Imagine you make a search with your favorite search engine about "keyword phrase 1" and you visit the first result (1st URL) but you return immediately to the search results - using search engine terms you bounce back. You now click on the 2nd URL and you never return to the search results. Lets say the same exact thing happens with 1000 other users searching for "keyword phrase 1";Â Â What should that search engine conclude ? The 2nd URL is more fit to be on the 1st position for "keyword phrase 1". The 1st URL might not even be fit in for displaying in the first page of the search results of that search !!
This is a very simple scenario but it is just an illustration made to explain the concept behind it. Imagine you are running a search engine and you want to keep your customers (searchers) happy.Â That would mean to give them the most relevant results as high in the search results as possible, so they can click them as soon as possible. Wouldn't you want to utilize the past customer satisfaction of each particular search in order to improve subsequent searches ?? I would. If I could utilize the searchers' behavior I would be able to improve my search results over time. That is what the major search engines do or at least ought to do.
Google uses this concept at least in their personalized search results which are available to any google user that has a gmail, adsense or adwords account and needs to specifically enable them by using the google history service. Google with this service effectively rearranges the search results to match your preferences and past behaviors.
Neither the concept nor the news that search engines might be using this are new to us. They have been discussed in webmaster forums among search engine optimization professionals quite extensively. Click-through rates are usually discussed as being part of the same subject.Â Click-through rate is the percentage of the times a listing (URL in the search results) gets clicked over the times it appears in them.
How much each one affects the position of each URL ?
Lately (at least Google's search results) don't seem to be steady. I see domains jump 10 20 or even more positions up or down in the course of minutes or hours. High search engine positions are not the privilege of a few established domains anymore it seems. I suspect it could be the effect of click-through and bounce rates been taken into account.
Google is recording click throughs
and the 99th result gets this mousedown event
When the user clicks on the link the clk function results in a complex url call similar to this
How about bounce rate ?
I was also looking for the presence of cookies set in order to record the time a searcher stayed at the visited url but did not find any (maybe I did not look hard enough).Â Thinking about it, the exact time is not really needed in order to form an opinion about a url visit.Â A subsequent visit to another url of the same search results would mean that the first url was not satisfactory enough.Â The time between subsequent URL visits can also be an indicator of the level of satisfaction.
Lets make our original example a bit more detailed to make this point clearer. Lets assume the example user above clicks on the 1st URL at 12:00PM and then clicks on the 2nd URL at 12:01PM and also clicks on the the 3rd URL at 12:10PM and on the 4th URL on 12:11PM.Â This by itself might not mean anything but, if similar behavior is recorded by lets say 1000 different users (unique users) one can see that the 2nd URL kept the visitors longer which probably means that it was more relevant to the "keyword phrase 1" search.Â Again this is a very simple example but it only serves as an illustration.
What does google do with these statistics
Lately there is a lot of turmoil with search engine positions.Â No site can keep a steady position for long.Â Positions change even within the hour.Â I have seen this with client and personal sites competing for positions in very competitive terms or even moderately competitive terms.Â What is happening then ?Â I believe Google can NEVER be sure which specific URLs are the only ones fit to be on the first or even second page of the results and keeps on changing the rankings.Â It allows other sites on the top results giving all of them equal chances to be clicked by searchers and have their click-through and visited-time (or bounce rates) be compared with the rest of the URLs.Â This in theory sounds to me a parameters in the Google algorithm that gives fair chance to more URLs.Â Mind you, links are still relevant and believe will always be since without them Google would not even know a site exists.
Improve your rankings
With these in mind here is what I think is important to improve your search engine rankings.
a) Good and relevant link titles and snippets in the search results
The first impression a searcher has of your site is what google presents in the search results.Â Usually google creates a listing that consists of the URL's TITLE TAG text and a small snippet taken from your sites content that usually contains the whole or part of the search phrase.Â If google does not find such snippet to present, it will go to DMOZ.org and fetch the site's listing as snippet.Â If google does not find a DMOZ listing it will get the page's META description instead. Â Therefore it is in the webmaster's hands what google will display in the results.Â Nice short and relevant TITLEs are usually better than long spammy ones because a searcher usually reads them before clicking the link to your site; users have gotten smarter with the web and usually won't click on TITLEs that read like spam (everything and their dog is in them).Â Also if a webmaster makes sure to include phrases they want to be found for then I am sure google will display those instead of the usually not relevant DMOZ listing.
b) Fast load, good presentation and structure and relevant content on the web page
The second impression a searcher has of your site is your site's immediate appearence.Â But even before that how fast your site loads might be a deciding factor that will keep a visitor to your site.Â Studies show that the user usually has some fraction of a second to decide whether he/she will spend their time on your site or go back. Â And web surfers don't have all day so your page better load fast. Â Once a visitor decides to stay and read your content that becomes the deciding factor that will keep the visitors or make them bounce back.
Content IS KING after all
These factors were always important in order to keep visitors happy and get them to bookmark your site in order to visit it again.Â In light of the click-through and visit-times that search engines might be using in the ranking algorithm, those become even more important in your site's improvement.
Here I should also mention that the talk involved how google records and I believe handles this user behavior data but, it is very probable that the other major search engines utilize such data (please bring forward if you have any such evidence).This system can be manipulated !!!
Any system that has a finite set of rules that can be easily deduced can be manipulated.Â This does not mean though that the search engines won't take measures against it.Â If they actually have such a system in place they probably have ways to check whether a request is man made and from a unique computer.Â Such anti fraud system were probably needed in many other applications search engines operate and they should already have them in place.
Please share your thoughts or findings regarding the use of click-through and bounce statistics.