Google – the word which has become synonym to answers and the verb which has made life incomplete without it. This article will cover the evolution of the Google Search algorithm and how it knows exactly what you have been searching for. More importantly, the article explains the new algorithm developed in the November 2011 and how it improves the content which is returned from the search giant.
Google Search Algorithm:
Google was nicknamed BackRub when it was just a research project in Stanford and was developed by Larry Page and Sergey Brin in 1997. Since then, the search algorithm has witnessed a lot of improvements and refinements, with the latest major update occurring in November 2011.
The BackRub has been modified since its inception and renamed to PageRank which is now the basis of all the results of searches done on Google. This patented PageRank algorithm ranks web pages that match a given search string and recursively filters them based on various categories. The final score gained by each webpage is then used to rank the pages. A web page can be ranked from 1 -10 with 10 being the highest. The higher the rank, the higher the position when people search for related terms.
Basic Comparison with other Search Engines:
As compared to the older search algorithms which relied on the frequency of occurrence of the search string on the web page or at best its association with text in a webpage, PageRank computes score recursively. It filters out pages assuming that web pages linked from many important pages are more important than those which are not. This leads to the results of the Google Search using the PageRank algorithm correlating more with the human mind’s definition of importance.
Apart from this base technology, Google is reported to use more than two hundred indicators including adaptive and contextual references to finally return the results. Hence, if one has been searching one particular item and then searches for something else, Google is known to return results containing links for both, the item 2 and links having correlation between the two. In other words, the Google search refines itself with continual searching. This is what enables Google’s search thinks like humans and this is what has resulted in Google having more than 50% market share in the search domain.
Google’s Updated Algorithm:
In the latest updated algorithm, also called the “freshness” algorithm is expected to impact more than one third of the searched done presently.
Google moved to the Caffeine Infrastructure last year which will enable searching of much more data faster and indexing and crawling more rapid. This will thereby return results faster. Based on this architecture Google has changed their search algorithm to show fresher results.
The freshness algorithm apart from the original PageRank technology also takes into consideration the relative “freshness” or the frequency of updating done on that web-page. This will be affecting about 35% of the searches and Google puts them into the following three categories:
- Recent events or hot topics: Recent events or hot topics that begin trending on the web the latest information will be displayed immediately. The freshness algorithm will search for web-pages which have been updated most recently with the latest authentic news and then display the same to the user. Here the quality of content is given equally high importance. This may include topics governing wars, protests, new invention among others.
- Regularly recurring events: Events which occur periodically will be searched in a slightly different manner. As the time period for these events may range from days to years, the algorithm identifies the frequency of updates and then displays the news related to most recent event. This includes the list of domains like elections, Olympics, Sports Live Scores and similar ones.
- Frequent updates: Topics which include reviews of models and gadgets fall into this category. These necessitate the latest posts on the models for an comprehensive picture about the product.
- There are few other sectors like recipes in which updating is not necessarily required. In such cases the Google algorithm makes sure to provide with authentic results.
There is a potential negative impact of this freshness algorithm on small-businesses because it will be near impossible for them to keep updating the website and this will result in them getting poor visibility. Especially if a particular business is in an industry where there are regular news to be updated these need to be done rapidly or else the website will be lost in the million search results that Google gives.
Though having freshness check in the search results is a change which will be welcomed by most, few critics feel that this might lead to over-spamming. As Google has not disclosed how it verifies whether content on a web-page is updated, it leaves a question unanswered, whether every change done on the website count as one. In that case, bloggers might simply make small alterations or re-post the earlier posts in order to get the favor from Google. Google has indirectly answered this query saying that along with freshness, quality of content will also be taken into consideration and that it will continuous try to ensure the best and the most relevant websites feature on the top.