Search engines are interested in delivering timely, relevant, high-quality search results to their users. They are constantly researching, developing, testing and refining ways to enhance the service that they provide – looking to optimise the relevance and quality of the results they serve back to the user on every single query. The better the search experience for the user, the better the reputation of the search engine and the more users it will attract. The more users a search engine has, the more alluring it is to the advertisers, therefore the more ad revenue it can pull in. Putting users first makes search engines richer and that makes search engine shareholders happy.
Scouring the web
Search engines need to deliver accurate, relevant, high quality search results to their users – they need to gather detailed information about the billions of web pages out there. They do this using automated programs called “bots” (robots), also known as “spiders”, which they send out to crawl the web. Spiders follow hyperlinks and gather information about the pages they find.
Once a page has been crawled by a spider, the search engine stores details about the page’s content, and the links both into and out of it, in a massive database called an index. This index is highly optimised so that results for any of the hundreds of millions of search requests received every day can be retrieved from it almost instantly.
The list of results for any given search query is run through the search engine’s complex ranking algorithms: special programs that use a variety of closely guarded proprietary formulas to “score” a site’s relevance to their user’s original query. Output is stored in order of relevance and presented to the user in the SERPs.
Ryan, D., 2016. Understanding digital marketing: marketing strategies for engaging the digital generation. Kogan Page Publishers.