Site Monitoring

What Search Engines Do

A search engine is a special kind of software that analyses web pages and builds an index for them. Users can then ask the search engine for information about any web page by using any keywords used on the web page.

The search engine builds up its very large database (as there are billions of web pages) by scanning web sites periodically in the background. If a link exists from a web site it has already indexed then the search engine robot is bound to find it sooner or later. This is a huge task and because there are hundreds of search engines a popular web site can be visited thousands of times a week by these robots that are building the index. To save repeatedly visiting a page that does not change they will tend to visit most frequently changing pages much more regularly than those that are fairly static in content. The main engines will visit important pages once a day and less 'important' ones perhaps once a month.

When a robot has found a page it reads it. The different search engines keep secret what information on a page is analysed. In the early days this was just the META keywords now it includes all the principal page elements. It builds up sets of keywords that are mentioned somewhere on the web page and confers a weighting to try to establish how relevant the page is to a particular word or phrase. The information is then stored away in the search engines gigantic database. Some time later this data will be made available to users of the search engine.

A user types in a search word or keyphrase to find relevant web pages. The search engine receives millions of search requests over HTTP when people use a web browser and responds to them with information from its database. A search engine must keep the time to respond to results to as short a time as possible so some very clever hardware and software is used to index the database. It will not be sufficient to put the whole database on one computer server it will be spread over a network of large clusters of computers. If one server fails, the request can be routed to other ones that are still functioning correctly.

Displaying the data may not be the end of the matter. Some search engines will track which links a user clicks on, and determine if they go to the next result listing page. It can use this information to deduce whether the user found the result they were looking for and give popular results a higher ranking.

For more information in this section please visit :
Which are the major search engines and how important each one is
A brief history of search engines
The basics of the algorithms used by search engines
How web page design can affect search engine ranking
The status of the old META keywords that used to be all important
The basics of the algorithms used by search engines

Web site monitoring

See also : How search engines work
How search engines work

Site Vigil Our leading product Site Position offers checking your position on all the important search engines, follow the download link for a free trial. We provide a unique Overall measure of position across all the search engines you choose. For more information look at the features.