What do you mean by Spider?

Many search engines use programs called spiders to index websites. The spiders are also known as crawlers or robots. They act as automatic date searching tools that visit every site to find new or updated web pages and links. This process is called web crawling. Spiders follow hyperlinks and gather textual and meta information for the search engine databases. They collect as much information as they can before relaying to the server of the search engine.

Spiders may also rate the content being indexed to help the search engine determine relevancy levels to a search. They are called spiders as they visit many sites simultaneously, i.e. their legs keep spanning a large area of the web. All search engines use spiders to revise and build their indexes.

Share: