Crawler Lag

A crawler is a program that visits Web sites and reads their pages and other information in order to create entries for a search engine index. The major search engines on the Web all have such a program, which is also known as a "spider" or a "bot.

See Also:

SEO, Page Rank, Deep Linking, Linkage

Copyright © 2012 . All Right Reserved.