Total Pageviews

Thursday, November 3, 2011

Web Crawlers (blog 3 on the search engine optimization series)


 Other names are Ants, automatic indexers, bots, web spiders, web robots or especially in the FOAF community, Skutters.
A web crawler is a type of bot, or software agent. In general it starts with a list of URL’s to visit, called the seeds. As the crawler visits these URL’s, it identifies the hyperlinks in the page and adds them to a list of URL’s to visit, called the crawler frontier. URL’s are recursively visited according to a set of policys.
Many websites and especially search engines use web crawlers as a means to deliver up-to-date data. Web crawlers are mainly used to create a copy of all the visited for later processing by a search engine that will download the indexed pages to provide faster searches. Web crawlers can also be used for tasks such as maintenance on a website.

No comments:

Post a Comment