A Website crawler is a  program that browses the Internet in a methodical, automatic manner or in an orderly fashion. Other terms for Website crawlers are automatic indexers, worms, bots, and ants or web robot or spiders.

GoogleBot

Web crawling is often referred to as spidering. Search engines, and also other sites, use this method to index the web. Website crawling allows search engine sites like google to save a website for later processing to index the page for faster searching. Link checking, or validating HTML code is another form of crawling that can be used for automatic maintainence of redundant taskts for a website. Crawlers are also used to gather specific types of info from web pages, like email address (but is only related to spam….which is bad)

Leave a Reply

Your email address will not be published. Required fields are marked *