Skip links

What is Crawler/Spider/Robot in SEO?

In the context of Search Engine Optimisation, a crawler, spider, or robot is a software program that is used by search engines to scan and index the content of websites. These programs, also known as web crawlers or web spiders, follow links on the internet and visit web pages, collecting information about the content and structure of the pages they visit. This information is then used by the search engine to understand the content of the pages and to determine their relevance and ranking in search results. Crawlers are an essential part of the search engine process and play a key role in helping search engines to discover and index new content on the web.

So, it is basically the component of a search engine that indexes web sites automatically. A search engine’s crawler (also called a spider or robot), copies web page source code into its index database and follows links to other web pages.