Final answer:
Web crawlers are software tools designed to index the internet for search engines by systematically browsing webpages and storing the information.
Step-by-step explanation:
Web crawlers are software tools that search the Internet for pages and sites and store the results in a database of a search engine. Unlike gateways, shopping carts, or payment systems, web crawlers, also known as spiders or bots, systematically browse the World Wide Web to index the content found on webpages.
Their primary purpose is to update the content that is available to users through a search engine.