Good bots are beneficial to businesses as well as individuals. When you search for a website or phrases related to a website's products or services, you get relevant results listed in the search results page. This is made possible with the help of search engine spider bots, also known as crawler bots (such as GoogleBot, Bingbot, and Baidu Spider, to name a few).
Good bots are generally deployed by reputable companies, and for the most part they respect rules created by webmasters to regulate their crawling activity and indexing rate, which can be defined in a website’s robots.txt file for crawlers to see. Certain crawlers can also be prevented from indexing websites if they are not useful or needed by the business. For example, the Baidu crawler can be blocked if a business does not operate in China and/ or does not cater to the Chinese market.
Apart from search engine crawlers, good bots also include partner bots (e.g. Slackbot), social network bots (e.g. Facebook Bot), website monitoring bots (such as Pingdom), backlink checker bots (e.g. SEMRushBot), aggregator bots (like Feedly), and more. Even good bots such as crawlers can cause problems at certain times, such as when their traffic increases and starts to reach the limits of server capacity, or when their volumes result in skewed analytics.