ShieldSquare classifies bot traffic into 4 different categories namely,

  1. Data Center Bots:
    These are bots which are having malicious intent and operate from a Data Center.

  2. Bad User Agent:
    These are dumb bots which are used by amateur scrapers to scrape a site. They use readily available market tools such as Scrapy etc. These have User Agent as Scrapy or curl or Wget.

  3. Integrity Check Failed:
    These are the smartest bots which mimic a human behaviour. ShieldSquare engine does a few turing tests before they are classified under this category.

  4. Legitimate Bots:
    These are bots which do not have malicious intent but tend to crawl through the site. These are predominantly used for market intelligence purpose and may help in bringing traffic to your site. These are classified into 5 subcategories.
  • Monitoring Bots – These are bots, which are used to monitor the system health of the customer's websites. (e.g. Pingdom)
  • Back link checker bots – These categories of bots check the back links of the URLs. (e.g. UASlinkChecker)
  • Social Network Bots - These are bots, which are run by social network sites. (e.g facebookbot)
  • Partner bots - These are partner bots which are useful to the website. (e.g. Paypal IPN)
  • Aggregator bots - These are bots which collate information from other websites. (e.g. WikioFeedBot)