Highlights:

  • Among the U.S. websites subjected to testing, 68% lacked protection against simple bot attacks, and a mere 10% were successful in blocking all bot requests.
  • The statistics show a modest improvement for websites employing specialized bot detection and a CAPTCHA tool.

The findings from a newly published report by software-as-a-service bot protection startup DataDome SAS reveal that around two-thirds of U.S. websites are vulnerable to simple bot attacks, lacking adequate protection.

The discovery is outlined in DataDome’s U.S. Bot Security report, which involved testing over 9,500 of the largest U.S.-based websites spanning various industries such as banking, ticketing, e-commerce, and gambling. The report underscores the immediate necessity for enhanced bot protection measures, revealing not only the substantial risks websites face but also asserting that traditional CAPTCHAs are no longer effective in thwarting automated attacks.

Bots come in various forms, with malicious bots—utilized for nefarious purposes—comprising over 30% of total internet traffic. Cybercriminals deploy these malicious bots to target online businesses through fraudulent activities and other attacks. DataDome highlights that bots can disrupt digital business operations, jeopardizing data security and the customer experience. The consequences of such disruptions may include financial losses and damage to the reputation of the affected entities.

Among the U.S. websites subjected to testing, 68% lacked protection against simple bot attacks, and a mere 10% were successful in blocking all bot requests. Nearly 22% of the tested sites identified and blocked certain types of bots, though not all, and a substantial 68% allowed all nine types of bots to pass through.

When categorized by sector, e-commerce websites exhibited the highest vulnerability to basic bot attacks, with approximately 72% failing the tests. Classified ad websites followed closely, with 65% experiencing susceptibility to such attacks. Conversely, gambling websites were way more secure, with 31% blocking all the test bots. Concerning company size, three-quarters of companies with 50 or fewer employees had entirely unprotected sites, contrasting with just under 60% for companies boasting more than 10,000 employees.

A notable highlight in the report is that CAPTCHAs are no longer effective in countering malicious bots. Among the 2,587 sites utilizing only a CAPTCHA tool, less than 5% could identify and block all types of bots. In a surprising revelation, the report indicates that, out of the 77% of sites relying solely on CAPTCHAs, these tools were unsuccessful in preventing bots.

The statistics show a modest improvement for websites employing both specialized bot detection and a CAPTCHA tool. Nearly 15% succeeded in blocking all malicious bots, while 30% managed to block some. However, the remaining 55% still failed to block any malicious bots.

Antoine Vastel, the Head of Research at DataDome, said, “Bots are becoming more sophisticated by the day, and U.S. businesses are clearly not prepared for the financial and reputational damage these silent assassins can cause. From ticket scalping and inventory hoarding to account fraud, bad bots wreak chaos on consumers and businesses alike. Businesses which do not deal adeptly with bad bots risk significant reputational damage, as well as exposing their customers to unnecessary risk. They must act now to protect themselves against this growing threat.”