OpenAI's web crawler has caused a seven-person e-commerce company, Triplegangers, to experience a severe disruption akin to a DDoS attack. The bot reportedly sent tens of thousands of requests to the company's website, overwhelming its server and leading to a crash. The incident has drawn attention to the implications of aggressive web scraping practices, particularly as Triplegangers' robots.txt file was not properly configured to prevent such actions. The CEO of Triplegangers described the situation as being 'basically a DDoS attack' due to the intense scraping activity. This event highlights the challenges small businesses face when confronted with the capabilities of large AI entities.
➡️ OpenAI's bot overwhelmed a seven-person company's website, resulting in a crash that resembled the effects of a DDoS attack due to high traffic. https://t.co/6GEMQ7XSnk
➡️ OpenAI's bot caused a seven-person company's website to crash, exhibiting effects similar to a DDoS attack from overwhelming user traffic. https://t.co/E4YBsbyNes
Google crawler keeps showing it gets 500 errors for weirdly long URLs but I can't reproduce Already 301'd and 404'ing the long URLs to better pages or if not valid week ago But Google says still getting 500 yday! Example: https://t.co/Dnzd5KFHiU https://t.co/CdwBQbGlKY