We operate a large number of servers running Imunify360.
As every website and server operator is well aware, there is a huge increase in unwanted bot activity - and Imunify360 is not very effective at identifying and blocking bots. As a multi-layered security solution, I expect Imunify360 to be able to keep websites safe, but it completely misses this threat.
Today's bad bots are not respecting robots.txt or even identifying themselves as bots. Their behaviour often mimics a distributed denial of service attack because of:
As every website and server operator is well aware, there is a huge increase in unwanted bot activity - and Imunify360 is not very effective at identifying and blocking bots. As a multi-layered security solution, I expect Imunify360 to be able to keep websites safe, but it completely misses this threat.
Today's bad bots are not respecting robots.txt or even identifying themselves as bots. Their behaviour often mimics a distributed denial of service attack because of:
- High request rate
- Only of website pages (not static assets) - in our case that typically means PHP/SQL generated pages, which are resource heavy to generate
- 1 request per IP (even if you can detect the IP as malicious and block it, the IP is never seen again)
- It already has an RBL feature (so bot IPs identified on one server could be propagated effectively to every other server)
- It already has a captcha feature (but it's almost impossible to trigger this from just 1 request!)
Comment