Submitted by Ryan Barnett 2/27/2009
Distinguishing real human clients from automated clients who are interacting with your web application is critical in order to prevent a range of possible issues such as abuse of functionality, data scraping, brute force attacks or denial of service. Unfortunately, this type of client behaviour profiling to identify automated clients and then initiating an appropriate response is severely lacking in most of today's web applications.
One recent example of a company who has implemented some anti-automation function is Facebook with their "Friend Throttling" technology, where they have implemented a threshold on how quickly users are able to add friends. There are some who may think that this particular example seems a bit trivial and the outcome of an automated attack to exploit this issue may not negatively impact the site. To those folks, I have one word - SAMY. Don't forget that as this automated attack escalated, MySpace eventually was taken offline in order to fix the issues. Having a mission critical web application taken offline for any period of time can be devastating for many companies.
Anti-Automation defense needs more attention.
Organizations that have custom coded Java applications may add in detection capabilities by implementing the OWASP ESAPI code and then configuring the IntrusionDetector class to set arbitrary thresholds for access control. For those sites that either don't use Java or do not have direct access to the source code, you can move this logic upstream and implement the same access thresholds within a web application firewall.