AppSecUSA 2015 has ended
AppSecUSA 2015 - Buy ticket at http://2015.appsecusa.org/buy/
Back To Schedule
Friday, September 25 • 10:30am - 11:25am
Detecting and managing bot activity more efficiently

Sign up or log in to save this to your schedule, view media, leave feedback and see who's attending!

Bots, also commonly referred to as scrapers or spiders, are omnipresent on the Internet. Studies show that bot activity represents a great percentage of the overall traffic on the Internet. Bots are built for different purposes from simple health check to ensure the site is up to site spidering for the purpose of indexing the content or collecting specific information en mass. Not all bots are bad:
- The ones operated by search engines, audience analytics, SEO companies, web site performance monitoring services or partners drive users to the site, are vital to its success and the business it supports. But like with any automated activity, sometimes with the best of intentions, bot activity can have a negative load impact on the web site infrastructure.
- Other bot activity, sometimes more difficult to detect, can have more questionable benefits, hurt the image of company that owns the targeted site or even have some impact on the company's revenue in the case of content theft or competitive scraping.

The amount of bot activity seen on a given web site is generally proportional to the value of the content hosted on the site. The value of the content is defined by the dollar amount that can be gained by exploiting the data collected.

Bots are usually part of botnets and come in all shapes and sizes. Some are very simplistic scripts that run on a single machine and can only support a single task. Others are highly distributed and have the same abilities as a web browser and support a wide variety of tasks.

In order to efficiently detect as much bot activity as possible, it is essential to implement many different techniques to match the different types of bots. In this talk, we’ll discuss different detection methods including evaluating the HTTP header signature, testing the ability of a client and evaluating the client behavior.

Detecting bots is however half of the trouble. Because a bot has a certain header signature or behavior doesn’t mean it has bad intentions and would have a negative impact on the business. Clearly identifying and categorizing the bots is key, this talk will provide some guidelines on how to identify and categorize bots.

Once detected and categorized, bots that are considered good for the business should be allowed access to the content. However, the one that do not appear to bring any benefits should be handled appropriately. Denying the traffic has the immediate effect of sending a signal to the bot operator, telling him that the activity was detected. Although such action may provide immediate relief, the bot operator may adapt, redeploy and resume its activity undetected. To conclude the session, we’ll go over some guidelines on how best to respond to “bad bot” activity.

avatar for David Senecal

David Senecal

Product Architect, Akamai Technologies
15 years of Network technology, web performance and web security support and consulting background from 50+ large scale projects for Global 1000 companies as well as start-up companies. Proven ability to conceive, develop, deploy and operate complex systems and applications.- Large... Read More →

Friday September 25, 2015 10:30am - 11:25am PDT
Room B