HTTP Robot Mitigation

From SEnginx
Jump to: navigation, search

Contents

HTTP Robot Mitigation

Synopsis

The Robot Mitigation module rewrites the Roboo (https://github.com/yuri-gushin/Roboo) module in C language.
The Robot Mitigation module provides defense against HTTP robot attacks, mainly against malicious scans and DDoS attacks. Robot Mitigation uses a "challenge" verification method by sending to clients specific responses that can be interpreted by browsers. If the client is a browser, then the original request will be re-generated containing a specific cookie value. Robot Mitigation will determine whether to allow this request to pass according to the inserted cookie.

Robot Mitigation has the following enhancements added to the original Roboo module:

  • More efficient due to C language. Detects robots before the request reaches "content handler".
  • Simplified configuration, easier to use.
  • Calling sequence and actions are handled by NetEye security layer along with other security modules.
  • During javascript challenges, Robot Mitigation returns a javascript code randomly taken from user-defined javascript code set instead of returning the same javascript code all the time.
  • Based on secure session mechanism, define actions, including Pass, Block, and Blacklist (with support of the dynamic blacklist module).
  • Supports returning notification messages in html pages to clients when HTTP requests are blocked or added to the blacklist.


Directives

robot_mitigation

Syntax robot_mitigation on / off
Default off
Context Location

Enables or disables Robot Mitigation.

robot_mitigation_cookie_name

Syntax robot_mitigation_cookie_name cookie_name
Default SENGINX-ROBOT-MITIGATION
Context Location

set cookie name of robot mitigation
Example:

robot_mitigation_cookie_name robot_cookie;


robot_mitigation_mode

Syntax robot_mitigation_mode js / swf
Default js
Context Location

Specify challenge type, including JavaScript and Flash:

  • If js is selected, Robot Mitigation will return a user-defined??? javascript code to the client. If javascript is enabled on the client browser, it will initiate a new request.
  • If swf is selected, Robot Mitigation will return a predefined flash file to the client. If the Flash plugin is installed on the client browser, it will run this flash file and initiate a new request.


robot_mitigation_blacklist

Syntax robot_mitigation_blacklist failed_count
Default 5
Context Location

Set the threshold of faling to pass robot check, example:

robot_mitigation_blacklist 3;

So this means if a client has failed for 3 times, this client will be added into blacklist


robot_mitigation_timeout

Syntax robot_mitigation_timeout timeout
Default 60
Context Location

Sets the time for the next challenge to be initiated, in seconds.

robot_mitigation_challenge_ajax

Syntax robot_mitigation_challenge_ajax on/off;
Default off
Context Location
Version Since 1.5.5

This directive sets if challenge for ajax requests based on XMLRequest header.

robot_mitigation_global_whitelist

Syntax robot_mitigation_global_whitelist ua_var_name=UA whitlist ip_var_name=IP whitelist ip_var_value=value;
Default -
Context Location
Version Since 1.5.11

This directives specifies the IP whitelist and User-Agent whitelist which are globally defined. The IP whitelist is provided by nginx's geo module.
Example:

#define an ip whitelist
geo $ip_wl {
    ranges;
    default 0;

    127.0.0.1-127.0.0.1 1;
    3.0.0.1-3.2.1.254 1;
}

#define an UA whitelist
whitelist_ua $ua_wl {
    "autotest" ".*\.test\.com";
}

server {
    location {
         robot_mitigation_global_whitelist ua_var_name=ua_wl ip_var_name=ip_wl ip_var_value=1;
    }
}


Retrieved from "https://senginx.org/en/index.php?title=HTTP_Robot_Mitigation&oldid=7914"
Personal tools
Namespaces

Variants
Actions
Navigation
In other languages
  • 中文
Toolbox
  • What links here
  • Related changes
  • Special pages
  • Printable version