Protecting your blog with NAXSI

I have been pondering how to make wordpress more secure. This is when i stepped on NAXSI. This is a WAF developed specifically for nginx. As it happens, i am providing an nginx debian package for squeeze that I plan to update. So here is the package for nginx 1.2.6 (amd64) built against naxsi 0.48. I am using Debian Squeeze as a server.

First, credits where they are due, I based my blog entry on the blog entries of 2 friends: Guigui and iMil.

You will need to edit /etc/nginx/nginx.conf and add:

http {
    include        /etc/nginx/naxsi_core.rules;
server {

        listen 80;
        listen [::]:80; #only if you are using ipv6
        root /where/your/awesome/blog/is;
        proxy_set_header Proxy-Connection "";
        location /RequestDenied {
                return 403;
        location / {
                index  index.html index.php;
                include    /etc/nginx/naxsi.rules;

Then add the following file: /etc/nginx/naxsi.rules with this:

DeniedUrl "/RequestDenied";

## check rules
CheckRule "$SQL >= 8" BLOCK;
CheckRule "$RFI >= 8" BLOCK;
CheckRule "$TRAVERSAL >= 4" BLOCK;
CheckRule "$EVADE >= 4" BLOCK;
CheckRule "$XSS >= 8" BLOCK;

# WordPress naxsi rules

BasicRule wl:1000,1001,1005,1007,1010,1011,1013,1200,1308,1309,1315 "mz:$HEADERS_VAR:cookie";
# xmlrpc
BasicRule wl:1402 "mz:$HEADERS_VAR:content-type";

### simple BODY (POST)
# comments
BasicRule wl:1000,1010,1011,1013,1015,1200 "mz:$BODY_VAR:post_title";
BasicRule wl:1000 "mz:$BODY_VAR:original_publish";
BasicRule wl:1000 "mz:$BODY_VAR:save";
BasicRule wl:1008,1010,1011,1015 "mz:$BODY_VAR:sk2_my_js_payload";
BasicRule wl:1009,1005,1100,1310 "mz:$BODY_VAR:url";
BasicRule wl:1009,1100 "mz:$BODY_VAR:referredby";
BasicRule wl:1100 "mz:$BODY_VAR:_wp_original_http_referer";
BasicRule wl:1000,1001,1008,1009,1010,1011,1013,1015,1016,1100,1200,1302,1303,1310,1311,1315,1400 "mz:$BODY_VAR:comment";
BasicRule wl:1100 "mz:$BODY_VAR:redirect_to";
BasicRule wl:1000,1009,1315 "mz:$BODY_VAR:_wp_http_referer";
BasicRule wl:1000 "mz:$BODY_VAR:action";
BasicRule wl:1001,1013 "mz:$BODY_VAR:blogname";
BasicRule wl:1015,1013 "mz:$BODY_VAR:blogdescription";
BasicRule wl:1015 "mz:$BODY_VAR:date_format_custom";
BasicRule wl:1015 "mz:$BODY_VAR:date_format";
BasicRule wl:1015 "mz:$BODY_VAR:tax_input%5bpost_tag%5d";
BasicRule wl:1100 "mz:$BODY_VAR:siteurl";
BasicRule wl:1100 "mz:$BODY_VAR:home";
BasicRule wl:1000 "mz:$BODY_VAR:submit";
# news content matches pretty much everything
BasicRule wl:0 "mz:$BODY_VAR:content";
BasicRule wl:1000 "mz:$BODY_VAR:delete_option";
BasicRule wl:1000 "mz:$BODY_VAR:prowl-msg-message";
BasicRule wl:1100 "mz:$BODY_VAR:_url";
BasicRule wl:1001 "mz:$BODY_VAR:c2c_text_replace%5btext_to_replace%5d";
BasicRule wl:1200 "mz:$BODY_VAR:ppn_post_note";
BasicRule wl:1100 "mz:$BODY_VAR:author";

BasicRule wl:1000 "mz:$BODY_VAR:delete_option|NAME";

### Simple ARGS (GET)
# WP login screen
BasicRule wl:1100 "mz:$ARGS_VAR:redirect_to";
BasicRule wl:1000,1009 "mz:$ARGS_VAR:_wp_http_referer";
BasicRule wl:1000 "mz:$ARGS_VAR:wp_http_referer";
BasicRule wl:1000 "mz:$ARGS_VAR:action";
BasicRule wl:1000 "mz:$ARGS_VAR:action2";
# load and load[] GET variable
BasicRule wl:1015 "mz:$ARGS_VAR:load";
BasicRule wl:1015 "mz:$ARGS_VAR:load[]";
BasicRule wl:1015 "mz:$ARGS_VAR:q";

### URL
BasicRule wl:1000 "mz:URL|$URL:/wp/wp-admin/update-core.php";
BasicRule wl:1000 "mz:URL|$URL:/wp/wp-admin/update.php";
BasicRule wl:1009,1100 "mz:$URL:/wp/wp-admin/post.php|$BODY_VAR:_wp_http_referer";
BasicRule wl:1016 "mz:$URL:/wp/wp-admin/post.php|$BODY_VAR:metakeyselect";
BasicRule wl:11 "mz:$URL:/wp/xmlrpc.php|BODY";
BasicRule wl:11 "mz:$URL:/wp/wp-cron.php|BODY";
BasicRule wl:1100 "mz:$URL:/wp/wp-admin/post.php|$BODY_VAR:_wp_original_http_referer|NAME";
BasicRule wl:1000 "mz:$URL:/wp/wp-admin/post.php|$BODY_VAR:metakeyselect|NAME";
BasicRule wl:1000 "mz:$URL:/wp/wp-admin/user-edit.php|$BODY_VAR:from|NAME";
BasicRule wl:1310,1311 "mz:$URL:/wp/wp-admin/load-scripts.php|$ARGS_VAR:load[]|NAME";
BasicRule wl:1000 "mz:$URL:/wp/wp-admin/users.php|$ARGS_VAR:delete_count|NAME";

There is a learning mode that you can enable to train your application, I would suggest you download the following script then take a look at Guigui’s article. I just focused on a ready to go configuration.

This configuration works on the latest Debian Squeeze using the latest packaged wordpress, of course YMMV. I have deployed this about 2 weeks ago and is proving quite nice in terms of security. It protects 2 blogs on my main server.

13 thoughts on “Protecting your blog with NAXSI

  1. Hello! Well I have compiled nginx for the last 2 years because back then the squeeze version was fairly old compared to current stable (0.7.67 vs 1.2.6). I needed the IPv6 changes in Nginx 1.x

    As for compiling your own nginx, i took the debian source package and modified the version and removed a few options.

    For your issue, you don’t specify which version of nginx and which platform you are using?

    1. Thanks for the reply, I am running Debian Squeeze. nginx 1.2.6, the latest stable one. In the thread I linked, I have also given the compile command I used.
      Just to explain a bit, the reason why I’m doing this is that the Debian nginx-naxsi package only contains a minimal number of modules so I had to manually compile it. Check this for more or less up-to-date info:

      What I am looking for is nginx running with the modules I need + naxsi with a working learning mode and UI but regarding the UI there is absolutely no info that I could find via Google, your blog is the first real clue I found 🙂

      1. I have not tried the ui yet, just looked at logs. Will take a look at it later, how about trying to use my build, just to see how well it works for you?

        1. I would gladly use your built but I would need to know how you built it so I know what modules are include and which are not.

          And how does it work? I can start using your rules for WordPress, since I host mostly WP sites but what if I have say an xtcommerce site too? How do I proceed to “learn” and create rules for that one particular site/vhost?

        2. Oh, and the link you gave above: shows a README that says: This are OLD naxsi’s rule generators.Do not use them, except if you know what you are doing.

          Is that link still current or is there something newer as suggested by the README?

          AND another reason why I was trying to build my own nginx was that I was planning to use nginx+naxsi+ngx_pagespeed

          You don’t happen to want to build one like that too? 😉

  2. Hello,

    The UI is lacking documentation atm (and for some time already), it should be improved soon, but in the meanwhile :

    naxsi utils (nx_intercept and nx_extract) are two tools that are used to :
    Help user to generate whitelist
    Generate statistics and reporting

    They are available on the googlecode space (naxsi-ui package), and here are some links on how to use it : : Performing learning from log files

    Once you fed nx_intercept with your log files, if you start nx_extract without the -o option, it will start a web daemon. You will find interface, ip, port & passwds in your naxsi-ui.conf (included within the package) and from this you should able to find your way.

    If you are still in trouble, feel free to join us on #naxsi @ freenode.

    1. Thanks, I had a look and found the naxsi-ui/doc folder, that helps as well. you’re pointers above should help me get started 🙂

      Can you explain in one simple sentence the different needs that nx_extract and nx_intercept fullfill? To me they both sound very similar but one does offer the web interface…

  3. Actually, they are indeed very similar and will be merge in a single tool pretty soon (actually, should already have been released, but we’re late :D).
    nx_intercept is in charge of filling the DB with exceptions, either from logs or from live learning (avoid live learning except if you really know what you are doing). nx_extract will read from this database to output stuff : whitelists and statistics.

    Both can be started at daemons, but this is something that is going to disappear, they will be merge in a single command-line only. The HTML reports will stay, but will be output to files directly, so it can be cron’d and e-mailed 😉

    1. Thanks for all the feedback but I’ve got another question trying to wrap my head around the whole concept: I put my sites in learning mode, then extract rules from the hits in the error log and create white lists. But how do I know all those hits in the error logs are benign? What if I had real malicious attacks in there and now I am generating whilelists from it?

      1. Well common sense mostly, you should be able to spot what does not look like legitimate traffic, In order to generate whitelists, you can look at how the application behaves and even click on the site yourself to whitelist what you have done, that way you know this is legitimate access.

Comments are closed.