Quantcast
Viewing all articles
Browse latest Browse all 7

Varnish + Expression Engine + nginx Config

Recently the Marketing team at my workplace asked me how much load our web server could handle, specifically a web site running on Ellis Lab’s Expression Engine for an upcoming promotion.

Now typically our nginx/php-fpm stack is capable of more than 1000 concurrent users without error so I was confident it could do the business but said I would load test Expression Engine to be sure.

I spun up a basic Blitz.io rush with 1-1000 concurrent users over 1 minute to start with, expecting good results with some fine tuning work to do. I was shocked when the server fell over completely at ~40 users, locking up all the PHP-FPM processes and returning “502 Bad Gateway”. For reference our standard WordPress and PHP sites running on the same box can easily handle 1000 concurrent users with a ~10ms average response time…

Here’s the first round results from Blitz.io:

Digging deeper on our NewRelic application monitoring it became clear the issue lay within Expression Engine’s index.php, and was not database or external content related. It appears that the Expression Engine architecture itself is it’s own bottleneck and does not play well with standard php-fpm fast_cgi and nginx static file caching methods, which are very good. Furthermore, the native limitation of around 35 users also seems to be backed by Ellis Lab’s own cache testing here.

I also tested an Expression Engine plugin called CE Cache which only created more overhead in the same load testing on Blitz.io, resulting in a maximum of just 17 concurrent users. Possibly a more complex fast_cgi or nginx config would have worked better here but ultimately the best caching gains are realised outside of the application layer and CE Cache is simply a waste of money.

Application Level Transaction Trace on NewRelic:

Enter Varnish.

For those that don’t know Varnish it’s self-described as a web application accelerator. Varnish basically caches your web server’s output and serves cached http content before you users ever hit the web server for fast response times. Typically Varnish works well on sites with fairly static content and it’s simple to set up. Varnish sit’s in front of your web server, serving cached content and only sending requests through to the web server when required.

In the end I bit the bullet and injected Varnish into our stack, simplifying the rest of the stack configurations. Using Ellis Lab’s post as a starting point and Varnish’s Wiki we set about creating our our own Varnish config for Expression Engine.

For reference we are running Varnish 3.0.5, Expression Engine 2.5.5, nginx 1.4.7 and PHP 5.3.

Varnish 3.0 configuration (/etc/varnish/default.vcl):

backend default {
    .host = "127.0.0.1"; # the ip address nginx is listening on
    .port = "8000";  # the port nginx is listening on
}

sub vcl_recv {

    # Forward client's IP to backend
    remove req.http.X-Forwarded-For;
    set req.http.X-Forwarded-For = client.ip;

    # Set the URI of your system directory (EE 2.5 support added)
    if (req.url ~ "^/admin.php" || req.url ~ "^/system/" || req.url ~ "ACT=" || req.request == "POST" || (req.url ~ "member_box" && req.http.Cookie ~ "exp_sessionid")) {
        return (pass);
    }

    unset req.http.Cookie;

    return(lookup);
}

sub vcl_fetch {

    # Enable ESI includes
    set beresp.do_esi = true;

    if (beresp.ttl > 0s) {
        # Remove Expires from backend, it's not long enough
        unset beresp.http.expires;

        # Set the clients TTL on this object
        set beresp.http.cache-control = "max-age=900";

        # Set how long Varnish will keep it
        set beresp.ttl = 1d;

        # marker for vcl_deliver to reset age
        set beresp.http.magicmarker = "1";
    }

    return(deliver);
}

sub vcl_deliver {
    if (resp.http.magicmarker) {
        # Remove the magic marker
        unset resp.http.magicmarker;

        # By definition we have a fresh object
        set resp.http.age = "0";
    }
}

This is a fairly standard config that puts Varnish up front listening on port 80, caching for ~1 day and passing non-cached requests to nginx listening on port 8000 (doesn’t matter what port really). To change the ports Varnish uses during your testing phase edit your /etc/sysconfig/varnish (Redhat/Centos/Amazon Linux) or /etc/default/varnish (Debian/Ubuntu).

So with Varnish ready to roll we need to setup nginx listening on port 8000 and tailored to Expression Engine. I’ve read a few nginx configs for Expression Engine but many of these are outdated or overly complex for my liking. I’ve found with nginx is best to keep things simple with as few rewrites as possible.

nginx server configuration (/etc/nginx/sites-enabled/my-ee-app.conf):

server {
    listen 8000;
    server_name www.my-ee-app.com.au

    root /var/www/www.my-ee-app.com.au;

    # access logs off for min io
    access_log off;
    error_log /var/www/www.fiafitnation_error_log warn;

    # Deny access to hidden files
    location ~* /\.ht {
        deny all;
        log_not_found off;
    }

    location / {
        try_files $uri $uri/ /index.php;

        # Remove index.php
        rewrite ^/index\.php(.*) $1 permanent;
    }

    # EE PHP-FPM Config
    location ~* \.php$ {
        fastcgi_split_path_info ^(.+\.php)(.*)$;
        try_files $uri =404;
        include fastcgi_params;
        fastcgi_pass 127.0.0.1:9000;
        #fastcgi_pass unix:/var/run/php-fpm/www.sock;
    }
}

And finally my PHP-FPM pool configuration (/etc/php-fpm.d/www.conf):

[www]
listen = 127.0.0.1:9000
#listen = /var/run/php-fpm/www.sock
listen.backlog = -1

user = www-data
group = www-data

pm = dynamic
pm.max_children = 50
pm.start_servers = 5
pm.min_spare_servers = 5
pm.max_spare_servers = 10
pm.max_requests = 500

slowlog = /var/log/php-fpm/www-slow.log

php_admin_value[error_log] = /var/log/php-fpm/www-error.log
php_admin_flag[log_errors] = on

That’s about it, now restart your services like so:

/etc/init.d/php-fpm restart   (or /etc/init.d/php5-fpm restart)
nginx -s reload

Now to load test on Blitz.io…

Image may be NSFW.
Clik here to view.
Blitz.io chart: 1000 concurrent users, that's better!

Blitz.io chart: 1000 concurrent users, that’s better!

Image may be NSFW.
Clik here to view.
Blitz.io stats: 1000 concurrent users

Blitz.io stats: 1000 concurrent users

And voila, 42 million hits/day based on 1000 concurrent users over 1 minute with sub 10ms response times and only a 0.05% error rate!

From here you’ll want to fine tune according to your caching requirements and/or look at load balancing if you expect more traffic but for most people you’ll get great value from a single server with this stack.

Let me know your thoughts and how this worked for you, if you have any improvements please post them below.

cheers!

Mike

The post Varnish + Expression Engine + nginx Config appeared first on Mike Walton's Technical Journal.


Viewing all articles
Browse latest Browse all 7

Trending Articles