Nginx caching

Nginx has built-in request caching capabilities. You can benefit from using Nginx caching as it is much simpler as compared to Varnish.

What should be cached?

Server based caching is meant to avoid generation of the same pages every time they are accessed (for example, Wordpress posts list), which may sometimes take seconds to process. Instead, an application generates pages once, and the result is stored into memory. Then next time user visits same page, it's not generated, but instead cached version is returned. After a specified time period (called ttl) stored version is deleted and a new one is generated from your app. Nginx cache

In almost all cases you can cache web pages for unauthorized users. The best way to use it is for websites with public content.

Enabling caching in Nginx

First of all, you need to set maximum cache size (the total size of all pages stored in cache will be limited by this value). This is done in the configuration file (nginx.conf) in 'http' section:

http {
...
proxy_cache_path /var/cache/nginx levels=1:2 keys_zone=all:<b>32m</b>;
...
}

# Cache is set to 32MB and will be store in /var/cache/nginx folder

Remember to create the cache folder.

mkdir /var/cache/nginx

Hosts setup

In order for caching to work, you should create a new host listening to port 80. And put the main host to other port (81). Cache host will send requests to the main host and return data from cache.

Cache host

server {
        listen 80;

        location / {
                proxy_pass http://127.0.0.1:81/;
                proxy_cache all;
                proxy_cache_valid any 1h;
        }
}

# Pages will be stored in cache for 1 hour

Main host

server {
        listen 81;

        location / {
                # fpm and other
        }
}

# App configuration, but for 81 port

Cookies and personalization

Many websites have pages with personalized content blocks. SSI technology allows to implement advanced caching for cases when large numbers of personalized content blocks are used. In the simplest case, cache can be disabled if user has cookies.

server {
        listen 80;

        location / {
                if ($http_cookie ~* ".+" ) {
                        set $do_not_cache 1;
                }
                proxy_cache_bypass $do_not_cache;
                proxy_pass http://127.0.0.1:81/;
                proxy_cache all;
                proxy_cache_valid any 1h;
        }
}

Errors

It makes sense to enable caching of failed requests for a short time period. This can help avoiding frequent attempts to access broken parts of your website.

server {
        listen 80;

        location / {
                if ($http_cookie ~* ".+" ) {
                        set $do_not_cache 1;
                }
                proxy_cache_bypass $do_not_cache;
                proxy_pass http://127.0.0.1:81/;
                proxy_cache all;
                proxy_cache_valid 404 502 503 1m;
                proxy_cache_valid any 1h;
        }
}

fastcgi caching

Nginx allows to cache responses from fastcgi. To enable this cache, you should set up its parameters ('http' section in nginx.conf file):

fastcgi_cache_path /var/cache/fpm levels=1:2 keys_zone=fcgi:100m;
fastcgi_cache_key "$scheme$request_method$host$request_uri";

# Sets maximum cache size to 100MB

Do not forget to create the folder:

mkdir /var/cache/fpm

Add caching rules to main host configuration:

server {
    listen   80;

    location ~ \.php$ {
        fastcgi_pass unix:/var/run/php5-fpm.sock;
        fastcgi_index index.php;
        include fastcgi_params;
        fastcgi_cache fcgi;
        fastcgi_cache_valid 200 60m;
    }
}

# Responses with 200 code will be cached for 60 minutes

The most important

Take advantage of caching. It's pretty simple to set up, but can give a ten-fold increase in website performance.

Подпишитесь на Хайлоад с помощью Google аккаунта
или закройте эту хрень