Outils pour utilisateurs

Outils du site


informatique:nginx

NGINX

Serveur HTTP.

Documentation

CGI

Tips & Tricks

Optimize Nginx

more than one worker process

http://articles.slicehost.com/2008/5/15/ubuntu-hardy-nginx-configuration/

user www-data www-data;
# Nginx can have more than one worker process running at the same time.
# To take advantage of SMP and to enable good efficiency I would recommend changing this to read:
worker_processes  4;
events {
    worker_connections  1024;
}
http {
    tcp_nodelay        on;
	include /usr/local/nginx/sites-enabled/*;
}

Sets the number of connections that each worker can handle. This is a good default setting.

You can work out the maximum clients value from this and the worker_processes settings:

max_clients = worker_processes * worker_connections

Sendfile is used when the server (Nginx) can actually ignore the contents of the file it is sending. It uses the kernel sendfile support instead of using it's own resources on the request.

It is generally used for larger files (such as images) which do not need use of a multiple request/confirmation system to be served - thus freeing resources for items that do need that level of 'supervision' from Nginx.

Keep it an on unless you know why you need to turn it off.

nginx proxy cache tuiles OSM

configuration nginx pour installer un cache de tuiles OSM par CQuest : https://gist.github.com/cquest/ef82d82e7700e116b340ca3f77532880

# tilecache.conf
# conserver les tuiles dans /var/cache, pendant 24h et au maximum 16Go
proxy_cache_path /var/cache/nginx-tilecache levels=1:2 keys_zone=tilecache:100m inactive=24h max_size=16G;

server {
  server_name	tilecache.mondomaine.tld a.tilecache.mondomaine.tld b.tilecache.mondomaine.tld c.tilecache.mondomaine.tld;
  listen 80;
  
  location / {
    proxy_pass		http://tilecache.openstreetmap.fr;
    proxy_cache		tilecache;
    proxy_cache_valid  	200 302  24h;
    proxy_cache_valid  	404      1m;
    proxy_cache_lock	on;

    # on ajoute l'IP du client dans la requête vers le upstream
    proxy_set_header	X-Forwarded-For $remote_addr;

    # on indique le status du cache dans la réponse au client
    add_header		X-Cache-Status $upstream_cache_status;
    # si upstream down, on envoie la copie qu'on a en cache
    proxy_cache_use_stale 	error timeout http_500 http_502 http_503 http_504;
  }
}
informatique/nginx.txt · Dernière modification: 03/10/2019 08:36 par cyrille