Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Optimize Nginx + PHP-FPM for faster response times (for Openx adserving)

Tags:

I'm currently running Nginx + PHP-FPM for serving ads on OpenX. Currently my response times are horrible, even during times of low load. However, my CPU and Memory resources are fine, so I can't seem to figure out what the bottleneck is.

My current config for nginx and php-fpm is:

worker_processes 20; worker_rlimit_nofile 50000;  error_log  /var/log/nginx/error.log; pid        /var/run/nginx.pid;  events {     worker_connections  15000;     multi_accept off;     use epoll; }  http {     include       /etc/nginx/mime.types;      access_log  /var/log/nginx/access.log;      sendfile        on;     tcp_nopush     off;      keepalive_timeout  0;     #keepalive_timeout  65;     tcp_nodelay        on;      gzip  on;     gzip_disable "MSIE [1-6]\.(?!.*SV1)";     gzip_comp_level 2;     gzip_proxied    any;     gzip_types    text/plain text/html text/css application/x-javascript text/xml application/xml application/xml+rss text/javascript;      include /etc/nginx/conf.d/*.conf;     include /etc/nginx/sites-enabled/*; }  server {     listen   80;     server_name  localhost;     access_log  /var/log/nginx/localhost.access.log;  # Default location     location / {         root   /var/www;         index  index.php;     }  ## Parse all .php file in the /var/www directory     location ~ .php$ {         fastcgi_pass   localhost:9000;         fastcgi_index  index.php;         fastcgi_param  SCRIPT_FILENAME  /var/www$fastcgi_script_name;         include fastcgi_params;         fastcgi_param  QUERY_STRING     $query_string;         fastcgi_param  REQUEST_METHOD   $request_method;         fastcgi_param  CONTENT_TYPE     $content_type;         fastcgi_param  CONTENT_LENGTH   $content_length;         fastcgi_ignore_client_abort     off;     }  PHP-FPM: rlimit_files = 50000 max_children = 500 

I only included the PHP-FPM paramaters I've changed for PHP-FPM.

Does anyone have any tips on how I can optimize it so I can serve more requests? I'm seeing horrendous response times right now.

like image 443
Fariz Avatar asked Feb 16 '10 09:02

Fariz


1 Answers

First off, way too many workers, and limits set excessively high. The max worker count for php-fpm alone would bog your server down quite a bit. Uncapping the limits on a server won't necessarily speed it up but may actually have the opposite effect.

  1. Worker Count: 20 makes little sense if you do not have a 20 processor/core machine, you're actually causing a negative effect as the workers will have excessive content swapping. If you're running a dual core processor, 2 workers should suffice.

  2. Worker Connections: Again, just throwing a limit into the heavens doesn't solve your problems. If your ulimit -n output is something like 1024, then your worker connections would need to be set to 1024 or less (maybe even 768), its unlikely that you'll have 2 x 1024 simultaneous connections especially with something like PHP.

  3. Root location, and PHP settings, refer to http://wiki.nginx.org/Pitfalls , it works best if you put your root directive at the server {} level, not the location level. Once you do that you can use $document_root$fastcgi_script_name as the SCRIPT_FILENAME value as $document_root will be automatically propagated to location blocks below it.

  4. You may wish to handle static files directly, in other words:

    location ~* \.(ico|css|js|gif|jpe?g|png)$ {     expires max;     add_header Pragma public;     add_header Cache-Control "public, must-revalidate, proxy-revalidate"; } 
  5. Use a PHP Accelerator, namely APC (with apc.enabled=1 in php.ini) or XCache, and be mindful of your php settings, such as the memory_limit. For example if you only have a system with 2GB of rams, it makes very little sense to allow 500 workers with a limit of 128MB each. Especially true if you're also running other services on your server.

like image 180
KBeezie Avatar answered Sep 24 '22 01:09

KBeezie