Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Server response gets cut off half way through

I have a REST API that returns json responses. Sometimes (and what seems to be at completely random), the json response gets cut off half-way through. So the returned json string looks like:

...route_short_name":"135","route_long_name":"Secte // end of response

I'm pretty sure it's not an encoding issue because the cut off point keeps changing position, depending on the json string that's returned. I haven't found a particular response size either for which the cut off happens (I've seen 65kb not get cut off, whereas 40kbs would).

Looking at the response header when the cut off does happen:

{
    "Cache-Control" = "must-revalidate, private, max-age=0";
    Connection = "keep-alive";
    "Content-Type" = "application/json; charset=utf-8";
    Date = "Fri, 11 May 2012 19:58:36 GMT";
    Etag = "\"f36e55529c131f9c043b01e965e5f291\"";
    Server = "nginx/1.0.14";
    "Transfer-Encoding" = Identity;
    "X-Rack-Cache" = miss;
    "X-Runtime" = "0.739158";
    "X-UA-Compatible" = "IE=Edge,chrome=1";
}

Doesn't ring a bell either. Anyone?

like image 976
samvermette Avatar asked May 11 '12 20:05

samvermette


4 Answers

I had the same problem:

Nginx cut off some responses from the FastCGI backend. For example, I couldn't generate a proper SQL backup from PhpMyAdmin. I checked the logs and found this:

2012/10/15 02:28:14 [crit] 16443#0: *14534527 open() "/usr/local/nginx/fastcgi_temp/4/81/0000004814" failed (13: Permission denied) while reading upstream, client: *, server: , request: "POST / HTTP/1.1", upstream: "fastcgi://127.0.0.1:9000", host: "", referrer: "http://*/server_export.php?token=**"

All I had to do to fix it was to give proper permissions to the /usr/local/nginx/fastcgi_temp folder, as well as client_body_temp.

Fixed!

Thanks a lot samvermette, your Question & Answer put me on the right track.

like image 58
Clement Nedelcu Avatar answered Nov 12 '22 15:11

Clement Nedelcu


Looked up my nginx error.log file and found the following:

13870 open() "/var/lib/nginx/tmp/proxy/9/00/0000000009" failed (13: Permission denied) while reading upstream...

Looks like nginx's proxy was trying to save the response content (passed in by thin) to a file. It only does so when the response size exceeds proxy_buffers (64kb by default on 64 bits platform). So in the end the bug was connected to my request response size.

I ended fixing my issue by setting proxy_buffering to off in my nginx config file, instead of upping proxy_buffers or fixing the file permission issue.

Still not sure about the purpose of nginx's buffer. I'd appreciate if anyone could add up on that. Is disabling the buffering completely a bad idea?

like image 32
samvermette Avatar answered Nov 12 '22 15:11

samvermette


I had similar problem with cutting response from server.

It happened only when I added json header before returning response header('Content-type: application/json');

In my case gzip caused the issue.

I solved it by specifying gzip_types in nginx.conf and adding application/json to list before turning on gzip:

gzip_types text/plain text/html text/css application/x-javascript text/xml application/xml application/xml+rss text/javascript application/json;
gzip on;
like image 15
Dralac Avatar answered Nov 12 '22 17:11

Dralac


It's possible you ran out of inodes, which prevents NginX from using the fastcgi_temp directory properly.

Try df -i and if you have 0% inodes free, that's a problem.

Try find /tmp -mtime 10 (older than 10 days) to see what might be filling up your disk.

Or maybe it's another directory with too many files. For example, go to /home/www-data/example.com and count the files:

find . -print | wc -l

like image 2
PJ Brunet Avatar answered Nov 12 '22 15:11

PJ Brunet