I am making a high-load web statistics system through embedding <img>
tag to site. The thing I want to do is:
I am working with Ruby and I'm going to make a pure-Rack app to get the headers and put them into a queue for further calculations.
The problem I can't solve is, how can I configure sphinx to give headers to the Rack app, and return a static image as the reply without waiting a for response from the Rack application?
Also, Rack is not required if there is more common Ruby-solution.
A simple option is to terminate the client connection ASAP while proceeding with the backend process.
server {
location /test {
# map 402 error to backend named location
error_page 402 = @backend;
# pass request to backend
return 402;
}
location @backend {
# close client connection after 1 second
# Not bothering with sending gif
send_timeout 1;
# Pass the request to the backend.
proxy_pass http://127.0.0.1:8080;
}
}
The option above, while simple, may result in the client receiving an error message when the connection is dropped. The ngx.say directive will ensure that a "200 OK" header is sent and as it is an async call, will not hold things up. This needs the ngx_lua module.
server {
location /test {
content_by_lua '
-- send a dot to the user and transfer request to backend
-- ngx.say is an async call so processing continues after without waiting
ngx.say(".")
res = ngx.location.capture("/backend")
';
}
location /backend {
# named locations not allowed for ngx.location.capture
# needs "internal" if not to be public
internal;
# Pass the request to the backend.
proxy_pass http://127.0.0.1:8080;
}
}
A more succinct Lua based option:
server {
location /test {
rewrite_by_lua '
-- send a dot to the user
ngx.say(".")
-- exit rewrite_by_lua and continue the normal event loop
ngx.exit(ngx.OK)
';
proxy_pass http://127.0.0.1:8080;
}
}
Definitely an interesting challenge.
After reading here about post_action and reading "Serving Static Content Via POST From Nginx" http://invalidlogic.com/2011/04/12/serving-static-content-via-post-from-nginx/ I have accomplished this using:
server {
# this is to serve a 200.txt static file
listen 8888;
root /usr/share/nginx/html/;
}
server {
listen 8999;
location / {
rewrite ^ /200.txt break;
}
error_page 405 =200 @405;
location @405 {
# post_action, after this, do @post
post_action @post;
# this nginx serving a static file 200.txt
proxy_method GET;
proxy_pass http://127.0.0.1:8888;
}
location @post {
# this will go to an apache-backend server.
# it will take a long time to process this request
proxy_method POST;
proxy_pass http://127.0.0.1/$request_uri;
}
}
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With