Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to change the status code of a proxied server response in nginx?

I'm having a hard time configuring nginx to act as a proxy of a public S3 endpoint. My use case necessitates altering the status code of the S3 response, while preserving the response payload.

The possible status codes returned by S3 include 200 and 403. For my use case, I need to map those status codes to 503.

I have tried the following which does not work:

location ~* ^/.* {
  [...]
  proxy_intercept_errors on;
  error_page             200 =503 $upstream_http_location
}

Nginx outputs the following error:

nginx: [emerg] value "200" must be between 300 and 599 in /etc/nginx/nginx.conf:xx

Here's a more complete snippet:

server {
  listen       80;

  location ~* ^/.* {
    proxy_http_version         1.1;
    proxy_method               GET;
    proxy_pass                 http://my-s3-bucket-endpoint;
    proxy_pass_request_body    off;

    proxy_set_header       Content-Length "";
    proxy_set_header       Connection "";
    proxy_set_header       Host my-s3-bucket-endpoint;
    proxy_set_header       Authorization '';

    proxy_hide_header      x-amz-id-2;
    proxy_hide_header      x-amz-request-id;
    proxy_hide_header      Set-Cookie;
    proxy_ignore_headers   "Set-Cookie";

    proxy_cache            S3_CACHE;
    proxy_cache_valid      200 403 503 1h;
    proxy_cache_bypass     $http_cache_purge;
    add_header             X-Cached $upstream_cache_status;

    proxy_intercept_errors on;
    error_page             200 =503 $upstream_http_location;
  }
}

Is it possible to achieve what I need with nginx?

like image 331
Wenzil Avatar asked Jan 10 '17 02:01

Wenzil


People also ask

What is proxy params in NGINX?

A proxy_pass is usually used when there is an nginx instance that handles many things, and delegates some of those requests to other servers. Some examples are ingress in a Kubernetes cluster that spreads requests among the different microservices that are responsible for the specific locations.

How do I use NGINX as an https forward proxy?

The following steps briefly outlines the process. 1) The client sends an HTTP CONNECT request to the proxy server. 2) The proxy server uses the host and port information in the HTTP CONNECT request to establish a TCP connection with the target server. 3) The proxy server returns an HTTP 200 response to the client.

What is Proxy_set_header in NGINX?

Passing Headers to Handle Proxied Requests. Apart from proxy_pass, NGINX offers many other directives to handle requests to your server blocks. One of these directives is proxy_set_header, which lets you pass/rewrite headers to handle proxied requests.

Can NGINX do reverse proxy?

Nginx is an open source web server that can also serve as a reverse proxy. Apart from being used to host websites, it's also one of the most widely used reverse proxy and load balancing solutions.


1 Answers

I found a more or less suitable solution. It's a bit hackish but it works.

The key was to set the index document of my S3 bucket to a non-existing filename. This causes requests to / on the S3 bucket endpoint to result in 403.

Since the nginx proxy maps all incoming requests to / on the S3 bucket endpoint, the result is always 403 which the nginx proxy can intercept. From there, the error_page directive tells it to respond by requesting a specific document (in this case error.json) in the S3 bucket endpoint and use 503 as the response status code.

location ~* ^/. {
  proxy_intercept_errors on;
  error_page             403 =503 /error.json;
}

This solution involves two requests being sent to the S3 bucket endpoint (/, /error.json) but at least caching seems to be enabled for both requests using the configuration in the more complete snippet above.

like image 161
Wenzil Avatar answered Nov 15 '22 05:11

Wenzil