Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Dealing with HTTP content in HTTPS pages

Tags:

http

image

https

We have a site which is accessed entirely over HTTPS, but sometimes display external content which is HTTP (images from RSS feeds, mainly). The vast majority of our users are also stuck on IE6.

I would ideally like to do both of the following

  • Prevent the IE warning message about insecure content (so that I can show a less intrusive one, e.g. by replacing the images with a default icon as below)
  • Present something useful to users in place of the images that they can't otherwise see; if there was some JS I could run to figure out which images haven't been loaded and replace them with an image of ours instead that would be great.

I suspect that the first aim is simply not possible, but the second may be sufficient.

A worst case scenario is that I parse the RSS feeds when we import them, grab the images store them locally so that the users can access them that way, but it seems like a lot of pain for reasonably little gain.

like image 396
El Yobo Avatar asked Jun 10 '10 02:06

El Yobo


People also ask

Can you mix http and https?

This is called mixed content because both HTTP and HTTPS content are being loaded to display the same page, and the initial request was secure over HTTPS.

Are HTTP pages safe?

Why HTTPS? The problem is that HTTP data is not encrypted, so it can be intercepted by third parties to gather data passed between the two systems. This can be addressed by using a secure version called HTTPS, where the S stands for Secure.


1 Answers

Your worst case scenario isn't as bad as you think.

You are already parsing the RSS feed, so you already have the image URLs. Say you have an image URL like http://otherdomain.com/someimage.jpg. You rewrite this URL as https://mydomain.com/imageserver?url=http://otherdomain.com/someimage.jpg&hash=abcdeafad. This way, the browser always makes request over https, so you get rid of the problems.

The next part - create a proxy page or servlet that does the following -

  1. Read the url parameter from the query string, and verify the hash
  2. Download the image from the server, and proxy it back to the browser
  3. Optionally, cache the image on disk

This solution has some advantages. You don't have to download the image at the time of creating the html. You don't have to store the images locally. Also, you are stateless; the url contains all the information necessary to serve the image.

Finally, the hash parameter is for security; you only want your servlet to serve images for urls you have constructed. So, when you create the url, compute md5(image_url + secret_key) and append it as the hash parameter. Before you serve the request, recompute the hash and compare it to what was passed to you. Since the secret_key is only known to you, nobody else can construct valid urls.

If you are developing in java, the Servlet is just a few lines of code. You should be able to port the code below on any other back-end technology.

/* targetURL is the url you get from RSS feeds request and response are wrt to the browser Assumes you have commons-io in your classpath */  protected void proxyResponse (String targetURL, HttpServletRequest request,  HttpServletResponse response) throws IOException {     GetMethod get = new GetMethod(targetURL);     get.setFollowRedirects(true);         /*      * Proxy the request headers from the browser to the target server      */     Enumeration headers = request.getHeaderNames();     while(headers!=null && headers.hasMoreElements())     {         String headerName = (String)headers.nextElement();          String headerValue = request.getHeader(headerName);          if(headerValue != null)         {             get.addRequestHeader(headerName, headerValue);         }                 }              /*Make a request to the target server*/     m_httpClient.executeMethod(get);     /*      * Set the status code      */     response.setStatus(get.getStatusCode());      /*      * proxy the response headers to the browser      */     Header responseHeaders[] = get.getResponseHeaders();     for(int i=0; i<responseHeaders.length; i++)     {         String headerName = responseHeaders[i].getName();         String headerValue = responseHeaders[i].getValue();          if(headerValue != null)         {             response.addHeader(headerName, headerValue);         }     }      /*      * Proxy the response body to the browser      */     InputStream in = get.getResponseBodyAsStream();     OutputStream out = response.getOutputStream();      /*      * If the server sends a 204 not-modified response, the InputStream will be null.      */     if (in !=null) {         IOUtils.copy(in, out);     }     } 
like image 51
Sripathi Krishnan Avatar answered Oct 13 '22 09:10

Sripathi Krishnan