Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Browser aborting Ajax requests sporadically without returning any errors [duplicate]

In my project (PHP with Symfony 2) I do a lot of Ajax requests in every page. I'm having a lot of problems with them, because it looks like browsers (tested in Google Chrome and Firefox) are aborting requests without giving me an error. I've done a clean page to test what can be causing this issue and the error persists. I've tried a test doing 10 requests inside a for loop (I believe we don't have any problem with it, right?).

Here is my code:

<!DOCTYPE html>
<html>
    <head>
        <meta http-equiv="Content-Type" content="text/html; charset=utf-8" />
        <title>Test page</title>
    </head>
    <body>Test page.
        <script type="text/javascript" src="/js/compressed_jquery-1.8.2.min_1.js"></script>
        <script type="text/javascript">
            $(document).ready(function() {
                for (var i = 0; i < 10; i++) {
                    $.get('/i18n/javaScript/pt.json', function(data) {
                        console.log(data);
                    });
                }
            });
        </script>
    </body>
</html>

And here is a screenshot of requests result in Firebug:

Ajax requests screenshot

As you can see, some requests are completed and others are not. Sometimes the browser completes all 10 requests without errors. What can be causing this?


I've tested all the solutions, but I'm pretty sure it's a Windows, Apache or PHP configuration issue. Today I've configured a VM in my machine with VirtualBox running Ubuntu 13.04 (Raring Ringtail) with Apache 2.2 + PHP, and NO ERRORS happenned, proving that is nothing with my JavaScript, HTML or PHP code. I am not sure it is a configuration issue. How do I discover this configuration?

like image 538
Hugo Nogueira Avatar asked May 14 '13 03:05

Hugo Nogueira


3 Answers

Could fit your needs, sending request one by one should avoid server rejecting some parallel requests:

TEST IT

$(document).ready(function () {
    var tot = 30; //to simulate 30 requests
    (function request(i){
        if(i === tot) return;
        $.get('/echo/html/?'+i, function (data) {
            console.log("success");
        }).always(function() { request(++i) });
    })(0);
});
like image 197
A. Wolff Avatar answered Oct 19 '22 01:10

A. Wolff


You can serialize requests with a global object:

function AjaxRequester() {
    this.queue = [];
}

AjaxRequester.prototype.doRequest(request){
    if (this.queue.length>0){
        this.queue.push(request)
    }
    else 
       handleRequest(request)
}

AjaxRequester.prototype.handleRequest(request){
    /* actually handle ajax request, on complete inspect 
       queue and if not empty recall this method on the first
       element */
}

requester = new AjaxRequester();

into your code, do

requester.doRequest(yourRequest);
like image 38
p91paul Avatar answered Oct 19 '22 01:10

p91paul


I think your server is taking too long to respond and the browser is timing out. It may be that the browser will only have two open connections to the server and is aborting the last 8/10 when the first two take too long. I would check your server logs to confirm this.

What is the server doing when it sees a request to /i18n/javaScript/pt.json? Four seconds is a long time. Try fetching some static content like an image or some static HTML instead of pt.json and see if that fixes the problem.

If you need to do a lot of computation to produce pt.json, can it be cached? Is it very large? Are you using Apache?

like image 2
Clayton Rabenda Avatar answered Oct 19 '22 00:10

Clayton Rabenda