I'm having trouble with one of my sites on which two ajax requests are executed when the page loads. I'm using jQuery in combination with an PHP application based on the zend framework.
The relevant HTML (simplified) looks like:
<select id="first">
<option value="1">First Option</option>
<option value="2">Second Option</option>
</select>
<div class="first_block"></div>
<select id="second">
<option value="1">First Option</option>
<option value="2">Second Option</option>
</select>
<div class="second_block"></div>
Here is what my jQuery looks like:
$(document).ready(function(){
// function to update the first block
var updateFirstBlock = function(){
var param = $(this).val();
$.ajax('module/controller/action/param/' + param, {
'success': function(data){
$('.first_block').html(data);
}
});
};
// bind and trigger the first update function
$('select#first').bind('change', updateFirstBlock );
$('select#first').trigger('change');
// function to update the second block
var updateSecondBlock = function(){
var param= $(this).val();
$.ajax('module/controller/another-action/param/' + param, {
'success': function(data){
$('.second_block').html(data);
}
});
};
// bind and trigger the second update function
$('select#second').bind('change', updateSecondBlock );
$('select#second').trigger('change');
});
The PHP Application just returns some content dependent on which value is distributed.
Now what happens when the pages is loaded, is that in nine of ten cases one of the two requests gets no answer. The other one gets 200 OK
and the failing one times out. There's no regularity, which request fails.
Is it possible that there's something wrong in the web-servers (Apache 2.2) configuration, so that two simultaneously fired requests constrain each other?
EDIT
If I set both requests to async: false
, they are always executed properly. So there must be a collision, I think.
EDIT 2
A possible reason for this behavior, could be php's session lock. I will examine this further.
The jQuery library provides a way to make any callback function waiting for multiple Ajax calls. This method is based on an object called Deferred. A Deferred object can register callback functions based on whether the Deferrred object is resolved or rejected.
for prevent multiple ajax request in whole site. For example: If use ajax request in other ajax page, Using ajax in php loop, etc, Give you multiple ajax request with one result. I have solution: Use window.
It seems your definitely on the right rack with the PHP session lock:
The default PHP session model locks a session until the page has finished loading. So if you have two or three frames that load, and each one uses sessions, they will load one at a time. This is so that only one PHP execution context has write access to the session at any one time.
Some people work around this by calling session_write_close()
as soon as they've finished writing any data to the $_SESSION
- they can continue to read data even after they've called it. The disadvantage to session_write_close()
is that your code still will lock on that first call to session_start()
on any session'ed page, and that you have to sprinkle session_write_close()
everywhere you use sessions, as soon as you can. This is still a very good method, but if your Session access follows some particular patterns, you may have another way which requires less modification of your code.
The idea is that if your session code mostly reads from sessions, and rarely writes to them, then you can allow concurrent access. To prevent completely corrupted session data, we will lock the session's backing store (tmp files usually) while we write to them. This means the session is only locked for the brief instant that we are writing to the backing store. However, this means that if you have two pages loading simultaneously, and both modify the session, the Last One Wins. Whichever one loads first will get its data overwritten by the one that loads second. If this is okay with you, you may continue - otherwise, use the session_write_close method, above.
If you have complicated bits of code that depend on some state in the session, and some state in a database or text file, or something else - again, you may not want to use this method. When you have two simultaneous pages running, you might find that one page runs halfway through, modifying your text file, then the second one runs all the way through, further modifying your text file, then the first one finishes - and your data might be mangled, or completely lost.
So if you're prepared to debug potentially very, very nasty race conditions, and your access patterns for your sessions is read-mostly and write-rarely (and not write-dearly), then you can try the following system.
Copy the example from session_set_save_handler() into your include file, above where you start your sessions. Modify the session write() method:
function write($id, $sess_data)
{
global $sess_save_path, $sess_session_name;
$sess_file = "$sess_save_path/sess_$id";
if ($fp = @fopen($sess_file, "w")) {
flock($fp,LOCK_EX);
$results=fwrite($fp, $sess_data);
flock($fp,LOCK_UN);
return($results);
} else {
return(false);
}
}
You will probably also want to add a GC (Garbage Collection) method for the sessions, as well.
And of course, take this advice with a grain of salt - We currently have it running on our testing server, and it seems to work OK there, but people have reported terrible problems with the Shared Memory session handler, and this method may be as unsafe as that.
You can also consider implementing your own locks for scary concurrency-sensitive bits of your code.
Ref: http://ch2.php.net/manual/en/ref.session.php#64525
It might be worthwhile implementing a database session handler. Using a database eliminates this problem and can actually improve performance slightly, if a good database structure is used.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With