Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to optimize a web based application against latency due to multiple asynchronous requests in background?

I am designing a modular RIA based on thin server client side MVC architecture. Right now, The application is only complete to the extent of 10% and as such it is not too late to incorporate design changes.

The application is designed in such way that it initially loads with a very small footprint and depending on the action performed by the user, large volumes of data are fetched asynchronously. This data potentially would include both data stored in my servers as well as data from third party web services including social networking and microblogging services.

However what I am concerned about is that, is it possible that multiple data heavy ajax requests running in background stall the browser ? I have recently observed some serious latency problems in some social content aggregation services and upon analysing the client side code I was surprised that the application footprint of the client side was quite small, well within 300KB. However, when running the application very often the browser (both Firefox and IE) hanged and took several seconds to recover. Upon analysing the asychronous requests it appeared that the application was simultaneously fetching user content from gmail, facebook and twitter and pushing them into the DOM and was taking too much of memory resources.

It would be great if someone could point me to some guidelines/best practices to prevent such issues. Would it be advisable to write a custom wrapper script that loads the content in background sequentially in a pre-specified order of importance rather than loading them all in parallel, which might eventually result in several callbacks getting executed in parallel.

Any advise would be heavily appreciated.

like image 961
lorefnon Avatar asked Jun 24 '11 06:06

lorefnon


1 Answers

One solution, well this is not the killer-solution for all case, but one solution is to delegate content aggregation on the server side and not all in the final browser.

This can be done with ESIGates. One of them is Varnish-Esi, but it doesn't cover the whole ESI specification. This one (esigate.org) is also Open source and maybe with a better coverage (not tested yet). The ESI system means your application layout can be a combination of various blocks with different cache policy (TTL), and different providers. The ESI server will take a part of the traffic that you initially deported to the final browser, so this will cost you a lot more bandwidth, but at least you'll get more control on this piece of software than on the different browsers used by the HTTP clients.

At least it could maybe improve the caching policy of asynchronous data loads on your server, and in that way it could speed up the response time for the final browser (better response time, less parallel work).

Now on the brwoser side, in term of priority on your page you should certainly establish what is the most important content, the one that the user could start to play with, and what is only a 'decoration' (well, this imply your service as a good information/noise ratio, if your website as nothing to provides except social networks aggregation you'll get problems).

I assume that as your application is a small-static application wih a lot of asynchronously-loaded data, you are using a lot of ajax and not too much page changes. This means that once a content is loaded it will reside in the page for a long time.

So having the social-networks & other web services contents delayed and chained instead of a big parallel load should'nt be a problem. Maybe it won't be there on the first 15s, but if it stay on the page for the next 15 minutes then it's maybe not a problem (if the most important content is already there the user will maybe not even notice that the decorative content was'nt available). One IE6 (and sometime IE7) tip here, use SetTimeouts() js commands everywhere to force repaint of pages, you'll see that available content shows faster.

Last tip, if you need to make some regular ajax checks for updated content. If you really make theses checks for 10 contents every minute you'll always get parallel load problems and a big activity, same problem as in the initial load, usually you can use 2 things to fix that problem, one is COMET familly long-running HTTP connexions (so you could instead PUSH data and/or get faster responses, but only with your server tuned for that sort of HTTP traffic). The second one is adding a time factor for the next checks, so that the first check is after 1 minute, the next one 2minutes, then 3, 15, 25, etc, at the end you have a new check only every hour maybe. And you could reduce the next check latency when you detect some activity from the user (some user interaction). This is because you could assume that the user is really looking for fresh data only when he's really doing something with your page. you will save some user CPU, and you will also help your server load.

like image 158
regilero Avatar answered Oct 08 '22 10:10

regilero