What are some general (not specific to LAMP, .NET, Ruby, mySql, etc) tactics and best practices to improve page loading speed?
I am looking for tips about caching, HTTP headers, external file minification (CSS, JS), etc.
And for good tools like Google PageSpeed and Yahoo YSlow.
A "ultimate resource" wiki style checklist of "things not to forget" (moderated and updated by all the wizards here on SO) is the end goal. So folks don't have to Google around endlessly for outdated blog posts on the subject. ;)
I hope the "subjective" mods go easy on me, I know this is a bit open ended. And similar questions have been asked here before. And this material overlaps with the domain of ServerFault and Webmasters too a bit. But there is no central "wiki" question that really covers this so I am hoping to start one. There are great questions like this that I refer to on SO all the time! Thanks
If you want a quick answer, the Google recommended page load time is under two seconds: “Two seconds is the threshold for ecommerce website acceptability. At Google, we aim for under a half-second.”
How Fast Should a Website Load in 2021? Best practice is to get your website to load in less than two to three seconds, according to John Mueller, the Senior Webmaster Trends Analyst at Google. However, ideally, your website should load as fast as possible.
Enable browser caching. Browser caching is another form of caching you can leverage to improve page loading speeds. This technique enables the browser to store a variety of information, including stylesheets, images, and JavaScript files, so it doesn't have to reload the entire page every time a user visits it.
<H#>
tags for headersMake your dynamic content more static.
If you can render your public pages as static contents you'll help proxy, caches, reverse proxy, things like web application accelerators & DDOS preventing infrastructures.
This can be done in several ways. By handling the cache headers of course, but you can even think about real static pages with ajax queries to feed dynamic content, and certainly a mix between these two solutions, using the cache headers to make your main pages static for hours for most browsers and reverse proxys.
The static with ajax solution as a major drawback, SEO, bots will not see your dynamic content. You need a way to feed bots with this dynamic data (and a way to handle user accessing this data from search engines url, big hell). So the anti pattern is to have the real important SEO data in a static page, not in ajax dynamic content, and to limit ajax fancy user interactions to the user experience. But the user experience on a composite page can maybe be more dynamic than the search egine bots experience. I mean replace the latest new every hours for bots, every minute for users.
You need as well to prevent premature usage of session cookies. Most proxy cache will avoid caching any HTTP request containing a cookie (and this is the official specification of HTTP). The problem with this is often application having the login form on all pages, and which need an existing session on the POST of the login form. This can be fixed by separate login pages, or advanced redirects on the login POST. cookie handling in reverse proxy cache can as well be handled in modern proxy cache like Varnish with some configuration settings.
edit: One advanced usage of reverse proxy page can be really useful: ESI, for example with varnish-esi. You can put on your html render tags that the ESI reverse proxy will identify. ach of these identified regions can have different TTL -Time To Live- (let's say 1 day for the whole page, 10 min for a latest new block, 0 for the chat block). And the reverse proxy will make the requests in is own cache or to your backend to fill these blocks.
Since the web exists handling proxys and caches has always been the main technique to fool the user, thinking the web was fast.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With