Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

General web page loading speed and performance best practices [closed]

Tags:

performance

What are some general (not specific to LAMP, .NET, Ruby, mySql, etc) tactics and best practices to improve page loading speed?

I am looking for tips about caching, HTTP headers, external file minification (CSS, JS), etc.

And for good tools like Google PageSpeed and Yahoo YSlow.

A "ultimate resource" wiki style checklist of "things not to forget" (moderated and updated by all the wizards here on SO) is the end goal. So folks don't have to Google around endlessly for outdated blog posts on the subject. ;)

I hope the "subjective" mods go easy on me, I know this is a bit open ended. And similar questions have been asked here before. And this material overlaps with the domain of ServerFault and Webmasters too a bit. But there is no central "wiki" question that really covers this so I am hoping to start one. There are great questions like this that I refer to on SO all the time! Thanks

like image 261
thaddeusmt Avatar asked Jan 26 '11 17:01

thaddeusmt


People also ask

What is a good loading speed for a website?

If you want a quick answer, the Google recommended page load time is under two seconds: “Two seconds is the threshold for ecommerce website acceptability. At Google, we aim for under a half-second.”

How fast should a website load 2021?

How Fast Should a Website Load in 2021? Best practice is to get your website to load in less than two to three seconds, according to John Mueller, the Senior Webmaster Trends Analyst at Google. However, ideally, your website should load as fast as possible.

What's your approach on how do you optimize page load speed & performance issues while the page is loading?

Enable browser caching. Browser caching is another form of caching you can leverage to improve page loading speeds. This technique enables the browser to store a variety of information, including stylesheets, images, and JavaScript files, so it doesn't have to reload the entire page every time a user visits it.


2 Answers

  • Caching of page content
  • Load javascript at the bottom of the page
  • Minify css (and javascript)
  • Css and javascript should be in their own [external] files
  • If possible combine all js or css files into one of each type (saves server requests)
  • Use Google's jQuery and jQuery UI loaders (as it's likely already cached on some computers)
  • Gzip compression
  • Images should be the same size as the width and height as in the markup (avoid resizing)
  • Use image sprites when appropriate (but don't over do it)
  • Proper use of HTML elements ie. using <H#> tags for headers
  • Avoid div-itis or the more popular now ul-itis)
  • Focus javascript selectors as much as possible ie. $('h1.title') is much quicker than $('.title')
like image 116
RDL Avatar answered Nov 08 '22 10:11

RDL


Make your dynamic content more static.

If you can render your public pages as static contents you'll help proxy, caches, reverse proxy, things like web application accelerators & DDOS preventing infrastructures.

This can be done in several ways. By handling the cache headers of course, but you can even think about real static pages with ajax queries to feed dynamic content, and certainly a mix between these two solutions, using the cache headers to make your main pages static for hours for most browsers and reverse proxys.

The static with ajax solution as a major drawback, SEO, bots will not see your dynamic content. You need a way to feed bots with this dynamic data (and a way to handle user accessing this data from search engines url, big hell). So the anti pattern is to have the real important SEO data in a static page, not in ajax dynamic content, and to limit ajax fancy user interactions to the user experience. But the user experience on a composite page can maybe be more dynamic than the search egine bots experience. I mean replace the latest new every hours for bots, every minute for users.

You need as well to prevent premature usage of session cookies. Most proxy cache will avoid caching any HTTP request containing a cookie (and this is the official specification of HTTP). The problem with this is often application having the login form on all pages, and which need an existing session on the POST of the login form. This can be fixed by separate login pages, or advanced redirects on the login POST. cookie handling in reverse proxy cache can as well be handled in modern proxy cache like Varnish with some configuration settings.

edit: One advanced usage of reverse proxy page can be really useful: ESI, for example with varnish-esi. You can put on your html render tags that the ESI reverse proxy will identify. ach of these identified regions can have different TTL -Time To Live- (let's say 1 day for the whole page, 10 min for a latest new block, 0 for the chat block). And the reverse proxy will make the requests in is own cache or to your backend to fill these blocks.

Since the web exists handling proxys and caches has always been the main technique to fool the user, thinking the web was fast.

like image 33
regilero Avatar answered Nov 08 '22 09:11

regilero