I have bundled all my js libraries into one large file in order to spare a number of http requests.
But for some reason it takes 9.29 seconds (sometimes +15) to download this bundle of 1.2mb.
In this case the bundle isn't minified, but even when it is, it takes like 4-7 seconds for 783kb, so not much better.
But the biggest mystery is: if I refresh the page 5-6 times fast, then load time gets normal (~150ms). It keeps being normal everytime I refresh then. But if I wait for like 5min. and make no requests. Then the load time is slow again.
And also when I run my application in local environment, it always loads fast.
Now I have two questions for you all:
1: Is it wrong to concatenate ALL my libraries into one single file?
2: Why does it take almost 10 seconds to download ~1mb in my case?
Please do also take a look at the pictures, showing the request load time and my request-headers
JavaScript bundling is an optimization technique you can use to reduce the number of server requests for JavaScript files. Bundling accomplishes this by merging multiple JavaScript files together into one file to reduce the number of page requests.
Sadly there isn't a definite way to stop people downloading the JS files, CSS files or image files from your website as these are executed within the browser, the best you can do is to try and minify or obfuscate the files in such a way that they become near impossible to read and therefore use or copy.
You can bundle your JavaScript using the CLI command by providing an entry file and output path. Webpack will automatically resolve all dependencies from import and require and bundle them into a single output together with your app's script. But that's just the bare minimum it can do.
1: "Is it wrong to concatenate ALL my libraries into one single file?"
There is no right and wrong answer here, it very much depends on whats in the bundle, and how its used (so im assuming your concerned with load speed in general). Some thoughts:
In general its pretty much always a good idea to combine & minify JS, CSS (and images should be combined into sprites
). Fewer bytes on the wire will transfer quicker, fewer requests create less overhead, oh and it costs less. But...
2: "Why does it take almost 10 seconds to download ~1mb in my case?"
You said it yourself - "local environment ... always loads fast". Your problem is the distance the data has to travel, not how well packed it is. Travelling over localhost is going to be pretty much instantaneous however many scripts you load, going out over the internet and back adds latency to establishing connections. Your transfer speed is restricted by the slowest "link in the chain" between browser and server.
In order to reduce the distance between your computer and the server you should consider caching your files and hosting them behind a CDN. Requests from browsers are routed to a CDN Edge Location server thats geographically local to the requester. If the CDN has previously cached the request, its returned immediately (and over a much shorter distance, so faster). If the CDN Edge location hasnt yet cached the file, it will make a request on your end users behalf (via a super quick private network) to your server (referred to as the Origin
), and if headers allow cache the file for future requests.
Caching can cause big problems, so my advice is to use a cache busting query string
. This gives the benefit of CDN & browser level caching - ie huge speed improvements, but still allows you to easily update your code and ensure visitors will retrieve the most recent version. Assuming you had a minifed file ~/minified.js
, you would reference it as ~/minified.js?v=1
(the name/value isnt important). In the future you can then replace ~/minified.js
, and update your markup to ~/minified.js?v=2
. This requires your actual HTML isnt cached, or at least uses a short lived cache, but means the browser will treat v=1
and v=2
as 2 separate requests, so will download/cache them.
A few other thoughts:
critical path
script that is small and downloads quickly and contains the bare minimum to allow the page to start rendering. Then a larger, lazy loaded script containing everything else that will download at some point later in the page load. While this increases the total overhead in transferring the files, and is basically side-stepping the problem it could allow your pages to begin rendering much sooner making them "appear" faster. Also, you can embed your critical path code into the HTML - adding a couple of KB to the initial payload, in return for the script being available as soon as the HTML is parsed.If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With