Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Referencing separate JS files vs one JS file

Tags:

javascript

Which would result in greater speed/efficiency: Referencing one JavaScript file for all files in the directory OR referencing a different JavaScript file for each file in the directory?

So basically, referencing the same JavaScript file in all web pages vs a unique JavaScript file for every webpage.

Note: I thought that referencing the single file would be slower as there is code in there that is obsolete to some files, thus running useless code and causing the file to run less efficient.

like image 371
Robert Tossly Avatar asked Aug 16 '15 06:08

Robert Tossly


2 Answers

There are tradeoffs involved so you may ultimately need to measure your specific circumstances to be sure. But, I'll explain some of the tradeoffs.

  1. If you have giant amounts of data or giant amounts of code that are only used in one or a few pages, then you will probably want to separate that out into its own file just so you can ONLY load it, initialize it and have it take memory when it's actually needed. But, note with the amount of memory in modern computers (even phones these days), the data or code has to be pretty large to warrant a separate download.

  2. Other than item 1, you pretty much always want to optimize for maximum caching efficiency. Retrieving a file (even a larger file than needed) from the cache is so massively much faster than retrieving any file (even a small file) over the network that you really want to optimize for caching. And, the time to retrieve these files generally dwarfs any of the JS parse time (CPUs are pretty fast these days) so triggering an extra download to save some JS parse time is unlikely to be faster.

  3. The best way to optimize for caching is to have most of your pages reference the same common script files. Then, they get loaded once when the viewer first hits your site and all subsequent loads come right from the browser cache. This is ideal. This caching efficiency easily overcomes having some unused or untriggered code in the master file that is not used in some pages.

  4. Lots of small downloads (even from the cache) is less efficient than one larger download. More separate requests generally just isn't as efficient for either the browser or the server. So, combining JS files into larger concatenated files is generally a good thing.

  5. There are limits to all of this. If you had completely separate code for 100 separate pages all concatenated together and each piece of code would search the DOM for multiple page elements (and not find them 99% of the time), then that's probably not an efficient way to do things either. But, usually you can make your shared code smarter than that by breaking things into categories based on a high level class name. So, for example, based on the presence of a class name on the <body> tag, you would then run only part of the initialization code, skipping the rest because its classification is not present. So, when combining code, much of which won't be relevant on any given page, it's wise to be smart in how you decide what initialization code in the shared file to actually run.

like image 156
jfriend00 Avatar answered Oct 04 '22 22:10

jfriend00


You need to measure for your specific case - as every site/page have its own balance between loading less files/loading extra unnecessary scripts (same apply to CSS too).

Generally single file is faster in HTTP v1 as there are restrictions on total number of parallel downloads, HTTP v2 should be removing the difference.

like image 25
Alexei Levenkov Avatar answered Oct 04 '22 22:10

Alexei Levenkov