Given the vast efforts to make the web as efficient as possible, why can't HTML (and all the various other plain text files e.g. CSS, JavaScript) be compiled into a single resource and sent down the wire? (I'm aware of .chm files - these are along the lines of this concept).
I understand the open nature of the web - an effort I stand behind - but one could conceive of an open specification that requires multiple resources compile down into binary. Decomplication by the user-agent could be required by the specification (this allowing individuals to view the DOM etc.)
I guess I'm just surprised, given the efforts at performance in other areas, we're still relying on plain text to push around pages, or am I just over-estimating the savings a binary format would provide?
An important factor for the development of the web has been the extensibility of the web languages. Browser vendors can support more functionality than the standards require. Although this has always been a hassle for developers, it has helped the web forward.
By compiling the web pages you would limit the capabilities to the set that the compiler supports. It would not be possible to use any new features in any browser until the compiler catches up with the development. This would slow down the development of the web.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With