I am a fan of valid web pages and always spend time passing new sites through the W3C validator.
When trying to make a case for why companies should validate web pages I quickly thought of accesibility and the future-proofing of the web site on more primitive devices such as phones, fridges, watches, the next-big-thing etc.
However I then wondered if there is a computational overhead involved in rendering web pages that do not validate?
Has there been any research done in this area? and do some browsers handle invalid content better than others?
It's likely that an invalid page will take longer to render because the browser will need to implement some fault recovery (deductive work to find where the next valid content is and how the browser can continue rendering it) and this can introduce an overhead.
Actual difference can only be told after careful measurements and (if possible) browser source code analysis.
I think your stronger argument will be in maintenance. I designed the intranet for the last major company I worked for. I created page templates that have valid markup and used conditional comments to target stylesheets for different versions of IE. The site was launched 2 years ago; it looks the same in Firefox 2 & 3, Safari 2 & 3, and IE 6, 7, & 8! No markup or style changes were necessary. This means that when the organization finally updates to IE 7, the web developer team will not have to do anything. Big win in reduced maintenance costs.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With