So, I've read the MDN disclaimers and warnings, I've read a great answer on the subject, but there's still something I want to know. This question actually came from an answer I gave to another question, here.
Let's say I decide to do the dirty deed. Something that I will regret for the rest of my life. Something that will stain me with shame forever and dishonor my family name. A purposeful, deliberate ending of --
Alright, enough of that. Anyway, here it is:
let proto = Object.getPrototypeOf(Function.prototype);
Object.setPrototypeOf(Function.prototype, {
iBetterHaveAGoodReasonForDoingThis : "Bacon!"
});
//just to prove it actually worked
let f = (function(){});
console.log(f.iBetterHaveAGoodReasonForDoingThis);
// Quick, hide the evidence!!
Object.setPrototypeOf(Function.prototype, proto);
Basically, what I did there, was change the prototype of Function.prototype
, an object that impacts pretty much every piece of JavaScript code you could write. Then I changed it back.
I wanted to illustrate a big change in the prototype chain that would impact a lot of code and cause a lot of optimizations to go down the drain. I don't expect changing it back would fix anything (if anything, I expect it would make things worse performance-wise). I'd love to know if it would or wouldn't, but if it does, that wasn't my intention.
I just want to know if, after a change like this, will the JavaScript environment begin to recover and starting optimizing things again? Or will it just give up forever and run everything in deoptimized mode? Are there optimizations that will never be achieved because of this? Can I trust that, eventually, after a period of recovery, it will return to its regular state?
For context, I'm talking about engines like the most recent version of V8, not the primitive crap used by stuff like Internet Explorers. I understand the answer could be different in different systems, but I hope there is some commonality among them.
The answer is Prototype. The prototype is an object that is associated with every functions and objects by default in JavaScript, where function's prototype property is accessible and modifiable and object's prototype property (aka attribute) is not visible. Every function includes prototype object by default.
Yes. If you change the prototype that existing object instances share, it will change for all of them.
Objects in JavaScript have an internal property known as prototype. It is simply a reference to another object and contains common attributes/properties across all instances of the object. An object's prototype attribute specifies the object from which it inherits properties.
[[Prototype]] In JavaScript, objects have a special hidden property [[Prototype]] (as named in the specification), that is either null or references another object. That object is called “a prototype”: When we read a property from object , and it's missing, JavaScript automatically takes it from the prototype.
V8 developer here. This question does not have a simple answer.
Most optimizations will "come back" (at the cost of spending additional CPU time, of course). For example, optimized code that had to be thrown away will eventually get recompiled.
Some optimizations will remain disabled forever. For example, V8 skips certain checks when (and as long as) it knows that prototype chains have not been mucked with. If it sees an app modify prototype chains, it plays it safe from then on.
To make things even more complicated, the details can and will change over time. (Which is why there's not much point in listing more specific circumstances here, sorry.)
Background:
There are many places in JavaScript where code might do a certain thing, which the JavaScript engine must check for, but most code doesn't do it. (Take, for example, inheriting missing elements from an array's prototype: ['a', ,'c'][1]
almost always returns undefined
, except if someone did Array.prototype[1] = 'b'
or Object.prototype[1] = 'b'
.) So when generating optimized code for a function, the engine has to decide between two options:
(A) Always check for the thing in question (in the example: walk the array's prototype chain and check every prototype to see if it has an element at that index). Let's say executing this code will take 2 time units.
(B) Optimistically assume that array prototypes have no elements, and skip the check (in the example: don't even look at prototypes, just return undefined
). Let's say this brings execution time down to 1 time unit (twice as fast, yay!). However, in order to be correct, the engine must now keep a close eye on the prototype chains of all arrays, and if any elements show up anywhere, all code based on this assumption must be found and thrown away, at a cost of 1000 time units.
Given this tradeoff, it makes sense that the engine at first follows the fast-but-risky strategy (B), but when that fails even just once, it switches to the safer strategy (A), in order to avoid the risk of having to pay the 1000-time-unit penalty again.
You can argue whether "even just once" is the best threshold, or whether a site should get 2, 3, or even more free passes before giving up on (B), but that doesn't change the fundamental tradeoff.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With