So if you look at this fiddle http://jsfiddle.net/r0k3t/z8f2N/1/ you can see that
var me = { fName: "ken", lName: "n" };
console.log(Object.prototype === Object.getPrototypeOf(me));
returns true. Why doesn't
console.log(Object.prototype === me.prototype);
Given that I created the "me" object as an object literal sure enough it's prototype should be Object.prototype and the first line would seem to confirm that.
The Object. setPrototypeOf() method sets the prototype (i.e., the internal [[Prototype]] property) of a specified object to another object or null. All JavaScript objects inherit properties and methods from a prototype. It is generally considered the proper way to set the prototype of an object.
prototype is a property of a Function object. It is the prototype of objects constructed by that function. __proto__ is an internal property of an object, pointing to its prototype. Current standards provide an equivalent Object.
The object literals are using undefined and number 15 to setup __proto__ value. Because only an object or null are allowed to be prototypes, objUndefined and objNumber still have their default prototypes: plain JavaScript objects {} .
prototype. __proto__ Deprecated: This feature is no longer recommended.
Object.prototype === me.constructor.prototype; // true
I let you guess now how getPrototypeOf
works :-)
Also, the non-standard-yet-but-works-almost-everywhere solution (thanks jAndy):
Object.prototype === me.__proto__; // true
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With