In Bluebird's util.js
file, it has the following function:
function toFastProperties(obj) {
/*jshint -W027*/
function f() {}
f.prototype = obj;
ASSERT("%HasFastProperties", true, obj);
return f;
eval(obj);
}
For some reason, there's a statement after the return function, which I'm not sure why it's there.
As well, it seems that it is deliberate, as the author had silenced the JSHint warning about this:
Unreachable 'eval' after 'return'. (W027)
What exactly does this function do? Does util.toFastProperties
really make an object's properties "faster"?
I've searched through Bluebird's GitHub repository for any comments in the source code or an explanation in their list of issues, but I couldn't find any.
2017 update: First, for readers coming today - here is a version that works with Node 7 (4+):
function enforceFastProperties(o) {
function Sub() {}
Sub.prototype = o;
var receiver = new Sub(); // create an instance
function ic() { return typeof receiver.foo; } // perform access
ic();
ic();
return o;
eval("o" + o); // ensure no dead code elimination
}
Sans one or two small optimizations - all the below is still valid.
Let's first discuss what it does and why that's faster and then why it works.
The V8 engine uses two object representations:
Here is a simple demo that demonstrates the speed difference. Here we use the delete
statement to force the objects into slow dictionary mode.
The engine tries to use fast mode whenever possible and generally whenever a lot of property access is performed - however sometimes it gets thrown into dictionary mode. Being in dictionary mode has a big performance penalty so generally it is desirable to put objects in fast mode.
This hack is intended to force the object into fast mode from dictionary mode.
In JavaScript prototypes typically store functions shared among many instances and rarely change a lot dynamically. For this reason it is very desirable to have them in fast mode to avoid the extra penalty every time a function is called.
For this - v8 will gladly put objects that are the .prototype
property of functions in fast mode since they will be shared by every object created by invoking that function as a constructor. This is generally a clever and desirable optimization.
Let's first go through the code and figure what each line does:
function toFastProperties(obj) {
/*jshint -W027*/ // suppress the "unreachable code" error
function f() {} // declare a new function
f.prototype = obj; // assign obj as its prototype to trigger the optimization
// assert the optimization passes to prevent the code from breaking in the
// future in case this optimization breaks:
ASSERT("%HasFastProperties", true, obj); // requires the "native syntax" flag
return f; // return it
eval(obj); // prevent the function from being optimized through dead code
// elimination or further optimizations. This code is never
// reached but even using eval in unreachable code causes v8
// to not optimize functions.
}
We don't have to find the code ourselves to assert that v8 does this optimization, we can instead read the v8 unit tests:
// Adding this many properties makes it slow.
assertFalse(%HasFastProperties(proto));
DoProtoMagic(proto, set__proto__);
// Making it a prototype makes it fast again.
assertTrue(%HasFastProperties(proto));
Reading and running this test shows us that this optimization indeed works in v8. However - it would be nice to see how.
If we check objects.cc
we can find the following function (L9925):
void JSObject::OptimizeAsPrototype(Handle<JSObject> object) {
if (object->IsGlobalObject()) return;
// Make sure prototypes are fast objects and their maps have the bit set
// so they remain fast.
if (!object->HasFastProperties()) {
MigrateSlowToFast(object, 0);
}
}
Now, JSObject::MigrateSlowToFast
just explicitly takes the Dictionary and converts it into a fast V8 object. It's a worthwhile read and an interesting insight into v8 object internals - but it's not the subject here. I still warmly recommend that you read it here as it's a good way to learn about v8 objects.
If we check out SetPrototype
in objects.cc
, we can see that it is called in line 12231:
if (value->IsJSObject()) {
JSObject::OptimizeAsPrototype(Handle<JSObject>::cast(value));
}
Which in turn is called by FuntionSetPrototype
which is what we get with .prototype =
.
Doing __proto__ =
or .setPrototypeOf
would have also worked but these are ES6 functions and Bluebird runs on all browsers since Netscape 7 so that's out of the question to simplify code here. For example, if we check .setPrototypeOf
we can see:
// ES6 section 19.1.2.19.
function ObjectSetPrototypeOf(obj, proto) {
CHECK_OBJECT_COERCIBLE(obj, "Object.setPrototypeOf");
if (proto !== null && !IS_SPEC_OBJECT(proto)) {
throw MakeTypeError("proto_object_or_null", [proto]);
}
if (IS_SPEC_OBJECT(obj)) {
%SetPrototype(obj, proto); // MAKE IT FAST
}
return obj;
}
Which directly is on Object
:
InstallFunctions($Object, DONT_ENUM, $Array(
...
"setPrototypeOf", ObjectSetPrototypeOf,
...
));
So - we have walked the path from the code Petka wrote to the bare metal. This was nice.
Remember this is all implementation detail. People like Petka are optimization freaks. Always remember that premature optimization is the root of all evil 97% of the time. Bluebird does something very basic very often so it gains a lot from these performance hacks - being as fast as callbacks isn't easy. You rarely have to do something like this in code that doesn't power a library.
Reality from 2021 (NodeJS version 12+). Seems like a huge optimization is done, objects with deleted fields and sparse arrays don't become slow. Or I'm missing smth?
// run in Node with enabled flag
// node --allow-natives-syntax script.js
function Point(x, y) {
this.x = x;
this.y = y;
}
var obj1 = new Point(1, 2);
var obj2 = new Point(3, 4);
delete obj2.y;
var arr = [1,2,3]
arr[100] = 100
console.log('obj1 has fast properties:', %HasFastProperties(obj1));
console.log('obj2 has fast properties:', %HasFastProperties(obj2));
console.log('arr has fast properties:', %HasFastProperties(arr));
both show true
obj1 has fast properties: true
obj2 has fast properties: true
arr has fast properties: true
V8 developer here. The accepted answer is a great explanation, I just wanted to highlight one thing: the so-called "fast" and "slow" property modes are unfortunate misnomers, they each have their pros and cons. Here is a (slightly simplified) overview of the performance of various operations:
struct-like properties | dictionary properties | |
---|---|---|
adding a property to an object | -- |
+ |
deleting a property | --- |
+ |
reading/writing a property, first time | - |
+ |
reading/writing, cached, monomorphic | +++ |
+ |
reading/writing, cached, few shapes | ++ |
+ |
reading/writing, cached, many shapes | -- |
+ |
colloquial name | "fast" | "slow" |
So as you can see, dictionary properties are actually faster for most of the lines in this table, because they don't care what you do, they just handle everything with solid (though not record-breaking) performance. Struct-like properties are blazing fast for one particular situation (reading/writing the values of existing properties, where every individual place in the code only sees very few distinct object shapes), but the price they pay for that is that all other operations, in particular those that add or remove properties, become much slower.
It just so happens that the special case where struct-like properties have their big advantage (+++
) is particularly frequent and really important for many apps' performance, which is why they acquired the "fast" moniker. But it's important to realize that when you delete
properties and V8 switches the affected objects to dictionary mode, then it isn't being dumb or trying to be annoying: rather it attempts to give you the best possible performance for what you're doing. We have landed patches in the past that have achieved significant performance improvements by making more objects go to dictionary ("slow") mode sooner when appropriate.
Now, it can happen that your objects would generally benefit from struct-like properties, but something your code does causes V8 to transition them to dictionary properties, and you'd like to undo that; Bluebird had such a case. Still, the name toFastProperties
is a bit misleading in its simplicity; a more accurate (though unwieldy) name would be spendTimeOptimizingThisObjectAssumingItsPropertiesWontChange
, which would indicate that the operation itself is costly, and it only makes sense in certain limited cases. If someone took away the conclusion "oh, this is great, so I can happily delete properties now, and just call toFastProperties
afterwards every time", then that would be a major misunderstanding and cause pretty bad performance degradation.
If you stick with a few simple rules of thumb, you'll never have a reason to even try to force any internal object representation changes:
delete
. When properties come and go, prefer using a Map
over the ES5-era "object-as-map" pattern. When an object can toggle into and out of a certain state, prefer boolean (or equivalent) properties (e.g. o.has_state = true; o.has_state = false;
) over adding and deleting an indicator property.Lastly, if your team lead tells you "I've heard that there are 'fast' and 'slow' properties, please make sure that all of ours are 'fast'", then point them at this post :-)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With