The accepted wisdom is that using a construct like Dim dict As New Dictionary
is poorer in performance than Dim dict As Dictionary / Set dict = New Dictionary
.
The explanation is that the former example - auto-instantiation - defers instantiation until the first usage of the variable dict
. And thus, every time dict is referenced, the compiled code must first check whether dict
is equal to Nothing.
But it occurs to me that compiled code does this anyway. You will get an error any time you try to make use of an object reference that is Nothing
.
So, in tribute to science, I ran some tests. And the results suggest there is no performance difference between the two approaches. (Run on Excel 2007)
Call "create dictionary & add 2 items" 100,000 times.
Reverse the order of test calls:
Call "create dictionary & add 6 items" 100,000 times.
Create dictionary and add 100,000 items.
Create dictionary and add 1,000,000 items.
I see nothing to indicate that auto-instantiation is a poor performing relation to explicit instantiation. (To be clear, for other reasons, I would avoid auto-instantiation but I'm just interested in the performance angle here.)
So is this a myth?
UPDATE
Let me lay out why the performance argument doesn't make sense to me. It is said that
x.Add("Key", "Item")
in an auto-instantiated object is equivalent to the following:
If x is Nothing then
Set x = New Dictionary
End If
x.Add("Key", "Item")
which makes it look like "frightening overhead" if you're calling this thousands of times. But in the explicit instantiation case, it's exactly the form of logic generated in the compiled version of the code:
If x is Nothing Then
Err.Raise "Object variable not set..."
End If
x.Add("Key", "Item")
It doesn't necessarily follow that auto is longer, which is why I'm asking whether there was any truth to this. I wonder if I've identified another one of the many untested performance myths.
A common problem that can cause performance issues in VBA macros is the usage of the . Select function. Each time a cell is selected in Excel, every single Excel add-in (including think-cell) is notified about this selection change event, which slows down the macro considerably.
VBA can process math faster than it can display the StatusBar. Writing to the StatusBar is another example of traffic between VBA and Excel. The following example writes one out of every 100 scenarios to the StatusBar. It is about 90 times faster than writing every scenario to the StatusBar.
I know there is the holy way over if it's okay or not to dim as new, but I've never heard of it being said to generate poor performance. The short answer is Not Really. Yes it does litter your code with unnessesary checks to see if it's not Nothing, but you wouldn't notice a speed difference thanks to today's machines. It's like saying "for looping over 10000 characters is faster than 10001. To start seeing any difference, you need to be looping your tests in higher terms, like millions and tens of millions.
That being said Dim as New is frowned upon but not for performance reasons.
Granted if you are just using VBA to automate some worksheet stuff or manipulate data, you probably won't care about these, but the moment you look at more sophisticated code, losing the ability to test if an object is Nothing and not controlling when it's initialized is huge and can generate unexpected behavior, not to mention make testing a pain in the butt. All that for saving a few lines of code.
Then there are the micro-optimizers who will argue that adding anything to your code that is not needed makes for poor performance. While they are right in some ways, you'll most likely saving 0.000000001 seconds in this case.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With