Its been a while I have ready Mcconnell's "Code Complete". Now I read it again in Hunt & Thomas' "The Pragmatic Programmer": Use assertions! Note: Not Unit Testing assertions, I mean Debug.Assert()
.
Following the SO questions When should I use Debug.Assert()? and When to use assertion over exceptions in domain classes assertions are useful for development, because "impossible" situations can be found quite fast. And it seems that they are commonly used. As far as I understood assertions, in C# they are often used for checking input variables for "impossible" values.
To keep unit tests concise and isolated as much as possible, I do feed classes and methods with null
s and "impossible" dummy input (like an empty string).
Such tests are explicitly documenting, that they don't rely on some specific input. Note: I am practicing what Meszaros' "xUnit Test Patterns" is describing as Minimal Fixture.
And that's the point: If I would have an assertions guarding these inputs, they would blow up my unit tests.
I like the idea of Assertative Programming, but on the other hand I don't need to force it. Currently I can't think of any use for Debug.Assert()
. Maybe there is something I miss? Do you have any suggestions, where they could be really useful? Maybe I just overestimate the usefulness of assertions? Or maybe my way of testing needs to be revisited?
Edit: Best practice for debug Asserts during Unit testing is very similar, but it does not answer the question which bothers me: Should I care about Debug.Assert()
in C# if I test like I have described? If yes, in which situation are they really useful? In my current point of view such Unit Tests would make Debug.Assert()
unnecessary.
Another point: If you really think that, this is a duplicate question, just post some comment.
In theory, you're right - exhaustive testing makes asserts redundant. In theory. In parctice, they're still useful for debugging your tests, and for catching attempts by future developers who might try to use interfaces not according to their intended semantics.
In short, they just serve a different purpose from unit tests. They're there to catch mistakes that by their very nature aren't going to be made when writing unit tests.
I would recommend keeping them, since they offer another level of protection from programmer mistakes.
They're also a local error protection mechanism, whereas unit tests are external to the code being tested. It's far easier to "inadvertently" disable unit tests when under pressure than it is to disable all the assertions and runtime checks in a piece of code.
I generally see asserts being used for sanity checks on internal state rather than things like argument checking.
IMO the inputs to a solid API should be guarded by checks that remain in place regardless of the build type. For example, if a public method expects an argument that is a number in between 5 and 500, it should be guarded with an ArgumentOutOfRangeException. Fail fast and fail often using exceptions as far as I'm concerned, particularly when an argument is pushed somewhere and is used much later.
However, in places where internal, transient state is being sanity-checked (e.g. checking that some intermediate state is within reasonable bounds during a loop), it seems the Debug.Assert is more at home. What else are you meant to do when your algorithm has gone wrong despite having valid arguments passed to it? Throw an EpicFailException? :) I think this is where Debug.Assert is still useful.
I'm still undecided on the best balance between the two. I've stopped using using Debug.Asserts so much in C# since I started unit testing, but there's still a place for them IMO. I certainly wouldn't use them to check correctness in API use, but sanity checking in hard to get to places? Sure.
The only downside is that they can pop up and halt NUnit, but you can write an NUnit plugin to detect them and fail any test that triggers an assert.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With