My current position is this: if I thoroughly test my ASP.NET applications using web tests (in my case via the VS.NET’08 test tools and WatiN, maybe) with code coverage and a broad spectrum of data, I should have no need to write individual unit tests, because my code will be tested in conjunction with the UI through all layers. Code coverage will ensure I’m hitting every functional piece of code (or reveal unused code) and I can provide data that will cover all reasonably expected conditions.
However, if you have a different opinion, I’d like to know:
-
What additional benefit does unit testing give that justifies the effort to include it in a project (keep in mind, I’m doing the web tests anyway, so in many cases, the unit tests would be covering code that web tests already cover).
-
Can you explain your reasons in detail with concete examples? Too often I see responses like ‘that’s not what it’s meant for’ or ‘it promotes higher quality’ – which really doesn’t address the practical question I have to face, which is, how can I justify – with tangible results – spending more time testing?
"Hit" does not mean "Testing"
The problem with only doing web-testing is that it only ensures that you hit the code, and that it appears to be correct at a high-level.
Just because you loaded the page, and it didn’t crash, doesn’t mean that it actually works correctly. Here are some things I’ve encountered where ‘web tests’ covered 100% of the code, yet completely missed some very serious bugs which unit testing would not have.
It is easy to come up with hundreds more examples of things like this.
You need both unit tests to make sure that your code actually does what it is supposed to do at a low level, and then functional/integration (which you’re calling web) tests on top of those, to prove that it actually works when all those small unit-tested-pieces are chained together.