We noticed that lots of bugs in our software developed in C# (or Java) cause a NullReferenceException.
Is there a reason why ‘null’ has even been included in the language?
After all, if there were no ‘null’, I would have no bug, right?
In other words, what feature in the language couldn’t work without null?
Anders Hejlsberg, "C# father", just spoke about that point in his Computerworld interview:
Cyrus Najmabadi, a former software design engineer on the C# team (now working at Google) discuss on that subject on his blog: (1st, 2nd, 3rd, 4th). It seems that the biggest hindrance to the adoption of non-nullable types is that notation would disturb programmers’ habits and code base. Something like 70% of references of C# programs are likely to end-up as non-nullable ones.
If you really want to have non-nullable reference type in C# you should try to use Spec# which is a C# extension that allow the use of "!" as a non-nullable sign.