This seems like a bug to me…
I accept that automatic properties, defined as such:
public decimal? Total { get; set; }
Will be null when they are first accessed. They haven’t been initialized, so of course they are null.
But, even after setting their value through +=, this decimal? still remains null. So after:
Total += 8;
Total is still null. How can this be correct? I understand that it’s doing a (null + 8), but seems strange that it doesn’t pick up that it means it should just be set to 8…
Addendums:
I made the “null + 8” point in my question – but notice that it works with strings. So, it does null + “hello” just fine, and returns “hello”. Therefore, behind the scenes, it is initializing the string to a string object with the value of “hello”. The behavior should be the same for the other types, IMO. It might be because a string can accept a null as a value, but still, a null string is not an initialized object, correct?
Perhaps it’s just because a string isn’t a nullable…
Think of
nullas “unknown value”. If you have an unknown quantity of something and you add 8 more, how many do you have now?Answer: unknown.
Operations on Nullable Variables
There are cases where operations on unknown values give you knowable results.
The following statements have knowable solutions even though they contain unknown values:
See the pattern?
Of course, if you want
Totalto be equivalent (equal) to 0 when it isnull, you can use the null coalescing operator and write something like this:That will use the value of
Totalin your equation unless it isnull, in which case it will use the value 0.