I’ve got the following code:
#include <iostream> using namespace std; int main(int argc, char *argv[]) { string a = 'a'; for(unsigned int i=a.length()-1; i+1 >= 1; --i) { if(i >= a.length()) { cerr << (signed int)i << '?' << endl; return 0; } } }
If I compile in MSVC with full optimizations, the output I get is ‘-1?’. If I compile in Debug mode (no optimizations), I get no output (expected.)
I thought the standard guaranteed that unsigned integers overflowed in a predictable way, so that when i = (unsigned int)(-1), i+1 = 0, and the loop condition i + 1 >= 1 fails. Instead, the test is somehow passing. Is this a compiler bug, or am I doing something undefined somewhere?
I remember having this problem in 2001. I’m amazed it’s still there. Yes, this is a compiler bug.
The optimiser is seeing
Theoretically, we can optimise this by putting all of the constants on the same side:
Because i is unsigned, it will always be greater than or equal to zero.
See this newsgroup discussion here.