EDIT: Originally I had transcribed
i++noti--The code now is as it was, and the code in the code block compiles and works.
Why, if unsigned int i; is used instead of int i; in the code snippet below, does using the function result in a segfault?
void insertion_sort_int_array(int * const Ints, unsigned int const len) {
unsigned int pos;
int key;
int i;
for (pos = 1; pos < len; ++pos) {
key = Ints[pos];
for (i = (pos - 1); (i >= 0) && Ints[i] > key; i--) {
Ints[i + 1] = Ints[i];
}
Ints[i + 1] = key;
}
}
The only difference between the standard insertion sort algorithm and your code is that you’re incrementing i instead of decrementing. That’s your problem. I bet that in the code you’re actually compiling and running, you have i– instead of i++ in the inner loop. That’s why the unsigned i makes a difference – it cannot be negative, so the inner loop will never end. Did you copy the code wrong when you posted?
EDIT:
Well, now that you changed the posted code, it all makes sense, right? An unsigned i will simply underflow to INT_MAX when you decrement it past 0, which will cause you to access memory outside of the array bounds.