I am trying to convert an int into three bytes representing that int (big endian).
I’m sure it has something to do with bit-wise and and bit shifting. But I have no idea how to go about doing it.
For example:
int myInt; // some code byte b1, b2 , b3; // b1 is most significant, then b2 then b3.
*Note, I am aware that an int is 4 bytes and the three bytes have a chance of over/underflowing.
To get the least significant byte:
The 2nd least significant byte:
And the 3rd least significant byte:
Explanation:
Bitwise ANDing a value with 0xFF (11111111 in binary) will return the least significant 8 bits (bits 0 to 7) in that number. Shifting the number to the right 8 times puts bits 8 to 15 into bit positions 0 to 7 so ANDing with 0xFF will return the second byte. Similarly, shifting the number to the right 16 times puts bits 16 to 23 into bit positions 0 to 7 so ANDing with 0xFF returns the 3rd byte.