I raised this question up because I have seen Windows 7 64 bits able to run several 32 bits program without any problem; of course, some run with problems and some refuse to run at all.
I am not sure why some 32 bits program can run just fine on 64 bits, but for us, in the future, say, if we have a 128 bit architecture and OS released in the future, what can we do in term of programming if we want our program to be able to run in a different bit architecture? Or is it not a programmer’s job?
There’s two possible questions here: what to do to allow the binary to run, and what to do to allow the source to compile and run.
There isn’t that much you can do to make the binary future-proof. Go strictly by the published API, and avoid using anything undocumented. It will run if the future system supports it, and the future system is far more likely to support the standard API than anything undocumented. This was the problem with many early Macintosh programs: instead of using the API (which was clumsy for some things early on), they used shortcuts that worked in OS 5 or whatever, and didn’t in OS 7.
This advice is mostly for C and C++, as languages like Java define things much better. Any pure Java program should run fine on any later JVM. (Yes, this has its own costs.)
Abstract out all the architecture-dependent stuff you can. Use types like
size_tandptrdiff_tin C and C++, rather than any type of integer.When you need a type of a particular bit size, don’t give it a type like
intorlong. Use typedefs. There’s a C99 header with useful typedefs for the purpose, but you can always have something liketypedef int int32_tand change theintlater, as needed, in one obvious place rather than scattered around the program in hard-to-find places.Try to encapsulate OS calls, since those could change in a future architecture. If you must do anything with an undocumented OS feature, document it very noticeably.
If your program has anything to do with networking, assume nothing about the byte order. Networking byte order is unlikely to change, but your program might wind up on a chip with a different architecture (cf. the Macintosh, which has used three different architectures in its time).
In general, assume as little as you can get away with. Use types specifically designated for machine-dependent things, and use them consistently. Do everything outside the program as written in the most formal, standard, and documented way possible.