Are there a real reason to use dynamic linking and binary distributions these days?
For binary distributions there’s an alternative in distributing everything in source code and letting the platform do the choice of compiling binaries or not. But whether it is usable or not depends about how well can today’s computers compile from source code.
Dynamic linking belongs to the question, since it allows distributing libraries in binaries as well.
So, how good performance a compiler can show off? With optimizations or without? What can be done to get better performance out from a compiler?
The gentoo Linux distribution does just that. We’re close to the point where it becomes more cheap to distribute source instead of binaries. Currently, there are a couple of issues which need to be solved:
makeallow to compile code in parallel. But it’s still slower than copying a file from the installation media to disk. A lot slower.OSS solves issue #3. gentoo solves #4. Right now, we’re just stuck at #2, really. Todays CPUs are just too slow to run something like games or MS Office from source code.