I’m coming from the .Net camp where virtualization was much more prevalent do to the need to run on server software and system-wide entities such as the GAC.
Now that I’m doing Java development, does it make sense to continue to employ virtualization? We were using VirtualPC which, IMO, wasn’t the greatest offering. If we were to move forward, we would, hopefully, be using VMWare.
We are doing web development and wouldn’t use virtualization to test different flavors of server deployment.
Pros:
- Allows development environments to be identical across the team
- Allows for isolation from the host
- Cross-platform/browser testing
Cons:
- Multiple monitor support is lacking (not in VMWare?)
- Performance degradation – mostly I/O
- Huge virtual disks
One possible advantage is that you could technically test out the same program on a variety of operating systems and on a vartiety of JVMs.
Contrary to popular opinion, Java is not 100% portable, and it is very possible to write nonportable code. In addition, there are subtle versions between libraries. If you work with different JVMs, there could also be differences.
However, since Java IDEs are heavyweight, running an IDE within the VM may not be fun.
Java does support some forms of remote deployment, might be beneficial to explore those while still doing the IDE work locally.