Java is reading the locale, timezone and encoding information (and perhaps more) from the system it is installed on.
This often brings bad surprises (brought me one just yesterday). Say your development and production servers are set to have TimeZone GMT+2. Then you deploy on a production server set to GMT. a 2-hour shift may not be easy to observe immediately. And although you can pass a TimeZone to your calendars, APIs might be instantiating calendars (or dates) using the default timezone.
Now, I know one should be careful with these settings, but are easy to miss, hence make programs more error-prone.
So, why doesn’t Java have its own defaults – UTF-8, GMT, en_US (yes, I’m on non-en_US locale, but having it as default is fine). Applications could read the system settings via some API, if needed.
Thus programs would be more predictable.
So, what is the reason behind this decision?
This isn’t unique to Java. Many systems default to the system time zone. After all, what else can they do?
Time zones are a thorny issues, particularly when the application needs to deal with several time zones. That’s why sites such as this one put everything in UTC.
As for your situation, it’s hard to comment because the description is rather vague but it sounds like this is your error. If you save a date (without time zone) in one place at GMT+2 and then load it another at GMT then you’ve done something wrong.