date - Trying to understand timezones in Java -
i have simple java program follows:
public static void main(string[] args) { calendar cal = calendar.getinstance(); system.out.println(new date(cal.gettimeinmillis())); system.out.println(cal.get(calendar.zone_offset)); cal.set(calendar.zone_offset, 0); system.out.println(new date(cal.gettimeinmillis())); }
my home timezone gmt. purpose of experiment set computer's timezone edt , observed system clock has moved 5 hours.
when run program, output:
sat apr 25 10:09:23 edt 2015 -18000000 sat apr 25 05:09:23 edt 2015
the sat apr 25 10:09:23 edt 2015
indicates system time , timezone, expected.
the -18000000
indicates zone offset in ms, negative 5 hours expected.
when set zone offset 0, expect time read real local time of 15:09 instead reads 05:09, in other words, has taken off 5 hours instead of adding them.
why? i'm confused!
i think have done set calendar current time in current timezone (-5 hours) (10:09) got time in milliseconds returns milliseconds time if in gmt. ie adds 5 hours (15:09)
system.out.println(new date(milliseconds)) interprets time in current timezone (-5 hours)(10:09)
you change zone offset 0 keep day , time numbers unchanged. (10:09) take time in milliseconds again again if gmt (it adds nothing) (10:09)
system.out.println(new date(time in millis)) interprets time in current timezone (-5 hours) (05:09)
the key information here date gmt internally , timezone applied when format or call tostring() done println (...) method.
calendar.gettimeinmillis () returns number of milliseconds since start of 01/01/1970 utc (the computer epoch)
Comments
Post a Comment