In Python, using calendar.timegm(), I get a 10 digit result for a unix timestamp. When I put this into Javscript’s setTime() function, it comes up with a date in 1970. It evidently needs a unix timestamp that is 13 digits long. How can this happen? Are they both counting from the same date?
How can I use the same unix timestamp between these two languages?
In Python:
In [60]: parseddate.utctimetuple()
Out[60]: (2009, 7, 17, 1, 21, 0, 4, 198, 0)
In [61]: calendar.timegm(parseddate.utctimetuple())
Out[61]: 1247793660
In Firebug:
>>> var d = new Date(); d.setTime(1247793660); d.toUTCString()
"Thu, 15 Jan 1970 10:36:55 GMT"
timegm is based on Unix’s gmtime() method, which return seconds since Jan 1, 1970.
Javascripts setTime() method is milliseconds since that date. You’ll need to multiply your seconds times 1000 to convert to the format expected by Javascript.