The two-second-vs-ms gotcha
The single biggest source of timestamp bugs is mixing seconds and milliseconds. Unix timestamps are traditionally seconds; JavaScript\'s Date uses milliseconds. Most language standard libraries lean one way or the other, so when you cross language or system boundaries, double-check. A unit dropdown is provided here for exactly this reason.
FAQ
- Seconds or milliseconds?
- Unix is traditionally seconds. JavaScript's
Date.now()returns milliseconds. The unit dropdown only matters when input is numeric; for ISO strings, the parser figures it out. - Why does my timestamp show the wrong year?
- Almost always because seconds got passed as ms (or vice versa). 1.7B seconds (recent unix time) interpreted as ms is in 1970. 1.7B ms is in 1970 too. Check the unit.
- Timezone handling?
- Unix timestamps are timezone-free (always UTC). Local time depends on the browser's timezone setting. ISO strings can include or omit a timezone offset.
- What about milliseconds-since-some-other-epoch?
- Anything not since Unix epoch (1970-01-01 UTC) needs adjustment first. Excel's epoch is 1900; Windows file times start in 1601; PostgreSQL's default starts in 2000.
Related tools
- Base64 Encode / Decode
UTF-8 safe encode and decode that round-trips emoji, CJK, and URL-safe variants.
- URL Encode / Decode
URL encode / decode with three variants: encodeURIComponent, encodeURI, and form-urlencoded.
- UUID Generator (v4 / v7)
Generate UUID v4 (random) or v7 (time-ordered). Bulk regenerate up to 100 at once.
- JWT Decoder
Decode JWT header and payload, see expiration status. No signature verification.