What is a Unix Timestamp?
A Unix timestamp is the number of seconds that have elapsed since January 1, 1970, at 00:00:00 UTC (the "Unix epoch"). It's the most common way to represent a point in time in computing because it's timezone-independent, a single integer, and trivially comparable.
For example, the Unix timestamp 1700000000 represents November 14, 2023, at approximately 22:13 UTC.
Why Seconds Since 1970?
When Unix was developed in the late 1960s and early 1970s, the team chose January 1, 1970 as the epoch because it was recent, round, and easy to work with. The choice was somewhat arbitrary — other systems use different epochs (e.g., Windows uses January 1, 1601).
Milliseconds vs. Seconds
JavaScript uses milliseconds: Date.now() returns the number of milliseconds since epoch. Most other languages and systems use seconds. Always check which unit an API expects — a common bug is passing milliseconds where seconds are expected, resulting in dates in the year 33000.
The Year 2038 Problem
Traditional Unix time uses a 32-bit signed integer, which can hold values up to 2,147,483,647. That number corresponds to January 19, 2038, at 03:14:07 UTC — after which 32-bit systems overflow. Modern 64-bit systems don't have this problem, but embedded systems and legacy code may.
Working with Timestamps
In JavaScript: Math.floor(Date.now() / 1000) gives the current Unix timestamp in seconds.
To convert a timestamp back to a Date: new Date(timestamp * 1000).
Always store and transmit times in UTC, then convert to local time only for display. This prevents timezone-related bugs in multi-region applications.