timestampunixjavascriptepoch

What is a Unix Timestamp and the Year 2038 Problem

Understand what Unix time (epoch time) is, how to convert it in JavaScript, and why 32-bit systems will have a problem on January 19, 2038.

May 4, 2026·6 min read

The Unix timestamp — also called epoch time or POSIX time — is the most universal way to represent a moment in time in computer systems. It's an integer that expresses how many seconds have elapsed since January 1, 1970 at 00:00:00 UTC.

That simple. That powerful.

Why Unix timestamp exists

Before computer systems had a standard, each platform stored dates its own way: some as strings like "1970-01-01", others as days since some arbitrary epoch, others with implicit time zones that caused chaos when systems communicated with each other.

Unix, developed at Bell Labs in the 1960s and 70s, needed a compact, precise, unambiguous representation. The solution was to choose a universal reference point — January 1, 1970 at midnight UTC, known as the Unix epoch — and count seconds from that moment.

Today that standard is universal. Linux, macOS, Windows, iOS, Android, JavaScript, Python, Go, Rust and virtually every language and operating system use the same base.

Working with timestamps in JavaScript

JavaScript uses milliseconds (not seconds) since the Unix epoch. Keep this in mind when interacting with external APIs that typically use seconds.

Get the current timestamp

// In milliseconds (JavaScript native)
Date.now();                     // 1746316800000

// In seconds (as most external APIs expect)
Math.floor(Date.now() / 1000);  // 1746316800

Convert timestamp to readable date

const ts = 1746316800; // seconds
const date = new Date(ts * 1000); // × 1000 because JS uses milliseconds

date.toISOString();        // "2025-05-04T00:00:00.000Z"
date.toLocaleDateString(); // depends on system locale
date.toUTCString();        // "Sun, 04 May 2025 00:00:00 GMT"

Convert date to timestamp

const date = new Date("2025-05-04T00:00:00Z");
const timestamp = Math.floor(date.getTime() / 1000); // 1746316800

Calculate differences between dates

const start = 1746316800;
const end   = 1746403200;
const diffSec  = end - start;       // 86400 seconds
const diffHour = diffSec / 3600;    // 24 hours
const diffDays = diffSec / 86400;   // 1 day

Timestamps make date arithmetic trivial — something that requires complex parsing and generates time zone errors when working with strings.

Related formats

ISO 8601 is the international standard for representing dates as text: 2025-05-04T00:00:00Z. The Z indicates UTC. JavaScript generates it with date.toISOString(). It's the recommended format for REST APIs.

RFC 2822 is the format used in email and HTTP headers: Sun, 04 May 2025 00:00:00 GMT. More human-readable, less suitable for automated parsing.

Relative time: instead of an absolute date, it indicates how long ago something happened: "3 hours ago", "yesterday", "last Tuesday". Calculated with Math.floor((Date.now() / 1000) - pastTimestamp).

The Year 2038 Problem

Here's the part that keeps legacy systems engineers awake at night.

32-bit systems store the Unix timestamp as a signed 32-bit integer (int32_t). The maximum representable value is 2,147,483,647.

That number in seconds since 1970 corresponds exactly to January 19, 2038 at 03:14:07 UTC.

One second later, the counter overflows and jumps to the lowest negative value (-2,147,483,648), which corresponds to December 13, 1901. Systems relying on that variable will interpret the clocks as having jumped back 136 years.

Which systems are actually at risk?

64-bit systems have no problem: an int64_t can represent timestamps up to the year 292,277,026,596. More than enough.

The risk lies in:

  • Embedded systems with 32-bit processors — old routers, industrial machinery, SCADA systems, medical equipment
  • 32-bit Linux kernels — though patches exist (Linux kernel 5.6 resolved this for 32-bit architectures)
  • Legacy databases with timestamp columns stored as 32-bit INT
  • Old C/C++ code using time_t defined as int32_t in some compilers

JavaScript has no 2038 problem. It uses float64 (IEEE 754 double precision) for its timestamps in milliseconds. The maximum safe value is 9,007,199,254,740,991, corresponding to the year 275,760.

Comparison with Y2K

The Year 2000 bug (Y2K) affected systems that stored the year with only 2 digits. The fix was urgent because year 2000 arrived for everyone simultaneously.

The 2038 bug is different: it affects specific systems, there's extensive documentation on how to migrate, and there's over 12 years of lead time. But complacency is the biggest risk — systems that have been running for decades without maintenance are the most vulnerable.

Timestamps and time zones

The Unix timestamp is always UTC. It has no implicit time zone. 1746316800 always means the same instant in time, whether you read it in Madrid or Tokyo.

Confusion arises when converting to readable dates. new Date(1746316800000).toLocaleDateString() in Spain (UTC+2 in summer) will show a different date than the same code running on a UTC server.

Golden rule: always store in UTC (timestamp or ISO 8601 with Z). Convert to local time only when displaying to the end user.

Try it without code

If you need to convert a Unix timestamp to a readable date, get the current timestamp, calculate date differences or explore ISO 8601 and UTC formats, our online timestamp converter does it instantly with full support for all formats.

Try it without code

Timestamp Converter

Unix timestamps to readable dates.

Open tool

Built by

Miguel Ángel Colorado Marin

Full-Stack Developer · Guadalajara, España

I develop web apps, digital tools and full projects — from design to deployment.

Contact me