What is a Unix timestamp?
A Unix timestamp (also called epoch time) is the number of seconds — or milliseconds — elapsed since January 1, 1970, 00:00:00 UTC. It's the universal language of time in software: databases store it, APIs return it, logs use it.
Seconds vs milliseconds
Most Unix timestamps are in seconds (10 digits, e.g. 1735689600), but JavaScript, Java, and many APIs use milliseconds (13 digits, e.g. 1735689600000). The converter above handles both automatically.
What is Unix Epoch Time?
Unix epoch time is a system for describing a point in time as a single number: the count of seconds that have elapsed since Thursday, January 1, 1970, at 00:00:00 Coordinated Universal Time (UTC). This reference point is called the Unix epoch, and it forms the backbone of timekeeping in virtually every modern operating system, programming language, and database.
The concept was introduced with the Unix operating system at Bell Labs in the early 1970s. The engineers needed a compact, unambiguous way to store time. A single integer, free of timezone offsets, daylight saving rules, and calendar quirks, turned out to be the most elegant solution. Decades later, that same design decision still powers the timestamps in your server logs, your database records, and the APIs you call every day.
How Epoch Time Works in Practice
Every second that passes increments the Unix timestamp by one. At the epoch itself (January 1, 1970 00:00:00 UTC), the value is 0. Negative values represent dates before 1970. For example, -86400 is December 31, 1969.
Epoch timestamps are timezone-independent. The integer 1735689600 means the same instant everywhere in the world. To display that instant in a local timezone, software applies the appropriate UTC offset after the fact. This separation between storage and display is what makes Unix time so reliable for distributed systems where servers span multiple timezones.
Seconds vs. Milliseconds Timestamps
The original Unix timestamp counts in seconds and produces a 10-digit number for current dates (e.g. 1735689600). However, many modern platforms use millisecond precision, yielding a 13-digit number (e.g. 1735689600000). JavaScript's Date.now(), Java's System.currentTimeMillis(), and many REST APIs return milliseconds. Meanwhile, Unix shell commands like date +%s, Python's time.time(), and most database TIMESTAMP columns use seconds.
Converting between the two is straightforward: multiply seconds by 1,000 to get milliseconds, or integer-divide milliseconds by 1,000 to get seconds. The converter above auto-detects which format you've entered based on digit count.
How to Get the Current Timestamp
Nearly every language has a one-liner for retrieving the current Unix timestamp:
- JavaScript:
Math.floor(Date.now() / 1000)(seconds) orDate.now()(ms) - Python:
import time; int(time.time()) - Bash:
date +%s - PHP:
time() - Java:
Instant.now().getEpochSecond() - Go:
time.Now().Unix()
The Y2038 Problem
The maximum value a signed 32-bit integer can store is 2,147,483,647, which corresponds to January 19, 2038 at 03:14:07 UTC. After that second, a 32-bit counter wraps around to its minimum negative value, jumping the date back to December 13, 1901. This is known as the Y2038 problem or the "Unix Millennium Bug." Most modern operating systems and languages have already transitioned to 64-bit timestamps, which won't overflow for approximately 292 billion years, effectively solving the issue for any foreseeable future.
Frequently Asked Questions
What is epoch time?
Epoch time (also called Unix time or POSIX time) is a system for tracking time as a running count of seconds since January 1, 1970, 00:00:00 UTC. This reference point is the Unix epoch. It provides a simple, timezone-independent way to represent any moment in time as a single integer, which is why it is the standard in programming, databases, and APIs.
Why does Unix time start at January 1, 1970?
The Unix operating system was developed at Bell Labs in the late 1960s. Its designers chose January 1, 1970 as a convenient, round date near Unix's creation. The original 32-bit signed integer gave about 136 years of range from that starting point, which was more than enough for the era. The convention stuck and became a universal standard.
What is the Y2038 problem?
On January 19, 2038 at 03:14:07 UTC, a 32-bit signed integer counting seconds since the epoch overflows. Systems still using 32-bit time will wrap to a negative value, interpreting the date as December 1901. Most modern platforms have migrated to 64-bit timestamps, which effectively eliminates the issue.
What's the difference between seconds and milliseconds timestamps?
Seconds timestamps are 10 digits for current dates and are used by most Unix tools, Python, PHP, and database columns. Milliseconds timestamps are 13 digits and are used by JavaScript, Java, and many web APIs. Multiply seconds by 1,000 to get milliseconds, or divide milliseconds by 1,000 (integer division) for seconds.
How do I get the current Unix timestamp?
JavaScript: Math.floor(Date.now() / 1000). Python: int(time.time()). Bash: date +%s. PHP: time(). Java: Instant.now().getEpochSecond(). Or just look at the live counter at the top of this page.
Is epoch time the same as Unix time?
In practice, yes. "Epoch time," "Unix time," "Unix timestamp," and "POSIX time" all refer to the same thing: seconds elapsed since January 1, 1970 UTC. Technically "epoch" just means a reference starting point and different systems could define different epochs, but in computing the term almost universally refers to the Unix epoch.