I got to thinking about the structure of storage on hard drives: bytes, kilobytes, megabytes, gigabytes, terabytes, and so on,
How does that compare to time? Specifically seconds, comparing them to bytes. How long is a kilosecond, a megasecond, a gigasecond…? you get the idea. Each jump is achieved by multiplying it by 1024 (not one thousand).
My rough calculations:
One kilosecond (1,024 seconds) works out to 17 minutes and 4 seconds (17:04)
One megasecond (1,048,576 seconds) is 12 days, 3 hours, 16 minutes, and 15 seconds (12:03:16:15)
(In the calculations below I’m figuring that a year is roughly 365.25 days. Your mileage may vary.)
One gigasecond (1,073,741,824 seconds) is 85 years, 22 days, 14 hours, 21 minutes and 22 seconds. (85:22:14:21:22).
And one terasecond (1,099,581,627,776 seconds) is 34,843 years, 233 days, 10 hours, 21 minutes and 2 seconds (more or less).
I leave it to you to figure out exactly how long a petasecond is, but it’s going to be somewhere close to 35,679,232 years.
Well, those figures don’t jive with what they’ve calculated here: Time Measurement but I think they are using 1000, not 1024 as the multiplier.
Which seems like cheating.