…unless it’s running software that uses signed 32-bit timestamps, or stores data using that format.
The point about the “millennium bug” was that it was a category of problems that required (hundreds of) thousands of fixes. It didn’t matter if your OS was immune, because the OS isn’t where the value is.
Edit: Oh damn, I never noticed that the timestamp is indeed signed. For anyone curious, it is mostly historical as early C didn’t really have a concept of unsigned
2038 is the next big thing to hit older *nix based OS. It will be Y2K all over again.
Maybe on my 32-bit ARM server with ancient kernel it will. Any 64-bit machine is immune.
…unless it’s running software that uses signed 32-bit timestamps, or stores data using that format.
The point about the “millennium bug” was that it was a category of problems that required (hundreds of) thousands of fixes. It didn’t matter if your OS was immune, because the OS isn’t where the value is.
…timestamp is signed? Why?
Edit: Oh damn, I never noticed that the timestamp is indeed signed. For anyone curious, it is mostly historical as early C didn’t really have a concept of unsigned
It also allows users to store dates back to ~1902.
It’ll be 911 times 1000.
It’ll be 911,000? As long as it’s stored with 32 bits that should be fine 🤷
I agree. We’ve been able to do 6 digit math for decades now