I wrote my own date calculation functions. And during that, I had an aha moment to treat March 1 as the beginning of the year during internal calculations. I thought it was a stroke of genius. It turns out this article says that’s the traditional way.
A write-up of a new Gregorian date conversion algorithm.
It achieves a 30–40% speed improvement on x86-64 and ARM64 (Apple M4 Pro) by reversing the direction of the year count and reducing the operation count (4 multiplications instead of the usual 7+).
Paper-style explanation, benchmarks on multiple architectures, and full open-source C++ implementation.
I wrote my own date calculation functions. And during that, I had an aha moment to treat March 1 as the beginning of the year during internal calculations. I thought it was a stroke of genius. It turns out this article says that’s the traditional way.
A write-up of a new Gregorian date conversion algorithm.
It achieves a 30–40% speed improvement on x86-64 and ARM64 (Apple M4 Pro) by reversing the direction of the year count and reducing the operation count (4 multiplications instead of the usual 7+).
Paper-style explanation, benchmarks on multiple architectures, and full open-source C++ implementation.
Very nice writeup!
> Years are calculated backwards
How did that insight come about?
Nice to see that there are still some jewels left to be dug out from the algorithm land.
Nice to see the micro-optimising folks are still making progress on really foundational pieces of the programming stack
Yes, some sharing Vibe coded slop.
> The algorithm provides accurate results over a period of ±1.89 Trillion years
i'm placing my bets that in a few thousand years we'll have changed calendar system entirely haha
but, really interesting to see the insane methods used to achieve this