admin管理员组文章数量:1389691
I want to write my own version of ctime()
which converts a timestamp to a human readable format. I found the following algorithm which is used in several projects (like .c#L230 and .c#L75), but nobody really documented how it works:
//Convert Unix time to date
a = (uint32_t) ((4 * t + 102032) / 146097 + 15);
b = (uint32_t) (t + 2442113 + a - (a / 4));
c = (20 * b - 2442) / 7305;
d = b - 365 * c - (c / 4);
e = d * 1000 / 30601;
f = d - e * 30 - e * 601 / 1000;
//January and February are counted as months 13 and 14 of the previous year
if(e <= 13)
{
c -= 4716;
e -= 1;
}
else
{
c -= 4715;
e -= 13;
}
//Retrieve year, month and day
date->year = c;
date->month = e;
date->day = f;
In some places there is even a comment "I don't know how this works but everyone uses it". I have found out that the base of this algorithm seems to be one that calculates a "Julian Date" to a "Gregorian Date". The interesting part seems to be the calculation of a
and b
.
Can someone break down what this algorithm does and which constants need to be adapted so it uses "days since 1.1.2025" as new base?
本文标签: algorithmChanging base of unix timestamp calculation to 112025Stack Overflow
版权声明:本文标题:algorithm - Changing base of unix timestamp calculation to 1.1.2025 - Stack Overflow 内容由网友自发贡献,该文观点仅代表作者本人, 转载请联系作者并注明出处:http://www.betaflare.com/web/1744606204a2615369.html, 本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌抄袭侵权/违法违规的内容,一经查实,本站将立刻删除。
发表评论