admin管理员组文章数量:1345420
I need to pute the JavaScript getTime method in C#.
For simplicity, I chose a fixed date in UTC and pared the C#:
C#
DateTime e = new DateTime(2011, 12, 31, 0, 0, 0, DateTimeKind.Utc);
DateTime s = new DateTime(1970, 1, 1, 0, 0, 0, DateTimeKind.Utc);
TimeSpan t = (e - s);
var x = t.TotalMilliseconds.ToString();
=> 1325289600000
and the JavaScript results:
JavaScript
var d = new Date(2011, 12, 31, 0, 0, 0)
var utcDate = new Date(d.getUTCFullYear(), d.getUTCMonth(), d.getUTCDate(), d.getUTCHours(), d.getUTCMinutes(), d.getUTCSeconds());
utcDate.getTime()
=> 1327960800000
Any hints on what I'm doing wrong?
Thanks!
I need to pute the JavaScript getTime method in C#.
For simplicity, I chose a fixed date in UTC and pared the C#:
C#
DateTime e = new DateTime(2011, 12, 31, 0, 0, 0, DateTimeKind.Utc);
DateTime s = new DateTime(1970, 1, 1, 0, 0, 0, DateTimeKind.Utc);
TimeSpan t = (e - s);
var x = t.TotalMilliseconds.ToString();
=> 1325289600000
and the JavaScript results:
JavaScript
var d = new Date(2011, 12, 31, 0, 0, 0)
var utcDate = new Date(d.getUTCFullYear(), d.getUTCMonth(), d.getUTCDate(), d.getUTCHours(), d.getUTCMinutes(), d.getUTCSeconds());
utcDate.getTime()
=> 1327960800000
Any hints on what I'm doing wrong?
Thanks!
Share Improve this question edited Feb 27, 2014 at 21:51 mo5470 asked Feb 27, 2014 at 21:43 mo5470mo5470 9774 gold badges10 silver badges26 bronze badges 4- 1 Why do you need UTC time to calculate the number of seconds between two dates? It doesn't matter what time zone, the answer is the same in any time zone. – EkoostikMartin Commented Feb 27, 2014 at 21:49
- Because in C# I'm reading the date from a DB which has stores dates in UTC. – mo5470 Commented Feb 27, 2014 at 21:54
- When I execute your js code I get - 1328004000000 - jsfiddle/k5Z7h/1 – EkoostikMartin Commented Feb 27, 2014 at 22:39
- 1 @EkoostikMartin The answer wouldn't be the same in any time zone, since time zones change over time and especially did in the last 40 years. – basilikum Commented Feb 27, 2014 at 22:44
3 Answers
Reset to default 10Javascript months are zero-based.
12
means January of next year.
You want 11
.
If you meant for the input to be at UTC, you should be doing this instead:
var ts = Date.UTC(2011,11,31,0,0,0);
As SLaks pointed out, months run 0-11, but even then - you must initialize the date as UTC if you want the response in UTC. In your code, you were initializing a local date, and then converting it to UTC. The result would be different depending on the time zone of the puter where the code is running. With Date.UTC
, you get back a timestamp - not a Date
object, and it will be the same result regardless of where it runs.
From Chrome's debugging console:
This is the same value returned from your .NET code, which looks just fine, except I would return a long
, not a string
.
The date JS is wrong I believe. Omit the var utcDate
line and output just d.getTime()
The time between two dates is the same, regardless of timezone and offset. Timezones are relative to an ACTUAL point in time, so whether you call .getTime()
on the UTC or EST or PST date, it will be the same relative to 1970-1-1 of the same timezone.
2011-12-31 EST - 1970-1-1 EST
== 2011-12-31 PST - 1970-1-1 PST
== 2011-12-31 UTC - 1970-1-1 UTC
EDIT: Per @Slaks above, you also are not using the 0-based month (which I also had no idea about).
本文标签: Computing milliseconds since 1970 in C yields different date than JavaScriptStack Overflow
版权声明:本文标题:Computing milliseconds since 1970 in C# yields different date than JavaScript - Stack Overflow 内容由网友自发贡献,该文观点仅代表作者本人, 转载请联系作者并注明出处:http://www.betaflare.com/web/1743779491a2537574.html, 本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌抄袭侵权/违法违规的内容,一经查实,本站将立刻删除。
发表评论