admin管理员组

文章数量:1345420

I need to pute the JavaScript getTime method in C#.

For simplicity, I chose a fixed date in UTC and pared the C#:

C#
DateTime e = new DateTime(2011, 12, 31, 0, 0, 0, DateTimeKind.Utc);
DateTime s = new DateTime(1970, 1, 1, 0, 0, 0, DateTimeKind.Utc);
TimeSpan t = (e - s);
var x = t.TotalMilliseconds.ToString();
=> 1325289600000

and the JavaScript results:

JavaScript
var d = new Date(2011, 12, 31, 0, 0, 0)
var utcDate = new Date(d.getUTCFullYear(), d.getUTCMonth(), d.getUTCDate(), d.getUTCHours(), d.getUTCMinutes(), d.getUTCSeconds());
utcDate.getTime()
=> 1327960800000

Any hints on what I'm doing wrong?

Thanks!

I need to pute the JavaScript getTime method in C#.

For simplicity, I chose a fixed date in UTC and pared the C#:

C#
DateTime e = new DateTime(2011, 12, 31, 0, 0, 0, DateTimeKind.Utc);
DateTime s = new DateTime(1970, 1, 1, 0, 0, 0, DateTimeKind.Utc);
TimeSpan t = (e - s);
var x = t.TotalMilliseconds.ToString();
=> 1325289600000

and the JavaScript results:

JavaScript
var d = new Date(2011, 12, 31, 0, 0, 0)
var utcDate = new Date(d.getUTCFullYear(), d.getUTCMonth(), d.getUTCDate(), d.getUTCHours(), d.getUTCMinutes(), d.getUTCSeconds());
utcDate.getTime()
=> 1327960800000

Any hints on what I'm doing wrong?

Thanks!

Share Improve this question edited Feb 27, 2014 at 21:51 mo5470 asked Feb 27, 2014 at 21:43 mo5470mo5470 9774 gold badges10 silver badges26 bronze badges 4
  • 1 Why do you need UTC time to calculate the number of seconds between two dates? It doesn't matter what time zone, the answer is the same in any time zone. – EkoostikMartin Commented Feb 27, 2014 at 21:49
  • Because in C# I'm reading the date from a DB which has stores dates in UTC. – mo5470 Commented Feb 27, 2014 at 21:54
  • When I execute your js code I get - 1328004000000 - jsfiddle/k5Z7h/1 – EkoostikMartin Commented Feb 27, 2014 at 22:39
  • 1 @EkoostikMartin The answer wouldn't be the same in any time zone, since time zones change over time and especially did in the last 40 years. – basilikum Commented Feb 27, 2014 at 22:44
Add a ment  | 

3 Answers 3

Reset to default 10

Javascript months are zero-based.
12 means January of next year.

You want 11.

If you meant for the input to be at UTC, you should be doing this instead:

var ts = Date.UTC(2011,11,31,0,0,0);

As SLaks pointed out, months run 0-11, but even then - you must initialize the date as UTC if you want the response in UTC. In your code, you were initializing a local date, and then converting it to UTC. The result would be different depending on the time zone of the puter where the code is running. With Date.UTC, you get back a timestamp - not a Date object, and it will be the same result regardless of where it runs.

From Chrome's debugging console:

This is the same value returned from your .NET code, which looks just fine, except I would return a long, not a string.

The date JS is wrong I believe. Omit the var utcDate line and output just d.getTime()

The time between two dates is the same, regardless of timezone and offset. Timezones are relative to an ACTUAL point in time, so whether you call .getTime() on the UTC or EST or PST date, it will be the same relative to 1970-1-1 of the same timezone.

2011-12-31 EST - 1970-1-1 EST 
    == 2011-12-31 PST - 1970-1-1 PST 
    == 2011-12-31 UTC - 1970-1-1 UTC

EDIT: Per @Slaks above, you also are not using the 0-based month (which I also had no idea about).

本文标签: Computing milliseconds since 1970 in C yields different date than JavaScriptStack Overflow