admin管理员组

文章数量:1384140

I found the following weird behavior while working on JavaScript numbers.

var baseNum = Math.pow(2, 53);
console.log(baseNum); //prints 9007199254740992

console.log(baseNum + 1); //prints 9007199254740992 again!

console.log(baseNum + 2); //prints 9007199254740994, 2 more than +1

console.log(baseNum + 3) // prints 9007199254740996, 2 more than +2
console.log(baseNum + 4) // prints 9007199254740996, same as +3

What is happening here? I understand that JavaScript can only represent numbers upto 2^53 (they are internally 'double'?), but why this behavior?

If 2^53 is the practical max, then why do we have Number.MAX_VALUE (1.7976931348623157e+308)?.

I found the following weird behavior while working on JavaScript numbers.

var baseNum = Math.pow(2, 53);
console.log(baseNum); //prints 9007199254740992

console.log(baseNum + 1); //prints 9007199254740992 again!

console.log(baseNum + 2); //prints 9007199254740994, 2 more than +1

console.log(baseNum + 3) // prints 9007199254740996, 2 more than +2
console.log(baseNum + 4) // prints 9007199254740996, same as +3

What is happening here? I understand that JavaScript can only represent numbers upto 2^53 (they are internally 'double'?), but why this behavior?

If 2^53 is the practical max, then why do we have Number.MAX_VALUE (1.7976931348623157e+308)?.

Share Improve this question asked Sep 26, 2012 at 5:44 Ashwin PrabhuAshwin Prabhu 7,6345 gold badges52 silver badges82 bronze badges 4
  • var base, then baseNum everywhere else? – Marc B Commented Sep 26, 2012 at 5:45
  • @MarcB Copy paste demon strikes again! Corrected the error – Ashwin Prabhu Commented Sep 26, 2012 at 5:46
  • 2 Article that goes into more detail at 2ality./2012/04/number-encoding.html. – Bill Commented Sep 26, 2012 at 5:48
  • @Bill Thanks, that explains it! – Ashwin Prabhu Commented Sep 26, 2012 at 5:52
Add a ment  | 

3 Answers 3

Reset to default 8

The number is really a double. The mantissa has 52-bits (source and extra information on doubles). Therefore, storing 2^53 chops off the one bit.

The number is stored using 3 pieces a sign bit (fairly straight forward) and two other pieces, the mantissa M and the exponent E. The number is calculated as:

(1 + M/2^53) * 2^(E-1023)

I might have some of the specifics there a little off, but the basic idea is there. So when the number is 2^53, 2^(E-1023) = 2^53 and since there are only 52 bits in M, you can no longer represent the lowest bit.

The answer that @CrazyCasta gave is good.

The only thing to add is to your second question:

If 2^53 is the practical max, then why do we have Number.MAX_VALUE (1.7976931348623157e+308)?

As you've demonstrated, it can store numbers larger than 2^53, but with precision worse than 2^0. As the numbers grow ever larger, they lose more and more precision.

So max value in Number.MAX_VALUE means the "largest" value it can represent; but it doesn't mean that accuracy is the same as a value near 2^1 or 2^53.

A corollary to this is that Number.MIN_VALUE is the smallest value that Number can contain; not the most negative. That is, it is the closest non-zero number to zero: 5.00E-324 (notice that's a positive number!).

The maximum value storable in a long is much larger than the maximum value storable with exact precision in a long. Floating-point numbers have a fixed maximum number of significant digits, but the magnitude of the number can get much larger.

There are a certain number of bits allocated for an exponent (power of two), and that's implicitly multiplied by the mantissa stored in the rest of the bits. Beyond a certain point, you lose precision, but you can keep incrementing the exponent to represent larger and larger magnitudes.

本文标签: Strange JavaScript Number behaviorStack Overflow