admin管理员组

文章数量:1290425

Why is this:

console.log("1100" ^ "0001")
=> 1101 // as expected

console.log("1100" ^ "1001")
=> 1957 // ???

Please explain. Thanks.

Why is this:

console.log("1100" ^ "0001")
=> 1101 // as expected

console.log("1100" ^ "1001")
=> 1957 // ???

Please explain. Thanks.

Share Improve this question edited Mar 31, 2012 at 14:11 Tomalak 338k68 gold badges546 silver badges635 bronze badges asked Mar 31, 2012 at 13:57 lamulamu 2153 silver badges8 bronze badges 1
  • You are using the XOR operator, but actually you seem to want the OR, since for 1101 | 0001 = 1101 (OR), while 1101 ^ 0001 = 0101 (XOR). – Tomalak Commented Mar 31, 2012 at 14:17
Add a ment  | 

1 Answer 1

Reset to default 11

Those numbers are interpreted as decimal numbers.

Try:

console.log(parseInt("1100", 2) ^ parseInt("1001", 2))

Of course the answer (0101) is printed in decimal (5).

The JavaScript token grammar supports numbers in decimal, octal, and hex, but not binary. Thus:

console.log(0xC0 ^ 0x09)

The first one worked, by the way, because 1100 (decimal) is 1101 (decimal) after the xor with 1.

本文标签: bit manipulationBitwise XOR operator in JavaScriptStack Overflow