admin管理员组

文章数量:1303397

Python version | Javascript version | Whitepaper

So, I'm working on a website to calculate Glicko ratings for two player games. It involves a lot of floating point arithmetic (square roots, exponents, division, all the nasty stuff) and I for some reason am getting a pletely different answer from the Python implementation of the algorithm I translated line-for-line. The Python version is giving basically the expected answer for the example found in the original whitepaper describing the algorithm, but the Javascript version is quite a bit off.

Have I made an error in translation or is Javascript's floating point math just less accurate?

Expected answer: [1464, 151.4]
Python answer: [1462, 155.5]
Javascript answer: [1470.8, 89.7]

So the rating calculation isn't THAT bad, being 99.6% accurate, but the variance is off by 2/3!

Edit: People have pointed out that the default value of RD in the Pyglicko version is 200. This is a case of the original implementer leaving in test code I believe, as the test case is done on a person with an RD of 200, but clearly default is supposed to be 350. I did, however, specify 200 in my test case in Javascript, so that is not the issue here.

Edit: Changed the algorithm to use map/reduce. Rating is less accurate, variance is more accurate, both for no discernible reason. Wallbanging mence.

Python version | Javascript version | Whitepaper

So, I'm working on a website to calculate Glicko ratings for two player games. It involves a lot of floating point arithmetic (square roots, exponents, division, all the nasty stuff) and I for some reason am getting a pletely different answer from the Python implementation of the algorithm I translated line-for-line. The Python version is giving basically the expected answer for the example found in the original whitepaper describing the algorithm, but the Javascript version is quite a bit off.

Have I made an error in translation or is Javascript's floating point math just less accurate?

Expected answer: [1464, 151.4]
Python answer: [1462, 155.5]
Javascript answer: [1470.8, 89.7]

So the rating calculation isn't THAT bad, being 99.6% accurate, but the variance is off by 2/3!

Edit: People have pointed out that the default value of RD in the Pyglicko version is 200. This is a case of the original implementer leaving in test code I believe, as the test case is done on a person with an RD of 200, but clearly default is supposed to be 350. I did, however, specify 200 in my test case in Javascript, so that is not the issue here.

Edit: Changed the algorithm to use map/reduce. Rating is less accurate, variance is more accurate, both for no discernible reason. Wallbanging mence.

Share edited Aug 9, 2011 at 5:23 agf 177k45 gold badges299 silver badges241 bronze badges asked Aug 7, 2011 at 17:56 Austin YunAustin Yun 4891 gold badge5 silver badges14 bronze badges 4
  • 1 This is likely because Python and JavaScript don't handle floating point numbers the same way. Do you know how floating point numbers work? (and when they don't) – Halcyon Commented Aug 7, 2011 at 18:00
  • 2 Don't know if this is of any signifcance, but rd defaults to 200 in python and 350 in javascript. @Frits: both use IEEE 754. – Daniel Baulig Commented Aug 7, 2011 at 18:01
  • Yes, I understand how floating point numbers work and that, for example, they can't exactly represent decimal numbers. @Daniel Baulig: RD defaults to 200 in Python, which is incorrect, but used for the test case. In the whitepaper, it specifies that RD defaults to 350, but the test case given is on a player with an RD of 200 -- so I set the proper default in Javascript but called a new player in the test case with an RD of 200. – Austin Yun Commented Aug 8, 2011 at 12:40
  • Is there a number type in JavaScript like Python's rational? Read the section on floating point arithmetic in TAoCP Volume 2 (Section 4.2.2). It gives strategies for tending to cancel out errors instead of magnifying them. – agf Commented Aug 9, 2011 at 5:28
Add a ment  | 

2 Answers 2

Reset to default 7

typically you get errors like this where you are subtracting two similar numbers - then the normally insignificant differences between values bee amplified. for example, if you have two values that are 1.2345 and 1.2346 in python, but 1.2344 and 1.2347 in javascript, then the differences are 1e-4 and 3 e-4 respectively (ie one is 3x the other).

so i would look at where you have subtractions in your code and check those values. you may find that you can either (1) rewrite the maths to avoid the subtraction (often it turns out that you can find an expression that calculates the difference in some other way) or (2) focus on why the values at that particular point differ between the two languages (perhaps the difference in pi that the other answer identified is being amplified in this way).

it's also possible, although less likely here, that you have a difference because something is treated as an integer in python, but as a float in javascript. in python there is a difference between integers and floats, and if you are not careful you can do things like divide two integers to get another integer (eg 3/2 = 1 in python). while in javascript, all numbers are "really" floats, so this does not occur.

finally, it's possible there are small differences in how the calculations are performed. but these are "normal" - to get such drastic differences you need something like what i described above to occur as well.

PS: also note what Daniel Baulig said about the initial value of the parameter rd in the ments above.

My guess is that involves the approximations you're using for some of the constants in the JavaScript version. Your pi2 in particular seems a little.. brief. I believe Python is using doubles for those values.

本文标签: