Possible Duplicate:
Is JavaScript’s Floating-Point Math Broken?
This is going to be a very rudimentary comp-sci question. Consider the following C# (the same holds true for JS, I assume it's how math works with doubles in general).
var i = .01;
i+=.01; //i=.02
i+=.01; //i=.03
i+=.01; //i=.04
i+=.01; //i=.05
i+=.01; //i=.0600000000005 (I may have added/missed a few 0s in there)
i+=.01; //i=.07
i+=.01; //i=.08
i+=.01; //i=.09
i+=.01; //i=.0999999999992 (I may have added/missed a few 9s in there)
So, what's happening and how can I accurately predict the sum of i+.01?