What are some important considerations for developers using JavaScript’s Number type? I know it’s an implementation of C’s double type, but I’m a self-taught Python developer so that doesn’t take me very far.
Pointers to well-written articles will be great answers.
Thanks!
In general, it’s just the general problems that plague Floating Point based arithmetic. For example, .1 + .2 == 0.30000000000000004 in JavaScript.
A good article on floating point arithmetic in general, and some of the problems associated with it can be found here on Wikipedia.
Edit: a previous answer to a similar question that seemed to be pretty popular – Is floating point math broken?