I’m using java and referring to the ‘double’ datatype. To keep it short, I’m reading some values from standard input that I read in my code as doubles (I would much rather use something like BigInteger but right now it’s not possible).
I expect to get double values from the user but sometimes they might input things like: 999999999999999999999999999.9999999999 which I think is beyond an accurate representation for double and get’s rounded to 1.0E27 (probably).
I would like to get some pointers on how to detect whether a specific value would be unable to be accurately represented in a double and would require rounding (so that I could refuse that value or other such actions).
Thank you very much
Most values entered by humans won’t be exactly representable with a double.
For instance, do you want to prevent the user from entering 0.1? That’s not exactly representable as a double.
To find the scale of the error, you could do something like:
Then check whether ‘error’ is tolerably small.