(This question is asked in a Ruby context, but I wouldn't be upset if it were answered in a more general way.)
So (just about) everyone knows that comparing floating point numbers involved in math computations for equality is a no-go, and you have to test for range (in Ruby, like so):
assert_in_delta 5.0, 3.5+1.5, 0.0001
However, is this still necessary when sanity checking a floating point number for equality when it is passed from one subsystem to another, say (in a Ruby context) like so?
json={:foo => 0.5}.to_json
# POST the JSON to a Rails controller, that puts it in a Mongo database
found_obj=pull_obj_out_of_Mongo # details not important here
assert_in_delta 0.5, found_obj[:foo], 0.0001
I personally argue is that assert_in_delta is still a good idea here, because I don't know, and perhaps many people don't know, whether the conceptual floating point value 0.5 is passed along in a way that would enable it to be compared with the floating point literal 0.5 with equals. Is that actually the case, or am I being too paranoid about how floating point numbers are stored and passed along? How much does the answer depend on what language is being used?