Map them to the total of the ranges you expect. then distribute them between the ranges.
E.g. if you need a random between 0..10 and 100..110
Generate a random-number between 20. The lower 10 get assigned to the 0..10 range, the rest to the other interval (or something like that - I may be off by one.. Interval arithmetic is one of these things that I never get right on the first try).
The reason behind this is that you often deal with non perfect random generators. These start to behave strange if you distribute successive random-number variables over several dimensions (e.g. first choose a random interval, then choose a random inside the chosen interval). That can lead to a very obvious non-random behavior.
If you start with a better random number generator that gets it's data from true random sources you may end up wasting precious random bits. If you do it just once every second it may not be a problem. If you do it to often though you program might get stalled because the pure random sources have to catch up with your random-bit consume.