In book Algorithms fourth edition by Robert Sedgewick on page 200, it says "for example, if you have 1GB of memory on your computer (1 billion bytes), you cannot fit more than about 32 million int values."
I got confused after my calculation: 1,000,000,000 bytes/4 bytes = 250 million
How the author got 32 million?
The book describes like below: