2
4

2 回答 2

2

Load factor is independent of the number of items you insert. It's basically the percentage of available space that's actually in use. If, for example, you currently have space for 100 elements allocated, the maximum load factor could say to start resizing the table when you had inserted, say, 80 items (this would correspond to a maximum load factor of 80%).

Setting the maximum load factor is, therefore, largely independent of the number of elements you're going to store. Rather, it's (mostly) an indication of how much extra space you're willing to use to improve search speed. All else being equal, a table that's closer to full will have more collisions, which will slow searching.

于 2012-06-12T23:46:16.230 回答
1

If you want to optimize an unordered set to hold N elements, you want to use the rehash function. This accepts an argument that sets the minimum buckets for the set. This will prevent a rehash from occuring when you are inserting elements into your set.

For instance, if your desired load factor is 75% then your bucket size should be N / .75

//  This creates an unordered set optimized for `80` elements with a load factor of `75%`
std::unordered_set<std::string> myset;
myset.rehash(120);
于 2012-09-27T16:01:39.357 回答