Why not use an LRU Cache?
From Java's LinkedHashMap documentation:
A special constructor is provided to
create a linked hash map whose order
of iteration is the order in which its
entries were last accessed, from
least-recently accessed to
most-recently (access-order). This
kind of map is well-suited to building
LRU caches. Invoking the put or get
method results in an access to the
corresponding entry (assuming it
exists after the invocation
completes). The putAll method
generates one entry access for each
mapping in the specified map, in the
order that key-value mappings are
provided by the specified map's entry
set iterator. No other methods
generate entry accesses. In
particular, operations on
collection-views do not affect the
order of iteration of the backing map.
So basically, every once in a while as your map gets too big, just delete the first x values that the iterator gives you.
See documentation for removeEldestEntry
to have this done for you automatically.
Here is code that demonstrates:
public static void main(String[] args) {
class CacheMap extends LinkedHashMap{
private int maxCapacity;
public CacheMap(int initialCapacity, int maxCapacity) {
super(initialCapacity, 0.75f, true);
this.maxCapacity = maxCapacity;
}
@Override
protected boolean removeEldestEntry(Map.Entry eldest) {
return size()>maxCapacity;
}
}
int[] popular = {1,2,3,4,5};
CacheMap myCache = new CacheMap(5, 10);
for (int i=0; i<100; i++){
myCache.put(i,i);
for (int p : popular) {
myCache.get(p);
}
}
System.out.println(myCache.toString());
//{95=95, 96=96, 97=97, 98=98, 99=99, 1=1, 2=2, 3=3, 4=4, 5=5}
}