这是我之前的问题HERE的后续问题。我目睹了我的 Java 应用程序中的内存泄漏。最初,我认为泄漏来自我的应用程序的服务器组件。但是按照别人的建议,没有。
我使用了一个工具来转储堆内存并使用JProfiler
. 显然是由于我的怀疑HashMaps
。但我不确定,因为我不熟悉如何解释转储。
这是我的应用程序结构的简短片段(它每 15 分钟缓存一些文本数据,以便快速检索服务器线程)。
导致泄漏问题的原因是什么?以及如何从下面的转储中识别它?显然我这样做的方式new Object()
并且HashMap.put()
有一些泄漏问题?!
第一个入门课程/主要课程。在这里,我启动了 7 个 main HashMaps
,每个将一个键(现在只有一个 - 最终将有 16 个键)映射到NavigableMap
大约 4000 个单行JSON
字符串的时间序列。
public class MyCache {
static HashMap <String, NavigableMap <Long, String>> map1= new HashMap <String, NavigableMap <Long, String>> ();
static HashMap <String, NavigableMap <Long, String>> map2= new HashMap <String, NavigableMap <Long, String>> ();
static HashMap <String, NavigableMap <Long, String>> map3= new HashMap <String, NavigableMap <Long, String>> ();
static HashMap <String, NavigableMap <Long, String>> map4= new HashMap <String, NavigableMap <Long, String>> ();
static HashMap <String, NavigableMap <Long, String>> map5= new HashMap <String, NavigableMap <Long, String>> ();
static HashMap <String, NavigableMap <Long, String>> map6= new HashMap <String, NavigableMap <Long, String>> ();
static HashMap <String, NavigableMap <Long, String>> map7= new HashMap <String, NavigableMap <Long, String>> ();
public static void main(String[] args) throws Exception {
new Server();
new Aggregation();
}
}
然后在 中Aggregation()
,我从 HTTP 资源中获取一些文本,将它们转换为 JSON 字符串,并将它们缓存在一些临时NavigableMaps
的 中,然后将它们放在 main 中HashMap
(因此刷新不会对服务器造成太大影响)。
public class Aggregation {
static NavigableMap <Long, String> map1Temp= new ConcurrentSkipListMap <Long, String> ();;
static NavigableMap <Long, String> map2Temp = new ConcurrentSkipListMap <Long, String> ();
static NavigableMap <Long, String> map3Temp= new ConcurrentSkipListMap <Long, String> ();
static NavigableMap <Long, String> map4Temp= new ConcurrentSkipListMap <Long, String> ();
static NavigableMap <Long, String> map5Temp = new ConcurrentSkipListMap <Long, String> ();
static NavigableMap <Long, String> map6Temp = new ConcurrentSkipListMap <Long, String> ();
static NavigableMap <Long, String> map7Temp = new ConcurrentSkipListMap <Long, String> ();
public Aggregation(){
// loop to cache last 15 mins
while (true) {
logger.info("START REFRESHING ...");
for (int i = 0; i < mylist.size(); i++) {
long startepoch = getTime(mylist.get(i).time);
MyItem m = mylist.get(i);
String index=(i+1)+"";
process1(index, m.name, startepoch);
//adds to map1Temp
process2(index, m.name, startepoch);
//adds to map2Temp
process3(index, m.name, startepoch);
//adds to map3Temp
process4(index, m.name, startepoch);
//adds to map4Temp
process5(index, m.name, startepoch);
//adds to map5Temp
process6(index, m.name, startepoch);
//adds to map6Temp
process7(index, m.name, startepoch);
//adds to map7Temp
}
//then `put` them in the main `HashMap` all at-once:
MyCache.map1.put(channel, new ConcurrentSkipListMap <Long, String> (map1Temp));
MyCache.map2.put(channel, new ConcurrentSkipListMap <Long, String> (map2Temp));
MyCache.map3.put(channel, new ConcurrentSkipListMap <Long, String>(map3Temp));
MyCache.map4.put(channel, new ConcurrentSkipListMap <Long, String>(map4Temp));
MyCache.map5.put(channel, new ConcurrentSkipListMap <Long, String> (map5Temp));
MyCache.map6.put(channel, new ConcurrentSkipListMap <Long, String> (map6Temp));
MyCache.map7.put(channel, new ConcurrentSkipListMap <Long, String> (map7Temp));
//printing the size of all Hashmap entries. They don't grow :-/
logger.info("\t"+"map1.size(): "+MyCache.map1.get(key).size());
logger.info("\t"+"map2.size(): "+MyCache.map2.get(key).size());
//and other 5...
//then clear the temp maps so they don't grow over and over
map1Temp.clear();
map2Temp.clear();
map3Temp.clear();
map4Temp.clear();
map5Temp.clear();
map6Temp.clear();
map7Temp.clear();
}
//sleep for 15 min until next caching cycle
Thread.sleep(cacheEvery*1000*60);
}