我正在编写一个似乎正在泄漏内存的 python 扩展。我正在尝试使用 valgrind 找出问题的根源。
但是,根据 valgrind,python 本身似乎正在泄漏内存。使用以下简单脚本:
你好.py
print "Hello World!"
和做
> valgrind --tool=memcheck python ./hello.py
(...)
==7937== ERROR SUMMARY: 580 errors from 34 contexts (suppressed: 21 from 1)
==7937== malloc/free: in use at exit: 721,878 bytes in 190 blocks.
==7937== malloc/free: 2,436 allocs, 2,246 frees, 1,863,631 bytes allocated.
==7937== For counts of detected errors, rerun with: -v
==7937== Use --track-origins=yes to see where uninitialised values come from
==7937== searching for pointers to 190 not-freed blocks.
==7937== checked 965,952 bytes.
==7937==
==7937== LEAK SUMMARY:
==7937== definitely lost: 0 bytes in 0 blocks.
==7937== possibly lost: 4,612 bytes in 13 blocks.
==7937== still reachable: 717,266 bytes in 177 blocks.
==7937== suppressed: 0 bytes in 0 blocks.
==7937== Rerun with --leak-check=full to see details of leaked memory.
有人对这种奇怪的行为有解释吗?python解释器真的会泄漏内存吗?
python 开发人员使用什么工具来调试他们的内存泄漏?