0

我无法理解为什么我消耗的 RAM 比<dataframe>.info(memory_usage="deep")Google Colab中显示的实际内存使用量多得多

我有一个实用功能,可以打印一些关于 ram 的信息

# module utils.py
def print_available_ram():
    MB = 1024 * 1024
    ram_available = psutil.virtual_memory().available
    ram_tot = psutil.virtual_memory().total
    print("Ram available:", ram_available / MB)
    print("Ram tot      :", ram_tot / MB)
    print("              ", (ram_available / ram_tot) * 100, "%")

我在 Google Colab 中运行以下代码

# ... some other code
utils.print_available_ram()
train_dataset_raw = panda.read_feather(train_dataset_path)
utils.print_available_ram()
log(train_dataset_raw.info(memory_usage="deep"))
return

输出

# intended as MB
Ram available: 12059.6640625
Ram tot      : 13021.0625
               92.61658994801691 %
Ram available: 2723.29296875
Ram tot      : 13021.0625
               20.91452190441448 %
<class 'pandas.core.frame.DataFrame'>
RangeIndex: 2164804 entries, 0 to 2164803
Columns: 281 entries, 0-60 to target
dtypes: float64(281)
memory usage: 4.5 GB
  • print_available_ram 显示的信息不可靠吗?
  • read_feather 是在占用而不是释放一些额外的内存吗?
4

0 回答 0