0

如何在 iPython 中加载太大的 csv 文件?似乎无法立即将其加载到内存中。

4

1 回答 1

2

您可以使用此代码以块的形式读取文件,它还将将该文件分发到多个处理器上。

import pandas as pd 
import multiprocessing as mp

LARGE_FILE = "yourfile.csv"
CHUNKSIZE = 100000 # processing 100,000 rows at a time

def process_frame(df):
        # process data frame
        return len(df)

if __name__ == '__main__':
        reader = pd.read_csv(LARGE_FILE, chunksize=CHUNKSIZE)
        pool = mp.Pool(4) # use 4 processes

        funclist = []
        for df in reader:
                # process each data frame
                f = pool.apply_async(process_frame,[df])
                funclist.append(f)

        result = 0
        for f in funclist:
                result += f.get(timeout=10) # timeout in 10 seconds
于 2015-07-21T23:09:06.077 回答