0

I have a CSV file which has approximately 13000 lines to be updated into a mysql database. My PHP function stops after about 5000 lines with a timeout or memory overflow.

It loops every line with fgetcsv and checks if the line exists already, which is causing the timeout i guess (SELECT/INSERT/UPDATE queries).

I can't drop the table and re-insert everything because the table still has relations with other tables. -> Then i could write a single INSERT statement with multiple value strings in blocks of e.g. 5000 lines.

I need to find a way to read the file in chunks to prevent a timeout.

Thx in advance!

Some code:

private function readFile()
    {
        $this->profiler = NULL;
        $this->auto_render = FALSE;
        $this->request->headers['Content-Type'] = 'application/json';

        if (array_key_exists('uploadedfile', $_FILES))
        {
            $filename = $_FILES['uploadedfile']['tmp_name'];
            if ($_FILES['uploadedfile']['type'] == 'application/vnd.ms-excel') // csv file
            {
                if (is_uploaded_file($filename)) //check if file 
                {
                    if ($_FILES['uploadedfile']['error'] == UPLOAD_ERR_OK) //check no errors
                    {                   
                        // time limit : unlimited
                        set_time_limit(0);

                        // file handler
                        $filepointer = fopen($filename,'r');
                        return $filepointer;
                    }
                }
            }
        }   
    }

Within another function i call readFile() and loop the lines like this:

while (($line = fgetcsv($filepointer, 1000, ";")) != false)
4

1 回答 1

0

从命令行运行 php 文件。默认情况下,它没有设置超时。
或增加超时时间php.ini

于 2011-06-20T14:10:19.967 回答