2

I have a large amount of data in i.e (around 10 million records) and trying to export in .csv but its completely not working and page continuously loading and then results in a blank page. I set max_execution_time,memory_limit in php.ini file but still its not working then I tried different approaches but no success.

In CodeIgniter I used array to CSV library like this:

$this->load->helper('csv') and also built in function

$this->load->dbutil()

but still it's not working,

so any idea how to achieve this large amount data manage and export CSV file with out an any issue.

4

5 回答 5

4

You don't say which database you're using, but I've often found with large numbers of records that it's far faster to bypass the framework and export the CSV directly from the query.

In MySQL, this would look like:

SELECT id, name, price INTO OUTFILE '/tmp/products.csv'
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
ESCAPED BY '\\'
LINES TERMINATED BY '\n'
FROM products WHERE 1
于 2013-10-15T09:24:34.837 回答
2

Add "max_input_vars = 3000" or "max_input_vars = 5000" to your php.ini file and then try. Hope this helps you. Do not forget to restart your apache server. You may extend this limit if the error persist. [Note: I had faced a similar issue while exporting a large csv file through browser. The above implementation fixed my issue]

于 2013-10-14T09:08:32.233 回答
1

You may to generate the csv file using the below example query and then export it:

SELECT id, client, project, task, description, time, date INTO OUTFILE '/path/to/file.csv'
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
LINES TERMINATED BY '\n'
FROM ts
于 2013-10-15T09:50:58.037 回答
0

I don't know what you want to achieve, but changing the set_time_limit is often not recommended. This can lead to an ever running script, and if you call the script every x hours will crash you server in some time.

But as I understand you want to generate a csv file for an user. What you do now is (I guess) an user goes to a link and you start rendering the csv.

What you could do (using javascript) is that when you click the link there starts a progressbar, and you start rendering the file in the background (can even send an email when finished). This way the user can still use the system, also the user won't press reload so you start rendering multiple csv.

Second, I don't know the codeigniter framework, native functions are always faster. But if you need to render 1.000+ rows perhaps you can look into sharing the load. So load 100, insert 100 rows, load 100, insert 100 .... (you get the point I hope) (100 is just a number)

Hope this tips can help

edit --- some code example ---

Well it took some time but here is some code examples of the above.

First we will check every x seconds (i do 60 in the example)(so every min) if the csv exists. To check if the file exists we use this stackoverflow anser and just add a simple javascript interval.

var checkFile = self.setInterval(function(){checkFile()},60000); // interval uses milli seconds as far as I know

function checkFile() {
    if (UrlExists(url)) { // url exists is the function of the answer given
        checkFile = window.clearInterval(checkFile);
        // add other stuff to do
    }
}

With this code you can check if the file is renderd and if so, you can set a message to the user or something else

于 2013-10-14T08:56:24.800 回答
0

The simple solution for exporting CSV files with data from database.. Follow the below link..

http://writephp.tuxkiddos.com/2013/02/export-to-csv-ci-helper-file.html https://gist.github.com/opnchaudhary/4744797#file-csv_helper-php for GitHub link

于 2014-01-15T05:51:33.680 回答