I am using python 2.7, pyodbc and mysql 5.5. I am on windows
I have query which returns millions of rows and I would like to process it in chunks. using the fetchmany function.
He a portion of the code
import pyodbc
connection = pyodbc.connect('Driver={MySQL ODBC 5.1 Driver};Server=127.0.0.1;Port=3306;Database=XXXX;User=root; Password='';Option=3;')
cursor_1 = connection.cursor()
strSQLStatement = 'SELECT x1, x2 from X'
cursor_1.execute(strSQLStatement)
# the error occurs here
x1 = cursor_1.fetchmany(10)
print x1
connection.close()
My problem:
I get the error MySQL client ran out of memory
I guess that this is because the
cursor_1.execute
tries to read everything into memory and tried the following (one by one) but to no avail- In user interface (ODBC – admin tools) I ticked the “Don't cache results of forwarding-only cursors”</li>
- connection.query("SET GLOBAL query_cache_size = 40000")
My question:
Does pyodbc has the possibility to run the query and serve the results only on demand ?
The MySQL manual suggests to invoke mysql with the --quick option. Can this be done also when not using the command line?
Thanks for your help.
P.S: suggestions for an alternative MySQL module are also welcome, but I use portable python so my choice is limited.