I have a MySQL database "DB" in which there's a table "TABLE" of a single field and it contains about 8 Million rows. I am trying to download that table into a text file as follows:
//Open connection.
connection.Open();
//Create command.
MySqlCommand command = connection.CreateCommand();
command.CommandText = "SELECT * FROM `DB`.`TABLE`";
//Execute command.
MySqlDataReader result = command.ExecuteReader();
//If result isn't null
if (result != null)
{
//Open stream to write result.
System.IO.StreamWriter file = new System.IO.StreamWriter(@"C:\Users\StackOverflow\Desktop\ID_List.txt", true);
//For each line row of result.
while (result.Read())
{
//Write result in a line.
file.WriteLine(result.GetValue(0).ToString());
}
}
//Close connection.
connection.Close();
Upon running it starts downloading the data and outputing it to the text file but after only a minute or two it gives:
An unhandled exception of type 'MySql.Data.MySqlClient.MySqlException' occurred in test program.exe
Additional information: Timeout expired. The timeout period elapsed prior to completion of the operation or the server is not responding.
What Might I change or do differently for it to work? Thanks for any suggestions and answers :)
[EDIT]: I added command.CommandTimeout = 240;
and tested three times, every time it downloaded for a couple minutes (about 40MB of data each time) before giving:
An unhandled exception of type 'MySql.Data.MySqlClient.MySqlException' occurred in test program.exe
Additional information: Fatal error encountered during data read.
And these are the last few lines of the debug output:
at MySql.Data.MySqlClient.NativeDriver.FetchDataRow(Int32 statementId, Int32 columns) in :line 0
at MySql.Data.MySqlClient.Driver.FetchDataRow(Int32 statementId, Int32 columns) in :line 0
at MySql.Data.MySqlClient.ResultSet.GetNextRow() in :line 0
at MySql.Data.MySqlClient.ResultSet.NextRow(CommandBehavior behavior) in :line 0
at MySql.Data.MySqlClient.MySqlDataReader.Read() in :line 0</ExceptionString><InnerException><ExceptionType>System.IO.EndOfStreamException, mscorlib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089</ExceptionType><Message>Attempted to read past the end of the stream.</Message><StackTrace> at MySql.Data.MySqlClient.MySqlStream.ReadFully(Stream stream, Byte[] buffer, Int32 offset, Int32 count) in :line 0
at MySql.Data.MySqlClient.MySqlStream.LoadPacket() in :line 0</StackTrace><ExceptionString>System.IO.EndOfStreamException: Attempted to read past the end of the stream.
at MySql.Data.MySqlClient.MySqlStream.ReadFully(Stream stream, Byte[] buffer, Int32 offset, Int32 count) in :line 0
at MySql.Data.MySqlClient.MySqlStream.LoadPacket() in :line 0</ExceptionString></InnerException></InnerException></Exception></TraceRecord>
An unhandled exception of type 'MySql.Data.MySqlClient.MySqlException' occurred in test program.exe
Additional information: Fatal error encountered during data read.
The program '[4548] test program.exe: Program Trace' has exited with code 0 (0x0).
The program '[4548] test program.exe: Managed (v4.0.30319)' has exited with code 0 (0x0).
[EDIT2]: I added MySqlDataReader result = command.ExecuteReader(System.Data.CommandBehavior.SingleResult);
in an effort to remedy the above error and now I am getting:
An unhandled exception of type 'MySql.Data.MySqlClient.MySqlException' occurred in test program.exe
Additional information: Query execution was interrupted
After some light google-ing I figure it is GoDaddy that is limiting the query execution time. Therefore a working solution is to have try/catch with a counter to keep track of how many records were retrieved before the connection was closed and to re-open new connections with the counter as starting point to LIMIT until the table size is exhausted.
Thanks for all the help!