我会将整个文件复制到您的服务器然后使用LOAD DATA LOCAL INFILE
,因为它支持 csv 输入:
LOAD DATA INFILE 'data.txt' INTO TABLE tbl_name
FIELDS TERMINATED BY ',' ENCLOSED BY '"'
LINES TERMINATED BY '\r\n'
IGNORE 1 LINES;
如果您不喜欢此解决方案,您可以使用mysql_ping()
(希望您使用的连接器支持它)自动重新连接。
检查与服务器的连接是否正常。如果连接已断开并且启用了自动重新连接,则会尝试重新连接。如果连接断开并且禁用了自动重新连接,则 mysql_ping() 返回错误。
如果你有问题,你可以下载文件但由于 MySQL 的延迟而超时,你可以在两个线程中运行它并同步它queue
:
# Prepare queue and end signaling handler
q = queue.Queue()
done = threading.Event()
# Function that fetches items from q and puts them into db after
# certain amount is reached
def store_db():
items=[]
# Until we set done
while not done.is_set():
try:
# We may have 500 records and thread be done... prevent deadlock
items.append(q.get(timeout=5))
if len(items) > 1000:
insert_into(items)
items = []
q.task_done()
# If you wait longer then 5 seconds < exception
except queue.Empty: pass
if items:
insert_into(items)
# Fetch all data in a loop
def continous_reading():
# Fetch row
q.put(row)
# Start storer thread
t = threading.Thread(target=store_db)
t.daemon = True
t.start()
continous_reading()
q.join() # Wait for all task to be processed
done.set() # Signal store_db that it can terminate
t.join() # to make sure the items buffer is stored into the db