0

I am using Oracle as a DBMS and Tuxedo for application server. Customer has the need to export data from Oracle to SAMFILE for interface purpose. Unfortunately, the total number of records size is huge (over 10 million) so I was wondering what is the best practice to extract big amounts of data to a file on the database server.

I am used to creating a cursor and fetching a record then writing to file. Is there a better i.e. faster way to handle this? It is a recurring task.

4

1 回答 1

0

我建议您阅读 Adrian Billington 关于调整 UTL_FILE 的文章。它涵盖了所有的基础。 在这里找到它

重要的是缓冲记录,从而减少文件 I/O 调用的数量。您将需要对不同的实现进行基准测试,以查看哪种最适合您的情况。

注意他对查询性能的建议。如果大部分时间都花在数据采集上,那么优化文件 I/O 是没有意义的。

于 2015-04-17T07:14:09.593 回答