1

我正在使用下面的代码使用jOOQ在Oracle中插入来自大型 csv( 100,000 条记录)的记录。这是片段:

CSVReader csvReader = null;
    String csvError;
    try {
      csvReader = new CSVReader(new FileReader(tempFile));
    } catch (FileNotFoundException e) {
      e.printStackTrace();
    }

    //delete previous uploaded stuff from the same user
    clearTable(user.getUserId());
    List<BSearchRecord> records = new ArrayList<>();
    boolean isFirst = true;
    while (csvReader.hasNext()) {
      if(isFirst) {
        //validate headers
        String[] headers = csvReader.next();
        uploadedHeaders = headers;
        csvError = validateHeadersLength(headers);
        if(csvError != null) {
          return csvError;
        }
        for(int i=0; i<headers.length; i++) {
          csvError = validateEachHeader(i, headers[i]);
          if(csvError != null) {
            return csvError;
          }
        }
        isFirst = false;
        continue;
      } else {
        String[] row = csvReader.next();
        if(row != null) {
          BSearchRecord bSearchRecord = new BSearchRecord();
          bSearchRecord.set(RET_BSEARCH.UPLOADEDBY, user.getUserId());
          for(int i=0; i<csvHeaders.length; i++){
            Field field = bSearchRecord.field(backendColumns[i]);
            bSearchRecord.set(field, row[i]);
          }
          records.add(bSearchRecord);
        }
      }
    }
    db.batchInsert(records).execute(); // IS THIS OKAY ? (is this batch enabled?)

我遇到了一些建议,例如:从 CSV 加载时出现 PostgreSQL/JooQ 批量插入性能问题;如何改进流程?

但是,我的用例有点不同,所以问这个只是为了得到一个建议,我是否以正确的方式做这件事?

另外,您能否建议,jOOQ 中的batchInsert(..)实现是否支持批处理执行?(在文档中,我看到了 .bind(..) 方法,所以为了清楚起见要求这个)

4

1 回答 1

0

jOOQ 具有用于导入 CSV 数据的开箱即用 API: https ://www.jooq.org/doc/latest/manual/sql-execution/importing/importing-csv

ctx.loadInto(BOOK)
   .loadCSV(inputstream, encoding)
   .fields(BOOK.ID, BOOK.AUTHOR_ID, BOOK.TITLE)
   .execute();

它使您可以方便地指定批量/批量/提交大小和其他行为。我建议你用那个。

于 2020-05-12T13:16:48.723 回答