I have an Excel file that I am currently able to import into a mySQL database, but I would like to increase the speed of the import somewhat. My set up for reading the Excel files (approx 40,000 rows in each file, 10-15 columns) and persisting is as follows:
* Java Code *
public void importFileToDataBase(String file) {
try {
FileInputStream latestExcelFile = new FileInputStream(file);
wb = WorkbookFactory.create(myFile);
} catch (Exception ex) {
}
Sheet bigSheet = wb.getSheetAt(0);
for (int i = 1; i <= bigSheet.getLastRowNum(); i++) {
try {
row = bigSheet.getRow(i);
String bigSheetId = i + "";
Date column1 = row.getCell(0).getDateCellValue();
int column2 = (int) row.getCell(1).getNumericCellValue();
int column3 = (int) row.getCell(3).getNumericCellValue();
int column4 = (int) row.getCell(4).getNumericCellValue();
int column5 = (int) row.getCell(5).getNumericCellValue();
int column6 = (int) row.getCell(6).getNumericCellValue();
int column7 = (int) row.getCell(7).getNumericCellValue();
String column8 = row.getCell(9).toString();
String column9 = row.getCell(11).toString();
String column10 = row.getCell(12).toString();
String column11 = row.getCell(13).toString();
int column12 = (int) row.getCell(8).getNumericCellValue();
Double column13 = row.getCell(10).getNumericCellValue();
String column14 = row.getCell(2)
.toString();
BigSheet mySheet = new BigSheet(
baseDataTableId, column1, column2, column3, column4,
column5, column6, column7, column8, column9,
column10, column11, column12, column13, column14);
putInDatabase(mySheet);
} catch (Exception ex) {}
}
Is there a way to improve the speed of the import via batching?
If so, could you give an example of how to do this (if it's possible)?