0

我正在处理的问题是在以下 TRY{} CATCH{} 语句中执行 3 个 SP 所花费的时间在执行与 2 个 SQL 表中的数据进行比较的事务时会出现问题。我有一个从 XML 文件创建的数据集,经过解析并插入到 CARD 对象中,然后从该对象插入到 2 个表中。正在处理的记录数为 15,198 条记录。每个表中的 INSERTTMS 列都有一个记录插入表中的时间戳,实际上需要 10 分 33 秒才能完成插入。

我需要帮助寻找解决方案来重构此代码以帮助加快流程。我们在 SQL Server 2003 上运行,并且使用的是 Java 1.6。任何帮助/方向将不胜感激。谢谢。

这是代码:

ResultSet results = null;
    CallableStatement call = null;
    PassThroughDBConnection con = null;
    try {
        con = new PassThroughDBConnection();
        con.setName("PBFDBConnection");
        con.setDBServer(System.getProperty(ATMServer.DBSERVER_PROPERTY));
        con.setDBServerType(System.getProperty(ATMServer.PROP_DBSERVERTYPE));
        con.setDBName(System.getProperty(ATMServer.DBNAME_PROPERTY));
        if (System.getProperty(ATMServer.DBUSER_PROPERTY) != null) {
            con.setDBUser(System.getProperty(ATMServer.DBUSER_PROPERTY));
        }
        if (System.getProperty(ATMServer.DBPASS_PROPERTY) != null) {
            con.setDBPass(System.getProperty(ATMServer.DBPASS_PROPERTY));
        }
        con.connect();

        call = con.prepareCall("{call clearData (?, ?)}");
        call.setInt("ServerID", ATMServer.getServerID());
        call.setInt("BankID", owningChannel.getBankID());
        call.executeUpdate();

        try {
            call.close();
        } catch (Exception e) {

        }
        results = null;
        call = null;
        LOGGER.trace("Preparing to save PBF data to database.  cards size: " + cards.size());           

        for (Card currentCard : cards.values()) {

            // add the card record
            // add any account records
            call = con.prepareCall("{call insertCardData (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)}");

            call.setInt("ServerID", ATMServer.getServerID());
            call.setInt("BankID", owningChannel.getBankID());
            call.setString("CardNumber", currentCard.cardnumber);
            call.setString("Zip", currentCard.zip);
            call.setString("Address", currentCard.address);
            call.setString("Expiration", currentCard.expire);
            call.setLong("PurchLimit", currentCard.purLimit);
            call.setLong("PurchUsed", currentCard.purUsed);
            call.setLong("ATMLimit", currentCard.atmLimit);
            call.setLong("ATMUsed", currentCard.atmUsed);
            call.setShort("Status", currentCard.status);
            call.setShort("Sequence", currentCard.cardSequence);
            call.setShort("CardType", currentCard.cardType.pbfValue);

            results = call.executeQuery();
            Long newId = null;
            if (results.next()) {
                newId = results.getLong("NewID");
                if (newId != null) {
                    currentCard.databaseId = newId;
                    for (Account account : currentCard.accounts) {
                        if (account != null) {
                            call = con.prepareCall("{call insertAccountData (?, ?, ?, ?, ?, ?, ?, ?, ?, ?)}");
                            call.setInt("ServerID", ATMServer.getServerID());
                            call.setInt("BankID", owningChannel.getBankID());
                            call.setString("AccountID", account.accountID);
                            call.setString("Zip", account.zip);
                            call.setString("Address", account.address);
                            call.setLong("Available", account.available);
                            call.setLong("Balance", account.balance);
                            call.setLong("CreditLine", account.creditline);
                            call.setShort("AccountType", (short) account.accountType.pbfValue);
                            call.setLong("CardId", newId);
                            call.executeUpdate();
                            try {
                                call.close();
                            } catch (Exception e) {
                                LOGGER.fatal("Error saving PBF data to database:" + FormatData.formatStack(e));
                            }
                            results = null;
                            call = null;
                        }
                    }
                }
            }
        }
4

1 回答 1

1

一种方法是批量插入,而不是一个一个插入。它将节省多次访问数据库并提高性能。

使用call.addBatch()and 将所有插入添加到批处理调用中call.executeBatch()

像这样的东西

               currentCard.databaseId = newId;
                for (Account account : currentCard.accounts) {
                    if (account != null) {
                        call = con.prepareCall("{call insertAccountData (?, ?, ?, ?, ?, ?, ?, ?, ?, ?)}");
                        call.setInt("ServerID", ATMServer.getServerID());
                        call.setInt("BankID", owningChannel.getBankID());
                        call.setString("AccountID", account.accountID);
                        call.setString("Zip", account.zip);
                        call.setString("Address", account.address);
                        call.setLong("Available", account.available);
                        call.setLong("Balance", account.balance);
                        call.setLong("CreditLine", account.creditline);
                        call.setShort("AccountType", (short) account.accountType.pbfValue);
                        call.setLong("CardId", newId);
                        call.addBatch();

                    }
                }
                call.executeBatch()
于 2013-05-30T17:45:47.193 回答