我想将从 json 解析的数据批量插入到 db 中。我使用下面的方法插入批处理。问题是 mDbWritable.beginTransaction(); 执行时间太长。通常像 6 秒!不知道哪里有问题。一些想法是什么导致这么长的执行时间?非常感谢。
@Override
public ContentProviderResult[] applyBatch(ArrayList<ContentProviderOperation> operations)
throws OperationApplicationException {
long start = System.currentTimeMillis();
mDbWritable.beginTransaction();
long time = System.currentTimeMillis() - start;
Alog.i(TAG, "Time applyBatch beginTransaction: " + time);
final int numOperations = operations.size();
final ContentProviderResult[] results = new ContentProviderResult[numOperations];
try {
for (int i = 0; i < numOperations; i++) {
results[i] = operations.get(i).apply(this, results, i);
}
mDbWritable.setTransactionSuccessful();
} finally {
mDbWritable.endTransaction();
}
return results;
}
日志中的一些示例:
11-16 15:14:53.726: I/ApiProvider(21442): Time applyBatch beginTransaction: 6025
11-16 15:15:00.713: I/ApiProvider(21442): Time applyBatch beginTransaction: 4940
11-16 15:15:17.819: I/ApiProvider(21442): Time applyBatch beginTransaction: 8651
11-16 15:15:45.346: I/ApiProvider(21442): Time applyBatch beginTransaction: 12672
11-16 15:16:16.807: I/ApiProvider(21442): Time applyBatch beginTransaction: 12411
11-16 15:16:45.685: I/ApiProvider(21442): Time applyBatch beginTransaction: 12247
11-16 15:17:01.500: I/ApiProvider(21442): Time applyBatch beginTransaction: 12788
编辑:解析 json 时,我在循环中使用应用批处理。例如,对于 json 中的每个项目 - 解析并应用批处理。批处理包含插入、更新、删除操作。
这是我如何迭代和调用 applyBatch 的代码
Cursor starredChannelsCursor =
mContentResolver.query(ApiContract.Channels.CONTENT_URI,
new String[] {BaseColumns._ID, ChannelsTable.ID, ChannelsTable.SLUG },
ChannelsTable.IS_STARRED + "=?",new String[] { "1" }, null);
String userName = mSettings.getUserName();
if (starredChannelsCursor != null && starredChannelsCursor.moveToFirst()) {
while (!starredChannelsCursor.isAfterLast()) {
String channelSlug = starredChannelsCursor.getString(2);
ChannelHandler channelHandler = new ChannelHandler(this);
URI channelApiUri = Constants.getChannelApiURI(channelSlug,userName);
//execute update make applybatch call
executeUpdate(channelApiUri, channelHandler);
starredChannelsCursor.moveToNext();
}
}
if (starredChannelsCursor != null) {
starredChannelsCursor.close();
}
/**
* Make call to Uri, parse response and apply batch operations to
* contentResolver
*
* @param apiUri
* @param handler
* - handles parsing
*/
private boolean executeUpdate(URI apiUri, AbstractJSONHandler handler) {
ApiResponse apiResponse = mHttpHelper.doHttpCall(apiUri);
ArrayList<ContentProviderOperation> batch =
new ArrayList<ContentProviderOperation>();
if (apiResponse != null) {
batch = handler.parse(apiResponse);
Alog.v(TAG, "update user data from " + apiUri);
}
if (batch.size() > 0) {
try {
mContentResolver.applyBatch(ApiContract.CONTENT_AUTHORITY, batch);
} catch (Exception e) {
Alog.v(TAG, "Error: " + e.getMessage());
}
}
return true;
}