0

I'm struggling with amounts of 20-50k JSON object response from server which I should insert into our indexeddb datastore.

Response is repeated with foreach and every single row is added with each. Calls with response less than 10k rows are working fine and inserted within a minute or so. But when the amounts get larger, the database goes unresponsive after a while and returns this error message

"db Error err=transaction aborted for unknown reason"

I'm using a Dexie wrapper for the database and an angular wrapper for dexie called ngDexie.

var deferred = $q.defer();
var progress = 0;
 // make the call
 $http({
       method: 'GET',
       headers: headers,
       url: '/Program.API/api/items/getitems/' + user
            }).success(function (response) {

        // parse response
        var items = angular.fromJson(response);
        // loop each item
        angular.forEach(items, function (item) {    
                    // insert into db
                    ngDexie.put('stuff', item).then(function () {
                        progress++;
                        $ionicLoading.show({
                            content: 'Loading',
                            animation: 'fade-in',
                            template: 'Inserting items to db: ' + progress
                                       + '/' + items.length,
                            showBackdrop: true,
                            maxWidth: 200,
                            showDelay: 0

                        });

                        if (progress == items.length) {
                            setTimeout(function () {
                                $ionicLoading.hide();
                            }, 500);
                            deferred.resolve(items);

                        }      
                    });
                });
            }).error(function (error) {
                $log('something went wrong');
                $ionicLoading.hide();
            });
            return deferred.promise;

Do I have the wrong approach with dealing with the whole data in one chunk? Could there be better alternatives? This whole procedure is only done once when the user opens up the site. All help is greatly appreciated. The target device is tablets running Android with Chrome.

4

3 回答 3

1

由于您收到未知错误,因此 I/O 出现问题。我的猜测是下面的数据库在处理大量数据时遇到了麻烦。可以尝试分批拆分,每批最多 10k。

事务可能由于与特定 IDBRequest 无关的原因而失败。例如,由于提交事务时的 IO 错误,或者由于遇到配额限制,实现无法将超出配额与特定请求联系起来。在这种情况下,实现必须运行使用事务作为事务并将适当的错误类型作为错误来中止事务的步骤。例如,如果超出配额,则应使用 QuotaExceededError 作为错误,如果发生 IO 错误,则应使用 UnknownError 作为错误。

你可以在规格中找到这个

另一种可能性,您是否在对象存储上定义了任何索引?因为对于您拥有的每个索引,每次插入都需要维护该索引。

于 2015-11-08T14:33:43.000 回答
0

如果您插入许多新记录,我建议您使用 add。这是出于性能原因添加的。请参阅此处的文档:

https://github.com/FlussoBV/NgDexie/wiki/ngDexie.add

于 2016-09-14T07:21:35.130 回答
0

我在大量插入时遇到问题(100.000 - 200.000 条记录)。我已经使用 Dexie 库中的 bulkPut() 解决了我所有的 IndexedDB 性能问题。它有一个重要的特点:

Dexie 的表现非常出色。它的批量方法利用了 indexedDB 中一个不为人知的特性,可以在不监听每个 onsuccess 事件的情况下存储内容。这可以最大限度地提高性能。

德克西:https ://github.com/dfahlander/Dexie.js

BulkPut() -> http://dexie.org/docs/Table/Table.bulkPut()

于 2017-06-16T10:46:50.377 回答