-2

我正在使用 try catch 块来捕获具有约束错误的数据。例如。如果在非空列中插入 null 或插入重复记录或发生类型不匹配,则所有有错误的源记录都应转到错误日志表,其余记录应转到目标表。为此我使用try catch,所以我不能使用批量插入,因此使用While Loop逐行插入,这需要永远运行,因为我必须插入3000000条记录。有什么方法可以提高循环时的性能吗?所以它可以在最短的时间内插入 3000000 条记录?目前需要 2 小时或更长时间 :(

4

2 回答 2

0

尝试分批进行插入。例如,做一个循环尝试一次插入 10,000/1,000/100 条记录作为批量插入。如果批处理中有错误,则捕获它并作为逐行操作重新执行该批处理。您将不得不使用批量大小并使其足够小,以便大多数批次作为批量插入处理,并且偶尔只需要逐行处理一个批次。

于 2017-11-09T15:24:08.640 回答
0

下面演示了在发生错误时对批量大小的“二分查找”批量处理一堆样本数据。

set nocount on;

-- Set the processing parameters.
declare @InitialBatchSize as Int = 1024;
declare @BatchSize as Int = @InitialBatchSize;

-- Create some sample data with somewhat random   Divisor   values.
declare @RowsToProcess as Int = 10000;
declare @SampleData as Table ( Number Int, Divisor Int );
with Digits as ( select Digit from ( values (0), (1), (2), (3), (4), (5), (6), (7), (8), (9) ) as Digits( Digit ) ),
  Numbers as (
  select ( ( ( Ten_4.Digit * 10 + Ten_3.Digit ) * 10 + Ten_2.Digit ) * 10 + Ten_1.Digit ) * 10 + Ten_0.Digit + 1 as Number
    from Digits as Ten_0 cross join Digits as Ten_1 cross join Digits as Ten_2 cross join
      Digits as Ten_3 cross join Digits as Ten_4 )
  insert into @SampleData
    select Number, Abs( Checksum( NewId() ) ) % 1000 as Divisor -- Adjust "1000" to vary the chances of a zero divisor.
      from Numbers
      where Number < @RowsToProcess;

-- Process the data.  
declare @FailedRows as Table ( Number Int, Divisor Int, ErrorMessage NVarChar(2048) );
declare @BitBucket as Table ( Number Int, Divisor Int, Quotient Int );
declare @RowCount as Int = 1; -- Force at least one loop execution.
declare @LastProcessedNumber as Int = 0;
while @RowCount > 0
  begin
  begin try
    -- Subject-to-failure   INSERT .
    insert into @BitBucket
      select top ( @BatchSize ) Number, Divisor, 1 / Divisor as Quotient
        from @SampleData
        where Number > @LastProcessedNumber
        order by Number;
    set @RowCount = @@RowCount;
    select @LastProcessedNumber = Max( Number ) from @BitBucket;
    print 'Processed ' + Cast( @RowCount as VarChar(10) ) + ' rows.';
  end try
  begin catch
    if @BatchSize > 1
      begin
      -- Try a smaller batch.
      set @BatchSize /= 2;
      end
    else
      begin
      -- This is a failing row.  Log it with the error and reset the batch size.
      set @LastProcessedNumber += 1;
      print 'Row failed. Row number ' + Cast( @LastProcessedNumber as VarChar(10) ) + ', error: ' + Error_Message() + '.';
      insert into @FailedRows
        select Number, Divisor, Error_Message()
          from @SampleData
          where Number = @LastProcessedNumber;
      set @BatchSize = @InitialBatchSize;
      end
  end catch
  end;

-- Dump the results.
select * from @FailedRows order by Number;
select * from @SampleData order by Number;
select * from @BitBucket order by Number;
于 2017-11-10T20:04:10.793 回答