1

是否可以使用Parallel.Foreach或其他方式优化此代码?

using (var zipStream = new ZipOutputStream(OpenWriteArchive()))
{
    zipStream.CompressionLevel = CompressionLevel.Level9;    
    foreach (var document in docuemnts)
    {
        zipStream.PutNextEntry(GetZipEntryName(type));    
        using (var targetStream = new MemoryStream()) // document stream
        {
            DocumentHelper.SaveDocument(document.Value, targetStream, type);    
            targetStream.Position = 0; targetStream.CopyTo(zipStream);
        }    
        GC.Collect();
    };
}

问题是 DotNetZip 和 SharpZipLibZipOutputStream不支持位置更改或搜索。

从多个线程写入 zip 流会导致错误。将结果流累积到 ConcurrentStack 也是不可能的,因为应用程序可以处理 1000 多个文档,并且应该即时压缩流并将流保存到云中。

有没有办法解决这个问题?

4

2 回答 2

1

通过使用ProducerConsumerQueue(生产者-消费者模式)解决。

using (var queue = new ProducerConsumerQueue<byte[]>(HandlerDelegate))
{
    Parallel.ForEach(documents, document =>
    {
        using (var documentStream = new MemoryStream())
        {
            // saving document here ...

            queue.EnqueueTask(documentStream.ToArray());
        }
    });
}

protected void HandlerDelegate(byte[] content)
{
    ZipOutputStream.PutNextEntry(Guid.NewGuid() + ".pdf");

    using (var stream = new MemoryStream(content))
    {
        stream.Position = 0; stream.CopyTo(ZipOutputStream);
    }
}
于 2013-07-09T13:43:31.880 回答
0

尝试在并行 foreach 中清除 zipstream,例如:

Parallel.ForEach(docuemnts, (document) =>
            {
                using (var zipStream = new ZipOutputStream(OpenWriteArchive()))
                {
                    zipStream.CompressionLevel = CompressionLevel.Level9;
                    zipStream.PutNextEntry(GetZipEntryName(type));
                    using (var targetStream = new MemoryStream()) // document stream
                    {
                        DocumentHelper.SaveDocument(document.Value, targetStream, type);
                        targetStream.Position = 0; targetStream.CopyTo(zipStream);
                    }
                    GC.Collect();
                }
            });

再见!

于 2013-07-08T09:30:57.617 回答