-3

我必须从数据库中读取大量 Blob 数据(超过 300Gb)并插入到另一个数据库中。我正在使用以下代码读取数据

if (dr.HasRows)
{
    while (dr.Read())
    {
       media m = new media
       {
           docid = Convert.ToInt32(dr["Id"]),
           Content = Convert.ToByte(dr["BlobData"]),
           madiaName = Convert.ToString(dr["Name"])
       }
    }

    InsertInNewDb(m);
}

我正在逐行读取并在另一个数据库中插入数据。问题是在发送一些数据后会生成内存已满异常,因为我没有处理对象。单次迭代后如何处理对象?

4

2 回答 2

1

要将多个答案和评论联系在一起,请尝试以下操作:

// The SqlConnection, SqlCommand and SqlDataReader need to be in using blocks
// so that they are disposed in a timely manner. This does not clean  up
// memory, it cleans up unmanaged resources like handles
using (SqlConnection conn = new SqlConnection(connectionString))
{
    using (SqlCommand cmd = new SqlCommand("SELECT * FROM OldTable", conn))
    {
        using (SqlDataReader dr = cmd.ExecuteReader())
        {
            if (dr.HasRows)
            {
                while (dr.Read())
                {
                   media m = new media
                   {
                       // Don't convert - cast instead. These are already the correct
                       // type.
                       docid = (int) dr["Id"],
                       // There are more efficient ways to do this, but
                       // Convert.ToByte was copying only a single byte
                       Content = dr["BlobData"],
                       madiaName = (string)dr["Name"]
                   }

                    // You probably want to insert _all_ of the rows.
                    // Your code was only inserting the last
                    InsertInNewDb(m);
                }
            }
        }
    }
}
于 2013-01-15T02:29:00.187 回答
0

您可以尝试对 DataReader 进行分页,这应该可以工作。尝试在某些行之后关闭数据的连接、源和目标。请记住将对象与using指令一起使用,以更好地管理内存。

于 2013-01-15T02:01:32.550 回答