嗨,我正在尝试从 输入大量数据JSON to SQL Server
,因此根据我的研究,我可以JSON to DataTable and then use SQLBulk
使用 WriteToSever(dataTable) 将数据转换为写入服务器。这是最好的方法吗?
以及我如何能够以Scope_identity()
这种方式提取每一行插入?
嗨,我正在尝试从 输入大量数据JSON to SQL Server
,因此根据我的研究,我可以JSON to DataTable and then use SQLBulk
使用 WriteToSever(dataTable) 将数据转换为写入服务器。这是最好的方法吗?
以及我如何能够以Scope_identity()
这种方式提取每一行插入?
Extracting scope identity isn't really possible from a bulk insert command. If you want to accurately get back every single scope_identity() value in an insert statement, you really have to insert each record individually.
However, you could also consider using a table parameter insert (I wrote an article about it here: http://www.altdevblogaday.com/2012/05/16/sql-server-high-performance-inserts/).
First, create a table insert type:
CREATE TYPE item_drop_bulk_table_rev4 AS TABLE (
item_id BIGINT,
monster_class_id INT,
zone_id INT,
xpos REAL,
ypos REAL,
kill_time datetime
)
Second, create a procedure to insert the data:
CREATE PROCEDURE insert_item_drops_rev4
@mytable item_drop_bulk_table_rev4 READONLY
AS
BEGIN TRANSACTION
-- Lookup the current ID
DECLARE @id_marker bigint
SELECT @id_marker = MAX(primary_key_column) FROM item_drops_rev4
-- Insert all the data
INSERT INTO item_drops_rev4
(item_id, monster_class_id, zone_id, xpos, ypos, kill_time)
SELECT
item_id, monster_class_id, zone_id, xpos, ypos, kill_time
FROM
@mytable
-- Return back the identity values
SELECT * FROM item_drops_rev4 WHERE primary_key_column > @id_marker
COMMIT TRANSACTION
Third, write the C# code to insert this data:
DataTable dt = new DataTable();
dt.Columns.Add(new DataColumn("item_id", typeof(Int64)));
dt.Columns.Add(new DataColumn("monster_class_id", typeof(int)));
dt.Columns.Add(new DataColumn("zone_id", typeof(int)));
dt.Columns.Add(new DataColumn("xpos", typeof(float)));
dt.Columns.Add(new DataColumn("ypos", typeof(float)));
dt.Columns.Add(new DataColumn("timestamp", typeof(DateTime)));
for (int i = 0; i < MY_INSERT_SIZE; i++) {
dt.Rows.Add(new object[] { item_id, monster_class_id, zone_id, xpos, ypos, DateTime.Now });
}
// Now we're going to do all the work with one connection!
using (SqlConnection conn = new SqlConnection(my_connection_string)) {
conn.Open();
using (SqlCommand cmd = new SqlCommand("insert_item_drops_rev4", conn)) {
cmd.CommandType = CommandType.StoredProcedure;
// Adding a "structured" parameter allows you to insert tons of data with low overhead
SqlParameter param = new SqlParameter("@mytable", SqlDbType.Structured);
param.Value = dt;
cmd.Parameters.Add(param);
SqlDataReader dr = cmd.ExecuteReader()
// TODO: Read back in the objects, now attached to their primary keys
}
}
Now, in theory, you could retrieve back only the single "id marker". However, I'm not convinced that every single version of SQL Server out there will always insert records from a datatable insert in linear order, so I think it's safer to retrieve back the whole data batch.