32

是否可以批量插入(SQL Server)一个 CSV 文件,其中的字段仅偶尔被引号包围?具体来说,引号仅围绕那些包含“,”的字段。

换句话说,我的数据看起来像这样(第一行包含标题):

id, company, rep, employees
729216,INGRAM MICRO INC.,"Stuart, Becky",523
729235,"GREAT PLAINS ENERGY, INC.","Nelson, Beena",114
721177,GEORGE WESTON BAKERIES INC,"Hogan, Meg",253

因为引号不一致,我不能使用 '","' 作为分隔符,我不知道如何创建一个格式文件来解决这个问题。

我尝试使用','作为分隔符并将其加载到一个临时表中,其中每一列都是一个varchar,然后使用一些kludgy处理来去除引号,但这也不起作用,因为包含','的字段被分成多列。

不幸的是,我无法事先操作 CSV 文件。

这是没有希望的吗?

非常感谢您的任何建议。

顺便说一句,我看到这篇文章SQL bulk import from csv,但在这种情况下,每个字段都始终用引号括起来。因此,在这种情况下,他可以使用 ',' 作为分隔符,然后去掉引号。

4

17 回答 17

19

无法从 MSDN 对此文件进行批量插入:

要用作批量导入的数据文件,CSV 文件必须符合以下限制:

  • 数据字段从不包含字段终止符。
  • 数据字段中的任何值或所有值都用引号 ("") 括起来。

( http://msdn.microsoft.com/en-us/library/ms188609.aspx )

一些简单的文本处理应该是使文件准备好导入所需的全部内容。或者,您的用户可能需要根据这些指南格式化文件或使用逗号以外的其他内容作为分隔符(例如 |)

于 2009-04-23T17:26:33.143 回答
18

您将需要预处理文件,期间。

如果你真的需要这样做,这里是代码。我写这篇文章是因为我绝对别无选择。它是实用程序代码,我并不为此感到自豪,但它确实有效。该方法不是让 SQL 理解引用的字段,而是操纵文件以使用完全不同的分隔符。

编辑:这是 github repo 中的代码。它已经过改进,现在带有单元测试!https://github.com/chrisclark/Redelim-it

此函数接受一个输入文件,并将用新的分隔符替换所有字段分隔逗号(引用文本字段中的不是逗号,只是实际的分隔符)。然后,您可以告诉 sql server 使用新的字段分隔符而不是逗号。在此处的函数版本中,占位符是 < TMP >(我相信这不会出现在原始 csv 中 - 如果出现,请做好爆炸的准备)。

因此,在运行此函数后,您可以通过执行以下操作在 sql 中导入:

BULK INSERT MyTable
FROM 'C:\FileCreatedFromThisFunction.csv'
WITH
(
FIELDTERMINATOR = '<*TMP*>',
ROWTERMINATOR = '\n'
)

事不宜迟,我提前为对您造成的可怕而可怕的功能表示歉意(编辑-我已经发布了一个可以执行此操作的工作程序,而不仅仅是我博客上的功能):

Private Function CsvToOtherDelimiter(ByVal InputFile As String, ByVal OutputFile As String) As Integer

        Dim PH1 As String = "<*TMP*>"

        Dim objReader As StreamReader = Nothing
        Dim count As Integer = 0 'This will also serve as a primary key'
        Dim sb As New System.Text.StringBuilder

        Try
            objReader = New StreamReader(File.OpenRead(InputFile), System.Text.Encoding.Default)
        Catch ex As Exception
            UpdateStatus(ex.Message)
        End Try

        If objReader Is Nothing Then
            UpdateStatus("Invalid file: " & InputFile)
            count = -1
            Exit Function
        End If

        'grab the first line
    Dim line = reader.ReadLine()
    'and advance to the next line b/c the first line is column headings
    If hasHeaders Then
        line = Trim(reader.ReadLine)
    End If

    While Not String.IsNullOrEmpty(line) 'loop through each line

        count += 1

        'Replace commas with our custom-made delimiter
        line = line.Replace(",", ph1)

        'Find a quoted part of the line, which could legitimately contain commas.
        'In that case we will need to identify the quoted section and swap commas back in for our custom placeholder.
        Dim starti = line.IndexOf(ph1 & """", 0)
        If line.IndexOf("""",0) = 0 then starti=0

        While starti > -1 'loop through quoted fields

            Dim FieldTerminatorFound As Boolean = False

            'Find end quote token (originally  a ",)
            Dim endi As Integer = line.IndexOf("""" & ph1, starti)

            If endi < 0 Then
                FieldTerminatorFound = True
                If endi < 0 Then endi = line.Length - 1
            End If

            While Not FieldTerminatorFound

                'Find any more quotes that are part of that sequence, if any
                Dim backChar As String = """" 'thats one quote
                Dim quoteCount = 0
                While backChar = """"
                    quoteCount += 1
                    backChar = line.Chars(endi - quoteCount)
                End While

                If quoteCount Mod 2 = 1 Then 'odd number of quotes. real field terminator
                    FieldTerminatorFound = True
                Else 'keep looking
                    endi = line.IndexOf("""" & ph1, endi + 1)
                End If
            End While

            'Grab the quoted field from the line, now that we have the start and ending indices
            Dim source = line.Substring(starti + ph1.Length, endi - starti - ph1.Length + 1)

            'And swap the commas back in
            line = line.Replace(source, source.Replace(ph1, ","))

            'Find the next quoted field
            '                If endi >= line.Length - 1 Then endi = line.Length 'During the swap, the length of line shrinks so an endi value at the end of the line will fail
            starti = line.IndexOf(ph1 & """", starti + ph1.Length)

        End While

            line = objReader.ReadLine

        End While

        objReader.Close()

        SaveTextToFile(sb.ToString, OutputFile)

        Return count

    End Function
于 2010-01-23T20:53:03.040 回答
8

我发现 Chris 的答案非常有帮助,但我想使用 T-SQL(而不是使用 CLR)在 SQL Server 中运行它,所以我将他的代码转换为 T-SQL 代码。但后来我更进一步,将所有内容包装在执行以下操作的存储过程中:

  1. 使用批量插入最初导入 CSV 文件
  2. 使用 Chris 的代码清理行
  3. 以表格形式返回结果

为了我的需要,我通过删除值周围的引号并将两个双引号转换为一个双引号来进一步清理行(我认为这是正确的方法)。

CREATE PROCEDURE SSP_CSVToTable

-- Add the parameters for the stored procedure here
@InputFile nvarchar(4000)
, @FirstLine int

AS

BEGIN

-- SET NOCOUNT ON added to prevent extra result sets from
-- interfering with SELECT statements.
SET NOCOUNT ON;

--convert the CSV file to a table
--clean up the lines so that commas are handles correctly

DECLARE @sql nvarchar(4000)
DECLARE @PH1 nvarchar(50)
DECLARE @LINECOUNT int -- This will also serve as a primary key
DECLARE @CURLINE int
DECLARE @Line nvarchar(4000)
DECLARE @starti int
DECLARE @endi int
DECLARE @FieldTerminatorFound bit
DECLARE @backChar nvarchar(4000)
DECLARE @quoteCount int
DECLARE @source nvarchar(4000)
DECLARE @COLCOUNT int
DECLARE @CURCOL int
DECLARE @ColVal nvarchar(4000)

-- new delimiter
SET @PH1 = '†'

-- create single column table to hold each line of file
CREATE TABLE [#CSVLine]([line] nvarchar(4000))

-- bulk insert into temp table
-- cannot use variable path with bulk insert
-- so we must run using dynamic sql
SET @Sql = 'BULK INSERT #CSVLine
FROM ''' + @InputFile + '''
WITH
(
FIRSTROW=' + CAST(@FirstLine as varchar) + ',
FIELDTERMINATOR = ''\n'',
ROWTERMINATOR = ''\n''
)'

-- run dynamic statement to populate temp table
EXEC(@sql)

-- get number of lines in table
SET @LINECOUNT = @@ROWCOUNT

-- add identity column to table so that we can loop through it
ALTER TABLE [#CSVLine] ADD [RowId] [int] IDENTITY(1,1) NOT NULL

IF @LINECOUNT > 0
BEGIN
    -- cycle through each line, cleaning each line
    SET @CURLINE = 1
    WHILE @CURLINE <= @LINECOUNT
    BEGIN
        -- get current line
        SELECT @line = line
          FROM #CSVLine
         WHERE [RowId] = @CURLINE

        -- Replace commas with our custom-made delimiter
        SET @Line = REPLACE(@Line, ',', @PH1)

        -- Find a quoted part of the line, which could legitimately contain commas.
        -- In that case we will need to identify the quoted section and swap commas back in for our custom placeholder.
        SET @starti = CHARINDEX(@PH1 + '"' ,@Line, 0)
        If CHARINDEX('"', @Line, 0) = 0 SET @starti = 0

        -- loop through quoted fields
        WHILE @starti > 0 
        BEGIN
            SET @FieldTerminatorFound = 0

            -- Find end quote token (originally  a ",)
            SET @endi = CHARINDEX('"' + @PH1, @Line, @starti)  -- sLine.IndexOf("""" & PH1, starti)

            IF @endi < 1
            BEGIN
                SET @FieldTerminatorFound = 1
                If @endi < 1 SET @endi = LEN(@Line) - 1
            END

            WHILE @FieldTerminatorFound = 0
            BEGIN
                -- Find any more quotes that are part of that sequence, if any
                SET @backChar = '"' -- thats one quote
                SET @quoteCount = 0

                WHILE @backChar = '"'
                BEGIN
                    SET @quoteCount = @quoteCount + 1
                    SET @backChar = SUBSTRING(@Line, @endi-@quoteCount, 1) -- sLine.Chars(endi - quoteCount)
                END

                IF (@quoteCount % 2) = 1
                BEGIN
                    -- odd number of quotes. real field terminator
                    SET @FieldTerminatorFound = 1
                END
                ELSE 
                BEGIN
                    -- keep looking
                    SET @endi = CHARINDEX('"' + @PH1, @Line, @endi + 1) -- sLine.IndexOf("""" & PH1, endi + 1)
                END

            END

            -- Grab the quoted field from the line, now that we have the start and ending indices
            SET @source = SUBSTRING(@Line, @starti + LEN(@PH1), @endi - @starti - LEN(@PH1) + 1) 
            -- sLine.Substring(starti + PH1.Length, endi - starti - PH1.Length + 1)

            -- And swap the commas back in
            SET @Line = REPLACE(@Line, @source, REPLACE(@source, @PH1, ','))
            --sLine.Replace(source, source.Replace(PH1, ","))

            -- Find the next quoted field
            -- If endi >= line.Length - 1 Then endi = line.Length 'During the swap, the length of line shrinks so an endi value at the end of the line will fail
            SET @starti = CHARINDEX(@PH1 + '"', @Line, @starti + LEN(@PH1))
            --sLine.IndexOf(PH1 & """", starti + PH1.Length)

        END

        -- get table based on current line
        IF OBJECT_ID('tempdb..#Line') IS NOT NULL
            DROP TABLE #Line

        -- converts a delimited list into a table
        SELECT *
        INTO #Line
        FROM dbo.iter_charlist_to_table(@Line,@PH1)

        -- get number of columns in line
        SET @COLCOUNT = @@ROWCOUNT

        -- dynamically create CSV temp table to hold CSV columns and lines
        -- only need to create once
        IF OBJECT_ID('tempdb..#CSV') IS NULL
        BEGIN
            -- create initial structure of CSV table
            CREATE TABLE [#CSV]([Col1] nvarchar(100))

            -- dynamically add a column for each column found in the first line
            SET @CURCOL = 1
            WHILE @CURCOL <= @COLCOUNT
            BEGIN
                -- first column already exists, don't need to add
                IF @CURCOL > 1 
                BEGIN
                    -- add field
                    SET @sql = 'ALTER TABLE [#CSV] ADD [Col' + Cast(@CURCOL as varchar) + '] nvarchar(100)'

                    --print @sql

                    -- this adds the fields to the temp table
                    EXEC(@sql)
                END

                -- go to next column
                SET @CURCOL = @CURCOL + 1
            END
        END

        -- build dynamic sql to insert current line into CSV table
        SET @sql = 'INSERT INTO [#CSV] VALUES('

        -- loop through line table, dynamically adding each column value
        SET @CURCOL = 1
        WHILE @CURCOL <= @COLCOUNT
        BEGIN
            -- get current column
            Select @ColVal = str 
              From #Line 
             Where listpos = @CURCOL

            IF LEN(@ColVal) > 0
            BEGIN
                -- remove quotes from beginning if exist
                IF LEFT(@ColVal,1) = '"'
                    SET @ColVal = RIGHT(@ColVal, LEN(@ColVal) - 1)

                -- remove quotes from end if exist
                IF RIGHT(@ColVal,1) = '"'
                    SET @ColVal = LEFT(@ColVal, LEN(@ColVal) - 1)
            END

            -- write column value
            -- make value sql safe by replacing single quotes with two single quotes
            -- also, replace two double quotes with a single double quote
            SET @sql = @sql + '''' + REPLACE(REPLACE(@ColVal, '''',''''''), '""', '"') + ''''

            -- add comma separater except for the last record
            IF @CURCOL <> @COLCOUNT
                SET @sql = @sql + ','

            -- go to next column
            SET @CURCOL = @CURCOL + 1
        END

        -- close sql statement
        SET @sql = @sql + ')'

        --print @sql

        -- run sql to add line to table
        EXEC(@sql)

        -- move to next line
        SET @CURLINE = @CURLINE + 1

    END

END

-- return CSV table
SELECT * FROM [#CSV]

END

GO

存储过程利用这个帮助函数将字符串解析为表(感谢 Erland Sommarskog!):

CREATE FUNCTION [dbo].[iter_charlist_to_table]
                (@list      ntext,
                 @delimiter nchar(1) = N',')
     RETURNS @tbl TABLE (listpos int IDENTITY(1, 1) NOT NULL,
                         str     varchar(4000),
                         nstr    nvarchar(2000)) AS

BEGIN
  DECLARE @pos      int,
          @textpos  int,
          @chunklen smallint,
          @tmpstr   nvarchar(4000),
          @leftover nvarchar(4000),
          @tmpval   nvarchar(4000)

  SET @textpos = 1
  SET @leftover = ''
  WHILE @textpos <= datalength(@list) / 2
  BEGIN
     SET @chunklen = 4000 - datalength(@leftover) / 2
     SET @tmpstr = @leftover + substring(@list, @textpos, @chunklen)
     SET @textpos = @textpos + @chunklen

     SET @pos = charindex(@delimiter, @tmpstr)

     WHILE @pos > 0
     BEGIN
        SET @tmpval = ltrim(rtrim(left(@tmpstr, @pos - 1)))
        INSERT @tbl (str, nstr) VALUES(@tmpval, @tmpval)
        SET @tmpstr = substring(@tmpstr, @pos + 1, len(@tmpstr))
        SET @pos = charindex(@delimiter, @tmpstr)
     END

     SET @leftover = @tmpstr
  END

  INSERT @tbl(str, nstr) VALUES (ltrim(rtrim(@leftover)), ltrim(rtrim(@leftover)))

RETURN

END

这是我从 T-SQL 中调用它的方式。在这种情况下,我将结果插入到临时表中,因此我首先创建临时表:

    -- create temp table for file import
CREATE TABLE #temp
(
    CustomerCode nvarchar(100) NULL,
    Name nvarchar(100) NULL,
    [Address] nvarchar(100) NULL,
    City nvarchar(100) NULL,
    [State] nvarchar(100) NULL,
    Zip nvarchar(100) NULL,
    OrderNumber nvarchar(100) NULL,
    TimeWindow nvarchar(100) NULL,
    OrderType nvarchar(100) NULL,
    Duration nvarchar(100) NULL,
    [Weight] nvarchar(100) NULL,
    Volume nvarchar(100) NULL
)

-- convert the CSV file into a table
INSERT #temp
EXEC [dbo].[SSP_CSVToTable]
     @InputFile = @FileLocation
    ,@FirstLine = @FirstImportRow

我没有对性能进行太多测试,但它可以很好地满足我的需要——导入少于 1000 行的 CSV 文件。但是,它可能会阻塞非常大的文件。

希望其他人也发现它有用。

干杯!

于 2012-03-02T16:00:32.603 回答
5

我还创建了一个函数来将 CSV 转换为批量插入的可用格式。我使用 Chris Clark 的已回答帖子作为创建以下 C# 函数的起点。

我最终使用正则表达式来查找字段。然后我逐行重新创建文件,将其写入一个新文件,从而避免将整个文件加载到内存中。

private void CsvToOtherDelimiter(string CSVFile, System.Data.Linq.Mapping.MetaTable tbl)
{
    char PH1 = '|';
    StringBuilder ln;

    //Confirm file exists. Else, throw exception
    if (File.Exists(CSVFile))
    {
        using (TextReader tr = new StreamReader(CSVFile))
        {
            //Use a temp file to store our conversion
            using (TextWriter tw = new StreamWriter(CSVFile + ".tmp"))
            {
                string line = tr.ReadLine();
                //If we have already converted, no need to reconvert.
                //NOTE: We make the assumption here that the input header file 
                //      doesn't have a PH1 value unless it's already been converted.
                if (line.IndexOf(PH1) >= 0)
                {
                    tw.Close();
                    tr.Close();
                    File.Delete(CSVFile + ".tmp");
                    return;
                }
                //Loop through input file
                while (!string.IsNullOrEmpty(line))
                {
                    ln = new StringBuilder();

                    //1. Use Regex expression to find comma separated values 
                    //using quotes as optional text qualifiers 
                    //(what MS EXCEL does when you import a csv file)
                    //2. Remove text qualifier quotes from data
                    //3. Replace any values of PH1 found in column data 
                    //with an equivalent character
                    //Regex:  \A[^,]*(?=,)|(?:[^",]*"[^"]*"[^",]*)+|[^",]*"[^"]*\Z|(?<=,)[^,]*(?=,)|(?<=,)[^,]*\Z|\A[^,]*\Z
                    List<string> fieldList = Regex.Matches(line, @"\A[^,]*(?=,)|(?:[^"",]*""[^""]*""[^"",]*)+|[^"",]*""[^""]*\Z|(?<=,)[^,]*(?=,)|(?<=,)[^,]*\Z|\A[^,]*\Z")
                            .Cast<Match>()
                            .Select(m => RemoveCSVQuotes(m.Value).Replace(PH1, '¦'))
                            .ToList<string>();

                    //Add the list of fields to ln, separated by PH1
                    fieldList.ToList().ForEach(m => ln.Append(m + PH1));

                    //Write to file. Don't include trailing PH1 value.
                    tw.WriteLine(ln.ToString().Substring(0, ln.ToString().LastIndexOf(PH1)));

                    line = tr.ReadLine();
                }


                tw.Close();
            }
            tr.Close();

            //Optional:  replace input file with output file
            File.Delete(CSVFile);
            File.Move(CSVFile + ".tmp", CSVFile);
        }
    }
    else
    {
        throw new ArgumentException(string.Format("Source file {0} not found", CSVFile));
    }
}
//The output file no longer needs quotes as a text qualifier, so remove them
private string RemoveCSVQuotes(string value)
{
    //if is empty string, then remove double quotes
    if (value == @"""""") value = "";
    //remove any double quotes, then any quotes on ends
    value = value.Replace(@"""""", @"""");
    if (value.Length >= 2)
        if (value.Substring(0, 1) == @"""")
            value = value.Substring(1, value.Length - 2);
    return value;
}
于 2011-01-21T20:40:47.677 回答
3

通常,此问题是由用户将 Excel 文件导出为 CSV 引起的。

有两种方法可以解决这个问题:

  1. 根据 Microsoft 的建议,使用宏从 Excel 导出
  2. 或者非常简单的方法:
    • 在 Excel 中打开 CSV。
    • 另存为 Excel 文件。(.xls 或 .xlsx)。
    • 将该文件作为Excel 文件导入 SQL Server 。
    • 自己笑一笑,因为您不必像上面的解决方案那样编写任何代码...... muhahahaha

导入为 Excel 文件

如果您真的想编写脚本,这里有一些SQL (在将 CSV 保存为 Excel 之后):

select * 
into SQLServerTable FROM OPENROWSET('Microsoft.Jet.OLEDB.4.0', 
    'Excel 8.0;Database=D:\testing.xls;HDR=YES', 
    'SELECT * FROM [Sheet1$]')
于 2013-11-22T11:25:32.857 回答
2

这可能比您愿意使用的更复杂或更复杂,但是......

如果您可以在 VB 或 C# 中实现将行解析为字段的逻辑,则可以使用 CLR 表值函数 (TVF) 来执行此操作。

当您希望某些 C# 或 VB 代码将数据分成列和/或调整值时,CLR TVF 可能是一种从外部源读取数据的良好执行方式。

您必须愿意将 CLR 程序集添加到您的数据库(并且允许外部或不安全操作以便它可以打开文件)。这可能会有点复杂或涉及,但对于您获得的灵活性可能是值得的。

我有一些大文件需要尽可能快地定期加载到表中,但是需要对某些列执行某些代码转换,并且需要特殊处理来加载值,否则这些值会在普通批量插入时导致数据类型错误。

简而言之,CLR TVF 允许您对文件的每一行运行 C# 或 VB 代码,并具有类似批量插入的性能(尽管您可能需要担心日志记录)。SQL Server 文档中的示例允许您创建一个 TVF 以从您可以用作起点的事件日志中读取。

请注意,CLR TVF 中的代码只能在处理第一行之前的 init 阶段访问数据库(例如,每行不查找 - 您在此之上使用普通 TVF 来执行此类操作)。根据您的问题,您似乎不需要这个。

另请注意,每个 CLR TVF 都必须明确指定其输出列,因此您不能编写可用于您可能拥有的每个不同 csv 文件的通用列。

您可以编写一个 CLR TVF 从文件中读取整行,返回一列结果集,然后使用普通 TVF 从该文件中读取每种类型的文件。这要求代码解析每一行以用 T-SQL 编写,但避免了编写许多 CLR TVF。

于 2009-09-28T14:37:25.373 回答
2

另一种方法——假设您没有大量字段或期望数据本身出现引号,则使用 REPLACE 函数。

UPDATE dbo.tablename 
        SET dbo.tablename.target_field = REPLACE(t.importedValue, '"', '')
FROM #tempTable t
WHERE dbo.tablename.target_id = t.importedID;

我已经用过了。我不能对性能提出任何要求。这只是解决问题的一种快速而肮脏的方法。

于 2012-10-19T23:33:15.513 回答
2

需要预处理。

PowerShell 函数 Import-CSV 支持这种类型的文件。然后,Export-CSV 会将每个值封装在引号中。

单个文件:

Import-Csv import.csv | Export-Csv -NoTypeInformation export.csv

合并多个路径为 C:\year\input_date.csv 的文件:

$inputPath = 'C:\????\input_????????.csv'
$outputPath = 'C:\merged.csv'
Get-ChildItem $inputPath |
  Select -ExpandProperty FullName |
  Import-CSV |
  Export-CSV -NoTypeInformation -Path $outputPath

PowerShell 通常可以使用 PowerShell 代理帐户与 SQL Server 代理一起运行。

如果分隔符处理不当,请明确指定另一个分隔符。

 Export-CSV -NoTypeInformation -Delimiter ';' -Path $outputPath
于 2018-12-11T15:32:08.803 回答
1

您不仅应该能够指定字段分隔符(应该是 [,]),还应该能够指定文本限定符,在这种情况下应该是 ["]。使用 [] 将其括起来,这样就不会与 " 混淆。

于 2009-04-23T15:50:53.467 回答
1

克里斯,非常感谢!你救了我的饼干!!当 XL 做得这么好时,我无法相信散装装载机不会处理这种情况..这些家伙不是在大厅里看到彼此吗?无论如何...我需要一个 ConsoleApplication 版本,所以这就是我一起破解的。它又脏又脏,但它就像一个冠军!我对分隔符进行了硬编码并注释掉了标题,因为我的应用程序不需要它们。

我希望我也可以在这里为你贴上一杯好喝的啤酒。

天哪,我不知道为什么 End Module 和 Public Class 在代码块之外……srry!

    Module Module1

    Sub Main()

        Dim arrArgs() As String = Command.Split(",")
        Dim i As Integer
        Dim obj As New ReDelimIt()

        Console.Write(vbNewLine & vbNewLine)

        If arrArgs(0) <> Nothing Then
            For i = LBound(arrArgs) To UBound(arrArgs)
                Console.Write("Parameter " & i & " is " & arrArgs(i) & vbNewLine)
            Next


            obj.ProcessFile(arrArgs(0), arrArgs(1))

        Else
            Console.Write("Usage Test1 <inputfile>,<outputfile>")
        End If

        Console.Write(vbNewLine & vbNewLine)
    End Sub

 End Module

 Public Class ReDelimIt

    Public Function ProcessFile(ByVal InputFile As String, ByVal OutputFile As String) As Integer

        Dim ph1 As String = "|"

        Dim objReader As System.IO.StreamReader = Nothing
        Dim count As Integer = 0 'This will also serve as a primary key
        Dim sb As New System.Text.StringBuilder

        Try
            objReader = New System.IO.StreamReader(System.IO.File.OpenRead(InputFile), System.Text.Encoding.Default)
        Catch ex As Exception
            MsgBox(ex.Message)
        End Try

        If objReader Is Nothing Then
            MsgBox("Invalid file: " & InputFile)
            count = -1
            Exit Function
        End If

        'grab the first line
        Dim line = objReader.ReadLine()
        'and advance to the next line b/c the first line is column headings
        'Removed Check Headers can put in if needed.
        'If chkHeaders.Checked Then
        'line = objReader.ReadLine
        'End If

        While Not String.IsNullOrEmpty(line) 'loop through each line

            count += 1

            'Replace commas with our custom-made delimiter
            line = line.Replace(",", ph1)

            'Find a quoted part of the line, which could legitimately contain commas.
            'In that case we will need to identify the quoted section and swap commas back in for our custom placeholder.
            Dim starti = line.IndexOf(ph1 & """", 0)

            While starti > -1 'loop through quoted fields

                'Find end quote token (originally  a ",)
                Dim endi = line.IndexOf("""" & ph1, starti)

                'The end quote token could be a false positive because there could occur a ", sequence.
                'It would be double-quoted ("",) so check for that here
                Dim check1 = line.IndexOf("""""" & ph1, starti)

                'A """, sequence can occur if a quoted field ends in a quote.
                'In this case, the above check matches, but we actually SHOULD process this as an end quote token
                Dim check2 = line.IndexOf("""""""" & ph1, starti)

                'If we are in the check1 ("",) situation, keep searching for an end quote token
                'The +1 and +2 accounts for the extra length of the checked sequences
                While (endi = check1 + 1 AndAlso endi <> check2 + 2) 'loop through "false" tokens in the quoted fields
                    endi = line.IndexOf("""" & ph1, endi + 1)
                    check1 = line.IndexOf("""""" & ph1, check1 + 1)
                    check2 = line.IndexOf("""""""" & ph1, check2 + 1)
                End While

                'We have searched for an end token (",) but can't find one, so that means the line ends in a "
                If endi < 0 Then endi = line.Length - 1

                'Grab the quoted field from the line, now that we have the start and ending indices
                Dim source = line.Substring(starti + ph1.Length, endi - starti - ph1.Length + 1)

                'And swap the commas back in
                line = line.Replace(source, source.Replace(ph1, ","))

                'Find the next quoted field
                If endi >= line.Length - 1 Then endi = line.Length 'During the swap, the length of line shrinks so an endi value at the end of the line will fail
                starti = line.IndexOf(ph1 & """", starti + ph1.Length)

            End While

            'Add our primary key to the line
            ' Removed for now
            'If chkAddKey.Checked Then
            'line = String.Concat(count.ToString, ph1, line)
            ' End If

            sb.AppendLine(line)

            line = objReader.ReadLine

        End While

        objReader.Close()

        SaveTextToFile(sb.ToString, OutputFile)

        Return count

    End Function

    Public Function SaveTextToFile(ByVal strData As String, ByVal FullPath As String) As Boolean
        Dim bAns As Boolean = False
        Dim objReader As System.IO.StreamWriter
        Try
            objReader = New System.IO.StreamWriter(FullPath, False, System.Text.Encoding.Default)
            objReader.Write(strData)
            objReader.Close()
            bAns = True
        Catch Ex As Exception
            Throw Ex
        End Try
        Return bAns
    End Function

End Class
于 2010-05-19T06:20:59.450 回答
1

我在我们的领域中使用“,”时发现了一些问题,例如 Mike,“456 2nd St, Apt 5”。

这个问题的解决方法是@http ://crazzycoding.blogspot.com/2010/11/import-csv-file-into-sql-server-using.html

谢谢, - 阿什

于 2010-11-12T08:13:18.910 回答
0

这段代码对我有用:

 public bool CSVFileRead(string fullPathWithFileName, string fileNameModified, string tableName)
    {
        SqlConnection con = new SqlConnection(ConfigurationSettings.AppSettings["dbConnectionString"]);
        string filepath = fullPathWithFileName;
        StreamReader sr = new StreamReader(filepath);
        string line = sr.ReadLine();
        string[] value = line.Split(',');
        DataTable dt = new DataTable();
        DataRow row;
        foreach (string dc in value)
        {
            dt.Columns.Add(new DataColumn(dc));
        }
        while (!sr.EndOfStream)
        {
            //string[] stud = sr.ReadLine().Split(',');
            //for (int i = 0; i < stud.Length; i++)
            //{
            //    stud[i] = stud[i].Replace("\"", "");
            //}
            //value = stud;
            value = sr.ReadLine().Split(',');
            if (value.Length == dt.Columns.Count)
            {
                row = dt.NewRow();
                row.ItemArray = value;
                dt.Rows.Add(row);
            }
        }
        SqlBulkCopy bc = new SqlBulkCopy(con.ConnectionString, SqlBulkCopyOptions.TableLock);
        bc.DestinationTableName = tableName;
        bc.BatchSize = dt.Rows.Count;
        con.Open();
        bc.WriteToServer(dt);
        bc.Close();
        con.Close();

        return true;
    }
于 2014-05-31T05:25:23.680 回答
0

您不需要在 SQL 之外对文件进行预处理。

对我有用的是改变

ROWTERMINATOR = '\n'

ROWTERMINATOR = '0x0a'。

于 2019-07-24T06:43:55.357 回答
0

SQL 2017 中添加了一个新选项来指定WITH ( FORMAT='CSV')命令BULK INSERT

来自Microsoft GitHub 页面的示例:

BULK INSERT Product
FROM 'product.csv'
WITH (  DATA_SOURCE = 'MyAzureBlobStorage',
        FORMAT='CSV', CODEPAGE = 65001, --UTF-8 encoding
        FIRSTROW=2,
        ROWTERMINATOR = '0x0a',
        TABLOCK); 

该选项的详细文档可在此处获得: https ://docs.microsoft.com/en-us/sql/t-sql/statements/bulk-insert-transact-sql?view=sql-server-2017#input-file -格式选项

我已经成功地将此选项与包含可选引号的 CSV 数据一起使用,就像 OP 给出的示例一样。

于 2019-08-15T19:14:29.440 回答
0

我将以下内容放在一起来解决我的问题。我需要预处理非常大的文件并整理出不一致的引用。只需将其粘贴到一个空白的 C# 应用程序中,将 const 设置为您的要求,然后就可以了。这适用于超过 10 GB 的非常大的 CSV。

namespace CsvFixer
{
    using System.IO;
    using System.Text;

    public class Program
    {
        private const string delimiter = ",";
        private const string quote = "\"";
        private const string inputFile = "C:\\temp\\input.csv";
        private const string fixedFile = "C:\\temp\\fixed.csv";

        /// <summary>
        /// This application fixes inconsistently quoted csv (or delimited) files with support for very large file sizes.
        /// For example :  1223,5235234,8674,"Houston","London, UK",3425,Other text,stuff 
        /// Must become :  "1223","5235234","8674","Houston","London, UK","3425","Other text","stuff" 
        /// </summary>
        /// <param name="args"></param>
        static void Main(string[] args)
        {
            // Use streaming to allow for large files. 
            using (StreamWriter outfile = new StreamWriter(fixedFile))
            {
                using (FileStream fs = File.Open(inputFile, FileMode.Open, FileAccess.Read, FileShare.ReadWrite))
                using (BufferedStream bs = new BufferedStream(fs))
                using (StreamReader sr = new StreamReader(bs))
                {
                    string currentLine;

                    // Read each input line in and write each fixed line out
                    while ((currentLine = sr.ReadLine()) != null)
                    {
                        outfile.WriteLine(FixLine(currentLine, delimiter, quote));
                    }
                }
            }
        }

        /// <summary>
        /// Fully quote a partially quoted line 
        /// </summary>
        /// <param name="line">The partially quoted line</param>
        /// <returns>The fully quoted line</returns>
        private static string FixLine(string line, string delimiter, string quote)
        {
            StringBuilder fixedLine = new StringBuilder();

            // Split all on the delimiter, acceptinmg that some quoted fields 
            // that contain the delimiter wwill be split in to many pieces.
            string[] fieldParts = line.Split(delimiter.ToCharArray());

            // Loop through the fields (or parts of fields)
            for (int i = 0; i < fieldParts.Length; i++)
            {
                string currentFieldPart = fieldParts[i];

                // If the current field part starts and ends with a quote it is a field, so write it to the result
                if (currentFieldPart.StartsWith(quote) && currentFieldPart.EndsWith(quote))
                {
                    fixedLine.Append(string.Format("{0}{1}", currentFieldPart, delimiter));
                }
                // else if it starts with a quote but doesnt end with one, it is part of a lionger field.
                else if (currentFieldPart.StartsWith(quote))
                {
                    // Add the start of the field
                    fixedLine.Append(string.Format("{0}{1}", currentFieldPart, delimiter));

                    // Append any additional field parts (we will only hit the end of the field when 
                    // the last field part finishes with a quote. 
                    while (!fieldParts[++i].EndsWith(quote))
                    {
                        fixedLine.Append(string.Format("{0}{1}", fieldParts[i], delimiter));
                    }

                    // Append the last field part - i.e. the part containing the closing quote
                    fixedLine.Append(string.Format("{0}{1}", fieldParts[i], delimiter));
                }
                else
                {
                    // The field has no quotes, add the feildpart with quote as bookmarks 
                    fixedLine.Append(string.Format("{0}{1}{0}{2}", quote, currentFieldPart, delimiter));
                }
            }

            // Return the fixed string 
            return fixedLine.ToString();
        }
    }
}
于 2016-12-13T21:20:36.420 回答
0

从实践中说...在 SQL Server 2017 中,您可以提供双引号的“文本限定符”,它不会“取代”您的分隔符。我批量插入几个文件,看起来就像 OP 的示例一样。我的文件是“.csv”,它们的文本限定符不一致,只有在值包含逗号时才能找到。我不知道这个特性/功能开始工作的 SQL Server 版本,但我知道它在 SQL Server 2017 Standard 中工作。相当容易。

于 2018-04-06T15:54:40.717 回答
-1

创建一个 VB.NET 程序以使用 4.5 Framework TextFieldParser 转换为新的分隔符 这将自动处理文本限定字段

修改上面的代码以使用内置的 TextFieldParser

模块模块1

Sub Main()

    Dim arrArgs() As String = Command.Split(",")
    Dim i As Integer
    Dim obj As New ReDelimIt()
    Dim InputFile As String = ""
    Dim OutPutFile As String = ""
    Dim NewDelimiter As String = ""

    Console.Write(vbNewLine & vbNewLine)

    If Not IsNothing(arrArgs(0)) Then
        For i = LBound(arrArgs) To UBound(arrArgs)
            Console.Write("Parameter " & i & " is " & arrArgs(i) & vbNewLine)
        Next
        InputFile = arrArgs(0)
        If Not IsNothing(arrArgs(1)) Then
            If Not String.IsNullOrEmpty(arrArgs(1)) Then
                OutPutFile = arrArgs(1)
            Else
                OutPutFile = InputFile.Replace("csv", "pipe")
            End If
        Else
            OutPutFile = InputFile.Replace("csv", "pipe")
        End If
        If Not IsNothing(arrArgs(2)) Then
            If Not String.IsNullOrEmpty(arrArgs(2)) Then
                NewDelimiter = arrArgs(2)
            Else
                NewDelimiter = "|"
            End If
        Else
            NewDelimiter = "|"
        End If
        obj.ConvertCSVFile(InputFile,OutPutFile,NewDelimiter)

    Else
        Console.Write("Usage ChangeFileDelimiter <inputfile>,<outputfile>,<NewDelimiter>")
    End If
    obj = Nothing
    Console.Write(vbNewLine & vbNewLine)
    'Console.ReadLine()

End Sub

端模块

公共类重新定界

Public Function ConvertCSVFile(ByVal InputFile As String, ByVal OutputFile As String, Optional ByVal NewDelimiter As String = "|") As Integer
    Using MyReader As New Microsoft.VisualBasic.FileIO.TextFieldParser(InputFile)
        MyReader.TextFieldType = FileIO.FieldType.Delimited
        MyReader.SetDelimiters(",")
        Dim sb As New System.Text.StringBuilder
        Dim strLine As String = ""
        Dim currentRow As String()
        While Not MyReader.EndOfData
            Try
                currentRow = MyReader.ReadFields()
                Dim currentField As String
                strLine = ""
                For Each currentField In currentRow
                    'MsgBox(currentField)
                    If strLine = "" Then
                        strLine = strLine & currentField
                    Else
                        strLine = strLine & NewDelimiter & currentField
                    End If
                Next
                sb.AppendLine(strLine)
            Catch ex As Microsoft.VisualBasic.FileIO.MalformedLineException
                'MsgBox("Line " & ex.Message & "is not valid and will be skipped.")
                Console.WriteLine("Line " & ex.Message & "is not valid and will be skipped.")
            End Try
        End While
        SaveTextToFile(sb.ToString, OutputFile)
    End Using

    Return Err.Number

End Function

Public Function SaveTextToFile(ByVal strData As String, ByVal FullPath As String) As Boolean
    Dim bAns As Boolean = False
    Dim objReader As System.IO.StreamWriter
    Try
        If FileIO.FileSystem.FileExists(FullPath) Then
            Kill(FullPath)
        End If
        objReader = New System.IO.StreamWriter(FullPath, False, System.Text.Encoding.Default)
        objReader.Write(strData)
        objReader.Close()
        bAns = True
    Catch Ex As Exception
        Throw Ex
    End Try
    Return bAns
End Function

结束类

于 2013-10-28T21:33:14.597 回答