64

I'm trying to import a correctly quoted CSV file, meaning data is only quoted if it contains a comma, e.g.:

41, Terminator, Black
42, "Monsters, Inc.", Blue

I observe that the first row imports correctly, but the second row errors in a manner that suggests the quoted comma was treated as a field separator.

I have seen suggestions such as this one

SQL Bulk import from CSV

to change the field terminator

FIELDTERMINATOR='","'

However, my CSV file only quotes fields that need it, so I do not believe that suggestion would work.

Can SQL Server's BULK IMPORT statement import a correctly quoted CSV file? How?

4

12 回答 12

32

Unfortunately SQL Server interprets the quoted comma as a delimiter. This applies to both BCP and bulk insert .

From http://msdn.microsoft.com/en-us/library/ms191485%28v=sql.100%29.aspx

If a terminator character occurs within the data, it is interpreted as a terminator, not as data, and the data after that character is interpreted as belonging to the next field or record. Therefore, choose your terminators carefully to make sure that they never appear in your data.

于 2012-10-16T01:51:44.593 回答
30

I know this is an old topic but this feature has now been implemented since SQL Server 2017. The parameter you're looking for is FIELDQUOTE= which defaults to '"'. See more on https://docs.microsoft.com/en-us/sql/t-sql/statements/bulk-insert-transact-sql?view=sql-server-2017

于 2018-06-20T08:35:29.317 回答
21

There is another solution for this.

Consider the quotes as part of the fields delimiter, by editing the fmt file.

You can check this out for more information:

http://blogs.msdn.com/b/sqlserverfaq/archive/2010/02/04/how-to-remove-unwanted-quotation-marks-while-importing-a-data-file.aspx

An extract of the link above:

The only way to remove the quotation marks would be to modify the column delimiters specified during the import operation. The only drawback here is that if you inspect the data to be inserted, you will very quickly realize that the column delimiters are different for each column (Delimiters highlighted above).

So to specify different column delimiters for each column, you would need to use a format file if you plan to use Bulk Insert or BCP. If you generate a format file for the above table structure, it would be as follows:

9.0
3
1       SQLCHAR       0       5       "\t"     1     FName              SQL_Latin1_General_CP1_CI_AS
2       SQLCHAR       0       5       "\t"     2     LName              SQL_Latin1_General_CP1_CI_AS
3       SQLCHAR       0       50      "\r\n"   3     Company            SQL_Latin1_General_CP1_CI_AS

Modify the format file to represent the correct column delimiters for each column. The new format file to be used will look like this:

9.0
4
1       SQLCHAR       0       0     "\""      0     FIRST_QUOTE      SQL_Latin1_General_CP1_CI_AS
2       SQLCHAR       0       5     "\",\""   1     FNAME               SQL_Latin1_General_CP1_CI_AS
3       SQLCHAR       0       5     "\",\""   2     LNAME            SQL_Latin1_General_CP1_CI_AS
4       SQLCHAR       0       50    "\"\r\n"  3     COMPANY          SQL_Latin1_General_CP1_CI_AS
于 2014-12-12T09:59:33.487 回答
7

Make sure you have enabled TextQualified option and set it to be ".

于 2016-03-18T00:02:22.667 回答
5

I had this same problem, and I didn't want to have to go the SSIS route, so I found a PowerShell script that is easy to run and handles the case of the quotes with the comma in that particular field:

Source Code and DLL for the PowerShell Script: https://github.com/billgraziano/CsvDataReader

Here's a blog that explains the usage: http://www.sqlteam.com/article/fast-csv-import-in-powershell-to-sql-server

于 2016-03-02T07:28:32.540 回答
3

You could also look at using OpenRowSet with the CSV text file data provider.

This should be possible with any version of SQL Server >= 2005 although you need to enable the feature.

http://social.msdn.microsoft.com/forums/en-US/sqldataaccess/thread/5869d247-f0a0-4224-80b3-ff2e414be402

于 2013-03-07T23:07:04.193 回答
3

I've spent half a day on this problem. It's best to import using SQL Server Import & Export data wizard. There is a setting in that wizard which solves this problem. Detailed screenshots here: https://www.mssqltips.com/sqlservertip/1316/strip-double-quotes-from-an-import-file-in-integration-services-ssis/ Thanks

于 2016-06-22T20:52:20.457 回答
2

I had the same problem, with data that only occasionally double-quotes some text. My solution is to let the BULK LOAD import the double-quotes, then run a REPLACE on the imported data.

For example:

bulk insert CodePoint_tbl from "F:\Data\Map\CodePointOpen\Data\CSV\ab.csv" with (FIRSTROW = 1, FIELDTERMINATOR = ',', ROWTERMINATOR='\n');

update CodePoint_tbl set Postcode = replace(Postcode,'"','') where charindex('"',Postcode) > 0

To make it less painful to write the REPLACE script, just copy and paste what you need from the results of something like this:

select C.ColID, C.[name] as Columnname into #Columns
from syscolumns C
join sysobjects T on C.id = T.id
where T.[name] = 'User_tbl'
order by 1;

declare @QUOTE char(1);
set @QUOTE = Char(39);
select 'Update User_tbl set '+ColumnName+'=replace('+ColumnName+','
 + @QUOTE + '"' + @QUOTE + ',' + @QUOTE + @QUOTE + ');
GO'
from #Columns
where ColID > 2
order by ColID;
于 2014-10-01T12:11:04.760 回答
1

I had the same problem, however, it worked for me with the following settings:

bulk insert schema.table
from '\\your\data\source.csv'
with (
datafiletype = 'char'
,format = 'CSV'
,firstrow = 2
,fieldterminator = '|'
,rowterminator = '\n'
,tablock
)

My CSV-File looks like this:

"col1"|"col2"
"val1"|"val2"
"val3"|"val4"

My problem was, I had rowterminator set to '0x0a' before, it did not work. Once I changed it to '\n', it started working...

于 2020-11-21T07:55:21.417 回答
0

Per CSV format specification, I don't think it matters if data is correctly quoted or not, as long as it adheres to specification. Excessive quotes should be handled by the parser, if it's properly implemented. FIELDTERMINATOR should be comma and ROWTERMINATOR is line end - this denotes a standard CSV file. Did you try to import your data with these settings?

于 2012-10-15T19:20:57.737 回答
0

Been stung by the same :)

I wrapped this logic into a function to clean up data that had been was already imported

DECLARE @str NVARCHAR(MAX);
DECLARE @quote_identifier NVARCHAR(MAX);

SET @quote_identifier = N'"';

SET @str = N'"quoted stuff"';

SELECT IIF(
           LEFT(@str, 1) = @quote_identifier
           AND RIGHT(@str, 1) = @quote_identifier,
           SUBSTRING(@str, DATALENGTH(@quote_identifier), LEN(@str) - DATALENGTH(@quote_identifier)),
           @str);
于 2020-09-18T18:48:28.677 回答
0

It seems this now works on partially quoted csv files as long as you include FORMAT='CSV' in the options. Once I added this, it worked as expected with a partially quoted file.

于 2022-01-18T05:44:42.383 回答