Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations strongm on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Strip Quotes from Flat File in SSIS

Status
Not open for further replies.

mickeyj2

Programmer
Jun 7, 2007
79
US
Hi,

Ok, here’s my situation. I have a CSV flat file. When you look at it in a text editor, the format is like:

"Contact Name","Customer","Adr1","Adr2","Adr3","City","State","Zip","Ctry","Phone","Slsm","Email"

In other words, quotes surround all of my data.

In SSIS, I have built a package “Customers.dtsx”.

Within the Flat File Source Data, I have the following properties.

Locale: English
Code Page: 1252 (ANSI – Latin I)
Format: Delimited
Text Qualifier: “
Header Row Delimiter: {CR}{LF}
Header Rows to Skip: 0
Columns names in the first data row: {checked}

When I preview this in the Flat File Connection Manager Editor, the data looks EXACTLY as I would like for it to look. The quote All of my column names are surrounded by quotes.

I add the Flat File Source to my Data Flow Task and connect it to my OLE DB Destination. The connection to my database table is fine.

Now, when I execute the package, I get the following error message on my Flat File Source:

“The column delimiter for column “”Email”” was not found.”

Ok, that’s the situation.

Here’s some things that I’ve tried to get this thing to work:

1) If I remove the Text Qualifier and leave it blank, I get the entire package to execute and my files get loaded into the database, except all of the data is surrounded by quotes.

2) I could change the column names to NOT be surrounded by quotes (Contact Name, Customer, etc.), but I’ll still get the complaint about a missing column.

Any ideas. I have rebuilt this several times. One time, about a month ago, I got it to work, but I don’t have that package anymore.

Thanks,
Mickeyj2

 
is the qualifier a double quote or 2 single quotes? check oth the data and the connection manager.

Paul
---------------------------------------
Shoot Me! Shoot Me NOW!!!
- Daffy Duck
 
was that the problem?

Paul
---------------------------------------
Shoot Me! Shoot Me NOW!!!
- Daffy Duck
 
It's probably not the problem. I had this problem a couple of years ago at a previous job, and I think I ended up getting rid of the text qualifier in the source files as I had control over the output format for the source files. I believe I also changed the delimiter from a comma to a special character so that I could still correctly parse commas which may be part of the actual values. That is something to try if you can change the output format. One side benefit is that I was working with really wide tables with the potential to have small values, and getting rid of the text qualifier downsized the csv files by a decent margin, which was a plus since the files were going a long ways over the network.

I wouldn't be surprised if there is a more straightforward fix for this issue, but I am not aware of what it may be.
 
I've got a process that imports 100million + rows from dozens of files that utilize text qualifiers and haven't encountered this. So I'm wondering if it is something within the data that is causing the issue. If you use double quotes as the text qualifiers and the data has double quotes in it then this could possibly cause and issue. Wonder what would happen in you used some obscure text qualifier such as }.

Paul
---------------------------------------
Shoot Me! Shoot Me NOW!!!
- Daffy Duck
 
Good call. I bet having double quotes in the data could be it. That actually rings a bell now that you mention it. The other thing I was thinking, although less likely would be an SP issue, since I did encounter this a couple of years ago. You could even try something more obscure for either the qualifier or the delimiter. I think I used ASCII char 31 for my delimiter.
 
I had an app that used to strip MDX code from AS 2000 cubes and write it to a and a second app to load the MDX to a different cube.

If you know MDX you know just how many special characters it uses I think my delimenter toseperate calculations was !|~

Paul
---------------------------------------
Shoot Me! Shoot Me NOW!!!
- Daffy Duck
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top