Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations Mike Lewis on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Updating SQL table with 3 seperate csv files

Status
Not open for further replies.

Cpreston

MIS
Mar 4, 2015
972
GB
Hi

We have 3 CSV files exactly the same format which I need to get into a SQL table (same table)

Trex.CSV
Samples.csv
Literature.csv

SQL table is called Samples

I have created a package that runs ok with just one file Trex.csv and this will update ok into the SQL table.
However I cannot find away to add the other 2 CSV files so it can run in one package.

Is this possible and if so how please.

Thanks
 
Hi,

Take a look at this link below - this should get you started on the correct path:

Link

Thanks
Michael
 
Hi

WOW Seems very complex and cannot quite get my head around how this applies to getting txt into the SQL table. I am sure someone with more experience could work this out but with my limited knowledge I doubt I could work this out.

Does anyone know of an easier method, if there is such a thing?

Thanks
 
Is this something that needs to run on a regular basis or a once off?

In SSIS using the Foreach Loop Container is what can be used to process multiple files. It allows you to grab a file, do something with it, then grab another file and do something with it etc etc.

If you read up on Foreach Loop Container and try it out, I am confident you get it to work for this scenario.

Thanks
Michael

 
Hi

We will need to run it daily. so my plan was to run it through a SQL job. I will check out the For Each Loop more.

Thanks
 
Another option if you don't want to use the Foreach Loop Container is you could have 3 separate Data Flow Tasks in one package - 1 for each file.

Thanks
Michael
 
Hi

Ok slight change of plan, I can now feed the SQL table from one excel sheet so this simplified the process.

However, I have come across a new issue. I have set the key field in the SQL table to not null and as a primary key, as I do not want duplicate codes.
It imports ok, but if I try and run it and their is a duplicate key then the whole process crashes.

Is there a way I can get it to work so it excludes the problem rows, like duplicates and then imports the good rows. So this way we always get the good rows in without fail.

Thanks
 
Hi,

The way I would do this is I would import the data into a staging table - this table has no constraints etc - i.e. the data gets imported 'as is'. I would then have a separate 'Execute SQL Task' in the package to identify the valid records and insert then insert these valid records into the final table with the constraints etc.

Thanks
Michael
 
Hi

I have managed to solve it using a Sort Transform and ticked it to exclude duplicates. Appears to running ok.

Thanks
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top