Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations strongm on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Text files with different destinations ?

Status
Not open for further replies.

desdaone

Programmer
Oct 18, 2006
2
US
Hello all,

Been looking to figure this out for a while, can't hurt to ask the experts
:D....

I have a folder that will periodically contain text files:

TABLE1.TXT
TABLE2.TXT
TABLE5.TXT

However, the files will likely be different each time, ie, the next time the package is run, there might be

TABLE2.TXT
TABLE3.TXT
TABLE4.TXT

I'm trying to write a package that will:

For each file in the directory, get the name and determine the table. (I can do that with a foreach container and a script task, yay)

Fire off a SQL Query that looks like "truncate table ___ " (I can hack my way into doing that by getting a connection object from the dts.connections collection and creating a command, etc, while in the script task).

Import the text file into the right table. Tis where I'm completely stuck. Any thoughts on how to make the Data Flow Task? Should I somehow do an import from inside the script task? Is that possible? All the files have different layouts, so making the Flat File Connection (even being able to set the name of the file) isn't working. Is there a way to do column mappings in code? With variables? I'm stuck having a file name, table name and sql server connection, while in a script task inside a foreach container...can't get the data in.

Hope everyone has some thoughts... :D

D
 
My thoughts are to create package variables (insert a dummy value into them) that are global to the package which contain the file names. Use your ForEach / Script task combo to find your files and then populate those variables, which overwrites the dummy value.

Then have conditional precedence constraints which will go to the correct Data Flow task based on the value of the variables.

Of course, I'm assuming you have a finite number of filenames/tables that you'll be importing to. If you have a large number of these you're doing, I'm not sure how you can handle that.



Catadmin - MCDBA, MCSA
"No, no. Yes. No, I tried that. Yes, both ways. No, I don't know. No again. Are there any more questions?"
-- Xena, "Been There, Done That"
 
Hmmm...So inside the foreach container I'd have a premade dataflow task for each possible file....there are prolly 15-16 possiblities, but that sounds like it might work. Then a conditional precedence contraint from the task to them....

Never used a conditional precedence constraint, but I'm sure I can figger it out.

Can't help but think there must be an easier way, but I sure don't know ... :D

Thanks for the help.
 
NP. Let us know if it works for you or not.


Catadmin - MCDBA, MCSA
"No, no. Yes. No, I tried that. Yes, both ways. No, I don't know. No again. Are there any more questions?"
-- Xena, "Been There, Done That"
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top