Hello all,
Been looking to figure this out for a while, can't hurt to ask the experts
....
I have a folder that will periodically contain text files:
TABLE1.TXT
TABLE2.TXT
TABLE5.TXT
However, the files will likely be different each time, ie, the next time the package is run, there might be
TABLE2.TXT
TABLE3.TXT
TABLE4.TXT
I'm trying to write a package that will:
For each file in the directory, get the name and determine the table. (I can do that with a foreach container and a script task, yay)
Fire off a SQL Query that looks like "truncate table ___ " (I can hack my way into doing that by getting a connection object from the dts.connections collection and creating a command, etc, while in the script task).
Import the text file into the right table. Tis where I'm completely stuck. Any thoughts on how to make the Data Flow Task? Should I somehow do an import from inside the script task? Is that possible? All the files have different layouts, so making the Flat File Connection (even being able to set the name of the file) isn't working. Is there a way to do column mappings in code? With variables? I'm stuck having a file name, table name and sql server connection, while in a script task inside a foreach container...can't get the data in.
Hope everyone has some thoughts...
D
Been looking to figure this out for a while, can't hurt to ask the experts
....
I have a folder that will periodically contain text files:
TABLE1.TXT
TABLE2.TXT
TABLE5.TXT
However, the files will likely be different each time, ie, the next time the package is run, there might be
TABLE2.TXT
TABLE3.TXT
TABLE4.TXT
I'm trying to write a package that will:
For each file in the directory, get the name and determine the table. (I can do that with a foreach container and a script task, yay)
Fire off a SQL Query that looks like "truncate table ___ " (I can hack my way into doing that by getting a connection object from the dts.connections collection and creating a command, etc, while in the script task).
Import the text file into the right table. Tis where I'm completely stuck. Any thoughts on how to make the Data Flow Task? Should I somehow do an import from inside the script task? Is that possible? All the files have different layouts, so making the Flat File Connection (even being able to set the name of the file) isn't working. Is there a way to do column mappings in code? With variables? I'm stuck having a file name, table name and sql server connection, while in a script task inside a foreach container...can't get the data in.
Hope everyone has some thoughts...
D