Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations Chris Miller on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Import columns from one table into another table with dissimilar field layout

Status
Not open for further replies.

eboughey1008

Programmer
Jan 6, 2009
31
US
I have a permanent template with zero records. Every time I get a new file from my client I push the csv file to sql and my end game is to import that file into my permanent template then export back to csv format in the same field layout all the time.

In Foxpro I just type 'Append From <table> and it finds all the matching field names automatically and populates the table. Can that be done in SQL? I can't find anything but I'm sure I should be able to accomplish it. I don't want to input 68 columns using an insert command. My client files change field layout all the time too so it's important to get the right template before I start the data work so my data is always in a uniform format.

Any help would be so much appreciated. Sorry to ask but after a full day and a half trying different things I'm ready to throw in the towel here!

 
When you do it now ‘by hand’, how do you know which ‘field’ from CSV file goes to which field in your ‘permanent template’ in SQL? Do your CSV files have a header row? If so, is there any correlation of the header row information to your fields’ names in your ‘permanent template’ in SQL?

---- Andy

"Hmm...they have the internet on computers now"--Homer Simpson
 
Hi Andy,

The field (column) names in both tables are the same but my permanent template has less fields/columns (72 vs 189).

The csv does have header rows. I'm used to working in foxpro so I don't have any issue importing records from one dbf to another one as long as the field names are the same in both dbf files.
 
What did you try?

BCP and format files should lead to a solution: First create a format file for your permanent import table, then create one for the current CSV format and map the fields. See the next steps section about mapping fields


Instead of manually rearranging the field rows or XML portions you could write a generator.

Chriss
 
The field (column) names in both tables are the same"
It looks to me you have 2 tables: TableA and TableB

"my permanent template has less fields/columns"
So, you have another table: "permanent template" table? (TableC...?)

And you also have a CSV file(s) from your clients as well, right?
[ponder]

---- Andy

"Hmm...they have the internet on computers now"--Homer Simpson
 
Just by the way.

eboughey1008 said:
In Foxpro I just type 'Append From <table> and it finds all the matching field names automatically and populates the table.

This only works from DBF to DBF, not appending a csv text file into a DBF.

So actually in VFP you also would need a mapping of field names, you can take the first line of the CSV file as a template for names, taking the field types from your import staging table, but you either need a table for the CSV import as a DBF in the order of the fields of the CSV captions or you - and that's only working with BCP in the SQL Server context - you need a format file that can map the fields of the CSV to the SQL Server table in a single step.

Chriss
 
Other thing SQL Server offers are SSIS (SQL Server Integration Services) formerly known (perhaps already about a decade ago) as DTS (Data Transformation Services) which lets you define data sources of any kind usable with any ODBC driver or OLEDB provider, that also includes CSV as data source file and you might get something going with an Excel OLDB provider that treats the CSV as an Excel file (not with Excel as destination in mind, but the special case of an Excel driver used to deal with CSV, though it's not the real deal XLS(X) spreadsheet format, Excel has the same import path and supports CSV as data source by its drivers, too. Of course also mainly to import CSV.

Well, and if all else fails you could redo the import with VFP and then use SSIS with DBFs as a data source, use a linked server using a Foxpro DBC or a directory of DBFs via VFP ODBC driver.

Then there are newer things like Azure Data Factory, if ou use SQL Server in the context of Azure in the cloud instead of on-premise database.


Chriss
 
And another idea: You could use the first line of the CSV file containing the column names as a list of a SELECT ... INTO <TABLE> statement, which creates a new staging table with the columns in order of the CSV file for simply bulk copy:

Code:
SELECT [fieldnames from 1. line of CSV] FROM dbo.yourpermanenttable INTO dbo.NewStagingtable

Your permanent table would be the template of the column data types for the dbo.Newstagingtable, which then is designed with all the column names in the order of the CSV file and could take in the CSV in a BULK COPY step. But there's still one thing that makes this complicated, too: As you say the CSV has much more fields than you actually have in your import table (dbo.permanenttable), such a SELECT column-list would cause errors about non-existing columns. You can of course remove them, but it will be a tedious process to go through 189-72=103 error reports of missing column names to arrive at the list of fields your import table can actually provide the column definitions of. All unknown columns would need to be defined as varchar columns, for example, if you still import the CSV without a format file by simply importing all columns.

It's not the first thought and likely not the best anyway, so ou still need to do some work per different CSV format you need to deal with.

Chriss
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top