Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations Chris Miller on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Text Import Specification Regardless Number of Fields

Status
Not open for further replies.

kjv1611

New member
Jul 9, 2003
10,758
US
This may be a simple answer, or it may just not be possible - at least the way I want to do it.

I have 92 worksheets in one workbook which I have been importing into Access, and combining into one table.

The first time I did this, I had a LOT of manual effort to do in order to accomplish this. Reason being, all the sheets have similar formats, and most are the same, but a few are varied. And besides that, they all have blank records, and some of them have a blank record as their first record/row.

So, I've been working through some VBA to take care of all the possible variations, or all that I can think of anyway.

For these variations, I would like to use a import delimiter that will import every field as text from every text file (I'm exporting as tab delimited from Excel to get over a few of the field formatting issues I've had).

The problem is that not every sheet/file/table has the same number of columns/fields.

So, is there a way to create just a generic "all text format" specification?

I'm hoping it's just a simple thing that I'm overlooking, and that it doesn't require further programming.

If necessary, though, I'll get all that information from Excel for each sheet, and use that to set the field formats...

--------

Of course, another thought I've had, altogether different is to just import the data into the same tables each time if the table exists in order to get around that... however, I'll probably still need to keep the option for a "new" spreadsheet in case a new one is inserted one month.

Any thoughts/suggestions/references/examples greatly appreciated.

--

"If to err is human, then I must be some kind of human!" -Me
 
I think schema.ini is the answer to what you are after. I used it some time ago to export variable data from a legacy system into Access as an overnigt batch routine. It's not an 'all text format', but if you are already defining the output you can define the schema at the same time.

It's some time since I've used it, but you should find more info in this forum and also by surfing generally.
Simon Rouse
 
Thanks! I'll look that up here shortly, and post back!

--

"If to err is human, then I must be some kind of human!" -Me
 
DrSimon,

I've looked up about the schema.ini file, and found one option to create one with the ODBC option within Control Panel here:

However, it appears you have to create one of these for each text file you want to open, but I'm looking to do this automatically based on the file itself. That way it covers for any future changes. It may simply not be an option, but I'll keep digging around. That was just the first reference I actually looked at. I'm about to look at the MSDN reference, now:

--

"If to err is human, then I must be some kind of human!" -Me
 
You can build a schema.ini, you could also build a table on the fly, based on the column count of the excel file.
 
So far, it looks to me that I can use the method found in the first link I posted to create one for just a certain file.

Then I can perhaps go in and edit the file, and maybe add more columns, I'm guessing. I think I'll give that a try...

--

"If to err is human, then I must be some kind of human!" -Me
 
Remou, that's another thought I had earlier. I thought that I suppose I could either:
1. Create a "reference" table that lists the "new" table names along with their fields and field formats.
2. Or better, I suppose, would be to create the table for each programatically, and then import the data into those fields. That way, I could easily just specify they are all text.

Now which one would be the best programmatically?
[ponder]

--

"If to err is human, then I must be some kind of human!" -Me
 
I think that choosing the best would depend on what you intend to do next. It would be easy enough to read in the first line of a csv file and use that to create a new table. Split would create an array that could be used to create an SQL string. A reference table could provide additional information, or you could simply use a length of 255 for all fields.

 
Well, here's my long term goal, whole deal, so maybe you (or anyone really) can offer suggestions based on that:

On a monthly basis, I (or if replaced, my replacement) will receive an Excel Workbook containing appx 94 spreadsheets.

Of the 94 spreadsheets, 92 are actual data, the first 2 are just informational for the reader - nothing I'm concerned with.

I want to take those 92 worksheets and import them into an Access database where I already have code for combining all of them into one table (once they are imported accurately into their own tables).

Then, once there, I bump that up against data in a SQL Server table to find any possible matching records there.

Then I return all those results back to an Excel workbook (one sheet) that displays the info from the original worksheets along with a couple fields - or at least one field - from the SQL Server table.

I did this manually last time, and boy was it fun [blue][mild sarcasm, 'cause I do like challenges in this stuff as much as I hate to admit it.][/blue]

With this databse, I'm setting up a user form (at least one) for fiddling with all this info. And with that database, I'm going to keep a couple different historical tables as well - not keeping every single piece of data imported... at least I hadn't planned on that, but the idea keeps popping into my mind.

The reason that I may just keep all the data is that the workbook I am receiving (based on getting it now twice) seems to always have the most recent completed month of data, as well as all data during the current year. So, the first one I got was January through April... the second was January through May.

Of course, that piece of it (I think) probably doesn't affect all this too much.. I guess.


My process within Access so far is this that I'm planning:
[OL]
[LI]Import data from Excel - was planning on exporting from Excel to text files and then to Access[/LI]
[LI]Combine data from separate tables into one master table[/LI]
[LI]Delete the individual tables, as I no longer need them[/LI]
[IL]Clear out empty records from master table - or actually from the initial tables before combining them.[/LI]
[LI]Take the data in the new "master" table, and bump that up against the SQL Server table, comparing at least 1 or 2, possibly a handful of fields eventually - last month I used a total of 2 or 3 fields if memory serves me correctly (not looking at the SQL script right now).[LI]
[LI]Return the records from the original data that actually matched a record in our SQL Server table back to an Excel worksheet for further research/action.[/LI]
[/OL]

The more I look at it, the more I get to thinking: maybe I'm putting too many pieces into this whole puzzle. Of course, though, what I've done so far is to break it into as many pieces as possible for bug tracking, and then if/when I get it working flawlessly, I can just put it into one combine process that runs 100% on its own.

Okay, any further thoughts/suggestions/criticisms (no, I don't mean I've been criticized, but I'm saying if someone thinks I'm making it too hard or choosing a bad method, please advise.)? [wink]



--

"If to err is human, then I must be some kind of human!" -Me
 
I KNEW I'd goof if I didn't preview first! Well, I'm sure you all can figure out that is a list. [wink]

--

"If to err is human, then I must be some kind of human!" -Me
 
Here is another piece of information that might be of help to anyone thinking about this one:

I am keeping all of that data stored in a location that is accessible by everyone working in the same department (it's a small department), so if someone else had to do this if they replaced me, or if I were out for an extended time for any reason, then they can run it from there.

The user form idea is so that it'll be that much simpler for someone else to do my work (assuming it doesn't break) in my absence.

--

"If to err is human, then I must be some kind of human!" -Me
 
How about linking the excel sheets, examining the data an then writing it to the main table?
 
Can you expand on that thought?

Do you mean:
[OL][LI]Use "Link Tables" in Access[/LI]
[LI]Select the Excel Worksheets from there[/LI]
[LI]Compare the data in Access between the tables[/LI]
[LI]Combine the data based on the comparison results in Access[/LI][/OL]
?

Just want to make sure I understand you correctly on that.

--

"If to err is human, then I must be some kind of human!" -Me
 
Hi kjv1611
Can you tell us something about the columns in the sheets? If you're combining them into one table they must have some columns in common.
 
Yes. Something like:

Code:
'Requires reference to the Microsoft Excel x.x Object Library

Dim strFileName As String
Dim objXL As Excel.Application 'Object
Dim wkb As Excel.Workbook
Dim wks As Object

Set objXL = CreateObject("Excel.Application")
objXL.Visible = True

strFileName = "C:\Docs\ltd.xls"
Set wkb = objXL.Workbooks.Open(strFileName)

For Each wks In wkb.Worksheets
    DoCmd.TransferSpreadsheet acLink, acSpreadsheetTypeExcel9, wks.Name, strFileName, True, wks.Name & "$"
Next

If nothing else, it would allow you to get the field names.

another possibility is ADO.
 
Thanks. I think I'll give that a try as well....

Of course, I may just end up doing it again like last month, and then tinkering with my methods afterwards, so at least I can get the data back for now. [wink]

What I had come up with took a while to get, and then (b/c I didn't back up the database), I lost a big chunk of my work the other day and had to recreate it - that was fun, doing it all from memory again. [smile]

--

"If to err is human, then I must be some kind of human!" -Me
 
Here's a crazy idea I just thought about:
What if I programmatically just went through and added double-quotes to all the data, and then used that as the "text qualifier"?
Would that work, or would that just be overkill on such a process, and end up just bogging the thing down.

As for a record size, I think the total is no more than around ten or twenty thousand for a year, or so it seems so far.

--

"If to err is human, then I must be some kind of human!" -Me
 
I do not think that that is a good idea. You can open an excel sheet directly with ADO and write to a table, which would be a better way to go.

Why do linked excel sheets not suit? There are a number of advantages, not the least of which is the ability to show the table when all else fails.
 
The more I think about it, the more I'm liking the idea of linking all the worksheets, and then going from there. I can link the worksheets, and then use SQL to import all non-blank data, and of course making sure all the fields line up with the "master" table..

I'll work on that some more in the morning...

--

"If to err is human, then I must be some kind of human!" -Me
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top