I am currently working on a project which requires me to copy records (200,000+ ) from SQL SERVER database to a temporary MS ACCESS database. Somebody, please help.
In SQL Server goto Tools -> Data Tranformation services -> export
Then follow the wizard to export your data from SQL Server to Access. Then on the last step choose to save to SQL Server. Once you have done this you'll see the DTS Package saved under your DB's Data Trasformation Services folder. Then go back to Visual Basic and make a reference to Microsoft DTSPackage Object Library and then run the package like this:
Dim oPkg As New DTS.Package2
oPkg.LoadFromSQLServer "MyServerName", , , _
DTSSQLStgFlag_UseTrustedConnection, , , , _
"MyDTSPackageName"
oPkg.FailOnError = True
oPkg.Execute
oPkg.UnInitialize
Set oPkg = Nothing
Really sorry I failed to mention that I'm doing all this at run time. The purpose of this to create a local copy of the database for each user to work on. The tempory ms access db is created/refreshed each time the user starts the system .
It's the reference source that everyone who use SQL should have!
________________________________________________________________
If you want to get the best response to a question, please check out FAQ222-2244 first
'People who live in windowed environments shouldn't cast pointers.'
======================================
Cursors are useful if you don't know sql.
DTS can be used in a similar way.
Beer is not cold and it isn't fizzy.
If the data is 'already' in SQL Server, why would you want to repetively copy it back to Ms. A.? The 'common' approach would be to create stored procedires in SQL Server to minimize the Network traffic. I would see copying the whole db as increasing the net traffic. Using a "Local"
copy doesn't appear to be a good way to do data entry, as it accomplishes little (nothing?) useful execpt (perhaps) local checking for PK/unique field values - which can easily be done via the stored procedures. About the only use for a LARGE block of records I can see is the possible use of Ms. A.'a Crosstab query on a recordset, and even there I think it would still be more efficient to get the recordset from SQL Server and do the CrossTab on that.
johnwm,vbSun & nigelrivett,
Thank you guys for the references!
MichaelRed,
The reason why I'm creating a temp copy of SQL server db to an ACCESS db is to facilitate the easy searching of certain field (fullname). I've created a module, a sort of an autocomplete thing (populate a datalist control with the field coming from the access db) for fast searching. I've tried connecting the datalist control directly to the SQL db (200k records) and it loaded slow and searched slow.
Am I making the right approach here? Or there is a better way here.
"The right stuff" is -as usual- relative. Only you, being in the situation, can deal with the set of variables handed to you. I would not think to even attempt to use something like the autolookup feature on a 200K+ recordset, even if it was local. As to suggesting an alternative, the closest I could come would be to subnmit the entire name and do a stpred procedure / query perhaps based on the 'soundex' approach - returning the set which are close / match, and having the user select from the list (or deciding that this is a new entry). Even there, I would maintain the lookup field (soundex) and name in a seperate related table. Again, I certainly do not know enough about the specifics to assure you that this will be a good soloution, but it is my best guess or what might/probably be my first attempt.
I would additionally suggest that "whole name" is generally a poor choice for a field. There are too many variables in the data entry process to expect consistiency. You will surely have [Mr. | Mrs. | Ms. | Dr. | Esq. | Jr. | III | etc.] imcluded somewhere within the table, and that will hamper any search.
This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
By continuing to use this site, you are consenting to our use of cookies.