Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations IamaSherpa on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

EXPORT Help needed ASAP 2

Status
Not open for further replies.

cyberdyne

Programmer
May 1, 2001
173
IN
I have created a project in oracle 8 as backend and VB as frontend.
I want some guidance / soultions for this. The case is explained below.

I have same project running at two different places. One is Head Office and other is regional Office. The table structures at both the places are same. Regional office will enter data of their region only.
And Headoffice will enter data of their. Now Headoffice wants the data of their regional office in their server also. This is where I am strucked up.

I am in need of solution for how to extract data from some tables (Not all ) from regional office server databse and append it to Headoffice server database. This will be done on regular basis say monthly or fortnightly.

There is no Index in any of the table. There are some views though which are not needed to be extarcted or imported. I tried to this with imp80/exp80 command but it extracts full database, and when imported into HO server, it creates duplicate entries in some table.

Please help on this ASAP.

I can be contacted at cyberdyne@softhome.net

 
Did you try just exporting from those particular tables? If you look up the Import/Export commands, there is a TABLES parameter that you can use to only import/export certain tables.

Hope that helps... Terry M. Hoey
th3856@txmail.sbc.com
While I don't mind e-mail messages, please post all questions in these forums for the benefit of all members.
 
There are many possible solutions for your situation. If these servers are connected then you could establish a link and just pull the data you want. I personally don't recommend using links because of the traffic issues on networks. You could also use create a stored procedure that would export out just the tables that you want. You could create a seperate schema for your regional office in the database and import your extracted data into them. Then you could create some PLSQL that would combine the regional data with your HO data.

Lot's of options!! You will have to decide what is the best option for your environment.
 
ok Friends , thanks for yr reply.

One thing i forgot to mentioned is that these servers are far apart from one part of country to other part of the country. So no direct n/w is available. Only way is email.
I want some list of commands through which i can extract data from some tables. can u give commands ?
I am somewhat new in oracle.

Also I want to know is there a way that once i hv extracted data next time it should not be extracted.
 
If you wrote a query that created a fixed width text file, you could use SQL*Loader to load the emailed data in. It is hard to give an example without knowing the structure of your tables. Terry M. Hoey
th3856@txmail.sbc.com
While I don't mind e-mail messages, please post all questions in these forums for the benefit of all members.
 
dear tcoffey
Thanks for your suggestion
Please will you explain what are those many possible ways of recurssive backup methods.
My servers are few KM away so no chance of physical connection except email.Is any method to directly send a
expdata.dmp file via email as an attachment to HO, and at Ho we can import it back to HO server.

Thanks for help.
 
I Think Terry's sql Loader approach will be the most fexable and reliable. I've worked on a similar problem and this is how we solved it. BTW, it took us a while to write all the code.

1. At a certain time of the day a client would run a query that created text files of all the data we needed. Each record was precended by a 2 digit 'record type' which mean't we knew which table the data had to go in. Each file had a '99' record at the end which contained a count of the number of records in the file and a '00' record at the start which contained info like the date the file was for. This extra info is important incase the email gets scrambbled or cut off mid transmission. If that's the case you don't want to be loading bad data.

2. The client had a Unix job which placed the data on a secure FTP server, in a specified directory and gave it a file name that was basically a file sequence number.

3. On our Unix box we had Shell scripts that ran as part of cron which would go to the FTP server and look for new files, if it found them it would FTP them back to the UNIX box and look at the end of the file. If it didn't end with the 99 record then we assumed the client was still putting the file down and would not use this file. (the shell script checked every 10 minutes or so) It would keep a log of the files it was trying. if it tried more that 12 times (2 hours worth) it would assume the FTP by the client had failed and raise an alert (an automatic page to our support)

4. If the file was transmitted OK then we would SQL Loader the file into staging tables on the DB. if we encountered any errors the processes would pause and alerts would be raised.

5. We then had PL/SQL Packages / procedures that would migrate this info into the tables proper. if any data looked dirty it would not be migrated nor would any other related info. It would be marked as 'bad' on the staging tables and alerts would be raised.

6. Finally the job would send out a few emails saying what was loaded and 1 back to the client so when they came in in the morning they knew if everything had gone OK the previous night.


 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top