Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations strongm on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Database Migration

Status
Not open for further replies.

tayoo

MIS
Mar 14, 2007
4
CA
Hello All,

I am planning to migrate a database from one location to the other.

I have prepped my taget databse, all I need to do now is to export/imp the dump files.

However, this source database is being upadted continuously and there is little chance of shutting down the source, migrate and switch user to the target.

there are two ways that will make sure that the dabase is in synch:
1)use exp/imp on day one.Use export import again on day 2 while exporting the tables that have just being updated since day 1 with where clause in the query.
2)use the advance replication to create a materialise view on the source database. This will record all the changes that have occured.

Please advise on simpler options becasue I have not used option 2 though I hae used option 1 without the query.

Thanks.
 
Tayoo said:
this source database is being upadted continuously and there is little chance of shutting down the source, migrate and switch user to the target.
Even if you can "auto-magically" get the data on the target machine, what is your plan for "cutting over" your users from the source machine (that is always in use) to the target machine?

It seems to me that even under the best of conditions, unless you have a "seamless-cutover" stategy (fully tested on a test machine) and in place, that during cutover, there will be 1 or more seconds when the users are not connected to the source database.

There certainly are methods for a (relatively) seamless cutover: 1) Standby database and 2) Replication. Neither of these options is for the faint-of-heart. You must practise with either of these methods...I would never recommend performing a cut-over of a heart-of-the-business, high-availability system with having fully tested the features on a non-production environment first.

[santa]Mufasa
(aka Dave of Sandy, Utah, USA)
[I provide low-cost, remote Database Administration services: www.dasages.com]
 
Perhaps, there is a little window (during the off hours)that we can use to shutdown the source databsae and refresh target database with all the new changes and switch users using the new connection strings to new database.

I am might be inlcined to use the standby database.

Tayo
 
If your source and destination O/S are the same, standby would probably be best. At the moment I am looking at migrating a 1tb database from linux to AIX with minimal downtime, and because the endian formats are different, I am using Datapump to create the destination database and Oracle Streams to capture any changes that occur while the import is taking place.
 
...and, of course, I wouldn't notice until several hours later what a goofy wording mistake I made in my earlier post:
Mufasa's earlier post said:
I would never recommend performing a cut-over of a heart-of-the-business, high-availability system with having fully tested the features on a non-production environment first.
Hopefully, it is obvious that I meant to say:
Correction said:
I would never recommend performing a cut-over of a heart-of-the-business, high-availability system without having fully tested the features on a non-production environment first.
...Apologies.[blush]

[santa]Mufasa
(aka Dave of Sandy, Utah, USA)
[I provide low-cost, remote Database Administration services: www.dasages.com]
 
I just got word that we dont even have enough space for stadby database.

Standby database will require approximate size of the source database. Also the source is windows OS while the target is a SOLARIS.

This gives me the idea of using the followin steps.

1)Do a full database export on day1 from source.
2)import the full database to target database.
3)take advantage of short window each night by doing an export of the source again. This time exporting only changed objects one schema at a time.
4)Switch user to target schema once refresh in step 3 is completed.

To accomplish step 3,I am planning to export a subset of a schema. I only want to copy the objects that have been changed only since yesterday (for example) Please note the query clause.

exp parfile=J:\EXPORTS\EXCH.parfile

EXCH.parfile:
userid=system/mydba1
full=N
buffer=200000000
log=J:\EXPORTS\EXCH.log
OWNER=(EXCH)
file=(J:\EXPORTS\DUMPFILES\EXCH_01.dmp,
J:\EXPORTS\DUMPFILES\EXCH_02.dmp,
J:\EXPORTS\DUMPFILES\EXCH_03.dmp,
J:\EXPORTS\DUMPFILES\EXCH_04.dmp)
query=\"where changedate=20070316\"
filesize=2000000000
feedback=100000
direct=y
COMPRESS=N
GRANTS=Y
INDEXES=Y
ROWS=Y
RECORDLENGTH=65535
 
Tayoo,

Your plan sounds workable. I highly recommend a test of your plan since the target-database-migration does not require an actual cutover, nor does it impact the source database in any fashion.

The "standby database" would not work unless the two machines have identical file system structures (which Solaris and Windows do not).

Let us know how things go for you.

[santa]Mufasa
(aka Dave of Sandy, Utah, USA)
[I provide low-cost, remote Database Administration services: www.dasages.com]
 
If you are using Enterprise edition, why not investigate using Oracle Streams to replicate the data?
 
Why not simply use a consistent export. This will perform a point in time export that is consistent across all tables. However, make sure your rollback logs are big enough to hold all changes while the export is running. The following is from the export manual.

CONSISTENT
Default: N
Specifies whether or not Export uses the SET TRANSACTION READ ONLY
statement to ensure that the data seen by Export is consistent to a single point in
time and does not change during the execution of the export command. You should
specify CONSISTENT=Y when you anticipate that other applications will be
updating the target data after an export has started.

Bill
Oracle DBA/Developer
New York State, USA
 
After All the preparation, I was able to migrate the database this past weekend.

It was mostly successful except for two issues. Note I my target database is 9i on Linux. Here are my outstanding issues:

1) In the target database, I have a missing datafile. Note this database is in partial usage before. Hence during my prepping I incidentally deleted a datafile that I thought I will not be needing becasuse the client got another DBA to create a user to use the tablespace of this datafile.
I can safely recover this datafile by shutting down the database, start up mount, put the datafile offline and issue recover datafile 'XXXX'. This brings me to my second issue.
2) While I was loading via import, I noticed that my oraarchive directory was filling up badly and the database was hanging. I quickly deleted the xxxxxx.arc files to make room. All the while not making sure that my backup was up to date.

Now, after all the work, I wanted to back up the database and it was complaining about the missing archive logs.

Please, how can I back up all the database without the missing archive logs files.

Your help will be highly appreciated.

Tayo O
 
I take it you're using RMAN to do the backup? If so, perhaps consider using an alternative cold backup procedure that doesn't 'know' or 'care' about archive logs.

I want to be good, is that not enough?
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top