Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations IamaSherpa on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Oracle Table Space HELP PLEASE!!!! 1

Status
Not open for further replies.

jgrobler

Programmer
Jul 18, 2002
6
GB
I have an 8GB .DMP file which I can not recreate.
I have a server with Oracle 8 server and 45GB of free space to play with.
I have tried to create a database and import the file. That is where it all goes wrong. It has problems allocating intial extents for some tables. I am not a DBA but I need this database really badly. The database that I got the dump file from is around 12GB in size. Is there a way that I can create a 24 GB database and import the file so that it will not have any problems with allocating space while trying to import the file?
This is only going to be a temp database so I dont care if it isnt pretty I just need it to work.
Any help would be greate
 
If the export was done with parameter FULL=Y, you may try to import it using parameter FULL=Y.
--> This should recreate tablespaces as they were in the exported database
 
Thanks for the help. The export was not done using that parameter. I dont suppose you know if it defaults to that value do you?
Do I still need to create the tablespaces before trying to import or do I just import and that creates them?
I think problem stems from the fact that the export was done with compress=Y which has lumped all the extents together and come up with extents of over 4GB for some tables!
 
Do you have largefile support configured on your system? When the dump was created with the "compress=y" parameter, the initial size determines the minimum size of the datafile for the tablespace. For example, if the initial segment size if 4Gb, then you must have at least 1 datafile that is large enough (e.g. 4Gb) to hold the initial extent.

HTH
 
If your export does not contain too many objects, you should create objects you have problem with before you import data.
1/ Try to import the object definitions only: use parameter ROWS=N in your imp script. Make a log
2/ Read the log a verify which objects could not be created due to an initial extend problem
3/ Get the creation script of these objects by using IMP with parameter SHOW=Y
4/ Manually create thes objects using SQL*PLUS
5/ Import DATA

--> There may be simpliest ways. Does anyone have any other ideas ?
 
If your export does not contain too many objects, you should create objects you have problem with before you import data.
1/ Try to import the object definitions only: use parameter ROWS=N in your imp script. Make a log
2/ Read the log and verify which objects could not be created due to an initial extend problem
3/ Get the creation script of these objects by using IMP with parameter SHOW=Y
4/ Manually create thes objects using SQL*PLUS
5/ Import DATA

--> There may be simpliest ways. Does anyone have any other ideas ?
 
If your export does not contain too many objects, you should create objects you have problem with before you import data.
1/ Try to import the object definitions only: use parameter ROWS=N in your imp script. Make a log
2/ Read the log and verify which objects could not be created due to an initial extend problem
3/ Get the creation script of these objects by using IMP with parameter SHOW=Y
4/ Manually create thes objects using SQL*PLUS
5/ Import DATA

--> There may be simplier ways. Does anyone have any other idea ?
 
It would help knowing the EXPORT statement that was used to generate the DMP file...but I will make some assumptions not knowing it.

Assuming the file does not contain objects owned by SYS, SYSTEM or from the SYSTEM tablespace....A non-pretty solution would be :

Create a large tablespace called "DataLoader" with enough space to hold the incoming data.
CREATE TABLESPACE DATALOADER
DATAFILE
'<file path/name_1>' SIZE 2000M,
'<file path/name_2>' SIZE 2000M,
'<file path/name_3>' SIZE 2000M,
'<file path/name_4>' SIZE 2000M,
'<file path/name_5>' SIZE 2000M,
'<file path/name_6>' SIZE 2000M,
'<file path/name_7>' SIZE 2000M;

Create a user called &quot;DataLoader&quot;, point that user to the &quot;DataLoader&quot; tablespace and give that user UNLIMTED access to the DataLoader tablespace:
create user dataloader
identified by <password>
default tablespace dataloader
temporary tablespace <temp tablespace>
quota unlimited on dataloader
quota unlimited on <temp tablespace>;
grant connect to dataloader;

** Make sure your 'temp' tablespace is as large as possible, to hold the temporary segments that are created during the import process....

Do you know what user account was used when the data was exported? (SYS/SYSTEM or a regular user?)

Import the data using the SYS account (do you know the SYS or SYSTEM passwords ?) and do a FROMUSER/TOUSER to load the data as DataLoader user:
$ imp sys/<sys password> buffer=10485760 file=<dmp file> log=<logfile> full=Y FROMUSER=<users that were exported> TOUSER=dataloader commit=y

If there are multiple schemas in the dmp file, you might want to run this separately for each, varying the FROMUSER parameter for the specific user/schema name. This will break the load into a setof smaller, more manageable loads.

This process basically loads the entire dmp file into a single tablespace, with all objects being owned by DataLoader user. Some of the sizes may not be adequate; I sized upbased on your comments. This might take you a few tries with adjustments along the way. You can drop the &quot;DataLoader&quot; tablespace, revuild it with different sizes and retry aslong as you need to.



=================================
Thomas V. Flaherty Jr.
Birch Hill Technology Group, Inc.
 
Thanks to all of you for your help.
The export was done using system and the export file looked like this:
FILE=e:\nib\proddmp\AIAPROD.dmp
STATISTICS=NONE
COMPRESS=Y
ROWS=Y
LOG=e:\nib\proddmp\AIAPROD.LOG
owner=AIAPROD
BUFFER=6000000
DIRECT=Y

AIAPROD is the schema owner and all the table belong to it.
I have managed to get it to work by creating a user tablespace that started off at 9GB and an index tablespace that started at 4GB.
Does anyone know if there is anyway to speed this up?
Is there a better way of making a copy of a database rather than having to export to dumpfiles and then importing?

Thanks again to everyone.
 
If the original database is on a different server than the one you are trying to create, you could perform a cold-backup of the original, move the files to new server and restore it there.

Shut down the original database (immediate or normal)
Copy all database, online redo logs and control filesto the other server
Copy the init<sid>.ora and config.ora(if you have one)

To make things really easy, recreate the same file-structure locations from the original server (where your data, redo and control files were located)- if you followed OFA standards, this should be easy to do.

Once the files are placed in the same named-locations on the new server, you can start the database.




=================================
Thomas V. Flaherty Jr.
Birch Hill Technology Group, Inc.
 
Good point, but keep in mind that the tablespace must be &quot;self-contained&quot; (i.e. not having any objects that reference objects in other tablespaces), you also need ORACLE8i Enterprise to set up a transport-set and you have to temporarily make the tablespaces read-only to generate the set (does your environment allow you to do this?). If this is a data warehouse, tranportable tablespaces are a feature to check out, otherwise it might be too much trouble.

jgrobler: I am not sure what you plan to do with the copied database, but you could also configure a &quot;stanbdby&quot; database. Using ARCHIVELOG backup and recovery features, a standby database can be kept in sync with its source. many folks use this to copy a production database to development on another server.

Some standy requirements:
The source database has to be configured for ARCHIVELOG mode
The source and copy servers need the same version/patch-level of the operating system and *should* have the same version/patch-level of ORACLE

The complete ORACLE document for standby databases (technet):

Tom =================================
Thomas V. Flaherty Jr.
Birch Hill Technology Group, Inc.
 
Thanks to all of you for your help. It is now up and running! but now the server netwrok card has blown! oh well thats not my job.
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top