Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations gkittelson on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

How do I export a table that has data > 2GB?? 1

Status
Not open for further replies.

bbanner

Programmer
Aug 3, 2001
5
US
I recently inherited a database that I need to backup/export. There is one table with ~36 million records, each ~8000 bytes worth of data. So, I've found I can't use the exp utility to export the data since the resultant file is bigger than 2GB.

Unfortunately, I don't think the table was partitioned when it was created so I can't specify a table partition with the exp command.

Does anybody have any suggustions on how I can most effectively export this data to back it up?

Thanks,
Banner
 
You can try using the
Code:
ROWS=N and INDEXFILE=Filename
parameters to see what is actually in the export file, then see if, by using the
Code:
TABLES= or FROMUSER=
parameters you can reduce and 'split' the import data..


hth
[profile]
 
Your best chance to get this to work is if you can get the export to write directly to tape.

Another possibility is to redesign the table to make it partititioned.

As a last resort you might have to abandon your intention to export the table and rely on tablespace backups instead. That would make some recovery scenarios more complicated, but the advantage is that you would be backing up individual datafiles that presumably are more reasonable in size.
 
Hi,
I misread the original post so I did not deal with the EXPORT issue, just an Import one ( a supermodel moment, obviously [smile] )..

The basic idea is the same, however, just export in stages
( user by user ) then do the same when importing it..

 
Unfortunately, exporting in stages won't work in this case. The problem is specifically one huge table of 36 million rows with an average of 8000 bytes per row. If these numbers are accurate, I calculate that just this single table takes up about 270 gigabytes, making it pretty hopeless to try exporting in stages.
 
I have to admit that I am a novice when it comes to Oracle DBA work... What exactly is required to "redesign the table to make it partitioned"? Would that mean creating a new partitioned table and then selecting into it? Or are there ways to modify the existing table's structure in order to partition it (without having to actually move all the data)?

The data in the table (based of the data types of all the fields and the number of records) is close to 288GB. I'm trying to do this in as efficient a way as possible so I'm not waiting days/weeks for the backup...
 
Ah, it's possible you are greatly exaggerating the size of your table. I thought that 8000 bytes per row looked suspiciously high.

It's usually not accurate to estimate size based on the maximum size of a row. Varchar2 data, in particular, rarely occupies anywhere close to its defined maximum.

The simplest way to find actual table size is to query dba_segments. That will give you the actual space allocation. Your export size may be much lower, because it won't contain any free space, which your table probably does.

select owner,segment_name,bytes/1024/1024/1024 from dba_segments
where owner='SCHEMA_OWNER' and segment_name = 'YOUR_TABLE';
 
I would also have to suggest that you take a close look at the system you inherited. What backup method was being used before you arrived? I mention this because normally exports are a secondary backup method. It's good if you want to supplement regular online or cold backups with this export. It's very dangerous, however, to rely exclusively on exports. If you have to recover, you can't forward recover from an export. You will also probably have synchronization problems. An export this large figures to take a considerable amount of time. How do you plan to handle updates that happen while the export is in progress?
 
Well, the problem is that no backup method has been implemented yet. Further, the system was hacked and we are unsure as to how the hackers got in (since we didn't build the system). The goal is to back everything up, wipe the disk and reinstall the OS, oracle, etc., then import the data.

The good news is that this is a "read-only" table that will not experience updates while exporting, so there won't be any synchronization issues. I looked at the actual table size and it looks like it is more like 17GB (lots of free space).

So, that being the case, do you think it still makes sense to redesign the table to make it partitioned? If so, what's the best method to do that?

Thanks for all the help thus far!
 
I see no reason why you can't just make a cold backup of all your datafiles, control files, redo logs, and init.ora. Then you can restore these files and be ready to go after cleaning your system.

Do you have any reason to believe the hackers compromised your database in a way that forces you to import to a completely new instance?

Since this is a one time situation, I would avoid partitioning. You want a quick fix, not added complexity.
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top