Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations gkittelson on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

My .TAR predicament Help me guys plss!

Status
Not open for further replies.

Methodh2k

Technical User
May 8, 2005
7
GB
Hello the problem is...
I moved server... i have my whole site in a .tar
It comes to 60Gig... YEP!

Ok..

Problem is when it extracts it runs out of space :/ 60 + 60 = 120 and i only have a 120gig on my dedicated server!

So..

Im stuck :( i have a tar file that i cant extract becuase i run out of space, my host tell me i should get a larger HD...

But is there a better alternative that doesnt cost more money...

Im not that experienced when it coems to tars on linux..

But is there a way of extracting half? of the tar... then deleting that content from the tar.. then extracting the other half?

Or maybe a feature that deletes the files as they are extracted fromt he tar...

Or maybe you have another idear...

I only run out of space by about 100mb...

Hope someone can give me some pointers i realy stucked!

THis is a linus server running
CentOs
SSH
Ftp
Apace 2
Cpanel 10
Telnet..
Many thanks in advanced!

Ryan
 
You can extract only a portion of the tar, based upon pathname(s).

# tar xvf tarfile ./pathname
(exact phrase depends on how the files were written)

If you can extract just what you need, then you probably have enough space. However, there is no way I know where the tar can be deflated as it's being extracted.
If you have access to another system, you can drop the tarfile onto that system and untar to the new box over the network. You'd have to establish "user equivalency" on both systems (even if temporarily) to accomplish this.
 
Have you tried compressing or gzipping the tar file? If not, do so and then you can use the following:

zcat <compressed tarfile> | tar xvf -

Might work in your circumstances.
 
I'm guessing that a 60GB site will have a lot of pictures in it (otherwise it's a HUGE amount of text). JPEGs are already compressed, so you may not get much help from the gzip...
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top