Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations TouchToneTommy on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

tarring files that are in several directories 3

Status
Not open for further replies.

bi

Technical User
Apr 13, 2001
1,552
US
I need help on a very simple script. I'm trying to tar to a file more than 1300 files that are in a directory tree. Here is what I do:

cd <rootdir>
find . -name &quot;*.wdb&quot; > /tmp/out.txt
cat /tmp/out.txt | xargs tar -cvf /tmp/out.tar

Although the last command makes it appear that all 1300+ files are being tarred up, when I untar the file, only about the last 512 files are untarred. The size of the tar file is 2.5 MB.

Does tar have a limit on the number of files you can have? If so, what are my alternatives other than tar? pax? cpio?

If tar doesn't have a limit on the number of files in a tar, what am I doing wrong?

thanks for any help.
 
Hi Bi,
I did a quick scan and took the following information from the man pages for tar, pax, and cpio: “Because of industry standards and interoperability goals, tar, pax, and cpio does not support the archival of files larger than 2GB or files that have user/group IDs greater than 60K. Files with user/group IDs greater than 60K are archived and restored under the user/group ID of the current process.”

The only suggestion I can offer is to consider scripting /tmp/out.txt to be split into two separate files and then tar the files separately. You could use wc –l in conjunction with head and tail to read and tar each half of the /tmp/out.txt file separately. Just make sure you account for an odd number of files. I do not know enough about tar, pax, or cpio to tell you if this will work or if there are any potential problems, but I hope it helps.

I hope someone else will be able to actually offer a solution. In the meantime you may want to read the man pages to see if it offers any suggestion that may help you as well as any other size restrictions. Also, if you are dealing with HP-UNIX, check your service package level. If you have the HP-UX Gold Package you can call HP without any additional charges and enlist their help in getting a solution. I do not know the coverage levels offered on other packages.

When you get it to work, please post your solution. Thanks.
-Bobby s-)
bwgunn@icqmail.com
 
Thanks, Birbone.

None of the files were bigger than 2 GB and the UID/GID were both 1004.

Here is the solution I found. It isn't too elegant, but it works:

First, I create a tar file and tar a dummy readme file into the tar file. We'll call the tar file /tmp/wdb.tar. (Just touching a file in /tmp wouldn't allow for the -r option in the tar command.)

Then, I cd into the root directory and type this:

find . -name &quot;*.wdb&quot; -exec tar -rvf /tmp/wdb.tar {} \;

Using the -c option with tar would just leave the last file tarred in the tar file. This is why I had to create a small tar file and then use the -r option.

Other variations of the cat solution I tried earlier just never seemed to work on the amount of files I had to tar up.

Thanks for your help.



 
bi

I think trying to tar up with xargs didn;t work because you were using:

xargs tar -cvf x.tar

Lets say you have 5 files a,b,c,d,e. Now xargs may work like:

tar -cvf x.tar a b c d e

But, xargs could also have run twice like:

tar -cvf x.tar a b c
tar -cvf x.tar d e

The -v option would have reported all files being added to x.tar, but the -c option wipes the previous version of x.tar! My guess is xargs uses a default number of 512 arguments in each invocation for your system. To verify this, use:

xargs -t <command>

Which should echo each constructed command before it is invoked.

To get the tar working for xargs, just replace -c with -r.

Hope this helps ;-)

Cheers, Neil
 
I tried that. The -r option adds the file to an existing tar. If the tar isn't already there, it won't do anything but give you an error &quot;A file or directory in the path name doesn't exist.&quot; If you touch a file and give it a .tar extension and try to tar to that file, you get &quot;Unexpected end of file while reading from the storage media.&quot;

But, here's something I discovered, too. The solution isn't quite as good as I originally thought:

It searches for the extension all the way down the directory tree and then it starts back at root.

I'm new at this and wonder what I'm doing wrong. I want to create a relative tar file so the contents can be untarred on any system.

 
Most versions of tar have a file list command.
see you local man pages.

I would try
cd <rootdir>
find . -name &quot;*.wdb&quot; > /tmp/out.txt
tar -cvf /tmp/out.tar -T /tmp/out.txt
or tar -cvf /tmp/out.tar -F /tmp/out.txt



Tony ... aka chgwhat
tony_b@technologist.com

When in doubt,,, Power out...
 
Thanks. I thought there was such an option, but couldn't find it in the man page. In AIX, the option is -L.
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top