Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations IamaSherpa on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Building a tar file dynamically 2

Status
Not open for further replies.

ddrillich

Technical User
Jun 11, 2003
546
US
Good Day,

When running the following -

tar cvf alldocs.tar `find /opt/autonomy/Wordnik/tmp/23 -name SB*.djm`

We run into the - 'bash: /bin/tar: Argument list too long' problem.

Any ideas?

Regards,
Dan
 
There is a line limit on most shells, and you are exceeding it. 2 ways, one xargs. The other would be to use the -T or --files-from option if available on your version of tar.

like:

find /opt/autonomy/Wordnik/tmp/23 -name SB*.djm -print | tar cvf alldocs.tar --files-from -

so it reads the list from stdin.
 
scratch that, solaris tar does not support --files-from.

xargs would be one way to do it.
 
No, xargs is no good either because it would execute tar more than once, creating multiple files (with the same name, therefore overwriting all but the last).

Solaris tar does support the -I option though, which is equivalent to --files-from, but it doesn't understand that "-" means to read from stdin... so this is a situation where Process Substition is useful, e.g.

Code:
tar cvf alldocs.tar -I <(find /opt/autonomy/Wordnik/tmp/23 -name 'SB*.djm')

I left out the -print since it's the default behaviour, and quoted the expression to prevent accidental matches by the shell in the current directory.

Annihilannic.
 
I'm actually on Linux and/but like this forum ;-).

elgrandeperro's suggestion works!

Thank you both.

Regards,
Dan
 
I thought you could to this (let me say, I don't know xargs as well as I should):

find /opt/autonomy/Wordnik/tmp/23 -name SB*.djm -print | xargs tar cvf alldocs.tar

? Or am i missing something?
 
The way xargs gets around the "command-line too long" problem is to call the specified command once for every "x" number of lines of standard input, where "x" is the maximum supported number for that operating system. Unless of course the xargs behaviour is modified with switches. So if the find command results in more than, say, 32768 characters of output, xargs will run tar more than once, resulting in the first tar files being clobbered.

By way of example, let's say we have a script that just echoes all of the command line arguments into a file, where the file is the first parameter:

Code:
#!/usr/bin/ksh
file=$1
shift
echo "$@" > $file

If we generate a list of 1 to 10000 and run the above script with xargs:

Code:
$ i=1; while ((i<10000)) ; do echo $i; ((i=i+1)); done | xargs ./echo_into_file testfile
$ cat testfile
9918 9919 9920 9921 9922 9923 9924 9925 9926 9927 9928 9929 9930 9931 9932 9933 9934 9935 9936 9937 9938 9939 9940 9941 9942 9943 9944 9945 9946 9947 9948 9949 9950 9951 9952 9953 9954 9955 9956 9957 9958 9959 9960 9961 9962 9963 9964 9965 9966 9967 9968 9969 9970 9971 9972 9973 9974 9975 9976 9977 9978 9979 9980 9981 9982 9983 9984 9985 9986 9987 9988 9989 9990 9991 9992 9993 9994 9995 9996 9997 9998 9999
$

You can see that the resulting file only contains the last chunk.

Annihilannic.
 
Thanks, my test case was not large enough. Tar with the concatenate option would have worked because that would allow tar to append to the archive.


 
Yep, that's a good idea! I think you mean the -r (or --append in GNU) option though. --concatenate seems to be for joining tars together, I think.

Annihilannic.
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top