littleIdiot
MIS
Hi,
I have been presented with a large (200GB) tar file, and I am trying to reduce it to manageable(2GB) chunks, without corrupting the data contained within.
I could use "split" to split it up, but then each individual file is useless, and relies on being able to create a 200GB file on future OS - not guaranteed.
I have looked into the man for tar and see the "-k size" option, so I'm thinking of untarring the data, then re-tar it using this -k option.
My question is this: when taring into specified sizes, will data that would potentially be split between two adjacent tar files be split up, thus potentially corrupted, or is the tar command smart enough to not bundle a file that is going to take it over the size limit, and put it into the next tar file instead?
I hope this makes *some* sense to *someone*!
I guess I could run it and see, but just wanted some tek-tips advice!
Cheers,
littleIdiot
I have been presented with a large (200GB) tar file, and I am trying to reduce it to manageable(2GB) chunks, without corrupting the data contained within.
I could use "split" to split it up, but then each individual file is useless, and relies on being able to create a 200GB file on future OS - not guaranteed.
I have looked into the man for tar and see the "-k size" option, so I'm thinking of untarring the data, then re-tar it using this -k option.
My question is this: when taring into specified sizes, will data that would potentially be split between two adjacent tar files be split up, thus potentially corrupted, or is the tar command smart enough to not bundle a file that is going to take it over the size limit, and put it into the next tar file instead?
I hope this makes *some* sense to *someone*!
I guess I could run it and see, but just wanted some tek-tips advice!
Cheers,
littleIdiot