Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations SkipVought on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

failed to copy 1 huge file

Status
Not open for further replies.

ady2007

IS-IT--Management
Oct 3, 2007
22
0
0
ID
I had a problem to copy one huge file (more than 70 GB) from usbdisk (ext 3) to Linux desktop. Only can copy less than 10% of 1 huge file (70 GB +) and failed.
Any suggestion ?

 
 
The failure to copy could be due to the power requirements of the USB disk
if possible conect it to an external PSU (dual USB connectors are not always good enough)
 
I'll second the ulimit - I have had problems with that in the past, but it has been so long since I had to change it I don't recall what the default is anymore.
 
# ulimit -a
core file size (blocks, -c) 0
data seg size (kbytes, -d) unlimited
file size (blocks, -f) unlimited
pending signals (-i) 1024
max locked memory (kbytes, -l) unlimited
max memory size (kbytes, -m) unlimited
open files (-n) 1024
pipe size (512 bytes, -p) 8
POSIX message queues (bytes, -q) 819200
stack size (kbytes, -s) unlimited
cpu time (seconds, -t) unlimited
max user processes (-u) 397824
virtual memory (kbytes, -v) unlimited
file locks (-x) unlimited
Any suggestion ?
Thanks
 
So ulimit is not the problem. Are there any disk quotas?
 
Hey, this is a really dumb answer, but when I set up my systems I intentionally partition certain key areas into their own space so that in the event someone is naughty it doesn't bring down the whole system. For example, my /var/ /var/ftp areas have their own partitions that are set to X size that is dedicated just for them.

When you set up your disk, if you made a separate partition for the users /home directories how big did you make that partition? Is it possible that there just simply isn't that much space in that particular partition to hold a file that big? On my system I only have 20GB budgeted for users' home directories because I don't expect any users other than me to have anything stored in them....

Just another thought....
 
what is the failure?
Is it logging in /var/log/messages anything?

Is it a read (I would run "wc -l THEUSBFILE".
This is a simple test, wc needs to read all the bytes to calculate things. So this verifies the reading.

But first, in the directory you are writing, go "df -k ."
to verify you have enough space and you are writing in
the write device.
 
Free harddisk space was not a issue.
Because I had 174 GB on local harddisk and NAS attached.
The error mesg : "Input/Output Error"
Thanks
 
What sort of filesystem are you writing to? Does it have or require a largefiles mount option? How big was the partial file that was created before the copy failed (assuming one was left behind)?

Annihilannic.
 
usb harddisk used ext3 and local harddisk on linux workstation also used ext3.
 
another thing comes to mind...

remove the USB drive from its casing and physically attach it to a controller that is build into the PC you are trying to copy it to...

and then copy it from there, this would eliminate the USB controller and USB Drivers out of the equation ...



Ben
"If it works don't fix it! If it doesn't use a sledgehammer..."
How to ask a question, when posting them to a professional forum.
Only ask questions with yes/no answers if you want "yes" or "no"
 

Yeah, I am still not sure if you are writing on the NAS or hard disk. Do the df thing to make absolutely sure you which physical device you are writing. You said "NAS" attached, is this an automount?

If you did the "wc -l" that eliminates the usb because that reads all bytes.
 
Yes, I used NFS. So I can copy files to Network attach Storage.
 
So we know the reading is from usb device.

What I am unclear is where are you writing. Use the
"df directory_that_you_are_writing" to figure out what physical device your are writing on. Is it NFS or
an ext3 device. If ext3, what is the block size.
 
BigDog,

the OP, clearly states what he is attempting to do in the very first post, by stating "usbdisk (ext 3) to Linux desktop. and in a subsequent post "Free harddisk space was not a issue. Because I had 174 GB on local harddisk and NAS attached."8/b]...

so where is the confusion?

the receiving HDD's block size nor the source HDD's block size, should matter in the problem, as that the FS is capable of handling that. It is my opinion that the problem lies in the USB protocol and/or driver timing out, thus my suggestion to take the drive out of the USB enclosure and to attach it physically to the mainboards controller...



Ben
"If it works don't fix it! If it doesn't use a sledgehammer..."
How to ask a question, when posting them to a professional forum.
Only ask questions with yes/no answers if you want "yes" or "no"
 
I'm not going to argue my point, but 174GB of free hard disk space does not mean that the drive does not have multiple filesystems / partitions and that the partition where /home resides may only have 20 or 30GB of free disk space. :eek:)

Some distros when set up will choose your drive configuration for you and the partition defaults are not always the best choice for every different setup.... Usually at minimum there is a / partition, a swap partition, a /usr partition and a "/everything else" partition if the disk is that big...

 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top