I have a simple program that basically reads in say 15,000 lines and prints them back out again (eg: "PRINT DF $rec" in a loop)...
Over 99% of the time it works an absolute treat, but every now and then it seems to go wrong and only manage to right back a fraction of the records (eg: only 5,000 odd), therefore losing all the records towards the end.
It's almost as if part way thru writing out the new version of the file it goes wrong/crashes!?!?
Now the software is being run on a Unix box by hundreds of people so I do not see the problem happening myself, just the result. Ie: The fact that a massive number of the latter records are lost from the file.
It is likely that when PERL is dealing with a large (over 5 meg) file in memory it may go wrong randomly and therefore while writing the adjusted file out, simply give up?
Could this account for my partitially written back file?
Or is PERL rock solid and the problem must be in my code. My code is pretty simple and works hundreds of times a day with no problem though.
Thanks!
Over 99% of the time it works an absolute treat, but every now and then it seems to go wrong and only manage to right back a fraction of the records (eg: only 5,000 odd), therefore losing all the records towards the end.
It's almost as if part way thru writing out the new version of the file it goes wrong/crashes!?!?
Now the software is being run on a Unix box by hundreds of people so I do not see the problem happening myself, just the result. Ie: The fact that a massive number of the latter records are lost from the file.
It is likely that when PERL is dealing with a large (over 5 meg) file in memory it may go wrong randomly and therefore while writing the adjusted file out, simply give up?
Could this account for my partitially written back file?
Or is PERL rock solid and the problem must be in my code. My code is pretty simple and works hundreds of times a day with no problem though.
Thanks!