And the question is?.... ;^)<br>
<br>
A little more info please. For example, what do your original files look like, and how do you want to divide them?<br>
<br>
Thanks,<br>
<br>
Andy
ok! The files is a data file with a caractere and line feed et cariache return! I want to split a file in many file with to take 66lines with the original files and make a other one. I do that a each 66line of data. Im not a beginner in programmation but i start in Perl. I need help to learn how i cant do that. P.S. Im in DOS operation.
OK, I've got an idea on a quick and dirty script that should do what you need, but I don't want to post it yet until I've had a quick chance to try it myself. (There's nothing worse than feeding a beginner bad information :^)<br>
<br>
I'll have a look at something later, and try to get it posted tomorrow. That way you should get to see it before the world ends at midnight on New Year's Eve ;^)<br>
<br>
Bear with me, and we should be able to sort something out.
Heck, got bored waiting for a backup to finish, so here's the script.<br>
<br>
WARNING: It's been written and tested using perl V4.036 (I know, I know, but I can't upgrade to v5 on the box I was using) under Unix. Things like file names, etc, will need changing. Also, it worked on the quick and dirty test that I did, but you might need to amend things a little before you use it in a production environment.<br>
<br>
Script begins below:<br>
<br>
#!/usr/gnu/bin/perl<br>
<br>
# The above line will need changing to point to your local perl binary. If<br>
# running under DOS, run the script through pl2bat to generate a batch file.<br>
<br>
# Divide an input file into an arbitray number of files as specified by a number<br>
# of lines.<br>
<br>
$Input_File = "testfile"; # Name of the input file.<br>
$Output_File_Prefix = "output"; # Name of all output files<br>
# will start with this.<br>
<br>
$Number_Of_Lines = 6; # Number of lines to split the file at.<br>
<br>
# Setup some counters.<br>
$Output_File_Counter = "0"; # Will be incremented for each file.<br>
$Input_Line_Counter = "0"; # Will be incremented as each line is read in.<br>
<br>
# Open the input file for reading.<br>
open(INPUT_FILE, "<$Input_File"<br>
<br>
# Now, loop around each line of the input file. As each line is read in, place<br>
# it into an associative array (a hash) which is keyed on line number. Also,<br>
# each line is read into a variable, so that we don't suffer from any $_ side<br>
# effects.<br>
while ($Current_Line = <INPUT_FILE> {<br>
# Increment line number counter at the beginning of the loop.<br>
$Input_Line_Counter++;<br>
<br>
# Place the current line into the hash.<br>
$Output_Hash{$Input_Line_Counter} = $Current_Line;<br>
<br>
# Have we reached our number of lines counter?<br>
if ( $Input_Line_Counter == $Number_Of_Lines ) {<br>
# We have.<br>
# Open the output file for writing. This will overwrite any<br>
# existing files of the same name.<br>
open(OUTPUT_FILE, ">${Output_File_Prefix}.${Output_File_Counter}"<br>
<br>
# Output the lines. What this little bit of code does is<br>
# read the keys of %Output_Hash, and sorts the keys into order.<br>
# As the keys are entered as 1,2,3, ... then the lines should be read<br>
# out in the same order as they were read in.<br>
foreach $Key (sort keys %Output_Hash) {<br>
print OUTPUT_FILE $Output_Hash{$Key};<br>
}<br>
<br>
# Close the output file.<br>
close(OUTPUT_FILE);<br>
<br>
# Increase the file counter.<br>
$Output_File_Counter++;<br>
<br>
# Reset the line number counter to 0.<br>
$Input_Line_Counter = 0;<br>
}<br>
}<br>
<br>
close(INPUT_FILE);<br>
<br>
Script ends above.<br>
<br>
Comments in the file should, hopefully, explain what's going on.<br>
<br>
You should be able to copy & paste it to test. Sorry if the formatting's gone a little out of whack during my posting. (Note to self: Must learn how to put formatting codes into Tek-Tips posts.)<br>
<br>
Hope this helps.
This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
By continuing to use this site, you are consenting to our use of cookies.