Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations strongm on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Problem opening a file 3

Status
Not open for further replies.

rensai

MIS
Jul 27, 2005
11
US
Hi everyone,

I've been wracking my brain on this problem so any help is greatly appreciated.

The problem I am having is regarding reading in a text file for parsing. I have created a script to open and parse several thousand log files. Of those thousands roughly 90 percent will open and parse just fine the other ten percent give me the following error:

Error! File: '/audit/logs/computer1.log' had the following error: No such file or directory

Heres the code that is generating that.
Code:
open (LOGFILE, "'$LogPath/$DefaultLog'") or $Check=1 and my $ErrVar1=$!; 
  if ($Check == 1) {
   $SQL="INSERT INTO errLog (errFile, errDesc) VALUES ('$LogPath/$DefaultLog','$ErrVar1')";
   &Do_SQL;
   print "Error! File: '$LogPath/$DefaultLog' had the following error: $ErrVar1 \n"; 
 }

now the output from $! indicates the file or directory cannot be found but I have checked and rechecked the path, file names, and permissions and am absolutely positive that they are all correct. Keep in mind this error is only showing up on a relatively minor number of log files, the rest work fine.

It may be helpful to know that these log files are audits generated on windows boxes by secaudit. They are then dumped onto a samba share(on a redhat entrprise box running perl 5.8.5) where the perl script parses them and outputs pertinent info into a mySQL database.

I would greatly appreciate any suggestions on either correcting this code or other ways to read in those files.

Thanks,

Dustin
 
I know it sounds stupid but I think the error tells you the story. Perl cannot read that file.
You say that the files are on a samba share but is there a chance that the files are in the process of being created when this code is run? I am thinking of transient issues here.

In my experience, even when you're absolutely sure that the file is there and readable, there is often something you've overlooked that the machine would see as making the file not there or not readable.

Have a think and come back to us.


Trojan.
 
Hey Trojan,

Thanks for the response. I certainly agree with you in general and in fact, I really hope it is something simple like that that I've over looked but we can rule out the files still being created. The script actually never tries to access them from the samba share, partly to prevent exactly what you are saying.

Heres a little bit more in depth explaination of my process. The workstations are automatically audited and generate a new log file every couple weeks. They are moved to the samba share and deleted from the windows boxes. On the linux server there is a cron job set to run frequently that will move the files from the samba share into a private folder. Once in that private folder, NOTHING besides my script ever accesses them. The cron job sets all files to 666 permissions.

any other ideas?

Thanks,

Dustin
 
I should also have added that right before running the script, I have manually opened several of the files that cause the error with VI. copying and pasting the path and file name exactly the way perl shows it in the error output and I have no trouble opening any of the files I've tested.
 
Can you cut your code down to just the section you showed us and run that?
Does it still fail?
Having a 6 line test script is much easier to debug.


Trojan.
 
Thats a great Idea. I created a new file with just the open piece in it and hardcoded it to check a single log file that has the problem. I stuck them both into a new folder and it still generates the error!

Heres the code:
Code:
$TestFile = "./kernlr.audit2.log";
open (FILE,"'$TestFile'") or $Check=1 and my $ErrVar=$!;  
 if ($Check == 1) {
    print "Error! File: '$TestFile' had the following error: $ErrVar \n";
 }
 else {
   print "everything A-OK. Printing File: \n\n";
   print FILE;
 }

That's the entire script and it is running in the same directory as the log file. Also set the file permissions to 777 on the log file just to be safe.

What do you think?
 
not sure if this is the problem, I doubt it, but this line just isn't correct:

Code:
open (FILE,"'$TestFile'") or $Check=1 and my $ErrVar=$!;

without explanation it should probably be written something like this:

Code:
my $ErrVar = 'some default value';
my $Check = 0;
unless (open FILE, $TestFile) {$Check = 1; $ErrVar=$!}; 
if ($Check == 1) {
...
}
else {
...
}

The use of double quotes and single quotes here: "'$TestFile'" is just confusing and I am not sure what affect that will have on the open function, if any. Just don't use any quotes at all.
 
The apostrophes look to be the problem to me, although I agree with KevinADC about your attempts to use short circuit operators. Try this:

Code:
#!/usr/bin/perl -w
use strict;
use diagnostics;
my $TestFile = "./kernlr.audit2.log";
if(open (FILE,"$TestFile")) {
   print "everything A-OK. Printing File: \n\n";
   print FILE;
} else {
    my $ErrVar=$!;  
    print "Error! File: '$TestFile' had the following error: $ErrVar \n";
}


Trojan.
 
Also, you will not that by using the more normal "if/else" construct, you no longer need the $Check variable. Indeed, you could even remove the $ErrVal since it's only used in the error message and you could use $! in the print statement directly.

With respect to the short circuit operators, it's not unreasonable to do it that way, I am just concerned that you were confusing yourself with the issues here. Also you were introducing a scoping issue at the same time.



Trojan.
 
Trojan is right about dumping $Check but, in the interests of understanding, were you ever resetting $Check to zero?

f

"As soon as we started programming, we found to our surprise that it wasn't as easy to get programs right as we had thought. Debugging had to be discovered. I can remember the exact instant when I realized that a large part of my life from then on was going to be spent in finding mistakes in my own programs."
--Maurice Wilkes
 
Thanks for the input everyone. Kevin you were right on about the quotes. I removed them and the file would open with the small 10 liner script. However, when I modified that on my full script and ran it, it still fails :(

In the full script the filename is not hardcoded but rather pulled in from an array. I'd like it if you all would take a look at that piece of code and see if you can spot any problems.

I've got this piece here that reads in all the log file names into an array.
Code:
opendir(DIR,"$LogPath") or die "Could not open log directory \n";  # open log dir to get list of log files or die
while (readdir(DIR)) {
 @Files = grep !/^\.\.?$/, readdir DIR; #reading in log file names and throws out . and .. directories
 }
 closedir DIR;

then the open file piece looks like this:
Code:
sub process_log_1 {

my $Check=0; # counter
my $Line = undef; #line to check
my @SecArr= undef; #holds log data

foreach my $File (@Files) {
  if ($Debug == 1) {
    print "\nLog1 processing: $File \n";
  }
open (FILE,$LogPath."/".$DefaultFile) or $Check=1 and my   $ErrVar=$!;  
 
 if ($Check == 1) {
   $SQL="INSERT INTO errLog (errFile, errDesc) VALUES   ('$LogPath/$File','The file could not be opened.')";
   &Do_SQL;
  if ($Debug == 1) {
    print "\nError! File: ";
    print "'$LogPath/$DefaultFile' had the following error: $ErrVar \n";
  }
   $CorrectParse = 1;
   return 1;
 }
 else {
  while (<FILE>) {
   s/[^\w\s\\\/)]//g; #parsing out extra crap that secedit writes to the log
   chop($_);
   chop($_);
   push (@SecArr,$_);
  }
 close (FILE);
 }
}

There. Thats the pieces that read in file names and then open and read the files. I can't think of any other parts of the full script that could affect opening a file. Oh and as far as the $Check and $ErrVal vars go, I like all your suggestions but it will require reworking some other pieces that use those to create sql entries for errors and such. I'll probably work on cleaning it up and getting it closer to your recommendations when I have time but my first priority is to just get it functional.
 
Changed the open line to closely match kevins suggestion. Its now:
Code:
unless (open FILE,$LogPath."/".$DefaultFile) {$Check = 1; $ErrVar=$!};
[\code]

Made no difference that I can see.
 
Bah I really need an edit button. While I've been playing with this I have a couple versions open and I had changed the $File var to $DefaultFile which was one I hardcoded to do some testing. Please just read $DefaultFile as if it were $File.
 
well, this block of code:

Code:
opendir(DIR,"$LogPath") or die "Could not open log directory \n";  # open log dir to get list of log files or die
while (readdir(DIR)) {
 @Files = grep !/^\.\.?$/, readdir DIR; #reading in log file names and throws out . and .. directories
 }
 closedir DIR;

can be written more simply:

Code:
opendir(DIR,"$LogPath") or die "Could not open log directory: $!";
@Files = grep !/^\.\.?$/, readdir DIR;
closedir DIR;

no need to wrap the readdir function in a while loop. The rest of the code, hard to tell what's going on.
 
Code:
sub process_log_1 {
  foreach my $File (@Files) {
    $Debug && print "\nLog1 processing: $File\n";
    local *FILE;
    open (FILE,$LogPath."/".$File) or do {
      $Debug && print "\nError! File: '$LogPath/$File': $!\n";
      $SQL="INSERT INTO errLog (errFile, errDesc) VALUES
        ('$LogPath/$File','The file could not be opened.')";
      &Do_SQL;
      return $CorrectParse = 1;
    };

    push @SecArr, map {
      s/[^\w\s\\\/)]//g; #clean 
      chop;chop;
    } <FILE>;
  }
}

I've lost my @SecArr becase if it's local to the sub, there's no point in writing to it. As you write to it, you must be intending to refer to one defined elsewhere.

I've localised *FILE so that the rest of the program gets to use their own FILE filehandle without interfering with yours (or vice versa). You also get it closed automatically when you leave the sub.

I've got rid of redundant and unused variables as well as some syntactic noise, so it should be much easier to read and to verify or debug.

Try renaming your sub and dropping this in verbatim. I've stared at it for a bit and I'm sure it does what yours was trying to do ;)

f

&quot;As soon as we started programming, we found to our surprise that it wasn't as easy to get programs right as we had thought. Debugging had to be discovered. I can remember the exact instant when I realized that a large part of my life from then on was going to be spent in finding mistakes in my own programs.&quot;
--Maurice Wilkes
 
PS - your global, @Files, scares me. It's good practise to avoid globals without very good reason and @Files is way too common a word. It's just asking to get clobbered accidently somewhere else.

Instead of
Code:
@Files = (...
process_log_1();
...
sub process_log_1 {
  foreach ( @Files )
  ... {
consider
Code:
my @Files = (...
process_log_1( @Files );
...
sub process_log_1 {
  foreach ( @_ ) {
  ...
or (more efficient but less legible)
Code:
my @Files = (...
process_log_1( \@Files );
...
sub process_log_1 {
  foreach ( @$_ ) {
  ...

f

&quot;As soon as we started programming, we found to our surprise that it wasn't as easy to get programs right as we had thought. Debugging had to be discovered. I can remember the exact instant when I realized that a large part of my life from then on was going to be spent in finding mistakes in my own programs.&quot;
--Maurice Wilkes
 
Thanks again for all the help everyone. I cleaned up the sub close to what was recommended by fishiface and it seems to be working now.

Thanks!
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top