I have a txt file containing a list of filenames (with full path) that need to be deleted from a server. Currently, i have to manually run a delete script as user 'xyz' using the become command on linux:
roc01[phase1]:/u02/Code> become xyz
dug's Password:
***manually enter my password, then...
Thanks for your response brigmar...i'm working on a hash solution now, actually a hash of hash of hash :) But i'd like to see what you come up with. Here are the data files (pipe del):
File1 (lookup key):
RABBI
MR RABBI
RADM
REV
SONR
File 2 (lookup key + key2 + key3)
RABBI|RB|1
MRABI|RB|2...
I'm doing a string compare on two variables in two different files.
File1: contains a lookup key
File2: contains the lookup key, key2, and key3 field
If the file1 key matches a key in file2, i need to append the key2 and key3 fields to file1.
My current script is working properly, but...
All,
I have a script that calculates an nth value. I need to use this value to assign a number to every record in my file.
For example, if my datafile contains 2000 records and i want to keep 100 nth'd records, my value will be 16 (based on a formula in my script). Now, i need to print each...
What syntax can be used if I want to sort a 900 byte fixed length file on bytes 40 - 60 (may or may not be numeric)?
I'd like to develop a perl script with an option to allow users to specify a sort range.
Thanks
Thanks Miller...i ran a test on both solutions. Opening the file twice seems to be much faster.
Opening Twice:
real 0m2.41s
user 0m1.70s
sys 0m0.71s
Tie::File:
real 1m48.53s
user 1m40.45s
sys 0m7.98s
Thanks for the solution...i learned something new.
Sorry boss...appreciate the help.
This also works, but like you said, i'm sure there a more efficient way to do it:
use strict;
my $cnt = 0;
my %hash = ();
open(F1, "input1.txt") or die "can't open file: $!\n";
while(<F1>)
{
chomp $_;
my $list_code = substr($_,525,5)...
I'll need all the data + total count:
02222 name1 10
02222 name2 10
02222 name3 10
02222 name4 10
But, sometimes list code may not be the first five bytes...could be in the middle.
Thanks travs...works beautifully, except one thing:
I need the total count print on each row, your way is cool, but it's groups them by list code.
so, if list 02222 appears 10 times in the file, the output needs to be:
02222 10
02222 10
02222 10
02222 10
02222 10
02222 10
02222 10...
Sample data(first 5 bytes is List Code):
00003 first name1
04352 first name2
99343 first name3
04352 first name4
99343 first name5
99343 first name6
output (w/ new total count field):
00003 first name1 1
04352 first name2 2
99343 first name3 3
04352 first name4...
I apologize for that explanation...when i re-read, it didn't make sense to me. What i'm trying to do is:
1. read a fixed length file
2. create a hash with LIST_CODE as key and an incremented count as the value.
02346=>34
04532=>1
49445=>397
etc...
3. Now, go back through the fixed length...
Guys/Gals
I've created a hash with List Code being key, and value being the count:
open(F1, "file1") || die ...
while(<F1>) {
$hash{$LIST}++;
}
Now, i'm bumping this against data in another file and need to print the count if List Code matches:
open(F2, "file2") || die ...
my $LIST2 =...
I got it fellas. Wasn't difficult after i got some sleep :)
The code snippet above was simplified from my script. I was working with a hash of a hash...this worked:
$hash{$LIST_NUM}{FileSeqNum}=$FileSeqNum;
A delimited flat file contains two fields: List Number and Seq Number.
1|1
1|2
1|3
1|4
1|5
2|1
2|2
2|3
2|4
2|5
I need to find the min and max value for each list number. Seems easy enough...but i'm struggling.
Thanks All.
All,
I have a script scheduled to run once every Friday night. Origionally, it was to find files matching the current date in MMDDYY format. Now it needs to match the following Fridays date.
This friday will be 040607...when the script runs on this date, it needs to find files with 041307 in...
This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
By continuing to use this site, you are consenting to our use of cookies.