I'm looking for a quick way to parse access logs and identify originating ISP based on the IP address. I can script something up that does a whois query but these logs are rather large and that seems a little 'dirty'.
Anybody by any chance know a more efficient way?
Thx!
Wow..that's a beautiful sight after all the mangled text I've seen.
Instead of search/replace, could AWK instead do it after every fourth comma?
./FM
PS - I still love to see this in SED if it could be done...
I have a single line file that I'm trying to insert some line breaks into. I seem to be failing miserably as I continue to end up with a single line.
In short:
sed -e 's/%,/%CR/g' file
I'm wanting to find each occurence of "%," and replace it with "%" and then a new line.
In exchange for...
sed -e '/word/q' filename
The above deletes everything after the line containing 'word'. How do I change it so that it includes deleting the line containing word?
Thx,
./FM
I can't believe I racked my brain on this. Lynx was the solution. I used the -dump option. The end result is I get one nice long page instead of the double column page that existed. This makes it much easier to strip and parse.
On a side note, I did discover html2text. It's a wonderful utility...
I'm working with some web pages that contain a lot of tables. Apart from writing some extensive regex, does anybody know of any tools or scripts out there for use with parsing tables from web pages. Perl has a module for it but it's not really up my alley so to speak.
Thx,
FM
I have a variable $result (for sake of discussion) that contains a bunch of html. I'm trying to grep out a particular section to assign to a another variable. My HTML likes the following:
***************************
<br>TEXT: data<br><br></body>
***************************
I simply want to put...
I have to ask for help before I go insane :-P
*********************************************
I have the following curl script (works great) which I'm trying to convert to PHP/Curl. Any pointers on what I'm doing wrong would be great.
curl -s -u username:password -k -F userid=username...
I have simple expect script that I want to pass a variable to via a web form. Any suggestions on how this can be done? One method I've seen is to dynamically generate the expect by using a simple echo from bash and then executing the expect script. Is there a way to pass the variable to expect...
Yes, $quotaHOLD is a file. I use the following to pipe a list of values into it (which that part works ok):
ldapscript2 $i | grep Quota | awk '{print $2}' >> $quotaHOLD
And I want to go straight down each of the values in $currentHold and compare it with $quotaHOLD. Logically, I'd think it...
Well, at least I get a different error now!
limits.sh: line 17: ((: [ 0<hold2.17721 ] : syntax error: operand expected (error token is "[ 0<hold2.17721 ] ")
FM
Ok..I've fought with this a while now. I'm starting to understand why they call it BASH as that's what I'm starting to do with my head on the desk.
Here is what I've got so far:
*********************************************************
#!/bin/bash
declare -a userHOLD
declare -a currentHOLD...
I have 3 arrays I have created from 3 different files. One file is a list of text and the other two are numeric in nature. I need to do a math equation all the way down the two numeric arrays (i.e., compare A[1] with B[1] and A[2] with B[2],etc..) and then dump the output to a file with the...
Wow...that 'while' works just great! Thx a million. Also, thx for the tips with Awk.
Can you explain the statement to me? I'm not sure I understand:
while IFS="/"
"done" I understand. done< I haven't seen before either....
Best Regards,
FM
I'm going batty with this seemingly simple array. What I'm trying to do is cat a file and do some ldap lookups.
What I've done is this:
**************************
wUSERS1=`cat usersample | awk -F "/" '{print $5}'\n`;
for h in "${wUSERS1[@]}"
do
echo "$h"
done...
This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
By continuing to use this site, you are consenting to our use of cookies.