Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations strongm on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

executing commands within awk

Status
Not open for further replies.

sinebubble

IS-IT--Management
Jun 13, 2002
9
US
I'm stuck trying to execute a command w/in an awk statement. I've pieced together a string with awk, but I need to execute the string as a command ($lynx_url). Suggestions?

#!/bin/sh

HOST=HTTP=http://
LYNXCMD="/usr/local/bin/lynx"
LYNX="$LYNXCMD -dump -head"

while read LINE
do
node=`echo $LINE | sed 's/=/./' | cut -f 3 -d "."`
markets=`echo $LINE | cut -f 2 -d "="`
URL="${HTTP}${HOST}/${node}";

echo $markets | awk -F"," '{
for (x=1;x<=NF;x+=1) {
lynx_url=&quot;/usr/local/bin/lynx -dump -head '$URL'&quot;$x&quot;/report.txt\n&quot;;

# How to get this next line to work?
$lynx_url > /tmp/HTTP.$$ 2>&1;

}
printf &quot;\n&quot;; }'
done < awk.txt
 
BTW, I did try
eval `echo $lynx_url`;
but no love.
 
Sinebubble:

Have you tried the awk system command, (from a man page):

system(cmd-line) Execute the command cmd-line, and return the exit sta­tus. (This may not be available on non-POSIX systems.)

Regards,

Ed
 
I don't think I can use the lynx_url variable. I think that is where my real problem exists.
 
sinebubble,
What you are trying is doable with gawk
3.1, but there is probably an easier way
to do it with Gawk 3.1's network
capability.
Grabbing webpages and storing their contents is pretty simple.

Here is a brief script:
#urlsuck.awk

function sanity() {
cmd = &quot;awk -W version | sed -n '1p'&quot;
cmd | getline val
if (val) {
z = split(val,arr,&quot;.&quot;)
if (arr[1] < 3 || arr[2] < 1) {
printf &quot;Forget it...&quot;
exit
} else {
return &quot;1&quot;
}
}
return &quot;0&quot;
}

BEGIN {
printf &quot;Host: &quot;
getline host < &quot;-&quot;
printf &quot;Port:&quot;
getline port < &quot;-&quot;
printf &quot;Url to retrieve: &quot;
getline url < &quot;-&quot;
printf &quot;Destination file: &quot;
getline dfile < &quot;-&quot;
sock = &quot;/inet/tcp/0/&quot; host &quot;/&quot; port

sanity();
print &quot;GET &quot; &quot; &quot; url |& sock
print &quot;Request buffered for&quot;, url

##send request##
while ((sock |& getline) > 0) {
print $0 > dfile
}
}

Usage example:
> awk -f urlsuck.awk
Host: ourhouse.ho
Port:80
Url to retrieve: Destination file: /home/me/scrap.html
Output:
Request buffered for
 
Well, I'm trying to parse a file that contains nodes on a site and then using lynx to make sure they are available. The problem is parsing the file and pulling information that I use to form a URL. I'll look into gawk, but if I could find some way of using the lynx_url, that would fix my issues.
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top