Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations Chris Miller on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Output Data During Process

Status
Not open for further replies.

TamedTech

IS-IT--Management
May 3, 2005
998
GB
Hello Guys,

I'm somewhat of a scripting amature when it comes to this kind of stuff and need a little help with somthing.

I've got a piece of software that I can run and it polls network devices that it finds in range, and then outputs thier details onto the screen (command prompt).

Now, I've been using the old

Code:
MyCommand >> file.txt

To have it output the results into a file which i can then read and use for other opperations.

Now, the issue i'm having is that the initial opperation can take several minutes to complete, and this is a long time to wait. If you watch the opperation run manualy, it lists the devices as it finds them on the screen, however when outputing to a file it waits untill it completes before it writes the output.

Any ideas on how to have it write the results to file as they come rather than waiting untill the end?

Let me know if there is anything more i can tell you,

Thanks,

Rob
 
Why do you think it is waiting until the end to output the
records. Have you tried looking at the file as it is being created?
 
Hello Buddy,

Yes i've checked the file during the provess and it doesnt appear to be creating it as it goes, it waits untill the end of the process before it outputs.

I'll have my colleague on-site double check that but i'm pretty sure its the case.

Thanks,

Rob
 
Hmmm,

Now you mention it, i have it running like this ...

Code:
MyCommand | grep... | cut ... >> myFile.txt

So maybe thats what is causing the issue,

I'll try putting the basic output into a file like that and see what happends.

Thanks,

Rob
 
Ok, after testing it without the grep and cuts it still waits untill the process is finished before writing the output to the file.

Any ideas?

Thanks,

Rob
 
The system tries to group as many output lines it can in one disk write operation (IO-buffering). That is why you only see the output file contents once the script/program is finished. Depending on the amount of data produced, you might see some data (4 or 8 K's worth) while the script/program is still running.


HTH,

p5wizard
 
Thank you p5wizard,

Does this mean there is little i can do to force the application to output lines of data as they come?

Like it say, when outputting to the screen it works just fine, as it find a device it throws its details onto the screen. Just when placing it into a file that it becomes an issue, presumably due to this I/O Buffering that you're talking about.

Thanks,

Rob
 
Well, you could possibly force your script to write out to a file unbufferedly, but then you'd probably have to redirect the output of each command in the script to that file:

sth like this
Code:
OUTFILE=/tmp/tst.out
echo "\c" >$OUTFILE
echo one >>$OUTFILE
sleep 1
echo two >>$OUTFILE
sleep 2
echo three >>$OUTFILE
sleep 3
echo four >>$OUTFILE
sleep 4
echo five >>$OUTFILE

instead of
Code:
echo one
sleep 1
echo two
sleep 2
echo three
sleep 3
echo four
sleep 4
echo five

and redirecting stdout like you are doing now.

HTH,

p5wizard
 
Thanks for that p5,

I'll give that a run through in the morning.

My script only has one command, and runs on a continous loop, but that command can take a few minutes to complete.

So whats the "\c" for? Does that set the output to be unbuffered? or does that just create an empty file?

Also, i'm loving the term 'unbufferedly' ... i can see myself having that printed on a tshirt...

"I output my commands unbufferedly"

:-D

Thanks,

Rob
 
Yes,
echo "\c" >$OUTFILE
creates an empty file

note: once you start piping thru filters, you fall back into buffered IO, unless you can instruct the filters to work unbuffered also with special flags. Google for more info (possible keywords: unix redirect filter buffered IO ...)

HTH,

p5wizard
 
Thanks for that P5, i'll have a look around and see what i can find.

After speaking with a business partner this morning (whom is a little more knowledgable) with Unix and Linux based systems, he said he's heard of people 'flushing' a command on every line, which would create the desired effect.

Is this somthing you've heard of before.

I'll get my nose on google and see if i can find anything.

Thanks,

Rob
 
Yes, "flushing" after every printed line essentially makes your output "unbuffered"/ it send the waiting bytes from the output buffer to the device.

Not sure if you can use flush from a shell script. If the program you are running is a C-program, and you can modify/recompile it, then that may be your solution. Look at the fflush() system call.
Code:
printf(stdout, "one line\n");
fflush(stdout);
printf(stdout, "next line\n");
fflush(stdout);
Again, once you start grep-ing or otherwise filtering the data, you may fall back into buffered output...

HTH,

p5wizard
 
Thanks for that P5,

The app i'm using is SDP Tool, its part of the Linux BlueZ stack, so i'm not sure if i'm going to be able to modify it.

I understand that if i add my greps in its will kick back, but the idea would be to run my greps as a seperate loop, then i can just have this sdptool output into a 'pool' file, that i can then have read with my grep loops and pull any appropriate information.

Thanks,

Rob
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top