Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations Mike Lewis on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Using fflush to force pipe output

Status
Not open for further replies.

paulobrads

Programmer
Jul 13, 2006
28
GB
I have the following awk command followed by a pipe in a shell script. I've been advised using fflush can force immediate piping of data rather than buffering.

Code:
awk '{ 
gsub(/"/,"\\\"", $0);
for(i=7; i<=NF; i++){
	if(substr($6,1,6)!="GetRes"){
		if(substr($6,1,2)!="C=") 
			printf $i " " $6 "\n";
			else if($(i+1) != "") printf $(i+1) " " $7 "\n";
		}
	}
fflush();}' | perl lookup.pl


How can I actually use it in this instance though? All the recommendations I see online are for specifying fflush(filename) but here it's this command I want to flush, not some other file.

The fflush as shown seems to make no difference.

Cheers.
 
I presume you're using gawk since fflush() doesn't seem to be widely available in awk.

From man gawk:

fflush([file]) Flush any buffers associated with the open
output file or pipe file. If file is missing,
then standard output is flushed.
If file is the
null string, then all open output files and
pipes have their buffers flushed.

Personally I have had more luck making the process that is reading the pipe unbuffered:

Code:
                # From the open() section on the perlfunc man page.
                # Makes output unbuffered.
                select(STDOUT); $| = 1;

Seems illogical to me that making STDOUT unbuffered improves the response when you would expect it to be STDIN that is at fault, but it seems to do the trick.

Annihilannic.
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top