I have two separate processes: a C program that outputs comma separated values followed by a newline every second, and a Perl program that accepts data (in the same format) and processes this data.
The C program outputs (via printf) values as such:
1, 2, 3, 4, 5, 6 7, 8, 9, 10, 11, 12 ...
The Perl program sits in an infinite loop waiting on a line basis for STDIN in order to process this data:
while ($line = <STDIN>) { chomp($line) # Line should now read '1,2,3,4,5,6' # Process data }
I want these two processes to communicate in real time. Standard bash pipes do not work (e.g. process1 | process2) because the Perl program waits for the first program to finish before processing the input.
Does anyone have any ideas, suggestions, or insightd as to a solution to this problem? Thank you in advance!
Pipes should be fine for this; you just need to control when the output of your C program is flushed to make it available to the perl script incrementally. You can do this in the C program using
fflush(), which will force the buffer from your C program to be pushed out so the perl program can read it.There is nothing inherent about pipes that would cause the perl program to wait for the C program to finish writing before processing its output. Your perl program is written so that it processes
STDINone line at a time:<>in this context reads one line fromSTDIN, but if there’s not one available it will block until one is. A call tofflush()from the C program will make this happen.Take a look at the Wikipedia article on Pipelines. The implementation section gives a brief description of how pipes are buffered, which should help you understand how your processes communicate. Pipes do allow concurrency between processes, and processes reading from and writing to pipes are managed by the scheduler just like other processes. Your problem here is with the buffering.