I am writing a Java application that needs to use an external command line application using the Apache Commons Exec library. The application I need to run has a fairly long loading time so it would be preferable to keep one instance alive instead of creating a new process every time. The way the application work is very simple. Once started, it waits for some new input and generates some data as an output, both of which use the application's standard I/O.
So the idea would be to execute the CommandLine, and then to use the PumpStreamHandler with three separate streams (output, error and input) and use those streams to interact with the application. So far, I have had this work in basic scenarios where I have one input, one output and the application then shuts down. But as soon as I'm trying to have a second transaction, something goes wrong.
After having created my CommandLine, I create my Executor and launch it like so :
this.executor = new DefaultExecutor();
PipedOutputStream stdout = new PipedOutputStream();
PipedOutputStream stderr = new PipedOutputStream();
PipedInputStream stdin = new PipedInputStream();
PumpStreamHandler streamHandler = new PumpStreamHandler(stdout, stderr, stdin);
this.executor.setStreamHandler(streamHandler);
this.processOutput = new BufferedInputStream(new PipedInputStream(stdout));
this.processError = new BufferedInputStream(new PipedInputStream(stderr));
this.processInput = new BufferedOutputStream(new PipedOutputStream(stdin));
this.resultHandler = new DefaultExecuteResultHandler();
this.executor.execute(cmdLine, resultHandler);
I then proceed to launching three different threads, each of one handling a different stream. I also have three SynchronousQueues that handle input and output (one used as input for the input stream, one to inform the outputQueue that a new command has been launched and one for output). For example, the input stream thread looks like this :
while (!killThreads) {
String input = inputQueue.take();
processInput.write(input.getBytes());
processInput.flush();
IOQueue.put(input);
}
If I remove the while loop and just execute this once, everything seems to work perfectly. Obviously, if I try executing it again, the PumpStreamHandler throws an exception because it has been accessed by two different threads.
The issue here is that it seems like the processInput is not truly flushed until the thread ends. When debugged, the command line application only truly receives its input once the thread ends, but never gets it if the while loop is kept. I've tried many different things to get the processInput to flush but nothing seems to work.
Has anyone attempted anything similar before? Is there anything I am missing? Any help would be greatly appreciated!
I ended up figuring out a way to make this work. By looking inside the code of the Commons Exec library, I noticed the StreamPumpers used by the PumpStreamHandler did not flush each time they had some new data incoming. This is why the code worked when I executed it just once, since it automatically flushed and closed the stream. So I created classes that I called AutoFlushingStreamPumper and AutoFlushingPumpStreamHandler. The later is the same as a normal PumpStreamHandler but uses AutoFlushingStreamPumpers instead of the usual ones. The AutoFlushingStreamPumper does the same as a standard StreamPumper, but flushes its output stream every time it writes something to it.
I've tested it pretty extensively and it seems to work well. Thanks to everyone who has tried to figure this out!
For my purposes, it turns out I only needed to override the "ExecuteStreamHandler". Here is my solution, which captures stderr into a StringBuilder, and allows you to stream things to stdin and receive things from stdout:
class SendReceiveStreamHandler implements ExecuteStreamHandler
You can see the whole class as a gist on GitHub here.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With