Non-blocking python subprocess

I am working on a pet project to compress a terabyte of video into a slimmer format. While I have been able to automate working with ffmpeg, I didn’t like the fact that I couldn’t follow along with the subprocess running ffmpeg.

I tried a few different ideas of how to watch ffmpeg but also avoid the script from blocking because I wanted to be able to time and monitor it’s progress

import subprocess

process = subprocess.pOpen

stdout, stderr = process communicate blocks until the process is finished

subprocess.stdout.readline() and subprocess.stderr.readline() will both block until there is sufficient data. In ffmpeg’s case there is never stdout output so it will block indefinitely.

https://gist.github.com/devdave/9b8553d63e24ef19eea7e56f7cb95c78

By using threading Queue and constantly polling the process, I can watch the output as fast as it can come in but not worry about the main process blocking, just the threads.

A further improvement on the idea would be to have two threads (for stdout and stderr respectively) with the queue items put with sentinels like queue.put((STDERR, line_from_stderr)) and a sentinel for STDOUT.

To use - 

r = Runner(["some_long_running_process", "-arg1", "arg1 value"])

for stdout, stderr in r.start():
    print("STDOUT", stdout)
    print("STDERR", stderr)