I see a ton of info about piping a raspivid stream directly to FFMPEG for encoding, muxing, and restreaming but these use cases are mostly from bash; similar to:
raspivid -n -w 480 -h 320 -b 300000 -fps 15 -t 0 -o - | ffmpeg -i - -f mpegts udp://192.168.1.2:8090ffmpeg
I'm hoping to utilize the functionality of the Picamera library so I can do concurrent processing with OpenCV and similar while still streaming with FFMPEG. But I can't figure out how to properly open FFMPEG as subprocess and pipe video data to it. I have seen plenty of attempts, unanswered posts, and people claiming to have done it, but none of it seems to work on my Pi.
Should I create a video buffer with Picamera and pipe that raw video to FFMPEG? Can I use camera.capture_continuous() and pass FFMPEG the bgr24 images I'm using for my OpenCV calculation?
I've tried all sorts of variations and I'm not sure if I'm just misunderstanding how to use the subprocess module, FFMPEG, or I'm simply missing a few settings. I understand the raw stream won't have any metadata, but I'm not completely sure what settings I need to give FFMPEG for it to understand what I'm giving it.
I have a Wowza server I'll eventually be streaming to, but I'm currently testing by streaming to a VLC server on my laptop. I've currently tried this:
import subprocess as sp
import picamera
import picamera.array
import numpy as np
npimage = np.empty(
(480, 640, 3),
dtype=np.uint8)
with picamera.PiCamera() as camera:
camera.resolution = (640, 480)
camera.framerate = 24
camera.start_recording('/dev/null', format='h264')
command = [
'ffmpeg',
'-y',
'-f', 'rawvideo',
'-video_size', '640x480',
'-pix_fmt', 'bgr24',
'-framerate', '24',
'-an',
'-i', '-',
'-f', 'mpegts', 'udp://192.168.1.54:1234']
pipe = sp.Popen(command, stdin=sp.PIPE,
stdout=sp.PIPE, stderr=sp.PIPE, bufsize=10**8)
if pipe.returncode != 0:
output, error = pipe.communicate()
print('Pipe failed: %d %s %s' % (pipe.returncode, output, error))
raise sp.CalledProcessError(pipe.returncode, command)
while True:
camera.wait_recording(0)
for i, image in enumerate(
camera.capture_continuous(
npimage,
format='bgr24',
use_video_port=True)):
pipe.stdout.write(npimage.tostring())
camera.stop_recording()
I've also tried writing the stream to a file-like object that simply creates the FFMPEG subprocess and writes to the stdin of it (camera.start_recording() can be given an object like this when you initialize the picam):
class PipeClass():
"""Start pipes and load ffmpeg."""
def __init__(self):
"""Create FFMPEG subprocess."""
self.size = 0
command = [
'ffmpeg',
'-f', 'rawvideo',
'-s', '640x480',
'-r', '24',
'-i', '-',
'-an',
'-f', 'mpegts', 'udp://192.168.1.54:1234']
self.pipe = sp.Popen(command, stdin=sp.PIPE,
stdout=sp.PIPE, stderr=sp.PIPE)
if self.pipe.returncode != 0:
raise sp.CalledProcessError(self.pipe.returncode, command)
def write(self, s):
"""Write to the pipe."""
self.pipe.stdin.write(s)
def flush(self):
"""Flush pipe."""
print("Flushed")
usage:
(...)
with picamera.PiCamera() as camera:
p = PipeClass()
camera.start_recording(p, format='h264')
(...)
Any assistance with this would be amazing!
I have been able to stream PiCamera output to ffmpeg with something like the following:
import picamera
import subprocess
# start the ffmpeg process with a pipe for stdin
# I'm just copying to a file, but you could stream to somewhere else
ffmpeg = subprocess.Popen([
'ffmpeg', '-i', '-',
'-vcodec', 'copy',
'-an', '/home/pi/test.mpg',
], stdin=subprocess.PIPE)
# initialize the camera
camera = picamera.PiCamera(resolution=(800, 480), framerate=25)
# start recording to ffmpeg's stdin
camera.start_recording(ffmpeg.stdin, format='h264', bitrate=2000000)
Or is that not what you're looking for?
Two problems that I see at first glance:
In your first example, you're writing your data into the subprocess's stdout
instead of its stdin
. That definitely doesn't work, and probably causes a hang.
In both examples, you're starting the process with stdin=sp.PIPE, stderr=sp.PIPE
and then never reading from those pipes. That means that as soon as ffmpeg writes enough output to fill the pipe buffer, it will block and you'll have a deadlock. Use the default stdout=None, stderr=None
to let ffmpeg's output go to your process's stdout and stderr, or connect them to a filehandle opened to /dev/null
to discard the output. Or use the communicate
method to get the output each time you write some input, and do something useful with it (like monitor the status of the streaming).
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With