I'm very new to gstreamer but after a lot of research I've now managed to create my own working pipeline streaming a webcam over a network from a Raspberry PI Zero to a PC via a UDP transport. I am pleased with my progress! :)
But I'm struggling to create a TCP transport...
This pipeline works perfectly over UDP: (note: simplified using a test video source and JPEG encoding):
Server UDP (192.168.2.1):
gst-launch-1.0 videotestsrc is-live=true ! jpegenc ! rtpjpegpay ! udpsink host=192.168.2.13 port=7001
Client UDP (192.168.2.13):
gst-launch-1.0 udpsrc port=7001 ! application/x-rtp,encoding-name=JPEG ! rtpjpegdepay ! jpegdec ! autovideosink
...but when I use a TCP sink/source with exactly the same elements I receive nothing but errors.
The modified pipeline using tcpserversink and tcpclientsrc:
Server TCP (192.168.2.1):
gst-launch-1.0 videotestsrc is-live=true ! jpegenc ! rtpjpegpay ! tcpserversink port=7001
Client TCP (192.168.2.13):
gst-launch-1.0 tcpclientsrc host=192.168.2.1 port=7001 ! application/x-rtp,encoding-name=JPEG ! rtpjpegdepay ! jpegdec ! autovideosink
Attempt 1: tcpserversink port=7001
ERROR: Failed to connect to host '192.168.2.1:7001': No connection could be made because the target machine actively refused it.
Attempt 2: tcpserversink host=localhost port=7001
ERROR: from element /GstPipeline:pipeline0/GstTCPClientSrc:tcpclientsrc0: Could not open resource for reading.
Attempt 3: tcpserversink host=127.0.0.1 port=7001
ERROR: from element /GstPipeline:pipeline0/GstTCPClientSrc:tcpclientsrc0: Could not open resource for reading.
Attempt 4: tcpserversink host=192.168.2.1 port=7001
ERROR: from element /GstPipeline:pipeline0/GstTCPClientSrc:tcpclientsrc0: Internal data stream error.
Attempt 5: tcpserversink host=0.0.0.0 port=7001
ERROR: from element /GstPipeline:pipeline0/GstTCPClientSrc:tcpclientsrc0: Internal data stream error.
I figured I should be able to replace src & sink elements without the pipeline breaking so I must just be missing something.
I would be grateful for any light you could shed on this.
You can solve it one of two ways (at least). The first is add the rtpstreampay element after the rtp payloader for your media type.
https://gstreamer.freedesktop.org/data/doc/gstreamer/head/gst-plugins-good/html/gst-plugins-good-plugins-rtpstreampay.html
server:
gst-launch-1.0 videotestsrc is-live=true \
! jpegenc \
! rtpjpegpay \
! rtpstreampay \
! tcpserversink port=7001
client:
gst-launch-1.0 tcpclientsrc port=7001 \
! application/x-rtp-stream,encoding-name=JPEG \
! rtpstreamdepay \
! rtpjpegdepay \
! jpegdec \
! autovideosink
The second way is to use a muxer rather than an rtp payloader, something like matroskamux which is pretty generic.
server:
gst-launch-1.0 videotestsrc is-live=true \
! jpegenc \
! matroskamux \
! tcpserversink port=7001
client:
gst-launch-1.0 tcpclientsrc port=7001 \
! matroskademux \
! jpegdec \
! autovideosink
You might also want to look into the GstRtspServer if you're wanting to do client/server rtp connections. A simple Python script like this will act as the server.
rtspserver.py
import gi
gi.require_version('Gst','1.0')
gi.require_version('GstRtspServer','1.0')
from gi.repository import Gst, GObject, GstRtspServer
Gst.init(None)
mainloop = GObject.MainLoop()
server = GstRtspServer.RTSPServer()
factory = GstRtspServer.RTSPMediaFactory()
factory.set_launch((
'videotestsrc is-live=true '
'! jpegenc '
'! rtpjpegpay name=pay0 pt=26'
))
# allow multiple connections
factory.set_shared(True)
mounts = server.get_mount_points()
mounts.add_factory('/live', factory)
server.attach(None)
mainloop.run()
And you can use a pipeline like this to view the output.
gst-launch-1.0 \
rtspsrc location=rtsp://localhost:8554/live latency=100 \
! rtpjpegdepay \
! jpegdec \
! autovideosink
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With