Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

FFServer streaming H.264 from Logitech C920 without re-encoding

I'm trying to broadcast a native .H264 webcam feed from a Logitech C920 webcam in realtime from an Odroid device (a robot) via ffserver running on a separate server (CentOS 7.1) to users' browser without reeconding the .H264 video feed.

Having a realtime video feed in the browser is a challenge on its own, so for now I'm just trying to get the Logitech C920 webcam on the Odroid to stream its native .H264 realtime video feed as mp4 via ffserver to users without the need to reencode the video in the process. Obviously I want to avoid re-encoding as that would take too much CPU time and would kill the realtime video feed. Later I might need to change the container to .flv or rtp, so it can be played from the browser in a realtime fashion. I'm using the Logitech C920 webcam, because it can do .H264 encoding on the hardware. (it has been tested by saving a file directly, it works, except the well-known 'jerkiness' issue related to a linux kernel bug: http://sourceforge.net/p/linux-uvc/mailman/message/33164469/ , but that is a different story)

The problem is, that however I set ffmpeg-ffserver up, as soon as ffserver is in the picture the feed gets reencoded - even from h264(native) to h264(libx264) - taking up 100% of CPU on the Odroid device and introducing a huge delay in the video feed.

Below are my ffmpeg and ffserver settings.

Ffmpeg from the Odroid device streaming the .H264 feed to ffserver

$ ffmpeg -s 1920x1080 -f v4l2 -vcodec h264 -i /dev/video0 -copyinkf -vcodec copy http://xxxyyyy.com:8090/feed1.ffm
ffmpeg version N-72744-g653bf3c Copyright (c) 2000-2015 the FFmpeg developers
  built with gcc 4.8 (Ubuntu/Linaro 4.8.2-19ubuntu1)
  configuration: --prefix=/home/odroid/ffmpeg_build --pkg-config-flags=--static --extra-cflags=-I/home/odroid/ffmpeg_build/include --extra-ldflags=-L/home/odroid/ffmpeg_build/lib --bindir=/home/odroid/bin --enable-gpl --enable-libass --enable-libfdk-aac --enable-libfreetype --enable-libmp3lame --enable-libopus --enable-libtheora --enable-libvorbis --enable-libvpx --enable-libx264 --enable-libx265 --enable-nonfree
  libavutil      54. 27.100 / 54. 27.100
  libavcodec     56. 41.100 / 56. 41.100
  libavformat    56. 36.100 / 56. 36.100
  libavdevice    56.  4.100 / 56.  4.100
  libavfilter     5. 16.101 /  5. 16.101
  libswscale      3.  1.101 /  3.  1.101
  libswresample   1.  2.100 /  1.  2.100
  libpostproc    53.  3.100 / 53.  3.100
Input #0, video4linux2,v4l2, from '/dev/video0':
  Duration: N/A, start: 6581.606726, bitrate: N/A
    Stream #0:0: Video: h264 (Constrained Baseline), yuvj420p(pc), 1920x1080 [SAR 1:1 DAR 16:9], -5 kb/s, 30 fps, 30 tbr, 1000k tbn, 60 tbc
[swscaler @ 0x11bf0b0] deprecated pixel format used, make sure you did set range correctly
No pixel format specified, yuvj420p for H.264 encoding chosen.
Use -pix_fmt yuv420p for compatibility with outdated media players.
[libx264 @ 0x12590e0] using SAR=64/45
[libx264 @ 0x12590e0] using cpu capabilities: ARMv6 NEON
[libx264 @ 0x12590e0] profile High, level 1b
Output #0, ffm, to 'http://robo-car.int.thomsonreuters.com:8090/feed1.ffm':
  Metadata:
    creation_time   : now
    encoder         : Lavf56.36.100
    Stream #0:0: Video: h264 (libx264), yuvj420p(pc), 160x128 [SAR 64:45 DAR 16:9], q=-1--1, 64 kb/s, 30 fps, 1000k tbn, 5 tbc
    Metadata:
      encoder         : Lavc56.41.100 libx264
Stream mapping:
  Stream #0:0 -> #0:0 (h264 (native) -> h264 (libx264))
Press [q] to stop, [?] for help
^Cav_interleaved_write_frame(): Immediate exit requested00 bitrate=N/A dup=0 drop=97    
    Last message repeated 2140 times
frame= 3723 fps=301 q=-1.0 Lsize=     396kB time=00:12:14.20 bitrate=   4.4kbits/s dup=3699 drop=103    
video:321kB audio:0kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: 23.500496%

And the /etc/ffserver.conf on the server running ffserver:

HTTPPort 8090                      # Port to bind the server to
HTTPBindAddress 0.0.0.0
MaxHTTPConnections 2000
MaxClients 1000
MaxBandwidth 10000             # Maximum bandwidth per client
                               # set this high enough to exceed stream bitrate
CustomLog -

<Feed feed1.ffm>         # This is the input feed where FFmpeg will send
   File ./feed1.ffm            # video stream.
   FileMaxSize 1G              # Maximum file size for buffering video
</Feed>

<Stream test.mp4>
  Feed feed1.ffm
  Format mp4
  NoAudio
</Stream>

As you have seen above in the ffmpeg section, there is a reencoding happening on the Odroid device maxing out the CPUs:

Stream mapping:
  Stream #0:0 -> #0:0 (h264 (native) -> h264 (libx264))

I have already tried setting the VideoCodec value in the ffserver config directly to libx264, tried the -re setting in ffmpeg, tried using different syntax for ffmpeg, etc. Nothing helps. Reeconding is always there and so I can't make ffmpeg-ffserver just to broadcast the video stream as-is.

Both ffmpeg (on the Odroid and on the server) were compiled yesterday (2015-06-09) from source, so they are the latest (and the same) version.

Any idea?

EDIT: IN SUMMARY the issue is: I cannot find a way to get ffserver to broadcast the h264(native) feed coming from the Logitech C920 webcam without re-encoding.

like image 585
Zoltan Fedor Avatar asked Jun 10 '15 19:06

Zoltan Fedor


People also ask

Is it possible to encode H264 video from a Logitech webcam?

I have a logitech C920 webcam. It has the ability to directly encode the video in H264. I want to reproduce what was done with a Beaglebone, but using a Raspberry Pi: send a H264 stream to the network.

Does the Logitech c920/c922 still have hardware encoder?

When attempting to set up a remote streaming solution, i shocked to find that the newer ones no longer have hardware H.264 encoder. This is the official Logitech wbepage declaring the removal of this feature from C920, C922 and BRIO models: SAY GOODBYE TO IN-CAMERA HARDWARE ENCODING

Is it possible to use a Raspberry Pi to encode H264 video?

It has the ability to directly encode the video in H264. I want to reproduce what was done with a Beaglebone, but using a Raspberry Pi: send a H264 stream to the network. The Raspberry Pi is then only here to packetize the stream in RTP, the video compression is done by the webcam itself. The H264 mode is enforced using video4linux.

What file format does the C920 webcam output?

From an old page ( archive.org link just in case ), this was someone else's output with the C920 WebCam. It showed 3 formats: RAW (YCbCr 4:2:2), H.264 and MJPEG


1 Answers

Well, it isn't really an answer, but I managed to do this by switching to vlc. Unfortunately I haven't managed to make ffserver to accept the incoming .H264 stream as-is, without re-encoding it and even if I would have, I still would have had this problem of ffmpeg-C920-linux kernel regression: http://sourceforge.net/p/linux-uvc/mailman/message/33164469/

As such it looked reasonable to abandon the ffmpeg-ffserver line and try vlc.

In case anybody else is interested, with vlc I managed to achieve the non-reencoded distribution of the C920 webcam's native .H264 feed by running the follow:

On the Odroid device this will pick up the .H264 stream from the cam and

streams it via http in mpeg-ts:
cvlc v4l2:///dev/video0:chroma=h264:width=1920:height=1080 --sout '#standard{access=http,mux=ts,dst=[ip of odroid]:8080,name=stream,mime=video/ts}' -vvv

On the CentOS 7 server the following takes the stream from the Odroid and multicasts it, so consumers can connect to it, instead of trying to connect to the Odroid device which has a lot more limited bandwidth (wifi):

vlc http://[ip of odroid]:8080 --sout '#standard{access=http,mux=ts,dst=[ip of centos server]:8080,name=stream,mime=video/ts}' -vvv

Now I can play this stream in realtime from the VLC player on an device:

http://[ip of centos server]:8080

But yes, this isn't really a solution to the original ffmpeg-ffserver problem, but rather a workaround using vlc for the same.

like image 167
Zoltan Fedor Avatar answered Nov 14 '22 09:11

Zoltan Fedor