Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Streaming (live-) video from Server to Client using .NET Core Blazor

I want to build a video-streaming application using .NET Core Blazor. A simple usecase would be to implement a live-webcam into a website. So far my solution is based on OpenCV's video capture (OpenCvSharp4.Windows 4.3) and an async method that constantly re-renders the image source converting it into a Base64 string. It is importat, that the video is coming from the server. Using openCV is optional. This is my Component:

@page "/webcam"

@using OpenCvSharp

<div class="top-row">
    <button type="button" class="btn btn-dark" @onclick="CaptureWebCam">Capture</button>
</div>

<img src="@_imgSrc" />

@code {
    private string _imgSrc;

    // Start task for video capture
    protected async Task CaptureWebCam()
    {
        // 0 = default cam (mostly webcam)
        using (var capture = new VideoCapture(0))
        {
            // set every image in videocapture as image source
            using (var image = new Mat())
            {
                while (true)
                {
                    capture.Read(image);

                    // encode image with chosen encoding format
                    string base64 = Convert.ToBase64String(image.ToBytes());
                    _imgSrc = $"data:image/png;base64,{base64}";

                    await Task.Delay(1);
                    StateHasChanged();
                }
            }
        }
    }
}

My problem is:

Using a base64 string for every image is not at all performant! From what I researched I should convert my raw video to an mp4 container (using ffmpeg for example) and then stream to an html <video> element using the HLS or RTSP protocoll: DIY: Video Streaming Server

This is where I am stuck, because I can't seem to find a good framwework/library/api to stream mp4 to a website using .net core blazor. Any suggestions?

I found something like VLC and WebRTC but I am not sure this will work. Also I found out that streaming over SignalR in combination with WebRTC could work: it is possible to stream video with SignalR?

like image 642
Eddy N. Avatar asked Nov 06 '22 05:11

Eddy N.


1 Answers

I'm used a stream in MJPEG format and <img> tag. Give out the stream via ApiController where a stream of jpeg frames is created. Worked video 800x600 30 FPS in the local network.

    [ApiController]
    [Route("api/[controller]")]
    public class CameraController : ControllerBase, IDisposable
    {
...
        [HttpGet("{id}/[action]")]
        public async Task<FileStreamResult> Live(string id, string channel)
        {
            _stream = new SmartMjpegStream(channel);
            _cameras.Add(Id = id, this);
            return await Task.FromResult(new FileStreamResult(_stream, SmartMjpegStream.ContentType));
        }
...
    }

and next in CameraView.razor

<img src="camera/live?channel=rtsp://admin:[email protected]/axis-media/media.amp" alt="..." />
like image 106
user15826789 Avatar answered Nov 15 '22 18:11

user15826789