Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to implement uploading in broadcast upload extension (iOS)?

Is anybody know is there an ability to upload frame buffers from Broadcast Upload Extension to the host app or I should load them directly to back-end ? My goal to intercept frame buffers from replay kit, send them to my application and broadcast the video through my application using Webrtc. Will appreciate any help. Thanks in advance.

like image 971
Seliver Avatar asked Sep 22 '16 14:09

Seliver


3 Answers

Only Broadcast Upload Extension and Broadcast UI Extension are loaded when the broadcast starts. And as far as I know there is no programmatic way of launching your host app and streaming any data to it in the background.

But you can implement the whole logic in the Broadcast Upload Extension. Your RPBroadcastSampleHandler implementation is fed with video CMSampleBuffers. All post-processing and upload logic is up to the implementation. So you can unpack and process frames and then upload to your server in any suitable way. If you need any configuration or authorization details you can simply set them in the Broadcast UI Extension or even in your host app and then just store them in a shared storage.

There is not much information about it on the internet nor in the Apple documentation. But you still can:

  • Watch the WWDC 2016 video Go Live with ReplayKit
  • Read the RPBroadcastSampleHandler documentation
  • Read this quite useful blog post (in Chinese): http://blog.lessfun.com/blog/2016/09/21/ios-10-replaykit-live-and-broadcast-extension/
  • Play around the stub implementation of the upload extension (simply create the target in Xcode)
like image 114
Eugene Avatar answered Sep 17 '22 21:09

Eugene


I tried exactly same thing with Replay Kit and webRTC combination. The fundamental issue of webRTC on iOS, webRTC can't handle video stream if it goes to background. So.. you can stream your app's screen video stream to webRTC while your video chat app is in foreground, but to stream other app, the moment your app goes background, you might not be able to handle video stream but only voice over webRTC.

You'd better upload it to server from upload extension, I've already wasted too much time to connect upload extension to host app.. there is absolutely no control over upload extension.

like image 31
user3806731 Avatar answered Sep 17 '22 21:09

user3806731


I have some code for you, I've already implement it in my project and discussed it on google-groups: https://groups.google.com/d/msg/discuss-webrtc/jAHCnB12khE/zJEu1vyUAgAJ

I will transfer code here for the next generations:

First of all I've created additional class in broadcast extension to manage WebRTC related code and call it PeerManager.

Setup video track with local stream, be careful, you should do it before generate local offer.

private func setupVideoStreaming() {        
        localStream = webRTCPeer.peerConnectionFactory.mediaStream(withStreamId: "\(personID)_screen_sharing")
        videoSource = webRTCPeer.peerConnectionFactory.videoSource()
        videoCapturer = RTCVideoCapturer(delegate: videoSource)
        videoSource.adaptOutputFormat(toWidth: 441, height: 736, fps: 15)
        let videoTrack = webRTCPeer.peerConnectionFactory.videoTrack(with: videoSource, trackId: "screen_share_track_id")
        videoTrack.isEnabled = true
        localStream.addVideoTrack(videoTrack)
        for localStream in webRTCPeer.localPeerConnection.peerConnection.localStreams {
            webRTCPeer.localPeerConnection.peerConnection.remove(localStream)

        }
        webRTCPeer.localPeerConnection.peerConnection.add(localStream)
    }

I’ve got callback from the system that provides me CMSampleBuffer, I convert it to the RTCVideoFrame and send to the videoSource (emulate VideoCapturer)

override func processSampleBuffer(_ sampleBuffer: CMSampleBuffer, with sampleBufferType: RPSampleBufferType) {
        switch sampleBufferType {
            case RPSampleBufferType.video:
                // Handle video sample buffer
                guard peerManager != nil, let imageBuffer: CVImageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer) else {
                    break
                }
                let pixelFormat = CVPixelBufferGetPixelFormatType(imageBuffer) // kCVPixelFormatType_420YpCbCr8BiPlanarFullRange
                let timeStampNs: Int64 = Int64(CMTimeGetSeconds(CMSampleBufferGetPresentationTimeStamp(sampleBuffer)) * 1000000000)                
                let rtcPixlBuffer = RTCCVPixelBuffer(pixelBuffer: imageBuffer)
                let rtcVideoFrame = RTCVideoFrame(buffer: rtcPixlBuffer, rotation: ._0, timeStampNs: timeStampNs)
                peerManager.push(videoFrame: rtcVideoFrame)
            case RPSampleBufferType.audioApp:
                break
            case RPSampleBufferType.audioMic:
                break
        }
    }

Code from peerManager, it is implementation of push functions from code above. Nothing strange here, we emulate Capturer behavior using delegate.

 func push(videoFrame: RTCVideoFrame) {
        guard isConnected, videoCapturer != nil, isProcessed else {
            return
        }
        videoSource.capturer(videoCapturer, didCapture: videoFrame)
    }

Now you are ready to generate local offer, send it, and transfer data whatever you want. Try to check your local offer, if you do everything right you should see a=sendonly in your offer.

P.S. As suggested VladimirTechMan, You may also check the sample code of broadcast extension in the AppRTCMobile demo app. I've found link for you, it is Objective-C example https://webrtc.googlesource.com/src/+/358f2e076051d28b012529d3ae6a080838d27209 You should be interesting in ARDBroadcastSampleHandler.m/.h and ARDExternalSampleCapturer.m/.h files. Never forget that you can build it by yourself according to the instruction https://webrtc.org/native-code/ios/

like image 25
Bws Sluk Avatar answered Sep 18 '22 21:09

Bws Sluk