Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

iOS SWIFT - WebRTC change from Front Camera to back Camera

Tags:

ios

swift3

webrtc

WebRTC video by default uses Front Camera, which works fine. However, i need to switch it to back camera, and i have not been able to find any code to do that. Which part do i need to edit? Is it the localView or localVideoTrack or capturer?

like image 570
mrnobody Avatar asked Mar 10 '17 07:03

mrnobody


3 Answers

Swift 3.0

Peer connection can have only one 'RTCVideoTrack' for sending video stream.

At first, for change camera front/back you must remove current video track on peer connection. After then, you create new 'RTCVideoTrack' on camera which you need, and set this for peer connection.

I used this methods.

func swapCameraToFront() {
    let localStream: RTCMediaStream? = peerConnection?.localStreams.first as? RTCMediaStream
    localStream?.removeVideoTrack(localStream?.videoTracks.first as! RTCVideoTrack)
    let localVideoTrack: RTCVideoTrack? = createLocalVideoTrack()
    if localVideoTrack != nil {
        localStream?.addVideoTrack(localVideoTrack)
        delegate?.appClient(self, didReceiveLocalVideoTrack: localVideoTrack!)
    }
    peerConnection?.remove(localStream)
    peerConnection?.add(localStream)
}

func swapCameraToBack() {
    let localStream: RTCMediaStream? = peerConnection?.localStreams.first as? RTCMediaStream
    localStream?.removeVideoTrack(localStream?.videoTracks.first as! RTCVideoTrack)
    let localVideoTrack: RTCVideoTrack? = createLocalVideoTrackBackCamera()
    if localVideoTrack != nil {
        localStream?.addVideoTrack(localVideoTrack)
        delegate?.appClient(self, didReceiveLocalVideoTrack: localVideoTrack!)
    }
    peerConnection?.remove(localStream)
    peerConnection?.add(localStream)
}
like image 125
Sergey Balashov Avatar answered Oct 23 '22 08:10

Sergey Balashov


As of now I only have the answer in Objective C language in regard to Ankit's comment below. I will convert it into Swift after some time.

You can check the below code

- (RTCVideoTrack *)createLocalVideoTrack {

    RTCVideoTrack *localVideoTrack = nil; 
    NSString *cameraID = nil; 
    for (AVCaptureDevice *captureDevice in [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo]) {
       if (captureDevice.position == AVCaptureDevicePositionFront) { 
        cameraID = [captureDevice localizedName]; break;
       }
    }

    RTCVideoCapturer *capturer = [RTCVideoCapturer capturerWithDeviceName:cameraID]; 
    RTCMediaConstraints *mediaConstraints = [self defaultMediaStreamConstraints]; 
    RTCVideoSource *videoSource = [_factory videoSourceWithCapturer:capturer constraints:mediaConstraints]; 
    localVideoTrack = [_factory videoTrackWithID:@"ARDAMSv0" source:videoSource];

       return localVideoTrack; 
   }

- (RTCVideoTrack *)createLocalVideoTrackBackCamera {
    RTCVideoTrack *localVideoTrack = nil;
    //AVCaptureDevicePositionFront
    NSString *cameraID = nil;
    for (AVCaptureDevice *captureDevice in [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo]) {
     if (captureDevice.position == AVCaptureDevicePositionBack) {
         cameraID = [captureDevice localizedName];
         break;
     }
    }

  RTCVideoCapturer *capturer = [RTCVideoCapturer capturerWithDeviceName:cameraID];
  RTCMediaConstraints *mediaConstraints = [self defaultMediaStreamConstraints];
  RTCVideoSource *videoSource = [_factory videoSourceWithCapturer:capturer constraints:mediaConstraints];
  localVideoTrack = [_factory videoTrackWithID:@"ARDAMSv0" source:videoSource];

  return localVideoTrack;
}
like image 2
SULEMAN BAWA Avatar answered Oct 23 '22 07:10

SULEMAN BAWA


If you decide to use official Google build here the explanation:

First, you must configure your camera before call start, best place to do that in ARDVideoCallViewDelegate in method didCreateLocalCapturer

- (void)startCapture:(void (^)(BOOL succeeded))completionHandler {
    AVCaptureDevicePosition position = _usingFrontCamera ? AVCaptureDevicePositionFront : AVCaptureDevicePositionBack;
    __weak AVCaptureDevice *device = [self findDeviceForPosition:position];
    if ([device lockForConfiguration:nil]) {
        if ([device isFocusPointOfInterestSupported]) {
            [device setFocusModeLockedWithLensPosition:0.9 completionHandler: nil];
        }
    }
    AVCaptureDeviceFormat *format = [self selectFormatForDevice:device];
    if (format == nil) {
        RTCLogError(@"No valid formats for device %@", device);
        NSAssert(NO, @"");
        return;
    }
    NSInteger fps = [self selectFpsForFormat:format];
    [_capturer startCaptureWithDevice: device
                               format: format
                                  fps:fps completionHandler:^(NSError *    error) {
                                      NSLog(@"%@",error);
                                      if (error == nil) {
                                          completionHandler(true);
                                      }
                                  }];
}

Don't forget enabling capture device is asynchronous, sometime better to use completion to be sure everything done as expected.

like image 1
GalaevAlexey Avatar answered Oct 23 '22 06:10

GalaevAlexey