Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

AVMutableVideoComposition output video shrinked

I'm a newbie to Swift. I'm trying to add a watermark with reference to code from SO. My original video resolution is 1280 X 720, but the output video is a shrunk version.

Here are the before and after pictures

Before

After

Here is my function to create a watermark.

private func watermark(video videoAsset:AVAsset, watermarkText text : String!, image : CGImage!, saveToLibrary flag : Bool, completion : ((_ status : AVAssetExportSessionStatus?, _ session: AVAssetExportSession?, _ outputURL : URL?) -> ())?) {
    DispatchQueue.global(qos: DispatchQoS.QoSClass.default).async {

        let mixComposition = AVMutableComposition()

        let compositionVideoTrack = mixComposition.addMutableTrack(withMediaType: AVMediaTypeVideo, preferredTrackID: Int32(kCMPersistentTrackID_Invalid))
        let clipVideoTrack = videoAsset.tracks(withMediaType: AVMediaTypeVideo)[0]
        do {
            try compositionVideoTrack.insertTimeRange(CMTimeRangeMake(kCMTimeZero, videoAsset.duration), of: clipVideoTrack, at: kCMTimeZero)
        }
        catch {
            print(error.localizedDescription)
        }

        let videoSize = clipVideoTrack.naturalSize

        print("Video size", videoSize.height) //720
        print("Video size", videoSize.width) //1280

        let parentLayer = CALayer()
        let videoLayer = CALayer()
        parentLayer.frame = CGRect(x: 0.0,
                                   y: 0.0,
                                   width: videoSize.width,
                                   height: videoSize.height)
        videoLayer.frame = CGRect(x: 0.0,
                                  y: 0.0,
                                  width: videoSize.width,
                                  height: videoSize.height)
        parentLayer.addSublayer(videoLayer)

        if text != nil {
            let titleLayer = CATextLayer()
            titleLayer.backgroundColor = UIColor.red.cgColor
            titleLayer.string = text
            titleLayer.font = "Helvetica" as CFTypeRef
            titleLayer.fontSize = 15
            titleLayer.alignmentMode = kCAAlignmentCenter
            titleLayer.bounds = CGRect(x: 0, y: 0, width: videoSize.width, height: videoSize.height)
            parentLayer.addSublayer(titleLayer)
        } else if image != nil {
            let imageLayer = CALayer()
            imageLayer.contents = image

            let width: CGFloat = (self.imageView.image?.size.width)!
            let height: CGFloat = (self.imageView.image?.size.height)!

            print("Video size", height) //720
            print("Video size", width) //1280

            imageLayer.frame = CGRect(x: 0.0, y: 0.0, width: width, height: height)
            imageLayer.opacity = 0.65
            parentLayer.addSublayer(imageLayer)
        }

        let videoComp = AVMutableVideoComposition()
        videoComp.renderSize = videoSize
        videoComp.frameDuration = CMTimeMake(1, Int32(clipVideoTrack.nominalFrameRate))
        videoComp.animationTool = AVVideoCompositionCoreAnimationTool(postProcessingAsVideoLayer: videoLayer, in: parentLayer)

        let instruction = AVMutableVideoCompositionInstruction()
        instruction.timeRange = CMTimeRangeMake(kCMTimeZero, mixComposition.duration)
        _ = mixComposition.tracks(withMediaType: AVMediaTypeVideo)[0] as AVAssetTrack

        let layerInstruction = self.videoCompositionInstructionForTrack(track: compositionVideoTrack, asset: videoAsset)

        instruction.layerInstructions = [layerInstruction]
        videoComp.instructions = [instruction]

        let documentDirectory = NSSearchPathForDirectoriesInDomains(.documentDirectory, .userDomainMask, true)[0]
        let dateFormatter = DateFormatter()
        dateFormatter.dateStyle = .long
        dateFormatter.timeStyle = .short
        let date = dateFormatter.string(from: Date())
        let url = URL(fileURLWithPath: documentDirectory).appendingPathComponent("watermarkVideo-\(date).mov")

        let exporter = AVAssetExportSession(asset: mixComposition, presetName: AVAssetExportPresetHighestQuality)
        exporter?.outputURL = url
        exporter?.outputFileType = AVFileTypeQuickTimeMovie
        exporter?.shouldOptimizeForNetworkUse = true
        exporter?.videoComposition = videoComp

        exporter?.exportAsynchronously() {
            DispatchQueue.main.async {

                if exporter?.status == AVAssetExportSessionStatus.completed {
                    let outputURL = exporter?.outputURL
                    if flag {

                        if UIVideoAtPathIsCompatibleWithSavedPhotosAlbum(outputURL!.path) {
                            PHPhotoLibrary.shared().performChanges({
                                PHAssetChangeRequest.creationRequestForAssetFromVideo(atFileURL: outputURL!)
                            }) { saved, error in
                                if saved {
                                    completion!(AVAssetExportSessionStatus.completed, exporter, outputURL)
                                }
                            }
                        }
                    } else {
                        completion!(AVAssetExportSessionStatus.completed, exporter, outputURL)
                    }

                } else {
                    // Error
                    completion!(exporter?.status, exporter, nil)
                }
            }
        }
    }
}

While the size of the watermark image is correct, the video is shrunk.

like image 824
Rishabh Bhatia Avatar asked Nov 14 '17 10:11

Rishabh Bhatia


2 Answers

can you try this function

private func watermark(video videoAsset: AVAsset, watermarkText text : String!, image : CGImage!, saveToLibrary flag : Bool, completion : ((_ status : AVAssetExportSessionStatus ?, _ session: AVAssetExportSession ?, _ outputURL : URL ?) -> ())?) {
  DispatchQueue.global(qos: DispatchQoS.QoSClass.default).async {

    let mixComposition = AVMutableComposition()

    let compositionVideoTrack = mixComposition.addMutableTrack(withMediaType: AVMediaTypeVideo, preferredTrackID: Int32(kCMPersistentTrackID_Invalid))
    let clipVideoTrack = videoAsset.tracks(withMediaType: AVMediaTypeVideo)[0] as AVAssetTrack
    do {
      try compositionVideoTrack.insertTimeRange(CMTimeRangeMake(kCMTimeZero, videoAsset.duration), of: clipVideoTrack, at: kCMTimeZero)
    }
      catch {
      print(error.localizedDescription)
    }

    let videoSize = clipVideoTrack.naturalSize

    let parentLayer = CALayer()
    let videoLayer = CALayer()
    parentLayer.frame = CGRect(x: 0.0,
      y: 0.0,
      width: videoSize.width,
      height: videoSize.height)
    videoLayer.frame = CGRect(x: 0.0,
      y: 0.0,
      width: videoSize.width,
      height: videoSize.height)
    parentLayer.addSublayer(videoLayer)

    //            if text != nil {
    //                let titleLayer = CATextLayer()
    //                titleLayer.backgroundColor = UIColor.red.cgColor
    //                titleLayer.string = text
    //                titleLayer.font = "Helvetica" as CFTypeRef
    //                titleLayer.fontSize = 15
    //                titleLayer.alignmentMode = kCAAlignmentCenter
    //                titleLayer.frame = CGRect(x: 0, y: 0, width: videoSize.width, height: videoSize.height)
    //                parentLayer.addSublayer(titleLayer)
    //            } else
    if image != nil {
      let imageLayer = CALayer()
      imageLayer.contents = image

      let width: CGFloat = (self.imageView.image ?.size.width)!
      let height: CGFloat = (self.imageView.image ?.size.height)!
      //
      print("Video size", height)
      print("Video size", width)

      imageLayer.frame = CGRect(x: 0, y: 0, width: width, height: height)

      //                imageLayer.frame = CGRect(x: 0, y: 0, width: videoSize.width, height: videoSize.height)

      imageLayer.opacity = 1
      parentLayer.addSublayer(imageLayer)
    }

    let videoComp = AVMutableVideoComposition()
    videoComp.renderSize = videoSize
    videoComp.frameDuration = CMTimeMake(1, Int32(clipVideoTrack.nominalFrameRate))
    videoComp.animationTool = AVVideoCompositionCoreAnimationTool(postProcessingAsVideoLayer: videoLayer, in: parentLayer)

    let instruction = AVMutableVideoCompositionInstruction()
    instruction.timeRange = CMTimeRangeMake(kCMTimeZero, mixComposition.duration)
    let videotrack = mixComposition.tracks(withMediaType: AVMediaTypeVideo)[0] as AVAssetTrack
    let layerInstruction = AVMutableVideoCompositionLayerInstruction(assetTrack: videotrack)

    //            let layerInstruction = self.videoCompositionInstructionForTrack(track: compositionVideoTrack, asset: videoAsset)

    instruction.layerInstructions = [layerInstruction]
    videoComp.instructions = [instruction]

    let documentDirectory = NSSearchPathForDirectoriesInDomains(.documentDirectory, .userDomainMask, true)[0]
    let dateFormatter = DateFormatter()
    dateFormatter.dateStyle = .long
    dateFormatter.timeStyle = .short
    let date = dateFormatter.string(from: Date())
    let url = URL(fileURLWithPath: documentDirectory).appendingPathComponent("watermarkVideo-\(date).mp4")

    guard let exporter = AVAssetExportSession(asset: mixComposition, presetName: AVAssetExportPresetHighestQuality) else { return}
    exporter.videoComposition = videoComp
    exporter.outputFileType = AVFileTypeMPEG4
    exporter.outputURL = url

    exporter.exportAsynchronously() {
      DispatchQueue.main.async {

        if exporter.status == AVAssetExportSessionStatus.completed {
          let outputURL = exporter.outputURL
          if flag {
            // Save to library
            //                            let library = ALAssetsLibrary()

            if UIVideoAtPathIsCompatibleWithSavedPhotosAlbum(outputURL!.path) {
              PHPhotoLibrary.shared().performChanges({
                PHAssetChangeRequest.creationRequestForAssetFromVideo(atFileURL: outputURL!)
              }) {
                saved, error in
                              if saved {
                  completion!(AVAssetExportSessionStatus.completed, exporter, outputURL)
                }
              }
            }

            //                            if library.videoAtPathIs(compatibleWithSavedPhotosAlbum: outputURL) {
            //                                library.writeVideoAtPathToSavedPhotosAlbum(outputURL,
            //                                                                           completionBlock: { (assetURL:NSURL!, error:NSError!) -> Void in
            //
            //                                                                            completion!(AVAssetExportSessionStatus.Completed, exporter, outputURL)
            //                                })
            //                            }
          } else {
            completion!(AVAssetExportSessionStatus.completed, exporter, outputURL)
          }

        } else {
          // Error
          completion!(exporter.status, exporter, nil)
        }
      }
    }
  }
}
like image 127
Rishabh Bhatia Avatar answered Nov 20 '22 03:11

Rishabh Bhatia


The code above for creating the watermarked video seems not to be the reason for the smaller output resolution.

Problem

The resolution depends on what kind of AVAsset is put into the watermark method.

Example: Frequently an UIImagePickerController is used. There is the delegate method

func imagePickerController(_ picker: UIImagePickerController, didFinishPickingMediaWithInfo info: [String : Any]) 

There one often can see something like this:

let url = info[UIImagePickerControllerMediaURL] as? URL
let videoAsset = AVAsset(url: url!)
self.watermark(video: videoAsset, watermarkText: nil, image: self.imageView.image?.cgImage ...

But with the lines above a downsized input image is used, e.g. instead of having a video with 1920x1080 one has a reduced video size of 1280x720.

Solution

A method for determining the AVAsset from the PHAsset could look like this:

private func videoAsset(for asset: PHAsset, completion: @escaping (AVAsset?) -> Void) {
    let requestOptions = PHVideoRequestOptions()
    requestOptions.version = .original
    PHImageManager.default().requestAVAsset(forVideo: asset, options: requestOptions, resultHandler: {
        (avAsset, avAudioMix, info) in
        completion(avAsset)
    })
}

And where to get the PHAsset from? It can also be determined in the didFinishPickingMediaWithInfo method by using UIImagePickerControllerPHAsset:

let asset = info[UIImagePickerControllerPHAsset] as? PHAsset

Quick Test

For a quick test one could use:

func imagePickerController(_ picker: UIImagePickerController, didFinishPickingMediaWithInfo info: [String : Any]) {
    if let asset = info[UIImagePickerControllerPHAsset] as? PHAsset {
        picker.dismiss(animated: true, completion: { [weak self] in
            self?.videoAsset(for: asset, completion: { (avAsset) in
                if let videoAsset = avAsset {
                    DispatchQueue.main.async {
                        self?.watermark(video: videoAsset, watermarkText: nil, image: self?.imageView.image?.cgImage, saveToLibrary: true) { (exportStat: AVAssetExportSessionStatus? , session: AVAssetExportSession?, url: URL?) in
                            print("url: \(String(describing: url?.debugDescription))")
                        }
                    }
                }
            })
        })
    }
}

The result is a video in the original resolution with a watermark on the lower left, see screenshot of resulting video:

test of adding watermark

like image 2
Stephan Schlecht Avatar answered Nov 20 '22 02:11

Stephan Schlecht