Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Swift Merge audio and video files into one video

Tags:

I wrote a program in Swift.I want to merge a video with an audio file, but got this error.

"failed Error Domain=AVFoundationErrorDomain Code=-11838 "Operation Stopped" UserInfo=0x17da4230 {NSLocalizedDescription=Operation Stopped, NSLocalizedFailureReason=The operation is not supported for this media.}"

code

func mergeAudio(audioURL: NSURL, moviePathUrl: NSURL, savePathUrl: NSURL) {     var composition = AVMutableComposition()     let trackVideo:AVMutableCompositionTrack = composition.addMutableTrackWithMediaType(AVMediaTypeVideo, preferredTrackID: CMPersistentTrackID())     let trackAudio:AVMutableCompositionTrack = composition.addMutableTrackWithMediaType(AVMediaTypeAudio, preferredTrackID: CMPersistentTrackID())     let option = NSDictionary(object: true, forKey: "AVURLAssetPreferPreciseDurationAndTimingKey")     let sourceAsset = AVURLAsset(URL: moviePathUrl, options: option as [NSObject : AnyObject])     let audioAsset = AVURLAsset(URL: audioURL, options: option as [NSObject : AnyObject])      let tracks = sourceAsset.tracksWithMediaType(AVMediaTypeVideo)     let audios = audioAsset.tracksWithMediaType(AVMediaTypeAudio)      if tracks.count > 0 {         let assetTrack:AVAssetTrack = tracks[0] as! AVAssetTrack         let assetTrackAudio:AVAssetTrack = audios[0] as! AVAssetTrack          let audioDuration:CMTime = assetTrackAudio.timeRange.duration         let audioSeconds:Float64 = CMTimeGetSeconds(assetTrackAudio.timeRange.duration)          trackVideo.insertTimeRange(CMTimeRangeMake(kCMTimeZero,audioDuration), ofTrack: assetTrack, atTime: kCMTimeZero, error: nil)         trackAudio.insertTimeRange(CMTimeRangeMake(kCMTimeZero,audioDuration), ofTrack: assetTrackAudio, atTime: kCMTimeZero, error: nil)     }      var assetExport: AVAssetExportSession = AVAssetExportSession(asset: composition, presetName: AVAssetExportPresetPassthrough)     assetExport.outputFileType = AVFileTypeMPEG4     assetExport.outputURL = savePathUrl     self.tmpMovieURL = savePathUrl     assetExport.shouldOptimizeForNetworkUse = true     assetExport.exportAsynchronouslyWithCompletionHandler { () -> Void in         switch assetExport.status {         case AVAssetExportSessionStatus.Completed:             let assetsLib = ALAssetsLibrary()             assetsLib.writeVideoAtPathToSavedPhotosAlbum(savePathUrl, completionBlock: nil)             println("success")         case  AVAssetExportSessionStatus.Failed:             println("failed \(assetExport.error)")         case AVAssetExportSessionStatus.Cancelled:             println("cancelled \(assetExport.error)")         default:             println("complete")         }     }  } 

In my idea media type like mpeg4 is wrong. Where is the problem? What am i missing?

like image 369
Kei Maejima Avatar asked Aug 13 '15 09:08

Kei Maejima


People also ask

How can I combine audio and video for free?

Clideo. Clideo is another easy-to-use audio and video merger. It instantly combines your videos, photos and music into one integrated video, even videos and photos online. The supported media formats are MP4, AVI, MPG, VOB, WMV, MOV and more.


2 Answers

Improved code (of Govind's answer) with some additional features:

  1. Merge audio of the video + external audio (the initial answer was dropping the sound of the video)
  2. Flip video horizontally if needed (I personally use it when user captures using frontal camera, btw instagram flips it too)
  3. Apply preferredTransform correctly which solves the issue when video was saved rotated (video is external: captured by other device/generated by other app)
  4. Removed some unused code with VideoComposition.
  5. Added a completion handler to the method so that it can be called from a different class.
  6. Update to Swift 4.

Step 1.

import UIKit import AVFoundation import AVKit import AssetsLibrary 

Step 2.

/// Merges video and sound while keeping sound of the video too /// /// - Parameters: ///   - videoUrl: URL to video file ///   - audioUrl: URL to audio file ///   - shouldFlipHorizontally: pass True if video was recorded using frontal camera otherwise pass False ///   - completion: completion of saving: error or url with final video func mergeVideoAndAudio(videoUrl: URL,                         audioUrl: URL,                         shouldFlipHorizontally: Bool = false,                         completion: @escaping (_ error: Error?, _ url: URL?) -> Void) {      let mixComposition = AVMutableComposition()     var mutableCompositionVideoTrack = [AVMutableCompositionTrack]()     var mutableCompositionAudioTrack = [AVMutableCompositionTrack]()     var mutableCompositionAudioOfVideoTrack = [AVMutableCompositionTrack]()      //start merge      let aVideoAsset = AVAsset(url: videoUrl)     let aAudioAsset = AVAsset(url: audioUrl)      let compositionAddVideo = mixComposition.addMutableTrack(withMediaType: AVMediaTypeVideo,                                                                    preferredTrackID: kCMPersistentTrackID_Invalid)      let compositionAddAudio = mixComposition.addMutableTrack(withMediaType: AVMediaTypeAudio,                                                                  preferredTrackID: kCMPersistentTrackID_Invalid)      let compositionAddAudioOfVideo = mixComposition.addMutableTrack(withMediaType: AVMediaTypeAudio,                                                                         preferredTrackID: kCMPersistentTrackID_Invalid)      let aVideoAssetTrack: AVAssetTrack = aVideoAsset.tracks(withMediaType: AVMediaTypeVideo)[0]     let aAudioOfVideoAssetTrack: AVAssetTrack? = aVideoAsset.tracks(withMediaType: AVMediaTypeAudio).first     let aAudioAssetTrack: AVAssetTrack = aAudioAsset.tracks(withMediaType: AVMediaTypeAudio)[0]      // Default must have tranformation     compositionAddVideo.preferredTransform = aVideoAssetTrack.preferredTransform      if shouldFlipHorizontally {         // Flip video horizontally         var frontalTransform: CGAffineTransform = CGAffineTransform(scaleX: -1.0, y: 1.0)         frontalTransform = frontalTransform.translatedBy(x: -aVideoAssetTrack.naturalSize.width, y: 0.0)         frontalTransform = frontalTransform.translatedBy(x: 0.0, y: -aVideoAssetTrack.naturalSize.width)         compositionAddVideo.preferredTransform = frontalTransform     }      mutableCompositionVideoTrack.append(compositionAddVideo)     mutableCompositionAudioTrack.append(compositionAddAudio)     mutableCompositionAudioOfVideoTrack.append(compositionAddAudioOfVideo)      do {         try mutableCompositionVideoTrack[0].insertTimeRange(CMTimeRangeMake(kCMTimeZero,                                                                             aVideoAssetTrack.timeRange.duration),                                                             of: aVideoAssetTrack,                                                             at: kCMTimeZero)          //In my case my audio file is longer then video file so i took videoAsset duration         //instead of audioAsset duration         try mutableCompositionAudioTrack[0].insertTimeRange(CMTimeRangeMake(kCMTimeZero,                                                                             aVideoAssetTrack.timeRange.duration),                                                             of: aAudioAssetTrack,                                                             at: kCMTimeZero)          // adding audio (of the video if exists) asset to the final composition         if let aAudioOfVideoAssetTrack = aAudioOfVideoAssetTrack {             try mutableCompositionAudioOfVideoTrack[0].insertTimeRange(CMTimeRangeMake(kCMTimeZero,                                                                                        aVideoAssetTrack.timeRange.duration),                                                                        of: aAudioOfVideoAssetTrack,                                                                        at: kCMTimeZero)         }     } catch {         print(error.localizedDescription)     }      // Exporting     let savePathUrl: URL = URL(fileURLWithPath: NSHomeDirectory() + "/Documents/newVideo.mp4")     do { // delete old video         try FileManager.default.removeItem(at: savePathUrl)     } catch { print(error.localizedDescription) }      let assetExport: AVAssetExportSession = AVAssetExportSession(asset: mixComposition, presetName: AVAssetExportPresetHighestQuality)!     assetExport.outputFileType = AVFileTypeMPEG4     assetExport.outputURL = savePathUrl     assetExport.shouldOptimizeForNetworkUse = true      assetExport.exportAsynchronously { () -> Void in         switch assetExport.status {         case AVAssetExportSessionStatus.completed:             print("success")             completion(nil, savePathUrl)         case AVAssetExportSessionStatus.failed:             print("failed \(assetExport.error?.localizedDescription ?? "error nil")")             completion(assetExport.error, nil)         case AVAssetExportSessionStatus.cancelled:             print("cancelled \(assetExport.error?.localizedDescription ?? "error nil")")             completion(assetExport.error, nil)         default:             print("complete")             completion(assetExport.error, nil)         }     }  } 

Again thanks to @Govind's answer! It helped me a lot!

Hope this update helps someone too:)

like image 158
Tung Fam Avatar answered Sep 17 '22 19:09

Tung Fam


In Above question same error I found due to wrong savePathUrl, destination URL should be like below code including new video name.

I was looking for the code to Merge audio and video files into one video but couldn't find anywhere so after spending hours while reading apple docs I wrote this code.

NOTE : This is tested and 100% working code for me.

Stap : 1 Import this modules in your viewController.

import UIKit import AVFoundation import AVKit import AssetsLibrary 

step 2: Add this function in your code

func mergeFilesWithUrl(videoUrl:NSURL, audioUrl:NSURL) {     let mixComposition : AVMutableComposition = AVMutableComposition()     var mutableCompositionVideoTrack : [AVMutableCompositionTrack] = []     var mutableCompositionAudioTrack : [AVMutableCompositionTrack] = []     let totalVideoCompositionInstruction : AVMutableVideoCompositionInstruction = AVMutableVideoCompositionInstruction()       //start merge      let aVideoAsset : AVAsset = AVAsset(URL: videoUrl)     let aAudioAsset : AVAsset = AVAsset(URL: audioUrl)      mutableCompositionVideoTrack.append(mixComposition.addMutableTrackWithMediaType(AVMediaTypeVideo, preferredTrackID: kCMPersistentTrackID_Invalid))     mutableCompositionAudioTrack.append( mixComposition.addMutableTrackWithMediaType(AVMediaTypeAudio, preferredTrackID: kCMPersistentTrackID_Invalid))      let aVideoAssetTrack : AVAssetTrack = aVideoAsset.tracksWithMediaType(AVMediaTypeVideo)[0]     let aAudioAssetTrack : AVAssetTrack = aAudioAsset.tracksWithMediaType(AVMediaTypeAudio)[0]        do{         try mutableCompositionVideoTrack[0].insertTimeRange(CMTimeRangeMake(kCMTimeZero, aVideoAssetTrack.timeRange.duration), ofTrack: aVideoAssetTrack, atTime: kCMTimeZero)          //In my case my audio file is longer then video file so i took videoAsset duration         //instead of audioAsset duration          try mutableCompositionAudioTrack[0].insertTimeRange(CMTimeRangeMake(kCMTimeZero, aVideoAssetTrack.timeRange.duration), ofTrack: aAudioAssetTrack, atTime: kCMTimeZero)          //Use this instead above line if your audiofile and video file's playing durations are same          //            try mutableCompositionAudioTrack[0].insertTimeRange(CMTimeRangeMake(kCMTimeZero, aVideoAssetTrack.timeRange.duration), ofTrack: aAudioAssetTrack, atTime: kCMTimeZero)      }catch{      }      totalVideoCompositionInstruction.timeRange = CMTimeRangeMake(kCMTimeZero,aVideoAssetTrack.timeRange.duration )      let mutableVideoComposition : AVMutableVideoComposition = AVMutableVideoComposition()     mutableVideoComposition.frameDuration = CMTimeMake(1, 30)      mutableVideoComposition.renderSize = CGSizeMake(1280,720)      //        playerItem = AVPlayerItem(asset: mixComposition)     //        player = AVPlayer(playerItem: playerItem!)     //     //     //        AVPlayerVC.player = player        //find your video on this URl     let savePathUrl : NSURL = NSURL(fileURLWithPath: NSHomeDirectory() + "/Documents/newVideo.mp4")      let assetExport: AVAssetExportSession = AVAssetExportSession(asset: mixComposition, presetName: AVAssetExportPresetHighestQuality)!     assetExport.outputFileType = AVFileTypeMPEG4     assetExport.outputURL = savePathUrl     assetExport.shouldOptimizeForNetworkUse = true      assetExport.exportAsynchronouslyWithCompletionHandler { () -> Void in         switch assetExport.status {          case AVAssetExportSessionStatus.Completed:              //Uncomment this if u want to store your video in asset              //let assetsLib = ALAssetsLibrary()             //assetsLib.writeVideoAtPathToSavedPhotosAlbum(savePathUrl, completionBlock: nil)              print("success")         case  AVAssetExportSessionStatus.Failed:             print("failed \(assetExport.error)")         case AVAssetExportSessionStatus.Cancelled:             print("cancelled \(assetExport.error)")         default:             print("complete")         }     }   } 

Step 3: Call function where u want like this

let videoUrl : NSURL =  NSURL(fileURLWithPath: NSBundle.mainBundle().pathForResource("SampleVideo", ofType: "mp4")!) let audioUrl : NSURL = NSURL(fileURLWithPath: NSBundle.mainBundle().pathForResource("SampleAudio", ofType: "mp3")!)  mergeFilesWithUrl(videoUrl, audioUrl: audioUrl) 

hope this will help you and will save your time.

like image 30
Govind Prajapati Avatar answered Sep 19 '22 19:09

Govind Prajapati