I have to do "slow motion" in a video file along with audio, in-between some frames and need to store the ramped video as a new video.
Ref: http://www.youtube.com/watch?v=BJ3_xMGzauk (watch from 0 to 10s)
From my analysis, I've found that AVFoundation framework can be helpful.
Ref: http://developer.apple.com/library/ios/#documentation/AudioVideo/Conceptual/AVFoundationPG/Articles/00_Introduction.html
Copy and pasted from the above link:
" Editing AV Foundation uses compositions to create new assets from existing pieces of media (typically, one or more video and audio tracks). You use a mutable composition to add and remove tracks, and adjust their temporal orderings. You can also set the relative volumes and ramping of audio tracks; and set the opacity, and opacity ramps, of video tracks. A composition is an assemblage of pieces of media held in memory. When you export a composition using an export session, it's collapsed to a file. On iOS 4.1 and later, you can also create an asset from media such as sample buffers or still images using an asset writer.
"
Questions: Can I do " slow motion " the video/audio file using the AVFoundation framework ? Or Is there any other package available? If i want to handle audio and video separately, please guide me how to do?
Update :: Code For AV Export Session :
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES); NSString *outputURL = paths[0]; NSFileManager *manager = [NSFileManager defaultManager]; [manager createDirectoryAtPath:outputURL withIntermediateDirectories:YES attributes:nil error:nil]; outputURL = [outputURL stringByAppendingPathComponent:@"output.mp4"]; // Remove Existing File [manager removeItemAtPath:outputURL error:nil]; AVAssetExportSession *exportSession = [[AVAssetExportSession alloc] initWithAsset:self.inputAsset presetName:AVAssetExportPresetLowQuality]; exportSession.outputURL = [NSURL fileURLWithPath:outputURL]; // output path; exportSession.outputFileType = AVFileTypeQuickTimeMovie; [exportSession exportAsynchronouslyWithCompletionHandler:^(void) { if (exportSession.status == AVAssetExportSessionStatusCompleted) { [self writeVideoToPhotoLibrary:[NSURL fileURLWithPath:outputURL]]; ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init]; [library writeVideoAtPathToSavedPhotosAlbum:[NSURL fileURLWithPath:outputURL] completionBlock:^(NSURL *assetURL, NSError *error){ if (error) { NSLog(@"Video could not be saved"); } }]; } else { NSLog(@"error: %@", [exportSession error]); } }];
Open a video shot in Slo-mo mode, then tap Edit. Drag the white vertical bars beneath the frame viewer to set where the video is played in slow motion.
Adjust speed With your project open, tap a video clip in the timeline to reveal the inspector at the bottom of the screen. Tap the Speed button . A yellow bar appears at the bottom of the clip, with range handles at each end.
You could scale video using AVFoundation and CoreMedia frameworks. Take a look at the AVMutableCompositionTrack method:
- (void)scaleTimeRange:(CMTimeRange)timeRange toDuration:(CMTime)duration;
Sample:
AVURLAsset* videoAsset = nil; //self.inputAsset; //create mutable composition AVMutableComposition *mixComposition = [AVMutableComposition composition]; AVMutableCompositionTrack *compositionVideoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid]; NSError *videoInsertError = nil; BOOL videoInsertResult = [compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration) ofTrack:[[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:kCMTimeZero error:&videoInsertError]; if (!videoInsertResult || nil != videoInsertError) { //handle error return; } //slow down whole video by 2.0 double videoScaleFactor = 2.0; CMTime videoDuration = videoAsset.duration; [compositionVideoTrack scaleTimeRange:CMTimeRangeMake(kCMTimeZero, videoDuration) toDuration:CMTimeMake(videoDuration.value*videoScaleFactor, videoDuration.timescale)]; //export AVAssetExportSession* assetExport = [[AVAssetExportSession alloc] initWithAsset:mixComposition presetName:AVAssetExportPresetLowQuality];
(Probably audio track from videoAsset should also be added to mixComposition)
I have tried and able to Slower the asset.
compositionVideoTrack?.scaleTimeRange(timeRange, toDuration: scaledVideoDuration)
did the trick.
I made a class which will help you to generate a slower
video from AVAsset
. + point is you can also make it faster
and another + point is it will handle the audio too.
Here is my custom class sample:
import UIKit import AVFoundation enum SpeedoMode { case Slower case Faster } class VSVideoSpeeder: NSObject { /// Singleton instance of `VSVideoSpeeder` static var shared: VSVideoSpeeder = { return VSVideoSpeeder() }() /// Range is b/w 1x, 2x and 3x. Will not happen anything if scale is out of range. Exporter will be nil in case url is invalid or unable to make asset instance. func scaleAsset(fromURL url: URL, by scale: Int64, withMode mode: SpeedoMode, completion: @escaping (_ exporter: AVAssetExportSession?) -> Void) { /// Check the valid scale if scale < 1 || scale > 3 { /// Can not proceed, Invalid range completion(nil) return } /// Asset let asset = AVAsset(url: url) /// Video Tracks let videoTracks = asset.tracks(withMediaType: AVMediaType.video) if videoTracks.count == 0 { /// Can not find any video track completion(nil) return } /// Get the scaled video duration let scaledVideoDuration = (mode == .Faster) ? CMTimeMake(asset.duration.value / scale, asset.duration.timescale) : CMTimeMake(asset.duration.value * scale, asset.duration.timescale) let timeRange = CMTimeRangeMake(kCMTimeZero, asset.duration) /// Video track let videoTrack = videoTracks.first! let mixComposition = AVMutableComposition() let compositionVideoTrack = mixComposition.addMutableTrack(withMediaType: AVMediaType.video, preferredTrackID: kCMPersistentTrackID_Invalid) /// Audio Tracks let audioTracks = asset.tracks(withMediaType: AVMediaType.audio) if audioTracks.count > 0 { /// Use audio if video contains the audio track let compositionAudioTrack = mixComposition.addMutableTrack(withMediaType: AVMediaType.audio, preferredTrackID: kCMPersistentTrackID_Invalid) /// Audio track let audioTrack = audioTracks.first! do { try compositionAudioTrack?.insertTimeRange(timeRange, of: audioTrack, at: kCMTimeZero) compositionAudioTrack?.scaleTimeRange(timeRange, toDuration: scaledVideoDuration) } catch _ { /// Ignore audio error } } do { try compositionVideoTrack?.insertTimeRange(timeRange, of: videoTrack, at: kCMTimeZero) compositionVideoTrack?.scaleTimeRange(timeRange, toDuration: scaledVideoDuration) /// Keep original transformation compositionVideoTrack?.preferredTransform = videoTrack.preferredTransform /// Initialize Exporter now let outputFileURL = URL(fileURLWithPath: "/Users/thetiger/Desktop/scaledVideo.mov") /// Note:- Please use directory path if you are testing with device. if FileManager.default.fileExists(atPath: outputFileURL.absoluteString) { try FileManager.default.removeItem(at: outputFileURL) } let exporter = AVAssetExportSession(asset: mixComposition, presetName: AVAssetExportPresetHighestQuality) exporter?.outputURL = outputFileURL exporter?.outputFileType = AVFileType.mov exporter?.shouldOptimizeForNetworkUse = true exporter?.exportAsynchronously(completionHandler: { completion(exporter) }) } catch let error { print(error.localizedDescription) completion(nil) return } } }
I took 1x, 2x and 3x as a valid scale. Class contains the proper validation and handling. Below is the sample of how to use this function.
let url = Bundle.main.url(forResource: "1", withExtension: "mp4")! VSVideoSpeeder.shared.scaleAsset(fromURL: url, by: 3, withMode: SpeedoMode.Slower) { (exporter) in if let exporter = exporter { switch exporter.status { case .failed: do { print(exporter.error?.localizedDescription ?? "Error in exporting..") } case .completed: do { print("Scaled video has been generated successfully!") } case .unknown: break case .waiting: break case .exporting: break case .cancelled: break } } else { /// Error print("Exporter is not initialized.") } }
This line will handle the audio scaling
compositionAudioTrack?.scaleTimeRange(timeRange, toDuration: scaledVideoDuration)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With