I have an NSURL that contains a video, and I want to record a frame of that video ten times every seconds. And I have code that will capture an image of my player, but I have trouble setting it up to take capture 10 frames per second. I am trying something like this, but it is returning the same initial frame of the video, the correct number of times? Here is what I have:
AVAsset *asset = [AVAsset assetWithURL:videoUrl];
CMTime vidLength = asset.duration;
float seconds = CMTimeGetSeconds(vidLength);
int frameCount = 0;
for (float i = 0; i < seconds;) {
AVAssetImageGenerator *imageGenerator = [[AVAssetImageGenerator alloc]initWithAsset:asset];
CMTime time = CMTimeMake(i, 10);
CGImageRef imageRef = [imageGenerator copyCGImageAtTime:time actualTime:NULL error:NULL];
UIImage *thumbnail = [UIImage imageWithCGImage:imageRef];
CGImageRelease(imageRef);
NSString* filename = [NSString stringWithFormat:@"Documents/frame_%d.png", frameCount];
NSString* pngPath = [NSHomeDirectory() stringByAppendingPathComponent:filename];
[UIImagePNGRepresentation(thumbnail) writeToFile: pngPath atomically: YES];
frameCount++;
i = i + 0.1;
}
But instead of getting the frame at the current time i of the video, I just get the initial frame?
How can I get the frame of the video 10 times a second?
Thanks for the help :)
You are getting initial frame because you are trying to create CMTime with help of float value:
CMTime time = CMTimeMake(i, 10);
Since CMTimeMake function takes int64_t value as first parameter, your float value will be rounded to int, and you will get incorrect result.
Lets change your code a bit:
1) At first, you need to find total frames count that you need to get from the video. You wrote that you need 10 frames per second, so the code will be:
int requiredFramesCount = seconds * 10;
2) Next you need to find a value that will be increasing your CMTime value on each step:
int64_t step = vidLength.value / requiredFramesCount;
3) And lastly, you need to set requestedTimeToleranceBefore and requestedTimeToleranceAfter to kCMTimeZero, to get a frame at precise time:
imageGenerator.requestedTimeToleranceAfter = kCMTimeZero;
imageGenerator.requestedTimeToleranceBefore = kCMTimeZero;
Here is how your code will look like:
CMTime vidLength = asset.duration;
float seconds = CMTimeGetSeconds(vidLength);
int requiredFramesCount = seconds * 10;
int64_t step = vidLength.value / requiredFramesCount;
int value = 0;
for (int i = 0; i < requiredFramesCount; i++) {
AVAssetImageGenerator *imageGenerator = [[AVAssetImageGenerator alloc]initWithAsset:asset];
imageGenerator.requestedTimeToleranceAfter = kCMTimeZero;
imageGenerator.requestedTimeToleranceBefore = kCMTimeZero;
CMTime time = CMTimeMake(value, vidLength.timescale);
CGImageRef imageRef = [imageGenerator copyCGImageAtTime:time actualTime:NULL error:NULL];
UIImage *thumbnail = [UIImage imageWithCGImage:imageRef];
CGImageRelease(imageRef);
NSString* filename = [NSString stringWithFormat:@"Documents/frame_%d.png", i];
NSString* pngPath = [NSHomeDirectory() stringByAppendingPathComponent:filename];
[UIImagePNGRepresentation(thumbnail) writeToFile: pngPath atomically: YES];
value += step;
}
With CMTimeMake(A, B)
you store a rational number, an exact fraction A / B seconds, and first parameter of this function takes int value. For 20 seconds video you will capture a frame with time ((int) 19.9) / 10 = 1.9 second in the last iteration of your cycle. Use CMTimeMakeWithSeconds(i, NSEC_PER_SEC)
function to fix this time issue.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With