App Descrtiption: Speedometer. Has needle dial and animated needle as overlay on the video. I output the animation of the needle onto the video via post-processing. I use AVAssetExportSession, and construct an AVComposition containing my animated layers along with the Video and Audio tracks from the video. This works fine. Video shows, needle animates.
Currently to replay the animation during the post-processing, I have saved off any change in speed with a time since "recording" of the video began. During postprocessing, I then fire off a timer(s) based on the saved time/speed data to then animate the needle to the next speed.
Problem: Resulting video/animation pair are not completely accurate and there often is a mismatch between the speed displayed when the video was taken and when it is played back and composited. (usually needle is in advance of video) due to the fact that the compositing/compression during export is not necessarily real-time.
Question: Is there a way I can embed speed information into the recording video stream and then get access to it when it is exported so that the video and speedometer are temporally matched up?
Would be nice to get a callback at specific times during export that contains my speed data.
As always...thanks!
Instead of using timers to animate your needle create a keyframe animation based on the speed data you recorded.
Timers and CA don't generally mix well, at least not in the way I infer from your description.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With