I can get individual frames from the iPhone's cameras just fine. what I need is a way to package them up with sound for streaming to the server. Sending the files once I have them isn't much of an issue. Its the generation of the files for streaming that I am having problems with. I've been trying to get FFMpeg to work without much luck.
Anyone have any ideas on how I can pull this off? I would like a known working API or instructions on getting FFMpeg to compile properly in an iPhone app.
You can connect your iPhone to a TV using an AV cable, AirPlay, or screen mirroring. To AirPlay or screen mirror your iPhone, you also need to have an Apple TV or AirPlay 2-compatible smart TV. With AirPlay, you can stream content directly from video apps like Hulu and HBO Max.
To use AirPlay, make sure your streaming device or smart TV is connected to the same home Wi-Fi network as your iPhone or iPad. Open the Control Center on your device and tap AirPlay Mirroring, then select the appropriate destination. Whatever is on your mobile device's screen will be mirrored onto your TV's screen.
As long as your iPhone supports AirPlay, and your computer is running Windows 10 or Windows 11, you're good to go. One caveat is that, since Windows doesn't natively support screen mirroring for iOS devices, you're going to need to install third-party software to get it working.
You could divide your recording to separate files with a length of say, 10sec, then send them separately. If you use AVCaptureSession
's beginConfiguration
and commitConfiguration
methods to batch your output change you shouldn't drop any frames between the files. This has many advantages over frame by frame upload:
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With