I would like to create a virtual camera device using gstreamer on OSX
I am able to create it on *nix systems using
gst-launch-1.0 -v videotestsrc pattern=snow ! video/x-raw,width=1280,height=720 ! v4l2sink device=/dev/video1
On osx I get a erroneous pipeline v4l2
I would like to be able to pipe the output into applications such as skype etc.,
Any pointers in this direction ? Should I use syphon?
The way to go is to use CoreMediaIO framework. You can create a virtual device using the framework and feed in any stream using a wrapper. Some pointers for wrappers can be seen here.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With