I'm trying to make myself a QML Camera item which has more functions, and also provide a source to the VideoOutput
element. Such as this:
VideoOutput{
source:mycamera
}
MyCustomCamera{
id:mycamera
}
in the document it says
If you are extending your own C++ classes to interoperate with VideoOutput, you can either provide a QObject based class with a mediaObject property that exposes a QMediaObject derived class that has a QVideoRendererControl available, or you can provide a QObject based class with a writable videoSurface property that can accept a QAbstractVideoSurface based class and can follow the correct protocol to deliver QVideoFrames to it.
I have tried giving my object a private property mediaObject
, which is of type QCamera, but looks like QCamera does not have a QVideoRenderControl (or its my fault not knowing how to do it correctly).
I need to achieve the effect I've shown in the beginning, anyway is welcomed.
Or otherwise can anyone give me a short example on what is meant by "a writable videoSurace
property that accept blablabla and follow the correct protocol"?
I can't help you with your main concern but i can give you an example usage of the videoSurface
.You can use the "writable videoSurface
" like this:
My example consists of three main steps:
videoadapter.h
#ifndef VIDEOADAPTER_H
#define VIDEOADAPTER_H
#include <QObject>
#include <QAbstractVideoSurface>
#include <QVideoSurfaceFormat>
#include <QTimer>
class VideoAdapter : public QObject
{
Q_OBJECT
Q_PROPERTY(QAbstractVideoSurface* videoSurface READ videoSurface WRITE setVideoSurface NOTIFY signalVideoSurfaceChanged)
public:
explicit VideoAdapter(QObject *parent = nullptr);
QAbstractVideoSurface *videoSurface() const;
void setVideoSurface(QAbstractVideoSurface *videoSurface);
signals:
void signalVideoSurfaceChanged();
private slots:
void slotTick();
private:
void startSurface();
private:
QAbstractVideoSurface *mVideoSurface;
QVideoSurfaceFormat *mSurfaceFormat;
QImage *mImage;
QTimer mTimer;
};
#endif // VIDEOADAPTER_H
videoadapter.cpp
#include "videoadapter.h"
#include <QDebug>
VideoAdapter::VideoAdapter(QObject *parent)
: QObject(parent), mVideoSurface(nullptr), mSurfaceFormat(nullptr)
{
mTimer.setInterval(1000);
connect(&mTimer, &QTimer::timeout, this, &VideoAdapter::slotTick);
}
QAbstractVideoSurface *VideoAdapter::videoSurface() const
{
return mVideoSurface;
}
void VideoAdapter::setVideoSurface(QAbstractVideoSurface *videoSurface)
{
if(videoSurface != mVideoSurface)
{
mVideoSurface = videoSurface;
emit signalVideoSurfaceChanged();
startSurface();
// This is the test timer that will tick for us to present the image
// on the video surface
mTimer.start();
}
}
void VideoAdapter::slotTick()
{
QVideoFrame frame(*mImage);
mVideoSurface->present(frame);
}
void VideoAdapter::startSurface()
{
mImage = new QImage("../resources/images/test.jpg");
auto pixelFormat = QVideoFrame::pixelFormatFromImageFormat(mImage->format());
mSurfaceFormat = new QVideoSurfaceFormat(mImage->size(), pixelFormat);
if(!mVideoSurface->start(*mSurfaceFormat))
{
qDebug() << "Surface couldn't be started!";
}
}
This class only loads an image file and displays it with the usage of a timer but in your case you will be having a frame source so you can change this to suit your needs. If you can convert your frame to QImage
of QVideoFrame
you can display it like this.
You have to make this class usable in QML. In my case i created an object and made it visible to QML via setting it as a property.
int main(int argc, char *argv[])
{
QCoreApplication::setAttribute(Qt::AA_EnableHighDpiScaling);
QGuiApplication app(argc, argv);
QQmlApplicationEngine engine;
QQmlDebuggingEnabler enabler;
VideoAdapter adapter;
// When you do this this object is made visible to QML context with the
// given name
engine.rootContext()->setContextProperty("videoAdapter", &adapter);
const QUrl url(QStringLiteral("qrc:/main.qml"));
QObject::connect(&engine, &QQmlApplicationEngine::objectCreated,
&app, [url](QObject *obj, const QUrl &objUrl) {
if (!obj && url == objUrl)
QCoreApplication::exit(-1);
}, Qt::QueuedConnection);
engine.load(url);
return app.exec();
}
You give this object to the VideoOutput as source in QML.
Window {
visible: true
width: 640
height: 480
color: "black"
title: qsTr("Video Player")
VideoOutput {
id: videoPlayer
anchors.fill: parent
source: videoAdapter
}
}
This example as i said is a simple one that only loads an image and only displays that one image periodically.
This question is an old one and you probably moved on but hope this can at least help other people.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With