The aim is to implement a custom MFT for video processing and synchronization to an external application. The details of are not important. What I would like to achieve as a first step is to get the MFT up and running using DXVA video processing or DXVA-HD. I wasn't able to do so.
Here's what I did: I started building a topology with a source input node (my webcam), an MFT (the MFT_Grayscale example) and the EVR. I included this in a small application. The topology worked and I could see the monochrome stream from the camera. Now I want to change the code of the MF_Grayscale example such that it would support DXVA video processing and could use hardware acceleration provided by the VideoProcessBlt method. The Microsoft documentation is giving bits and pieces of information but I wasn't able to achieve a running MFT.
What I did so far:
GetAttributes
I indicate that this MFT is MF_SA_D3D_AWARE
.ProcessMessage
I process the message MFT_MESSAGE_SET_D3D_MANAGER
to get a device handle, a IDirect3DDeviceManager9
and a IDirectXVideoProcessorService
.SetInputType
method I use the methods described here
https://msdn.microsoft.com/en-us/library/windows/desktop/ms694235(v=vs.85).aspx
to get a DXVA2_VideoDesc structure
and follow this code
https://msdn.microsoft.com/en-us/library/windows/desktop/cc307964(v=vs.85).aspx
to create a video processing device. Additionally I create the surfaces using
IDirectXVideoProcessorService->CreateSurface
GetOutputStreamInfo
the dwFlags
variable looks like this:pStreamInfo->dwFlags =
MFT_OUTPUT_STREAM_PROVIDES_SAMPLES |
MFT_OUTPUT_STREAM_WHOLE_SAMPLES |
MFT_OUTPUT_STREAM_SINGLE_SAMPLE_PER_BUFFER |
MFT_OUTPUT_STREAM_FIXED_SAMPLE_SIZE;
Everything seems to be ok until here. Now my questions (I'm sorry that I cannot be more specific):
Do I have to adapt the GetOutputAvailableType/SetOutputType
methods?
In the ProcessInput
method I get the IMFSample
and extract an IMFMediaBuffer
. The buffer is not managing a IDirect3DSurface9
according to my function calls.
Do I have to memcpy the data of the buffer to a Direct3D surface?
In the ProcessOutput
method to make a starting point I want to forward the incoming frame to the output. VideoProcessBlt
should make a 1:1 blit from input to output.
The documentation says:
Get an available surface that is not currently in use.
How can I determine whether a surface is in use?
How am I supposed to output the surface? Should I use MFCreateVideoSampleFromSurface
or MFCreateDXSurfaceBuffer
?
Unfortunately I am really lost and unable to make any progress using the documentation.
The situation now is that I do not see any video output (the window has its default windows background color) and the webcam stops to capture frames after the first frame (the LED switches off). Besides that nothing happens - the application just continues to run without showing anything).
I hope that somebody can help me. I would also appreciate if someone could direct me to sample code for an MFT using DXVA video processing or DXVA-HD. I was not able to find anything...
Thanks
Do I have to adapt the GetOutputAvailableType/SetOutputType methods?
Yes. You need to provide attributes as usual, with no special requirements.
Do I have to memcpy the data of the buffer to a Direct3D surface?
Yes. You need to care about IDirect3DDeviceManager9 and LockDevice when doing this, because EVR could use the surface at the same time.
How can I determine whether a surface is in use?
You have to care especially when the surface is free for use. Your MFT needs to implement IMFAsyncCallback interface. After you use MFCreateVideoSampleFromSurface, you query for the IMFTrackedSample interface and call SetAllocator. The Invoke method will tell you when the surface is free.
How am I supposed to output the surface? Should I use MFCreateVideoSampleFromSurface or MFCreateDXSurfaceBuffer?
Because you are using IDirectXVideoProcessorService->CreateSurface, the MFCreateVideoSampleFromSurface is the right place. You output the surface through the IMFSample.
Check this project :
MFNode
Under MFTDxva2Decoder and MFSkDxva2Renderer, you will find some dxva2 processing.
The easiest way to get the Media Foundation samples is to download and install 'Microsoft Windows SDK for Windows 7 and .NET Framework 4' from http://www.microsoft.com/en-us/download/details.aspx?id=8279. Install the Samples and then locate them at 'v7.1/Samples/Multimedia/MediaFoundation'. The Media Foundation samples are supposedly on Microsoft's Code Gallery site, but I've been unable to locate them there. The samples are no longer included with the Windows SDK releases.
The samples that are directly relevant to your question are probably Decoder, DXVA_HD, DXVA2_VideoProc, EVRPresenter, MPEG1Source, and topoedit.
I can't provide anymore help as I'm currently struggling with some of the same issues.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With