Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Rendering Windows screenshot capture bitmap as DirectX texture

I'm making progress developing a '3d desktop' directx app that needs to display the current contents of a desktop window (e.g. "Calculator") as a 2D texture on a rectangular surface in directx (11). I'm sooo close but really struggling with the screenshot BMP -> Texture2D step. I do have screenshot->HBITMAP and DDSFile->rendered texture successfully working but can't complete the screenshot->rendered texture.

So far I have working the 'capture the window as a screenshot' bit:

RECT user_window_rectangle;
HWND user_window = FindWindow(NULL, TEXT("Calculator"));
GetClientRect(user_window, &user_window_rectangle);
HDC hdcScreen = GetDC(NULL);
HDC hdc = CreateCompatibleDC(hdcScreen);
UINT screenshot_width = user_window_rectangle.right - user_window_rectangle.left;
UINT screenshot_height = user_window_rectangle.bottom - user_window_rectangle.top;
hbmp = CreateCompatibleBitmap(hdcScreen, screenshot_width, screenshot_height);
SelectObject(hdc, hbmp);
PrintWindow(user_window, hdc, PW_CLIENTONLY);

At this point I have the window bitmap referenced by HBITMAP hbmp.

Also working is my code to render a DDS file as a texture on a directx/3d rectangle:

ID3D11Device *dev;
ID3D11DeviceContext *dev_context;
...
dev_context->PSSetShaderResources(0, 1, &shader_resource_view);
dev_context->PSSetSamplers(0, 1, &tex_sampler_state);
...
DirectX::TexMetadata tex_metadata;
DirectX::ScratchImage image;

hr = LoadFromDDSFile(L"Earth.dds", DirectX::DDS_FLAGS_NONE, &tex_metadata, image);
hr = CreateShaderResourceView(dev, image.GetImages(), image.GetImageCount(), tex_metadata, &shader_resource_view);

Pixel shader is:

Texture2D ObjTexture
SamplerState ObjSamplerState
float4 PShader(float4 pos : SV_POSITION, float4 color : COLOR, float2 tex : TEXCOORD) : SV_TARGET\
{
    return ObjTexture.Sample( ObjSamplerState, tex );
}

The samplerstate (defaulting to linear) is:

D3D11_SAMPLER_DESC sampler_desc;
ZeroMemory(&sampler_desc, sizeof(sampler_desc));
sampler_desc.AddressU = D3D11_TEXTURE_ADDRESS_WRAP;
sampler_desc.AddressV = D3D11_TEXTURE_ADDRESS_WRAP;
sampler_desc.AddressW = D3D11_TEXTURE_ADDRESS_WRAP;
sampler_desc.MinLOD = 0;
sampler_desc.MaxLOD = D3D11_FLOAT32_MAX;

hr = dev->CreateSamplerState(&sampler_desc, &tex_sampler_state);

Question: how do I replace the LoadFromDDSFile bit with some equivalent that takes the HBITMAP from the windows screencapture and ends up with it on the graphics card as ObjTexture ?

Below is my best shot of bridging from the screenshot HBITMAP hbmp to the shader resource screenshot_texture, but it gives a memory access violation from the graphics driver (I think due to my "data.pSysmem = &bmp.bmBits", but no idea really):

GetObject(hbmp, sizeof(BITMAP), (LPSTR)&bmp)

D3D11_TEXTURE2D_DESC screenshot_desc = CD3D11_TEXTURE2D_DESC(DXGI_FORMAT_R8G8B8A8_UNORM, bmp.bmWidth, bmp.bmHeight, 1,
    1,
    D3D11_BIND_SHADER_RESOURCE
    );

int bytes_per_pixel = 4;

D3D11_SUBRESOURCE_DATA data;
ZeroMemory(&data, sizeof(D3D11_SUBRESOURCE_DATA));
data.pSysMem = &bmp.bmBits; //pixel buffer
data.SysMemPitch = bytes_per_pixel * bmp.bmWidth;// line size in byte
data.SysMemSlicePitch = bytes_per_pixel * bmp.bmWidth * bmp.bmHeight;// total buffer size in byte

hr = dev->CreateTexture2D(
    &screenshot_desc, //texture format
    &data,          // pixel buffer use to fill the texture
    &screenshot_texture  // created texture
    );

:::::::::::::::::::::::::SOLUTION::::::::::::::::::::::::::::::::::::::::::

The main issue was trying to use &bmp.bmBits directly as a pixel buffer caused memory conflicts within the graphics driver - this was resolved by using 'malloc' to allocate an appropriately sized block of memory to store the pixel data. Thanks to Chuck Walbourn for helping with my poking around in the dark to work out how the pixel data is actually stored (it was actually 32 bits/pixel by default). It's still possible/likely some of code is relying on luck to read the pixel data correctly, but it's been improved with Chuck's input.

My basic technique was;

  • FindWindow to get the client window on the desktop
  • CreateCompatibleBitmap and SelectObject and PrintWindow to get a HBITMAP to the snapshot
  • malloc to allocate the correct amount of space for a (byte*)pixel buffer
  • GetDIBits to populate the (byte*)pixel buffer from the HBITMAP
  • CreateTexture2D to build the texture buffer
  • CreateShaderResourceView to map the texture to the graphics pixel shader

So working code to screenshot a windows desktop window and pass that as a texture to a direct3d app is:

RECT user_window_rectangle;

HWND user_window = FindWindow(NULL, TEXT("Calculator"));    //the window can't be min
if (user_window == NULL)
{
    MessageBoxA(NULL, "Can't find Calculator", "Camvas", MB_OK);
    return;
}
GetClientRect(user_window, &user_window_rectangle);
//create
HDC hdcScreen = GetDC(NULL);
HDC hdc = CreateCompatibleDC(hdcScreen);
UINT screenshot_width = user_window_rectangle.right - user_window_rectangle.left;
UINT screenshot_height = user_window_rectangle.bottom - user_window_rectangle.top;
hbmp = CreateCompatibleBitmap(hdcScreen, screenshot_width, screenshot_height);

SelectObject(hdc, hbmp);

//Print to memory hdc
PrintWindow(user_window, hdc, PW_CLIENTONLY);

BITMAPINFOHEADER bmih;
ZeroMemory(&bmih, sizeof(BITMAPINFOHEADER));
bmih.biSize = sizeof(BITMAPINFOHEADER);
bmih.biPlanes = 1;
bmih.biBitCount = 32;
bmih.biWidth = screenshot_width;
bmih.biHeight = 0-screenshot_height;
bmih.biCompression = BI_RGB;
bmih.biSizeImage = 0;

int bytes_per_pixel = bmih.biBitCount / 8;

BYTE *pixels = (BYTE*)malloc(bytes_per_pixel * screenshot_width * screenshot_height);

BITMAPINFO bmi = { 0 };
bmi.bmiHeader = bmih;

int row_count = GetDIBits(hdc, hbmp, 0, screenshot_height, pixels, &bmi, DIB_RGB_COLORS);

D3D11_TEXTURE2D_DESC screenshot_desc = CD3D11_TEXTURE2D_DESC(
    DXGI_FORMAT_B8G8R8A8_UNORM,     // format
    screenshot_width,               // width
    screenshot_height,              // height
    1,                              // arraySize
    1,                              // mipLevels
    D3D11_BIND_SHADER_RESOURCE,     // bindFlags
    D3D11_USAGE_DYNAMIC,            // usage
    D3D11_CPU_ACCESS_WRITE,         // cpuaccessFlags
    1,                              // sampleCount
    0,                              // sampleQuality
    0                               // miscFlags
    );

D3D11_SUBRESOURCE_DATA data;
ZeroMemory(&data, sizeof(D3D11_SUBRESOURCE_DATA));
data.pSysMem = pixels; // texArray; // &bmp.bmBits; //pixel buffer
data.SysMemPitch = bytes_per_pixel * screenshot_width;// line size in byte
data.SysMemSlicePitch = bytes_per_pixel * screenshot_width * screenshot_height;

hr = dev->CreateTexture2D(
    &screenshot_desc, //texture format
    &data,          // pixel buffer use to fill the texture
    &screenshot_texture  // created texture
    );

D3D11_SHADER_RESOURCE_VIEW_DESC srvDesc;
srvDesc.Format = screenshot_desc.Format;
srvDesc.ViewDimension = D3D11_SRV_DIMENSION_TEXTURE2D;
srvDesc.Texture2D.MostDetailedMip = 0;
srvDesc.Texture2D.MostDetailedMip = screenshot_desc.MipLevels;

dev->CreateShaderResourceView(screenshot_texture, NULL, &shader_resource_view);
like image 942
Bambam Avatar asked Nov 26 '14 13:11

Bambam


1 Answers

You are making a lot of assumptions here that the BITMAP returned is actually in 32-bit RGBA form. It is likely not at all in that format, and in any case you need to validate the contents of bmPlanes to be 1 and bmBitsPixel to be 32 if you are assuming it is 4-bytes per pixel. You should read more about the BMP format.

BMPs uses BGRA order, so you can use DXGI_FORMAT_B8G8R8A8_UNORM for the case of bmBitsPixel being 32.

Secondly, you need to derive pitch from bmWidthBytes and not bmWidth.

data.pSysMem = &bmp.bmBits; //pixel buffer
data.SysMemPitch = bmp.bmWidthBytes;// line size in byte
data.SysMemSlicePitch = bmp.bmWidthBytes * bmp.bmHeight;// total buffer size in byte

If bmBitsPixel is 24, there is no DXGI format equivalent to that. You have to copy the data to a 32-bit format such as DXGI_FORMAT_B8G8R8X8_UNORM.

If bmBitsPixel is 15 or 16, you can use DXGI_FORMAT_B5G5R5A1_UNORM on a system with Direct3D 11.1, but remember that 16-bit DXGI formats are not always supported depending on the driver. Otherwise you'll have to convert this data to something else.

For bmBitsPixel values of 1, 2, 4, or 8 you have to convert them as there are no DXGI texture formats that are equivalent.

like image 185
Chuck Walbourn Avatar answered Oct 21 '22 16:10

Chuck Walbourn