I'm writing some WebGPU code with the rust wgpu library, and compiling it to run on a browser with WASM.
from what i found, seems i could use WebGpu to draw on a HTML canvas when i'm using JS, and could also draw on the canvas if i pass the canvas element to the rust WASM.
i'm intereseted in doing part of the graphic processing in JS, and part of the graphic processing in WASM, but i can't find a way to pass a WebGPU texture between JS and rust.
In contrast, on Android, i can create a texture with OpenGL, do whatever processing i like, and convert it in rust to a WebGPU texture using create_texture_from_hal (which is unavailable for wasm)
Is there any such way to pass WebGpu textures between the JS and the rust WASM?
This direction is easy, since there's a provided method for it
let image_bitmap: web_sys::ImageBitmap = ...;
let size = wgpu::Extent3d { ... };
let desc = wgpu::TextureDescriptor { ... };
let texture = device.create_texture(&desc);
let copy_texture = wgpu::TexelCopyTextureInfo {
aspect: wgpu::TextureAspect::All,
texture: &texture,
mip_level: 0,
origin: wgpu::Origin3d::ZERO,
};
queue.copy_external_image_to_texture(
&wgpu::CopyExternalImageSourceInfo {
source: wgpu::ExternalImageSource::ImageBitmap(image_bitmap),
origin: wgpu::Origin2d::ZERO,
flip_y: false,
},
copy_texture.to_tagged(wgpu::PredefinedColorSpace::Srgb, true),
size,
)
See copy_external_image_to_texture
Nobody seems to have requested this yet. So they haven't added a helper method for this. Please do open an issue on their repository if you actually find yourself needing this. And tell them what for, and why. :)
Until then, here's the rough workaround:
Have this Typescript function, and call it whenever the Rust side is about to create a texture. I haven't looked into how to elegantly call it. wasm-bindgen probably has something for that.
function captureNextTexture(): Promise<GPUTexture> {
return new Promise((resolve) => {
const originalCreateTexture = GPUDevice.prototype.createTexture;
// The Rust bindings will call this, since they go through the browser API
GPUDevice.prototype.createTexture = function (
descriptor: GPUTextureDescriptor
) {
// Create underlying texture
const texture = originalCreateTexture.apply(this, [descriptor]);
// Tell the promise about it.
// Be a bit careful if you want to access the Rust side,
// since the Rust side is in the middle of a WebGPU function call.
resolve(texture);
// Restore this function, we're done capturing
GPUDevice.prototype.createTexture = originalCreateTexture;
// Finish the createTexture call
return texture;
};
});
}
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With