In my scenario I fetch screen from a device(it only produces tiff image) and transfer it to jpeg and sent it over network to the client(client only support jpeg encoding)
java code
public byte[] getscreen(){
/*
logic for fetching tiff image from the device
*/
if(tiffimage == null )
return null;
byteOutput = new ByteArrayOutputStream();
ImageIO.write(tiffImage, "jpeg", byteOutput);
return byteOutput;
}
For the device to produce the image it's taking 10ms - 1 sec depending on the resolution of the device (please note no change can be done at this side, it produces only tiff image) and the size is from 3 MB -12 MB depending on the resolution.
Now converting the image to JPEG is taking some time. my query is, can we use the GPU power for converting the image from tiff to JPEG so that i can get improved FPS in my client side?
P.S: The application is running in various machines which have graphics cards such as (NVDIA, AMD,Intel HD graphics) I want to know whether this can be done, if so how to approach the solution.
MPEG is roughly about just that: a lot of JPEG image encoding operations one after another, plus some logic involving differences for P frames, etc. I wrote a simple MPEG encoder using a GPU once, which gave some speedup factor (don't remember exactly by how much though). That said, in order to properly answer your question: yes, there might be some time difference, but for one picture only, that difference is probably negligible, including offset times for offloading the picture data to the GPU device, etc.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With