Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Detect SwiftShader WebGL renderer in Chrome 18

I have a 2D HTML5 game engine (www.scirra.com) and really want to detect if WebGL is going to render with Chrome 18's 'Swiftshader' software renderer. If so we would much prefer to fall back to the ordinary canvas 2D context, as happens in other browsers. The mass of people out there have low end machines with weak CPUs that turn the game in to a slideshow when software rendering, and I think in many cases the 2D canvas would have been hardware accelerated. However, the WebGL context creation never fails in Chrome and there is no obvious way to detect SwiftShader.

Things I've tried:

// Always returns "WebKit WebGL" regardless of SwiftShader
gl.getParameter(gl.RENDERER)

// Always returns "WebKit" regardless of SwiftShader
gl.getParameter(gl.VENDOR)

I could try taking in to account things like the maximum texture size or the other MAX_* properties, but how do I know they don't vary between machines even with SwiftShader? And since I guess SwiftShader aims to mimic common hardware, using that approach might still get a lot of false positives.

I don't want to write a startup performance test, because:

  • we just make an engine, not any particular game, so I don't know how we'd write a fair test which works in the general case for any game of any performance profile with a high degree of accuracy
  • A good test would probably need a second or two to finish running, which could interrupt the user experience or make them have to watch some squares being shifted around or whatever
  • It could create new complications, such as if we cache the result, what if the user updates their drivers and fixes the problem?

I don't want to flat out disable WebGL on Chrome, because with hardware-accelerated WebGL performance can be over twice as fast as canvas 2D! If we did that, everyone loses.

I don't want to have to add in-game switches or a user setting, because how many users care about that? If the game is slow they'll just quit and most likely not search for a solution. "This game sucks, I'll go somewhere else." I think only a minority of users would bother reading instructions like "by the way, if this game is slow, try changing this setting to 'canvas 2D'..."

My current best guess is to use gl.getSupportedExtensions(). I have found that SwiftShader reports the following extensions:

OES_texture_float,OES_standard_derivatives,WEBKIT_WEBGL_lose_context

...but a real hardware-accelerated context reports:

OES_texture_float,OES_standard_derivatives,WEBKIT_WEBGL_lose_context,WEBKIT_WEBGL_compressed_textures

Note the addition of WEBKIT_WEBGL_compressed_textures. Some quick research indicates that this may or may not be widely supported. See this support table - both GL_EXT_texture_compression_s3tc and GL_ARB_texture_compression appear widely supported on desktop cards. Also the table only seems to list reasonably old models, so I could hazard a guess that all modern desktop graphics cards would support WEBKIT_WEBGL_compressed_textures... therefore my detection criteria for SwiftShader would be:

  • Windows OS
  • Google Chrome browser
  • WebGL context does not support WEBKIT_WEBGL_compressed_textures
  • Result: fall back to Canvas 2D

Of course, if SwiftShader adds compressed texture support in future, this breaks again. But I can't see the advantage of compressed textures with a software renderer! Also, it will still get lots of false positives if there are many real working video cards out there that don't support WEBKIT_WEBGL_compressed_textures!

Is there not a better way to detect SwiftShader?

like image 848
AshleysBrain Avatar asked May 04 '12 21:05

AshleysBrain


1 Answers

Go to http://code.google.com/p/angleproject/wiki/ExtensionSupport look at the EGL extension EGL_ANGLE_software_display, if it is available, it is because there's a SwiftShader backend.

like image 58
Chiguireitor Avatar answered Oct 23 '22 03:10

Chiguireitor