Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

What happens during a display mode change?

What happens during a display mode change (resolution, depth) on an ordinary computer? (classical stationarys and laptops)

It might not be so trivial since video cards are so different, but one thing is common to all of them:

  • The screen goes black (understandable since the signal is turned off)
  • It takes many seconds for the signal to return with the new mode

and if it is under D3D or GL:

  • The graphics device is lost and all VRAM objects must be reloaded, making the mode change take even longer

Can someone explain the underlying nature of this, and specifically why a display mode change is not a trivial reallocation of the backbuffer(s) and takes such a "long" time?

like image 566
Thomas Rudtgerkick Avatar asked Jun 10 '11 15:06

Thomas Rudtgerkick


People also ask

What is a display mode?

The term display mode refers to the characteristics of a computer display, in particular the maximum number of colors and the maximum image resolution (in pixels horizontally by pixels vertically). There are several display modes that can be found in personal computer (PC) systems today.

Which option is used to change display setting of your computer?

Answer: In Windows, search for and open Display settings. You can also right-click an open area of the desktop and then select Display settings. To change the Display orientation between Landscape and Portrait or to flip the orientation, select an option from the drop-down menu, then click Keep Changes or Revert.


1 Answers

The only thing that actually changes are the settings of the so called RAMDAC (a Digital Analog Converter directly attached to the video RAM), well today with digital connections it's more like a RAMTX (a DVI/HDMI/DisplayPort Transmitter attached to the video RAM). DOS graphics programmer veterans probably remember the fights between the RAMDAC, the specification and the woes of one's own code.

It actually doesn't take seconds until the signal returns. This is a rather quick process, but most display devices take their time to synchronize with the new signal parameters. Actually with well written drivers the change happens almost immediately, between vertical blanks. A few years ago, when the displays were, errr, stupider and analogue, after changing the video mode settings, one could see the picture going berserk for a short moment, until the display resynchronized (maybe I should take a video of this, while I still own equipment capable of this).

Since what actually is going on is just a change of RAMDAC settings there's also not neccesary data lost as long as the basic parameters stays the same: Number of Bits per Pixel, number of components per pixel and pixel stride. And in fact OpenGL contexts usually don't loose their data with an video mode change. Of course visible framebuffer layouts change, but that happens also when moving the window around.

DirectX Graphics is a bit of different story, though. There is device exclusive access and whenever switching between Direct3D fullscreen mode and regular desktop mode all graphics objects are swapped, so that's the reason for DirectX Graphics being so laggy when switching from/to a game to the Windows desktop.

If the pixel data format changes it usually requires a full reinitialization of the visible framebuffer, but today GPUs are exceptionally good in maping heterogenous pixel formats into a target framebuffer, so no delays neccesary there, too.

like image 80
datenwolf Avatar answered Nov 16 '22 00:11

datenwolf