Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How can I output a HDMI 1.4a-compatible stereoscopic signal from an OpenGL application to a 3DTV?

I have an OpenGL application that outputs stereoscopic 3D video to off-the-shelf TVs via HDMI, but it currently requires the display to support the pre-1.4a methods of manually choosing the right format (side-by-side, top-bottom etc). However, now I have a device that I need to support that ONLY supports HDMI 1.4a 3D signals, which as I understand it is some kind of packet sent to the display that tells it what format the 3D video is in. I'm using an NVIDIA Quadro 4000 and I would like to know if it's possible to output my video (or tell the video card how to) in a way that a standard 3DTV will see the correct format, similar to a 3D Blu-ray or other 1.4a-compatible device, without having to manually select a certain 3D mode. Is this possible?

like image 310
bparker Avatar asked Aug 11 '11 21:08

bparker


2 Answers

I don't see a direct answer for the question.

HDMI 1.4a defines meta data to describe 3D format. video_format 010 means 3D 3d_structure 0000 frame packing, 0110 top-bottom, 1000 side-by-side

But, if the driver doesn't have an api for that, you need to change its code (assuming it's open or you have access)

like image 109
user1477675 Avatar answered Nov 12 '22 14:11

user1477675


If your drivers allow it, you can create a quad-buffer stereo rendering context. This context has two back buffers and two front buffers, one pair for the left eye and one pair for the right. You render to one back buffer (GL_BACK_LEFT), then the other (GL_BACK_RIGHT), then swap them with the standard swap function.

Creating a QBS context requires platform-specific coding. If you're on Windows, you need to pick a pixel format with quad-buffers.

This is only possible if your drivers allow it. They may not. And if they don't there is nothing you can do.

like image 26
Nicol Bolas Avatar answered Nov 12 '22 14:11

Nicol Bolas