10-bit Color with Nvidia Geforce

On Windows Nvidia’s Geforce cards support 10-bit color for programs that use full-screen DirectX output (source: nvidia.com). Starting from 1000 series (Pascal architecture), Geforce cards are said to support 10 bits for exclusive full-screen OpenGL output too (source: anandtech.com).

In other words, on Windows 10-bit color output with Nvidia card will work only with DirectX or OpenGL in full-screen mode (e.g. for MadVR with “automatic full-screen exclusive mode” and “Direct3D for presentation” enabled). Windowed programs, such as Adobe Photoshop or MadVR in windowed mode, still require professional Quadro cards. Also, 10-bit color is not compatible with Windows Aero (a graphic interface, not only a desktop theme).

Under Linux there’s neither DirectX nor “exclusive full-screen” mode. Fortunately, on Linux Nvidia’s Geforce cards seem to have no such restrictions, as on Windows. 10-bit output is supported since 8 series (8800 et al), it can be enabled by setting X server’s default depth to 30. However, it works only for DisplayPort and VGA connections, if a monitor is connected via HDMI, DVI or LVDS — any 10-bit output will be dithered to 8 bits (source: nvidia.com). For monitors that don’t have a DisplayPort input this issue can be fixed with an active DP-HDMI adapter.

Free Spears&Munsil 10-bit Quantization Test pattern can help to check the output — if 10-bit works, the test will show clearly visible banding in the left “8-bit” square.

How to set up 10-bit output with Nvidia GPU under Linux

It’s surprisingly easy to get 10-bit color under Linux (at least, with proprietary driver).

Any modern Nvidia Geforce GPU itself can be considered capable of outputting 10 bit visuals without any further ado.

All what is needed is to set DefaultDepth option in Screen section of xorg.conf configuration to 30, as below:

Section "Screen"
Identifier             "Screen0"
Device                 "Device0"
Monitor                "Monitor0"
 DefaultDepth           30
SubSection             "Display"
Depth              30
EndSubSection
EndSection

Adjust other options according to your needs, restart the X server and all done.

Meanwhile, from one driver release to another Nvidia keeps saying that:

Devices connected via DVI or HDMI, as well as laptop internal panels connected via LVDS, will be dithered to 8 or 6 bits per pixel.

However, it seems to pass the S&M test even when a 10-bit capable monitor is connected via HDMI. Maybe Nvidia’s manual is simply not updated at this point. The question needs further research.

Also, X.org Server 1.20 is expected to arrive soon and hopefully it will have more support for DeepColor and HDR stuff.