Sunday, April 14, 2013

NVIDIA 300 series Linux drivers - worst functionality regression ever

For a long time, I've been extraordinarily happy with both NVIDIA graphics hardware and the vendor-supplied binary drivers. Functionality, stability, speed. However, things are changing and I'm frustrated. Let me tell you why.

Part of my job is to do teaching and presentations. I have a trusty thinkpad with a VGA output which can in principle supply about every projector with a decent signal. Most of these projectors do not display the native 1920x1200 resolution of the built-in display. This means, if you configure the second display to clone the first, you will end up seeing only part of the screen. In the past, I solved this by using nvidia-settings and setting the display to a lower resolution supported by the projector (nvidia-settings told me which ones I could use) and then let it clone things. Not so elegant, but everything worked fine- and this amount of fiddling is still something that can be done in the front of a seminar room while someone is introducing you and the audience gets impatient.

Now consider my surprise when suddenly after a driver upgrade the built-in display was completely glued to the native resolution. Only setting possible - 1920x1200. The first time I saw that I was completely clueless what to do; starting the talk took a bit longer than expected. A simple, but completely crazy solution exists; disable the built-in display and only enable the projector output. Then your X session is displayed there and resized accordingly. You'll have to look at the silver screen while talking, but that's not such a problem. A bigger pain actually is that you may have to leave the podium in a hurry and then have no video output at all...

Now, googling. Obviously a lot of other people have the same problem as well. Hacks like this one just don't work, I've ended up with nice random screen distortions. Here's a thread on the nvidia devtalk forum from where I can quote, "The way it works now is more "correct" than the old behavior, but what the user sees is that the old way worked and the new does not." It seems like now nVidia expects that each application handles any mode switching internally. My usecase does not even exist from their point of view. Here's another thread, and in general users are not happy about it.

Finally, I found this link where the following reply is given: "The driver supports all of the scaling features that older drivers did, it's just that nvidia-settings hasn't yet been updated to make it easy to configure those scaling modes from the GUI." Just great.

Gentlemen, this is a serious annoyance. Please fix it. Soon. Not everyone is willing to read up on xrandr command line options and fiddle with ViewPortIn, ViewPortOut, MetaModes and other technical stuff. Especially while the audience is waiting.


  1. Well, ever since nVidia implemented newer randr things have been great. I always hated nvidia-settings, being separate from every other config util, and not having a easy-to-use cli.

    Look at using plain xrandr, or the display settings available from your favourite DE, and forget about the ugly hack that nvidia-settings was...

    1. I'm not sure what software you are using, but the KDE configuration panel (in 4.10.2) displays exactly the same problem, meaning it pulls its configuration information from the same source as nvidia-settings. And while I'm normally not averse to command line utilities, I'm not willing to spend a long time fiddling with them. Especially since I normally do not know the properties of the projector beforehand.

  2. I'm a Lenovo user as well, however the video card is different ( Intel(R) Ivybridge Mobile (GT2) ) and it just works. xinerama flag helps to setup two independent screens.

    For KDE users: I've also migrated to kscreen (0.0.81).

    Just nice and solid.

    "F%CK you, nvidia" (C) linus.

  3. what about nouveau? I just returned to it with my NVidia Card inside my Lenovo T410 because the nvidia driver somehow does not allow me to adjust the backlight. And: did you ever change to the console without ANY delay?

    Ah, if you want to try it, use the stable gentoo-sources. everything above 3.7 series brakes suspend..

  4. Just for completeness, I've flipped the switch and am now running the laptop with integrated intel graphics (luckily the Thinkpad BIOS allows me to hard-select which graphics hardware to run, intel, nvidia or hybrid optimus). So far all works perfectly, and even KDE desktop effects are fine.