Nvidia lightboost on a 144hz monitor, need help with xorg.conf
Linux - SoftwareThis forum is for Software issues.
Having a problem installing a new program? Want to know which application is best for the job? Post your question in this forum.
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
The 4 custom modelines/modes do work. However, the ghosting is present with both _120lb and _100lb.
Ideally i want to use the 1920x1080_120lb mode, which is the highest refresh rate compatible with lightboost, which should give a clearer/sharper image than 144hz in terms of blur/motion.
There really doesn't seem to be alot of information about this online, relating to Linux. I did initially try out the lightboost feature under Windows, and once setup the monitor was crystal clear without any blur/ghosting at all. Running at 120hz with lightboost enabled on windows, was better than pure 144hz.
Does anyone have any ideas? It's hard for me to show the problem, as you can't capture the ghosting with a camera/screen recorder.
AFAIK, the Modelines in my xorg.conf are required for any monitor, to enable the lightboost feature. I feel they may not be suitable for my own monitor, but i have no idea how to adjust them. Each time i try adjusting the Modeline values, the option for that mode doesn't work, and isn't listed.
These days, I would at least try without any xorg.conf and write a file in xorg.conf.d and anything else that needs it. X interrogates the monitor and asks "What can we do here?" It may find a monitor & video card that suck(I have those), are reasonable or even ostentatiously flashy. Copy and paste the video section only to 20-video.conf in xorg.conf.d . Remove the modelines, as X will usually sort that. Put in a Virtual setting and the Monitor, Device, and Screen sections, also the ServerLayout section. See what you get. There is a PreferredMode setting (option?) Which lets you take the guesswork out.
I set up once for a laptop screen (1600x900), HDMI (1920x1080), a projector on the hdmi(1280x720)and anything else it found on the VGA plug. Setting the appropriate Virtual setting was critical, as I had a left/right thing going on with the projector to keep my laptop screen off the projected stuff. The projector saw the right screen, and my screen saw the left one. It worked. Every time X boots it configures what's there and ignores what is not. Getting the sound right was harder.
These days, I would at least try without any xorg.conf and write a file in xorg.conf.d and anything else that needs it. X interrogates the monitor and asks "What can we do here?" It may find a monitor & video card that suck(I have those), are reasonable or even ostentatiously flashy. Copy and paste the video section only to 20-video.conf in xorg.conf.d . Remove the modelines, as X will usually sort that. Put in a Virtual setting and the Monitor, Device, and Screen sections, also the ServerLayout section. See what you get. There is a PreferredMode setting (option?) Which lets you take the guesswork out.
I set up once for a laptop screen (1600x900), HDMI (1920x1080), a projector on the hdmi(1280x720)and anything else it found on the VGA plug. Setting the appropriate Virtual setting was critical, as I had a left/right thing going on with the projector to keep my laptop screen off the projected stuff. The projector saw the right screen, and my screen saw the left one. It worked. Every time X boots it configures what's there and ignores what is not. Getting the sound right was harder.
Yeah i had dabbled in that mindset before, with using the xorg.conf.d/ files. Specifically for forcing the FullCompositionPipeline settings. I typically only run nvidia-xconfig and use that as everything in xorg.conf.
However, the lightboost feature seems confusing so i've just merged it all into a single file for now. The thing is, lightboost works. My screen becomes dimmer, and colours are a little washed out as expected when switching to the 120hz mode (which enables lightboost). It's just that something seems out of sync, mostly on the vertical refresh, which is noticable by scrolling or moving a sprite fast across the screen. It leaves ghost trails. In Windows, this isn't a problem, so i've ruled out the limitations of this monitor, be that speeds or faults.
The odd thing is, this monitor is supposed to have a 1ms refresh rate, and capable of 120hz lightboost mode. Which should be crystal clear wiht zero blur/ghosting. I just don't know what to look for to eliminate this effect with X on Linux.
I am tempted to install some other distribution on a second hard drive, and see if this is related to certain versions of the xserver.
Also; xorg.conf is required for enabling the lightboost mode through these specific lines:
Without these, lightboost doesn't enable (the monitor OSD options are disabled in this case). Also, without these modelines set, the 120hz and 100hz options just work as any other refresh rate would, however, this also causes some blurring due to lightboost being disabled.
Blurring and ghosting is weird. Persistence of the picture is key to this. If you write a second picture before the first one is gone, you're in a mess.
I personally feel a 144hz refresh is unnecessary, and I would get a workable picture at a lower frequency. Your eyes will merge things that fast. Never mind full speed. What works best? At 144 Hz, and 1ms refresh, 14.4% of the time is refreshing. Faster imho is not always better. At HDMI, (1920/1080/60HS) your refresh time on the same monitor is 6%. Persistence (The period the picture remains after the refresh) could be responsible for blurring, and certainly a factor in ghosting if you move things rapidly. I don't think you'll noticeanything above 80Hz refresh. You will have to find an optimal refresh. Don't presume the max is best
If they're offering a light boost mode, they're trying to cover over limitations of the high frequency by giving a general brightness boost. Not a picture quality boost, just masking a problem.
Over here, people watch PAL TVs. They have a 15.625khz horizontal setting and 50 Hz vertical. Further, the picture is divided into 25 potentially red shifted lines interleaved with 25 potentially blue shifted ones, with maybe a little extra green all round to compensate. NTSC TVs are worse. SECAM are a little bit better but not great. The shifts come from incoming signal ghosting.
Blurring and ghosting is weird. Persistence of the picture is key to this. If you write a second picture before the first one is gone, you're in a mess.
I personally feel a 144hz refresh is unnecessary, and I would get a workable picture at a lower frequency. Your eyes will merge things that fast. Never mind full speed. What works best? At 144 Hz, and 1ms refresh, 14.4% of the time is refreshing. Faster imho is not always better. At HDMI, (1920/1080/60HS) your refresh time on the same monitor is 6%. Persistence (The period the picture remains after the refresh) could be responsible for blurring, and certainly a factor in ghosting if you move things rapidly. I don't think you'll noticeanything above 80Hz refresh. You will have to find an optimal refresh. Don't presume the max is best
If they're offering a light boost mode, they're trying to cover over limitations of the high frequency by giving a general brightness boost. Not a picture quality boost, just masking a problem.
Over here, people watch PAL TVs. They have a 15.625khz horizontal setting and 50 Hz vertical. Further, the picture is divided into 25 potentially red shifted lines interleaved with 25 potentially blue shifted ones, with maybe a little extra green all round to compensate. NTSC TVs are worse. SECAM are a little bit better but not great. The shifts come from incoming signal ghosting.
That's why lightboost works with the 120hz and 100hz rates.
144hz is certainly noticable on anything with vsync enabled, it's like day and night compared to 60hz on an LED screen.
Using this test on Windows: https://www.testufo.com/ghosting whilst lightboost is enabled and running at 120hz, the sprite never blurs or ghosts in the slightest, whilst even moving the window around the screen rather fast. It's certainly noticable.
People often say things that nothing higher than 24fps is ever needed, due to hollywood films etc, but this is not the case. even 144hz is easily distinguishable from < 100hz.
The idea with 120hz and lightboost is to give a non-blurred image when panning fast within a game, completely eliminating motion blur, just as a CRT would expect to behave.
The thing is though, i am currently running Windows 10, experimenting with some games using 120hz/lightboost, it works absolutely fine here, and even scrolling the forum listings on LQ very fast shows no kind of blur that is present even on a 60hz monitor. The problem is, that using the same configurations via modelines in xorg.conf, there is highly noticable ghosting, even from just scrolling a web page quickly.
So the lightboost mode does work, it's just that something with X isn't wanting to carry out the same behaviour. I'm not sure if it's related to nvidia's driver itself (i have no idea if lightboost can be used with nouveau, but it's not exactly reccomended for any gaming as it is).
I must confess that games is one use I didn't consider.
I imagine that the 'on' period of each frame is excessive. I would try complaining about the issue on Nvidia driver forums, filing bugs, etc and see if they'll fix it. It must be some spec on the pc you have. I am frankly amazed anyone gets updates at 144hz, so that each frame's information is new. I thought most games were too CPU intensive for that.
Last edited by business_kid; 05-25-2018 at 03:24 AM.
This poster mentions about windows related software to set it up;
Quote:
Change the vertical total for 100hz (should default to 1133) to 1138.
Change the vertical total for 120hz (should default to 1144) to 1149 (some people said 1147 works)
This makes sense now, when generating modelines with `cvt`
Then to change the last digit manually from 1144, to 1149 (this is the value to tell the monitor to use lightboost for 120hz). Then i'd assume the pixelclock has to increase a little bit for the incremented changes for 1144/1149 ?
I must confess that games is one use I didn't consider.
I imagine that the 'on' period of each frame is excessive. I would try complaining about the issue on Nvidia driver forums, filing bugs, etc and see if they'll fix it. It must be some spec on the pc you have. I am frankly amazed anyone gets updates at 144hz, so that each frame's information is new. I thought most games were too CPU intensive for that.
Well, you'd need a rather powerful gpu to do that for most recent games (as long as the cpu isn't throttling fps), but even in my case, with my gtx 1060 combined with amd fx 8320 cpu. If i play Rise of the Tomb Raider using this monitor with the 144hz mode, despite the fact that it's rarely achieving 144fps, in some cases dropping to 30fps, it's still much more better when vsync is disabled and appears so much smoother for each frame that is rendered compared to 60hz.
The only downpoint is that there is a bit of ghosting/blur, whereas the 120hz mode along with lightboost is supposed to completely eliminate this.
Blurring and ghosting on a digital monitor are caused by the fact that it takes a finite time for light to fade from a screen after you write to it. A screen is not rubbed out; it's just overwritten. So your ghosting is caused by something being in a slightly different place the next time it is written.
Now, here's the rub. If you're only refreshing at a lower speed, THAT SHOULD NOT BE HAPPENING. Maybe every second, third, or even fourth time, but the effect should be noticeably less. Otherwise, the artifact should be visible on a slower monitor.
Is ghosting & blurring proportional to game refresh rate?
Blurring and ghosting on a digital monitor are caused by the fact that it takes a finite time for light to fade from a screen after you write to it. A screen is not rubbed out; it's just overwritten. So your ghosting is caused by something being in a slightly different place the next time it is written.
Now, here's the rub. If you're only refreshing at a lower speed, THAT SHOULD NOT BE HAPPENING. Maybe every second, third, or even fourth time, but the effect should be noticeably less. Otherwise, the artifact should be visible on a slower monitor.
Is ghosting & blurring proportional to game refresh rate?
The ghosting/blurring happens when simply scrolling a webpage in firefox. Whereas, in Windows this problem is not present. As i said before, the monitor is capable of eliminating this, it's just that under Linux, the ghosting/blurring seems to be something related to Linux / X / Nvidia's Linux driver. Which should not be happening.
If i lower the Hz down to 100 or even 60, the problem becomes even worse.
The ghosting/blurring happens when simply scrolling a webpage in firefox. Whereas, in Windows this problem is not present. As i said before, the monitor is capable of eliminating this, it's just that under Linux, the ghosting/blurring seems to be something related to Linux / X / Nvidia's Linux driver. Which should not be happening.
If i lower the Hz down to 100 or even 60, the problem becomes even worse.
Being worse at a lower frequency contradicts what I would have expected. Are you saying that if you run another screen at 60 or 80 Hz, the blurring and ghosting is bad? That's a faulty monitor imho. You need to try this with another screen and return that monitor if it's faulty.I presume that it's new as you're setting this up first time, by the sound of it.
There's a very simple fact here. My video card could not do 144hz, or close to it. The card should run on my video card as well as yours.
Being worse at a lower frequency contradicts what I would have expected. Are you saying that if you run another screen at 60 or 80 Hz, the blurring and ghosting is bad? That's a faulty monitor imho. You need to try this with another screen and return that monitor if it's faulty.I presume that it's new as you're setting this up first time, by the sound of it.
There's a very simple fact here. My video card could not do 144hz, or close to it. The card should run on my video card as well as yours.
I don't think you understand. If i hook up a Windows harddrive using this monitor, there are no ghosting/blur issues. This problem is only present whilst booted to Linux, which suggests a software problem in a Linux environment.
Now i would guess this can be adjusted to better suit my monitor? Despite this pixel clock and vertical trace (1149) being the working values under Windows.
This modeline without lightboost, has the vertical trace of 1144. When you change this to 1149 lightboost triggers on the monitor, and becomes accessible through the OSD.
The vertical trace value can be anywhere from 1149-1180. I'm not sure what value it should be set at, however, when increasing this value, the pixel clock needs to be incremented as well, but i have no idea by how much.
Say if i set that last value of 1149, changing it to 1180, the refresh rate for that mode actually becomes 117hz, and ends up with an unsynced choppy mess.
I'm surprised these isn't really any information on this relating to Linux.
The string beginning 1080 1083 1088 1149 dictates the bottom of screen, bottom of the written area, bottom of the invisible area, and top of screen expressed in terms of the previous screen. In the days of cry monitors this was a big deal,, flyback pulses etc. With a digital monitor it's not necessary, or certainly not as much of it. The 286.7 is the horizontal frequency.
Changing the 1149 to 1180 has the effect of slowing the Vertical refresh. The 286.7(khz) dictates the length of time for 1 horizontal line. The number of lines controls the refresh rate. There is also refresh rate and dotclock (the speed at which the dots are written) are all calculated by X every time and any modelines are sanity checked. If they're invalid, X simply deletes them.
You could achieve the same effect a much simpler way by removing Xorg.conf, and being clever about the few settings in the video.conf like a VertRefresh between 120 & 144 Hz would save black magic on the xorg.conf file. X is inclined to ask the monitor what it can do, and use the settings if they make sense. Overdriving CRT monitors could be catastrophic.
It's probable from what you say that the monitor is deciding to use light boost on based on the dotclock, which is the speed at which the dots is written. That's determined by the other settings and X notes it in Xorg.0.log . X computes a lot and checks the settings are sane. Instead of fiddling with the passé xorg.conf I would rather set up the simpler xorg.conf.d and quit trawling the internet for clues. If you follow too many people's advice you'll end up in a mess, whatever you are trying to sort.
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.