LinuxQuestions.org
Welcome to the most active Linux Forum on the web.
Home Forums Tutorials Articles Register
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Distributions > Gentoo
User Name
Password
Gentoo This forum is for the discussion of Gentoo Linux.

Notices


Reply
  Search this Thread
Old 10-13-2009, 02:04 PM   #1
scheme
LQ Newbie
 
Registered: May 2009
Distribution: Ubuntu, Ubuntu Studio, Debian, Arch
Posts: 24

Rep: Reputation: 1
Setting up Gentoo, KDE4, X and Nvidia


I recently installed Gentoo, KDE4 on it and now I'm trying to make the nvidia drivers work (the computer will be Teh-gaming-machina) and make the desktop all pretty, flashy etc.

I succeeded in emerging kdebase-startkde and it starts fine. What I'd like to do first here is change the resolution but X simply discards my proposals at startup.
Any ideas?

Here's xorg.conf and xorg.0.log from right after I startx.

Code:
Section "ServerLayout"
	Identifier     "X.org Configured"
	Screen      0  "Screen0" 0 0
	InputDevice    "Mouse0" "CorePointer"
	InputDevice    "Keyboard0" "CoreKeyboard"
EndSection

Section "ServerFlags"
	Option "AutoAddDevices" "False"
EndSection

Section "Module"
	Load  "glx"
	Load  "extmod"
	Load  "dbe"
EndSection

Section "InputDevice"
	Identifier  "Keyboard0"
	Driver      "kbd"
	Option "XkbLayout" "fi"
EndSection

Section "InputDevice"
	Identifier  "Mouse0"
	Driver      "mouse"
	Option	    "Protocol" "auto"
	Option	    "Device" "/dev/input/mice"
	Option	    "ZAxisMapping" "4 5 6 7"
EndSection

Section "Monitor"
	Identifier   "Monitor0"
	VendorName   "Monitor Vendor"
	ModelName    "Monitor Model"
EndSection

Section "Device"
	Identifier  "Card0"
	Driver      "nvidia"
	VendorName  "nVidia Corporation"
	BoardName   "G80 [GeForce 8800 GTS]"
	BusID       "PCI:1:0:0"
EndSection

Section "Screen"
	Identifier "Screen0"
	Device     "Card0"
	Monitor    "Monitor0"
	SubSection "Display"
		Viewport   0 0
		Depth     16
		Modes "1680x1050" "1440x900"
	EndSubSection
	SubSection "Display"
		Viewport   0 0
		Depth     24
		Modes "1680x1050" "1440x900"
	EndSubSection
EndSection

Code:
This is a pre-release version of the X server from The X.Org Foundation.
It is not supported in any way.
Bugs may be filed in the bugzilla at http://bugs.freedesktop.org/.
Select the "xorg" product for bugs you find in this release.
Before reporting bugs in pre-release versions please check the
latest version in the X.Org Foundation git repository.
See http://wiki.x.org/wiki/GitPage for git access instructions.

X.Org X Server 1.6.3.901 (1.6.4 RC 1)
Release Date: 2009-8-25
X Protocol Version 11, Revision 0
Build Operating System: Linux 2.6.30-gentoo-r5 x86_64 
Current Operating System: Linux LaughingAwesome 2.6.30-gentoo-r5 #1 SMP Sat Oct 10 08:15:46 EEST 2009 x86_64
Build Date: 10 October 2009  07:31:53AM
 
	Before reporting problems, check http://wiki.x.org
	to make sure that you have the latest version.
Markers: (--) probed, (**) from config file, (==) default setting,
	(++) from command line, (!!) notice, (II) informational,
	(WW) warning, (EE) error, (NI) not implemented, (??) unknown.
(==) Log file: "/var/log/Xorg.0.log", Time: Tue Oct 13 21:12:10 2009
(==) Using config file: "/etc/X11/xorg.conf"
(==) ServerLayout "X.org Configured"
(**) |-->Screen "Screen0" (0)
(**) |   |-->Monitor "Monitor0"
(**) |   |-->Device "Card0"
(**) |-->Input Device "Mouse0"
(**) |-->Input Device "Keyboard0"
(**) Option "AutoAddDevices" "False"
(**) Not automatically adding devices
(==) Automatically enabling devices
(WW) The directory "/usr/share/fonts/misc/" does not exist.
	Entry deleted from font path.
(WW) The directory "/usr/share/fonts/TTF/" does not exist.
	Entry deleted from font path.
(WW) The directory "/usr/share/fonts/OTF" does not exist.
	Entry deleted from font path.
(WW) The directory "/usr/share/fonts/Type1/" does not exist.
	Entry deleted from font path.
(WW) The directory "/usr/share/fonts/100dpi/" does not exist.
	Entry deleted from font path.
(WW) The directory "/usr/share/fonts/75dpi/" does not exist.
	Entry deleted from font path.
(==) FontPath set to:
	
(==) ModulePath set to "/usr/lib64/xorg/modules"
(II) Loader magic: 0xd20
(II) Module ABI versions:
	X.Org ANSI C Emulation: 0.4
	X.Org Video Driver: 5.0
	X.Org XInput driver : 4.0
	X.Org Server Extension : 2.0
(II) Loader running on linux
(--) using VT number 7

(--) PCI:*(0:1:0:0) 10de:0193:19f1:0416 nVidia Corporation G80 [GeForce 8800 GTS] rev 162, Mem @ 0xec000000/16777216, 0xd0000000/268435456, 0xea000000/33554432, I/O @ 0x0000df00/128, BIOS @ 0x????????/131072
(WW) Open ACPI failed (/var/run/acpid.socket) (No such file or directory)
(II) No APM support in BIOS or kernel
(II) System resource ranges:
	[0] -1	0	0xffffffff - 0xffffffff (0x1) MX[B]
	[1] -1	0	0x000f0000 - 0x000fffff (0x10000) MX[B]
	[2] -1	0	0x000c0000 - 0x000effff (0x30000) MX[B]
	[3] -1	0	0x00000000 - 0x0009ffff (0xa0000) MX[B]
	[4] -1	0	0x0000ffff - 0x0000ffff (0x1) IX[B]
	[5] -1	0	0x00000000 - 0x00000000 (0x1) IX[B]
(II) "extmod" will be loaded. This was enabled by default and also specified in the config file.
(II) "dbe" will be loaded. This was enabled by default and also specified in the config file.
(II) "glx" will be loaded. This was enabled by default and also specified in the config file.
(II) "record" will be loaded by default.
(II) "dri" will be loaded by default.
(II) "dri2" will be loaded by default.
(II) LoadModule: "glx"
(II) Loading /usr/lib64/xorg/modules/extensions//libglx.so
(II) Module glx: vendor="NVIDIA Corporation"
	compiled for 4.0.2, module version = 1.0.0
	Module class: X.Org Server Extension
(II) NVIDIA GLX Module  180.60  Mon May 11 15:53:29 PDT 2009
(II) Loading extension GLX
(II) LoadModule: "extmod"
(II) Loading /usr/lib64/xorg/modules/extensions//libextmod.so
(II) Module extmod: vendor="X.Org Foundation"
	compiled for 1.6.3.901, module version = 1.0.0
	Module class: X.Org Server Extension
	ABI class: X.Org Server Extension, version 2.0
(II) Loading extension MIT-SCREEN-SAVER
(II) Loading extension XFree86-VidModeExtension
(II) Loading extension XFree86-DGA
(II) Loading extension DPMS
(II) Loading extension XVideo
(II) Loading extension XVideo-MotionCompensation
(II) Loading extension X-Resource
(II) LoadModule: "dbe"
(II) Loading /usr/lib64/xorg/modules/extensions//libdbe.so
(II) Module dbe: vendor="X.Org Foundation"
	compiled for 1.6.3.901, module version = 1.0.0
	Module class: X.Org Server Extension
	ABI class: X.Org Server Extension, version 2.0
(II) Loading extension DOUBLE-BUFFER
(II) LoadModule: "record"
(II) Loading /usr/lib64/xorg/modules/extensions//librecord.so
(II) Module record: vendor="X.Org Foundation"
	compiled for 1.6.3.901, module version = 1.13.0
	Module class: X.Org Server Extension
	ABI class: X.Org Server Extension, version 2.0
(II) Loading extension RECORD
(II) LoadModule: "dri"
(WW) Warning, couldn't open module dri
(II) UnloadModule: "dri"
(EE) Failed to load module "dri" (module does not exist, 0)
(II) LoadModule: "dri2"
(WW) Warning, couldn't open module dri2
(II) UnloadModule: "dri2"
(EE) Failed to load module "dri2" (module does not exist, 0)
(II) LoadModule: "nvidia"
(II) Loading /usr/lib64/xorg/modules/drivers//nvidia_drv.so
(II) Module nvidia: vendor="NVIDIA Corporation"
	compiled for 4.0.2, module version = 1.0.0
	Module class: X.Org Video Driver
(II) LoadModule: "mouse"
(II) Loading /usr/lib64/xorg/modules/input//mouse_drv.so
(II) Module mouse: vendor="X.Org Foundation"
	compiled for 1.6.3.901, module version = 1.4.0
	Module class: X.Org XInput Driver
	ABI class: X.Org XInput driver, version 4.0
(II) LoadModule: "kbd"
(II) Loading /usr/lib64/xorg/modules/input//kbd_drv.so
(II) Module kbd: vendor="X.Org Foundation"
	compiled for 1.6.3.901, module version = 1.3.2
	Module class: X.Org XInput Driver
	ABI class: X.Org XInput driver, version 4.0
(II) NVIDIA dlloader X Driver  180.60  Mon May 11 15:33:16 PDT 2009
(II) NVIDIA Unified Driver for all Supported NVIDIA GPUs
(II) Primary Device is: PCI 01@00:00:0
(II) Loading sub module "fb"
(II) LoadModule: "fb"
(II) Loading /usr/lib64/xorg/modules//libfb.so
(II) Module fb: vendor="X.Org Foundation"
	compiled for 1.6.3.901, module version = 1.0.0
	ABI class: X.Org ANSI C Emulation, version 0.4
(II) Loading sub module "wfb"
(II) LoadModule: "wfb"
(II) Loading /usr/lib64/xorg/modules//libwfb.so
(II) Module wfb: vendor="X.Org Foundation"
	compiled for 1.6.3.901, module version = 1.0.0
	ABI class: X.Org ANSI C Emulation, version 0.4
(II) Loading sub module "ramdac"
(II) LoadModule: "ramdac"
(II) Module "ramdac" already built-in
(II) resource ranges after probing:
	[0] -1	0	0xffffffff - 0xffffffff (0x1) MX[B]
	[1] -1	0	0x000f0000 - 0x000fffff (0x10000) MX[B]
	[2] -1	0	0x000c0000 - 0x000effff (0x30000) MX[B]
	[3] -1	0	0x00000000 - 0x0009ffff (0xa0000) MX[B]
	[4] -1	0	0x0000ffff - 0x0000ffff (0x1) IX[B]
	[5] -1	0	0x00000000 - 0x00000000 (0x1) IX[B]
(==) NVIDIA(0): Depth 24, (==) framebuffer bpp 32
(==) NVIDIA(0): RGB weight 888
(==) NVIDIA(0): Default visual is TrueColor
(==) NVIDIA(0): Using gamma correction (1.0, 1.0, 1.0)
(**) NVIDIA(0): Enabling RENDER acceleration
(II) NVIDIA(0): Support for GLX with the Damage and Composite X extensions is
(II) NVIDIA(0):     enabled.
(WW) NVIDIA(GPU-0): Unable to read EDID for display device CRT-1
(II) NVIDIA(0): NVIDIA GPU GeForce 8800 GTS (G80) at PCI:1:0:0 (GPU-0)
(--) NVIDIA(0): Memory: 327680 kBytes
(--) NVIDIA(0): VideoBIOS: 60.80.0d.00.04
(II) NVIDIA(0): Detected PCI Express Link width: 16X
(--) NVIDIA(0): Interlaced video modes are supported on this GPU
(--) NVIDIA(0): Connected display device(s) on GeForce 8800 GTS at PCI:1:0:0:
(--) NVIDIA(0):     CRT-1
(--) NVIDIA(0): CRT-1: 400.0 MHz maximum pixel clock
(II) NVIDIA(0): Assigned Display Device: CRT-1
(WW) NVIDIA(0): No valid modes for "1680x1050"; removing.
(WW) NVIDIA(0): No valid modes for "1440x900"; removing.
(WW) NVIDIA(0): 
(WW) NVIDIA(0): Unable to validate any modes; falling back to the default mode
(WW) NVIDIA(0):     "nvidia-auto-select".
(WW) NVIDIA(0): 
(II) NVIDIA(0): Validated modes:
(II) NVIDIA(0):     "nvidia-auto-select"
(II) NVIDIA(0): Virtual screen size determined to be 1024 x 768
(WW) NVIDIA(0): Unable to get display device CRT-1's EDID; cannot compute DPI
(WW) NVIDIA(0):     from CRT-1's EDID.
(==) NVIDIA(0): DPI set to (75, 75); computed from built-in default
(==) NVIDIA(0): Enabling 32-bit ARGB GLX visuals.
(--) Depth 24 pixmap format is 32 bpp
(II) do I need RAC?  No, I don't.
(II) resource ranges after preInit:
	[0] -1	0	0xffffffff - 0xffffffff (0x1) MX[B]
	[1] -1	0	0x000f0000 - 0x000fffff (0x10000) MX[B]
	[2] -1	0	0x000c0000 - 0x000effff (0x30000) MX[B]
	[3] -1	0	0x00000000 - 0x0009ffff (0xa0000) MX[B]
	[4] -1	0	0x0000ffff - 0x0000ffff (0x1) IX[B]
	[5] -1	0	0x00000000 - 0x00000000 (0x1) IX[B]
(II) NVIDIA(0): Initialized GPU GART.
(II) NVIDIA(0): ACPI: failed to connect to the ACPI event daemon; the daemon
(II) NVIDIA(0):     may not be running or the "AcpidSocketPath" X
(II) NVIDIA(0):     configuration option may not be set correctly.  When the
(II) NVIDIA(0):     ACPI event daemon is available, the NVIDIA X driver will
(II) NVIDIA(0):     try to use it to receive ACPI event notifications.  For
(II) NVIDIA(0):     details, please see the "ConnectToAcpid" and
(II) NVIDIA(0):     "AcpidSocketPath" X configuration options in Appendix B: X
(II) NVIDIA(0):     Config Options in the README.
(II) NVIDIA(0): Setting mode "nvidia-auto-select"
(II) Loading extension NV-GLX
(II) NVIDIA(0): NVIDIA 3D Acceleration Architecture Initialized
(==) NVIDIA(0): Disabling shared memory pixmaps
(II) NVIDIA(0): Using the NVIDIA 2D acceleration architecture
(==) NVIDIA(0): Backing store disabled
(==) NVIDIA(0): Silken mouse enabled
(II) NVIDIA(0): DPMS enabled
(II) Loading extension NV-CONTROL
(II) Loading extension XINERAMA
(==) RandR enabled
(II) Initializing built-in extension Generic Event Extension
(II) Initializing built-in extension SHAPE
(II) Initializing built-in extension MIT-SHM
(II) Initializing built-in extension XInputExtension
(II) Initializing built-in extension XTEST
(II) Initializing built-in extension BIG-REQUESTS
(II) Initializing built-in extension SYNC
(II) Initializing built-in extension XKEYBOARD
(II) Initializing built-in extension XC-MISC
(II) Initializing built-in extension XINERAMA
(II) Initializing built-in extension XFIXES
(II) Initializing built-in extension RENDER
(II) Initializing built-in extension RANDR
(II) Initializing built-in extension COMPOSITE
(II) Initializing built-in extension DAMAGE
(II) Initializing extension GLX
(**) Option "Protocol" "auto"
(**) Option "Device" "/dev/input/mice"
(II) Mouse0: Setting mouse protocol to "ExplorerPS/2"
(**) Mouse0: Device: "/dev/input/mice"
(**) Mouse0: Protocol: "auto"
(**) Option "CorePointer"
(**) Mouse0: always reports core events
(**) Option "Device" "/dev/input/mice"
(==) Mouse0: Emulate3Buttons, Emulate3Timeout: 50
(**) Option "ZAxisMapping" "4 5 6 7"
(**) Mouse0: ZAxisMapping: buttons 4, 5, 6 and 7
(**) Mouse0: Buttons: 11
(**) Mouse0: Sensitivity: 1
(II) XINPUT: Adding extended input device "Mouse0" (type: MOUSE)
(**) Mouse0: (accel) keeping acceleration scheme 1
(**) Mouse0: (accel) filter chain progression: 2.00
(**) Mouse0: (accel) filter stage 0: 20.00 ms
(**) Mouse0: (accel) set acceleration profile 0
(II) Mouse0: Setting mouse protocol to "ExplorerPS/2"
(II) Mouse0: ps2EnableDataReporting: succeeded
(**) Option "CoreKeyboard"
(**) Keyboard0: always reports core events
(**) Option "Protocol" "standard"
(**) Keyboard0: Protocol: standard
(**) Option "AutoRepeat" "500 30"
(**) Option "XkbRules" "xorg"
(**) Keyboard0: XkbRules: "xorg"
(**) Option "XkbModel" "pc105"
(**) Keyboard0: XkbModel: "pc105"
(**) Option "XkbLayout" "fi"
(**) Keyboard0: XkbLayout: "fi"
(**) Option "CustomKeycodes" "off"
(**) Keyboard0: CustomKeycodes disabled
(II) XINPUT: Adding extended input device "Keyboard0" (type: KEYBOARD)
(II) config/hal: Adding input device NOVATEK USB Keyboard
(EE) config/hal: NewInputDeviceRequest failed (8)
(II) config/hal: Adding input device NOVATEK USB Keyboard
(EE) config/hal: NewInputDeviceRequest failed (8)
(II) config/hal: Adding input device Logitech USB-PS/2 Optical Mouse
(EE) config/hal: NewInputDeviceRequest failed (8)
 
Old 10-13-2009, 06:06 PM   #2
scheme
LQ Newbie
 
Registered: May 2009
Distribution: Ubuntu, Ubuntu Studio, Debian, Arch
Posts: 24

Original Poster
Rep: Reputation: 1
Additional question:

My primary concern is that I didn't add something in the kernel that needed to be there. How could I check that?

The monitor isn't a crt but an Acer AL2216W (21" LCD) so obviously it isn't configured right either and that seems to be at least part of the problem.
 
Old 10-13-2009, 06:32 PM   #3
manwithaplan
Member
 
Registered: Nov 2008
Location: ~/
Distribution: Arch || Sidux
Posts: 393

Rep: Reputation: 45
I take it that you have ran nvidia-xconfig..? This is obviously your problem
Code:
    (WW) NVIDIA(GPU-0): Unable to read EDID for display device CRT-1
(II) NVIDIA(0): NVIDIA GPU GeForce 8800 GTS (G80) at PCI:1:0:0 (GPU-0)
(--) NVIDIA(0): Memory: 327680 kBytes
(--) NVIDIA(0): VideoBIOS: 60.80.0d.00.04
(II) NVIDIA(0): Detected PCI Express Link width: 16X
(--) NVIDIA(0): Interlaced video modes are supported on this GPU
(--) NVIDIA(0): Connected display device(s) on GeForce 8800 GTS at PCI:1:0:0:
(--) NVIDIA(0):     CRT-1
(--) NVIDIA(0): CRT-1: 400.0 MHz maximum pixel clock
(II) NVIDIA(0): Assigned Display Device: CRT-1
(WW) NVIDIA(0): No valid modes for "1680x1050"; removing.
(WW) NVIDIA(0): No valid modes for "1440x900"; removing.
(WW) NVIDIA(0): 
(WW) NVIDIA(0): Unable to validate any modes; falling back to the default mode
(WW) NVIDIA(0):     "nvidia-auto-select".
(WW) NVIDIA(0): 
(II) NVIDIA(0): Validated modes:
(II) NVIDIA(0):     "nvidia-auto-select"
(II) NVIDIA(0): Virtual screen size determined to be 1024 x 768
(WW) NVIDIA(0): Unable to get display device CRT-1's EDID; cannot compute DPI
(WW) NVIDIA(0):     from CRT-1's EDID.
(==) NVIDIA(0): DPI set to (75, 75); computed from built-in default


try this in your xorg:
Code:
Section "Screen"
    Identifier     "Screen0"
    Device         "Device0"
    Monitor        "Monitor0"
    DefaultDepth    24
    Option         "metamodes" ""1680x1050"_60 +0+0; nvidia-auto-select +0+0"
    Option         "NoLogo" "True"
    SubSection     "Display"
        Depth       24
    EndSubSection
EndSection
Then adjust later with nvidia settings or a manual edit

Last edited by manwithaplan; 10-13-2009 at 06:34 PM.
 
Old 10-13-2009, 07:09 PM   #4
scheme
LQ Newbie
 
Registered: May 2009
Distribution: Ubuntu, Ubuntu Studio, Debian, Arch
Posts: 24

Original Poster
Rep: Reputation: 1
I tried it (after removing quotes from around the resolution, wouldn't work without it). It fell back to the nvidia-auto-select, Xorg.0.log output was basically the same. Checked nvidia-settings and I can't go above 1024x768. I don't really know how to adjust it to make it work. The resolution modes I added myself after trying both Xorg --configure and nvidia-xconfig generated files which wouldn't either work or gave the same result as now.
 
Old 10-13-2009, 07:18 PM   #5
manwithaplan
Member
 
Registered: Nov 2008
Location: ~/
Distribution: Arch || Sidux
Posts: 393

Rep: Reputation: 45
Are you using a VGA or DVI cable... not that is should make a difference ... How about your kernel ...? Did you use genkernel.. or a custom?

Its strange that its detecting the LCD panel as a CRT. Did you use the sample I gave you? Did you set Nvidia in your make.conf..? I always hated the way Gentoo handled the kernel and Nvidia drivers through portage. In fact, I switched to ARCH today because of some portage instability issue's that were annoying.

I ended up unmasking xorg-server 1.6.4 and used Nvidia drivers from there website... then used zen-sources for a kernel. It was much more stable.

EDIT: I see that you are using xorg 1.6.4

One more thing, are you using a KVM switch..?

Last edited by manwithaplan; 10-13-2009 at 07:21 PM.
 
Old 10-13-2009, 07:32 PM   #6
scheme
LQ Newbie
 
Registered: May 2009
Distribution: Ubuntu, Ubuntu Studio, Debian, Arch
Posts: 24

Original Poster
Rep: Reputation: 1
I'm using a VGA cable with a DVI adapter on the end (sucky monitor, good video card).

I used the exact sample you gave me first but it didn't load and logs said "1680x1050" is not a valid keyword in this section so I dropped the quotes and it run.

VIDEO_CARDS="nvidia" has been added to the make.conf before compiling anything.

I run genkernel --menuconfig, hence my small concern. I removed some obvious useless things like wireless support but you never know if I accidently fell on the spacebar at the wrong place.

I've had similiar problems with Ubuntu before, I'm thinking it's just the monitor which doesn't send the right information, maybe partly because of the adapter.
 
Old 10-13-2009, 07:42 PM   #7
manwithaplan
Member
 
Registered: Nov 2008
Location: ~/
Distribution: Arch || Sidux
Posts: 393

Rep: Reputation: 45
I actually have a KVM on a DVI/VGA adapter ... and my screen always blanked... I dont have a widescreen, but its LCD. I then just use my DVI cable and it made a big difference. At the cost of switching cables for my KVM.

I would check nvidia's forums for a similar problem with xorg... or try upgrading to the newest drivers. I use custom kernels.. there much faster, and tuned for your hardware.

Try just this line with quotes

Quote:
Option "metamodes" ""1680x1050"_60 +0+0"
 
Old 10-13-2009, 08:09 PM   #8
scheme
LQ Newbie
 
Registered: May 2009
Distribution: Ubuntu, Ubuntu Studio, Debian, Arch
Posts: 24

Original Poster
Rep: Reputation: 1
The code snippet gave the same output, with (not a valid mode) and without (fell to nvidia-auto-select) quotes.

I tried forcing the mode with xrandr but the output was
Code:
Configure crtc 0 failed
What I know is the edid information isn't coming through properly, but I can't see how it would stop me from giving the modes through xrandr.

Also my driver version is 180.60, the highest one in portage. Nvidia site has a higher version one but I don't have a good history with them, they have usually ended up breaking alot of things.
 
Old 10-13-2009, 08:19 PM   #9
manwithaplan
Member
 
Registered: Nov 2008
Location: ~/
Distribution: Arch || Sidux
Posts: 393

Rep: Reputation: 45
From what I take is that you can get a X display correct? just at 1024x768...

And your xrandr command to add mode was..?
Code:
xrandr --addmode VGA-0 1680x1050
BTW... you can unmask the latest ~ nvidia drivers

EDIT: under monitor set the vertical & horizontial rates

Quote:
e.g. Monitor

HorizSync 28.0 - 33.0
VertRefresh 43.0 - 72.0
Maybe set a modeline... I'll post a link later or you can search for modline calculator

Quote:
e.g. Monitor

Modeline "1680x1050" 146.25 1680 1784 1960 2240 1050 1053 1059 1089 -hsync +vsync
Option "PreferredMode" "1680x1050"
Use this modeline generator:
http://xtiming.sourceforge.net/cgi-bin/xtiming.pl

This is defintley your problem

Quote:
(WW) NVIDIA(GPU-0): Unable to read EDID for display device CRT-1

Last edited by manwithaplan; 10-13-2009 at 08:38 PM.
 
Old 10-13-2009, 08:51 PM   #10
scheme
LQ Newbie
 
Registered: May 2009
Distribution: Ubuntu, Ubuntu Studio, Debian, Arch
Posts: 24

Original Poster
Rep: Reputation: 1
I ran:
Code:
$ cvt 1680 1050
$ xrandr --newmode "1680x1050_60.00" 146.25 1680 1784 1960 2240 1050 1053 1059 1089 -hsync +vsync # Pasted output from cvt
$ xrandr --addmode default 1680x1050_60.00
$ xrandr --output default --mode 1680x1050_60.00
xrandr: Configure critc 0 failed
Setting the Monitor hsync and vrefresh made the resolution go smaller (at least something happened!) when Modeline made no difference.

I unmasked and installed nvidia-drivers-185.18.31. Nothing changed.

Last edited by scheme; 10-13-2009 at 09:48 PM.
 
Old 10-14-2009, 08:07 AM   #11
manwithaplan
Member
 
Registered: Nov 2008
Location: ~/
Distribution: Arch || Sidux
Posts: 393

Rep: Reputation: 45
I came across these Xorg server settings that might come in handy...
Code:
Option "ModeValidation" "NoEdidModes"
Option   "UseEdid" "false"
I think by ignoring the EDID will allow you to have the resolution you want.

If I remember there is also a EDID option in the kernel under device drivers --> can't remember exactly where ... And I dont know if this will help with the EDID not passing correct info to the nvidia driver. Worth a check
 
Old 10-14-2009, 01:14 PM   #12
scheme
LQ Newbie
 
Registered: May 2009
Distribution: Ubuntu, Ubuntu Studio, Debian, Arch
Posts: 24

Original Poster
Rep: Reputation: 1
At startup X decides to ignore the EDID. When it comes to the modes part it can't validate any and falls back to nvidia-auto-select, after which it tries to find EDID and can't get it, setting all back to default. In KDE I tried my xrandr moves on it, without success. I checked /usr/src/linux/.config and couldn't find anything obviously missing, all EDID and screen detection stuff seemed to be there (I'm a noob at kernel configuration tho, here's an attachment).

I also tried an old CRT screen with VGA cable, the same problem occurred. So it's not because of the monitor.
Attached Files
File Type: txt kernel-config.txt (64.4 KB, 12 views)
 
Old 10-14-2009, 01:50 PM   #13
manwithaplan
Member
 
Registered: Nov 2008
Location: ~/
Distribution: Arch || Sidux
Posts: 393

Rep: Reputation: 45
Quote:
Originally Posted by scheme View Post
At startup X decides to ignore the EDID. When it comes to the modes part it can't validate any and falls back to nvidia-auto-select, after which it tries to find EDID and can't get it, setting all back to default. In KDE I tried my xrandr moves on it, without success. I checked /usr/src/linux/.config and couldn't find anything obviously missing, all EDID and screen detection stuff seemed to be there (I'm a noob at kernel configuration tho, here's an attachment).

I also tried an old CRT screen with VGA cable, the same problem occurred. So it's not because of the monitor.
The kernel config looks ok... Though I would check nvidia's forums for additional help... And see if this is a known issue, if not file a bug report.
 
Old 10-14-2009, 02:37 PM   #14
scheme
LQ Newbie
 
Registered: May 2009
Distribution: Ubuntu, Ubuntu Studio, Debian, Arch
Posts: 24

Original Poster
Rep: Reputation: 1
I tried the EDID settings on Studio64 installation (with the nv (not nvidia)) drivers. It finds them from the right port, but it's flawed. Maximum screen size is 1280x1280 and when I try to set up a new mode it just says it's too high. It gives me one resolution with the right aspect ratio as a default, rest are bogus.

Quick glance at the Nvidia forums didn't give anything relevant.
 
Old 10-14-2009, 02:51 PM   #15
manwithaplan
Member
 
Registered: Nov 2008
Location: ~/
Distribution: Arch || Sidux
Posts: 393

Rep: Reputation: 45
See if this link is of any help http://forums.nvidia.com/index.php?showtopic=73027 There is a mention of an alternative solution to acquiring the EDID.

And this option seemed possible:
Code:
Option "UseDisplayDevice" "CRT"
under Section "Device"

As an alternative ... try the newest beta driver from nvidia's site. It has something to do with either the driver, or xorg.
 
  


Reply



Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
Dual Monitor / Single Desktop KDE4 on-board video & PCI nvidia card Hewson Linux - Hardware 3 10-16-2009 11:02 AM
current location for setting desktop prefs in kde4 vs kde 3? babag Mandriva 9 05-18-2009 11:51 PM
Heads-up, nVidia proprietary drivers with kde4 you may not need the options tweaks!! GlennsPref Mandriva 7 02-11-2009 09:20 PM
nvidia screen resolution limited to 1024 in suse11.1 and KDE4.1 with 3d enabled ianio SUSE / openSUSE 3 01-28-2009 03:11 AM
Preparing for KDE4: Nvidia fails on 2.6.21-2952.fc8xen ericcarlson Fedora 8 01-18-2008 01:51 AM

LinuxQuestions.org > Forums > Linux Forums > Linux - Distributions > Gentoo

All times are GMT -5. The time now is 06:54 PM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration