On Tue, 2010-01-12 at 10:27 -0500, Jonathan Kamens wrote: > My old 32-bit system, on which I was running bleeding-edge Rawhide, > died when folks working in my house fried my motherboard by plugging a > sheetrock saw into my UPS (brilliant!). I've replaced my old system > with a home-built system with an Asus P5Q SE2 motherboard and a > GeoForce 8400 GS graphics controller. To allow things to stabilize > for a while before going back to bleeding-edge, I'm now running F12 > with updates-testing rather than Rawhide. Also, I've switched to > 64-bit. > > > > My monitor, a ViewSonic VX2235wm LCD, has both VGA and DVI inputs. > With my old system, I was using VGA, since my graphics controller > didn't have DVI out. The GeoForce, however, has DVI out, so I'd like > to start using it. The problem is that when I try to switch from VGA > to DVI, the monitor claims that it's getting no digital signal. This > could be due to any number of different problems, e.g.: > > > > · Broken monitor > > · Broken graphics controller > > · Bad / incompatible DVI cable > > · Incompatibility between the kind of DVI signal the controller > is generating and the kind of signal the monitor accepts (I hear > rumors that this kind of thing happens, but I know very little about > the various video out formats and therefore have no insight into it or > how to diagnose it) The DVI cable / connector standard is actually capable of transmitting analog signals, as well as digital. (DVI-A, rather than DVI-D). A connector which can do both is referred to as DVI-I. You could, theoretically, wind up with a situation where you're trying to connect a video card that can only do DVI-A to a monitor that only does DVI-D, or something. It's really rather unlikely, though. Wikipedia's page is reasonable: http://en.wikipedia.org/wiki/Digital_Visual_Interface > · Known bug / non-support for DVI output in kernel and/or X > driver > > · Regression in kernel and/or X driver > > · Me doing something stupid > > > > I'm hoping there's somebody here who can help me narrow down the > possibilities so that I know whether I should file a bug about this, > replace some hardware, just live with it, or what. It's very likely a driver issue, to be honest. I would usually diagnose this just by swapping bits out - try a different cable, a different monitor, and a different computer (and possibly a different operating system). Of course, that gets much trickier if you don't have spares of all the above available. :) For a start, you could send us your /var/log/Xorg.0.log files - one from a successful X start with an old-style VGA cable, and one from a failed attempt with a DVI connection. So we can compare. If we can't diagnose it very easily, you may want to just give up and not worry about it. Theoretically a digital signal should give you a better picture, particularly on an LCD, than an analog (which results in _two_ unnecessary conversion stages, one D-A then one A-D). In practice, however, it's bloody hard to tell the difference in most situations. I have two monitors on my home setup, one connected by an old VGA cable and one by DVI, and I really don't see a difference. > · When I connect the monitor with just DVI and reboot the > system, the BIOS and GRUB screens (well, I'm sure about the BIOS > screens but less certain about the GRUB screen) render just fine > through DVI, but then the screen goes blank. This makes it even *more* likely to be a driver issue. In fact I'd say it's 99.99% certain at that point. > · Judging from my Xorg.0.log > (http://stuff.mit.edu./~jik/misc/Xorg.0.log.txt), the X server notices > when the DVI cable is inserted or removed. Oh hey I should've spotted that! I'll take a look. -- Adam Williamson Fedora QA Community Monkey IRC: adamw | Fedora Talk: adamwill AT fedoraproject DOT org http://www.happyassassin.net -- test mailing list test@xxxxxxxxxxxxxxxxxxxxxxx To unsubscribe: https://admin.fedoraproject.org/mailman/listinfo/test