My old 32-bit system, on which I was running bleeding-edge
Rawhide, died when folks working in my house fried my motherboard by plugging a
sheetrock saw into my UPS (brilliant!). I've replaced my old system with
a home-built system with an Asus P5Q SE2 motherboard and a GeoForce 8400 GS
graphics controller. To allow things to stabilize for a while before
going back to bleeding-edge, I'm now running F12 with updates-testing rather
than Rawhide. Also, I've switched to 64-bit. My monitor, a ViewSonic VX2235wm LCD, has both VGA and DVI
inputs. With my old system, I was using VGA, since my graphics controller
didn't have DVI out. The GeoForce, however, has DVI out, so I'd like to
start using it. The problem is that when I try to switch from VGA to DVI,
the monitor claims that it's getting no digital signal. This could be due
to any number of different problems, e.g.: ·
Broken monitor ·
Broken graphics controller ·
Bad / incompatible DVI cable ·
Incompatibility between the kind of DVI signal the
controller is generating and the kind of signal the monitor accepts (I hear
rumors that this kind of thing happens, but I know very little about the
various video out formats and therefore have no insight into it or how to diagnose
it) ·
Known bug / non-support for DVI output in kernel
and/or X driver ·
Regression in kernel and/or X driver ·
Me doing something stupid I'm hoping there's somebody here who can help me narrow down
the possibilities so that I know whether I should file a bug about this,
replace some hardware, just live with it, or what. Here's some additional information that is probably
relevant: ·
I've got no xorg.conf file (i.e., I'm using
whatever settings the X server determines automatically). ·
My SMOLT profile is at http://www.smolts.org/client/show/pub_f04ce08e-bef7-4c17-8d9d-43905badf700. ·
When I connect the monitor with just DVI and
reboot the system, the BIOS and GRUB screens (well, I'm sure about the BIOS
screens but less certain about the GRUB screen) render just fine through DVI,
but then the screen goes blank. ·
When the DVI cable is not plugged in, xrandr
says that it's not plugged in. When it is plugged in, xrandr acknowledges
that it has been plugged in. ·
I'm not certain, but I believe that when I
unplug the DVI cable after it has been plugged in, xrandr doesn't notice that
it has been unplugged. ·
Judging from my Xorg.0.log (http://stuff.mit.edu./~jik/misc/Xorg.0.log.txt),
the X server notices when the DVI cable is inserted or removed. ·
When I open System > Preferences > Display
with both VGA and DVI connected, it acknowledges that there are two monitors
connected. If I then tell it to mirror my screens, apply the changes, and
either unplug the VGA cable or use the monitor menu to switch to DVI input, the
monitor goes blank. Ditto of I tell it that I want to extend my display
to both screens. I think that's everything I've got. So, does anybody
have any insights or suggestions? Thanks, Jik |
-- test mailing list test@xxxxxxxxxxxxxxxxxxxxxxx To unsubscribe: https://admin.fedoraproject.org/mailman/listinfo/test