Re: DVI output under XFree86 (nVidia GeForce)

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On Fri, Oct 11, 2002 at 10:11:16AM -0700, Stefan Llewellyn Smith wrote:
> I had terrible trouble two years ago trying to get an LCD monitor to run 
> with XFree86. This was with an nVidia Corporation GeForce 256 DDR card 
> and was not cured by getting the nVidia drivers.

For quite a long while I have around a machine which has NEC LCD1830
monitor with a GeForce card.  It is GeForce2 MX, PCI id 10de:0110
(rev b2), but a standard XFree86 driver I am using with it claims to
support GeForce DDR as well.

It happens infrequently (in order of once per two months, say) that
the whole thing goes into a "funk" mode and the whole screen fills
up with some weird colour patterns.  It seems that this is actually
a card itself which starts to misbehave as just restarting X may be
not enough to clear that.  I am not really sure.  OTOH things are
pushed a bit.  Usually at least two independent, and looking quite
differently, monitor displays are active, although only one of these
is used in any given time.

I have never seen a point with this monitor to mess with binary only
drivers from Nvidia.

  Michal



_______________________________________________
xfree86-list mailing list
xfree86-list@redhat.com
https://listman.redhat.com/mailman/listinfo/xfree86-list
IRC: #xfree86 on irc.redhat.com

[Red Hat General]     [Red Hat Watch]     [Red Hat Development]     [Kernel Development]     [Yosemite Camping]

  Powered by Linux