Re: RGB/PAL over VGA at variable frame rate

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



wow, when I buy an AMD card I will sure look up your code. :)

currently I'm still using a pentium 4, 2.4GHz machine with nvidia AGP 440MX card,

only way to get that to work properly was with the older nvidia drivers 71.86.0 , apparently the newer drivers forces PAL or any other TV Standard to run @60Hz instead of 50Hz, which is what my broadcast is. So I had to "downgrade" the driver to get the proper output.

With these options in my xorg.conf to disable the driver's auto settings.

Section "Monitor"
    .
    .
    ModeLine "720x576PAL"   27.50   720 744 800 880 576 582 588 625 -hsync -vsync
    ModeLine "720x576@50i"  14.0625 720 760 800 900 576 582 588 625 -hsync -vsync interlace
    .
EndSection

Section "Screen"
    .
    .
    Option         "UseEDIDFreqs" "FALSE"
    Option         "UseEDIDDpi" "FALSE"
    Option         "ModeValidation" "NoEdidModes"
    SubSection "Display"
         Modes       "720x576PAL"
    EndSubSection
    .
EndSection

xvidtune reports this on DISPLAY=:0.1
 "720x576"      27.50    720  744  800  880    576  582  588  625 -hsync -vsync

cpu load is 10% with xineliboutput set to use xvmc, my cpu fan even turns off, it only kicks in when I view a xvid/divx type movie.

Theunis

2008/7/22 Thomas Hilber <vdr@xxxxxx>:
Hi list,

the last few days I made some interesting experiences with VGA cards I
now want to share with you.

goal
----

develop a budget card based VDR with PAL/RGB output and FF like output quality

problem
-------

as we all know current VGA graphics output quality suffers from certain
limitations. Graphics cards known so far operate at a fixed frame rate
not properly synchronized with the stream.
Thus fields or even frames do often not appear the right time at the ouput.
Some are doubled others are lost. Finally leading to more or less jerky
playback.

To a certain degree you can workaround this by software deinterlacing.
At the cost of worse picture quality when playing interlaced material. Also
CPU load is considerably increased by that.

It appeared to be a privilege of so called full featured cards (expensive cards
running proprietary firmware) to output true RGB PAL at variable framerate.
Thus always providing full stream synchronicity.

I've always been bothered by that and finally started to develop a few patches
with the goal in mind to overcome these VGA graphics limitations.

solution
--------

graphics cards basically are not designed for variable frame rates. Once
you have setup their timing you are not provided any means like registers to
synchronize the frame rate with external timers. But that's exactly what's
needed for signal output to stay in sync with the frame rate provided by
xine-lib or other software decoders.

To extend/reduce the overall time between vertical retrace I first
dynamically added/removed a few scanlines to the modeline but with bad
results. By doing so the picture was visibly jumping on the TV set.

After some further experimenting I finally found a solution to fine adjust the
frame rate of my elderly Radeon type card. This time without any bad side
effects on the screen.

Just trimming the length of a few scanlines during vertical retrace
period does the trick.

Then I tried to implement the new functionality by applying only minimum
changes to my current VDR development system. Radeon DRM driver is perfectly
suited for that. I just had to add a few lines of code there.

I finally ended up in a small patch against Radeon DRM driver and a even
smaller one against xine-lib. The last one also could take place directly
in the Xserver. Please see attachments for code samples.

When xine-lib calls PutImage() it checks whether to increase/decrease
Xservers frame rate. This way after a short adaption phase xine-lib can
place it's PutImage() calls right in the middle between 2 adjacent vertical
blanking intervals. This provides maximum immunity against jitter. And
even better: no more frames/fields are lost due to stream and graphics
card frequency drift.

Because we now cease from any deinterlacing we enjoy discontinuation of
all its disadvantages:

If driving a device with native interlaced input (e.g. a traditional TV Set
or modern TFT with good RGB support) we have no deinterlacing artifacts
anymore.

Since softdecoders now are relieved of any CPU intensive deinterlacing
we now can build cheap budget card based VDRs with slow CPUs.

Please find attached 2 small patches showing you the basic idea and a
description of my test environment. The project is far from complete but
even at this early stage of development shows promising results.

It should give you some rough ideas how to recycle your old hardware to a
smoothly running budget VDR with high quality RGB video output.

some suggestions what to do next:
- detection of initial field parity
- faster initial frame rate synchronisation after starting replay
- remove some hard coded constants (special dependencies on my system's timing)

Some more information about the project is also available here
http://www.vdr-portal.de/board/thread.php?threadid=78480

Currently it's all based on Radeons but I'll try to also port it to other
type of VGA cards. There will be some updates in the near future. stay tuned.

-Thomas


_______________________________________________
vdr mailing list
vdr@xxxxxxxxxxx
http://www.linuxtv.org/cgi-bin/mailman/listinfo/vdr


_______________________________________________
vdr mailing list
vdr@xxxxxxxxxxx
http://www.linuxtv.org/cgi-bin/mailman/listinfo/vdr

[Index of Archives]     [Linux Media]     [Asterisk]     [DCCP]     [Netdev]     [Xorg]     [Util Linux NG]     [Xfree86]     [Big List of Linux Books]     [Fedora Users]     [Fedora Women]     [ALSA Devel]     [Linux USB]

  Powered by Linux