On 16/01/11 01:16, VDR User wrote:
On Sat, Jan 15, 2011 at 2:36 PM, Tony Houghton<h@xxxxxxxxxxx> wrote:
I wonder whether it might be possible to use a more eonomical card which
is only powerful enough to decode 1080i without deinterlacing it and
take advantage of the abundant CPU power most people have nowadays to
perform software deinterlacing. It may not be possible to have something
as sophisticated as NVidia's temporal + spatial, but some of the
existing software filters should scale up to HD without overloading the
CPU seeing as it wouldn't be doing the decoding too.
Well, you can get a gt220 for around $40USD which does full rate
temporal-spatial 1080i and allows you to use it with an old slow cpu's
that are dirt cheap if you don't already have one collecting dust in
your basement. Not sure how much more economical you can get aside of
free.
I also/mainly mean more economical in power consumption and ease of
installation and cooling. Most cheap GT220s have fans (most likely cheap
& noisy ones) so I wouldn't want one of them in my HTPC. A fanless one
might overheat being packed in closely with my DVB cards. But many
motherboards already have integrated NVidia chipsets with HDMI,
including audio, and basic VDPAU functionality. Mine is an 8200 and I
know there's also been a lot of interest in Ion systems for HTPCs, so I
think finding some way of getting these systems to display 1080i nicely
should be a good move.
_______________________________________________
vdr mailing list
vdr@xxxxxxxxxxx
http://www.linuxtv.org/cgi-bin/mailman/listinfo/vdr