Where did Nvidia hide my texture units?

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]<

 



>
> On Thu, Nov 06, 2008 at 07:02:53PM +0100, Reimar D?ffinger wrote:
> >* On Thu, Nov 06, 2008 at 05:22:49PM +0100, J?nos Szab? wrote:
> *>* > Wikipedia tells me that I should have 48 texture units (dream) on my Geforce
> *>* > 8800GTS, unfortunately I am not able to use more than 4.
> *>*
> *>* Actually, it does work just fine, so you can ignore the warning.
> *>* For some reason the function always returns 4 for NVidia, no idea why...
> *>* One of the very few cases where ATI actually works right...
> *>* Actually, the idea behind it is explained here:
> *>* http://developer.nvidia.com/object/General_FAQ.html#t6
> *>* No idea if/when I will find the motivation to implement that...
> *
> Actually. The way the code was organized I could just replace one
> GL_MAX_TEXTURE_UNITS by GL_MAX_TEXTURE_IMAGE_UNITS so it should probably
> work right now.
> Might behave a bit weird on old hardware, but there that code will not
> work anyway.
>
> Greetings,
> Reimar D?ffinger
>
> Works for me, great!

By the way, would it be possible to use the pipeline for e.g. adding high
quality output-resolution noise (as of now it is source res, AFAIK) or even
for some limited denoising?
(Sorry for repeating myself, but something went awry with my previous post).


[Index of Archives]     [Linux USB Devel]     [Video for Linux]     [Linux Audio Users]     [Yosemite News]     [Linux Kernel]     [Linux SCSI]     [XFree86]
  Powered by Linux