[Bug 40931] r600g: interpret integer texture types as ints regresses VDPAU/XVMC decode.

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



https://bugs.freedesktop.org/show_bug.cgi?id=40931

--- Comment #5 from Marek Olšák <maraeo@xxxxxxxxx> 2011-09-16 11:16:13 PDT ---
(In reply to comment #4)
> Hi Dave & Andy,
> 
> I think we have a disagreement here about what SCALED types should be.
> 
> As I understood it SCALED types should be represented as integers in memory,
> but when loaded into a shader converted to floats in the range 0..2^n (in
> opposite to normalized types), and that's how I used them in g3dvl.

That's right if you are talking about USCALED.

-- 
Configure bugmail: https://bugs.freedesktop.org/userprefs.cgi?tab=email
------- You are receiving this mail because: -------
You are the assignee for the bug.
_______________________________________________
dri-devel mailing list
dri-devel@xxxxxxxxxxxxxxxxxxxxx
http://lists.freedesktop.org/mailman/listinfo/dri-devel



[Index of Archives]     [Linux DRI Users]     [Linux Intel Graphics]     [Linux USB Devel]     [Video for Linux]     [Linux Audio Users]     [Yosemite News]     [Linux Kernel]     [Linux SCSI]     [XFree86]     [Linux USB Devel]     [Video for Linux]     [Linux Audio Users]     [Linux Kernel]     [Linux SCSI]     [XFree86]
  Powered by Linux