Re: How to test whether a buffer is in linear format

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On Fri, 5 Aug 2022 12:32:01 +0000
"Hoosier, Matt" <Matt.Hoosier@xxxxxxxxxx> wrote:

> Suppose that I want to map a GPU buffer to the CPU and do image
> analysis on it. I know all the usual cautions about this being a
> poor performance option, etc. But suppose for the moment that the
> use-case requires it.
> 
> What's the right set of preconditions to conclude that the buffer
> is in vanilla linear representation? In other words: no
> compression, tiling, or any other proprietary GPU tricks that
> would prevent accessing the pixel data in the same way you would
> for a dumb buffer.
> 

Hi Matt,

whoever produced the buffer must *explicitly* tell you that the
buffer is using the DRM format modifier DRM_FORMAT_MOD_LINEAR.

> I think that requiring the modifiers to be 0x0 would suffice. But
> is that overkill? Maybe there are situations when some modifiers
> are set, but they don't affect the interpretation of the pixel
> data.

It is not overkill, it is strictly necessary. It is not sufficient
though, you must know things like stride and offset for each plane
as well in addition to width, height and pixel format. All those
together should be enough. Note, that DRM_FORMAT_MOD_LINEAR must be
explicit. If you lack a modifier, you cannot assume it is linear.

No modifier can ever be ignored. If there is no modifier, or it is
invalid, then you must use some originating-driver specific means
to figure out what the "real modifier" is.


Thanks,
pq

Attachment: pgp5JGpTgCQul.pgp
Description: OpenPGP digital signature


[Index of Archives]     [Linux DRI Users]     [Linux Intel Graphics]     [Linux USB Devel]     [Video for Linux]     [Linux Audio Users]     [Yosemite News]     [Linux Kernel]     [Linux SCSI]     [XFree86]     [Linux USB Devel]     [Video for Linux]     [Linux Audio Users]     [Linux Kernel]     [Linux SCSI]     [XFree86]
  Powered by Linux