[Bug 206351] RX 5600 XT Not Correctly Recognized, Max Memory Frequency Below Where it Should Be

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



https://bugzilla.kernel.org/show_bug.cgi?id=206351

--- Comment #5 from Matt McDonald (gardotd426@xxxxxxxxx) ---
I'm not referring to the 1750MHz boost clock. I'm referring to the 14Gbps
Memory clock on the same page. Which is 1750MHz (1750Mhz * 8 octopumped GDDR6 =
14Gbps or 14GT/s, which is the stated memory frequency of the card) . Which is
how it's reported in Windows as well. Like explained on here:



https://forums.tomshardware.com/threads/effective-memory-clock-speed-confusions.3518637/ 



and here:
https://www.techpowerup.com/forums/threads/how-to-calculate-gddr6-speed-from-gpu-z.250747/



Literally everything I can find has said to calculate the GDDR6 clock frequency
as DOUBLE(DoubleDataRate, so x2) rate and Quad(x4) pumped, so 1750 * 2 * 4 =
14000, or 14Gbps. The specs for the card itself show it's memory frequency at
14Gbps, which fits everything I've seen. Windows reports the Memory clock (no,
not the Boost clock, they're listed separately) as 1750MHz which also lines up. 




Am I missing something? If I am, I apologize but literally everything I can
find says otherwise. If I am missing something, how does 14Gbps (which is the
official memory clock frequency of the card) end up being 900MHz?

-- 
You are receiving this mail because:
You are watching the assignee of the bug.
_______________________________________________
dri-devel mailing list
dri-devel@xxxxxxxxxxxxxxxxxxxxx
https://lists.freedesktop.org/mailman/listinfo/dri-devel



[Index of Archives]     [Linux DRI Users]     [Linux Intel Graphics]     [Linux USB Devel]     [Video for Linux]     [Linux Audio Users]     [Yosemite News]     [Linux Kernel]     [Linux SCSI]     [XFree86]     [Linux USB Devel]     [Video for Linux]     [Linux Audio Users]     [Linux Kernel]     [Linux SCSI]     [XFree86]
  Powered by Linux