Voltage control on Southern Island GPU using radeon driver

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Hi, I have a HAINAN GPU below:

lspci -nn
0a:00.0 Display controller [0380]: Advanced Micro Devices, Inc. [AMD/ATI] Sun LE [Radeon HD 8550M / R5 M230] [1002:666f]

I run linux 5.13.12 on Arch on a Lenovo B40-70 laptop.

I'm trying to understand more on how voltage control works and how I can modify the voltage for doing overvoltage / undervoltage on my GPU. The reason is I am observing how running programs under high GPU load (glmark2) would lead to crashes when I use dpm=1 in either radeon or amdgpu driver, which seems to happen when I am reaching power level 4 (sclk 900MHz), while a lighter program like glxgears could run and switch power levels  between 0,1,2 without issue under both drivers. I believe my laptop might be faulty, but I would like to take this opportunity to try fixing it from the driver's side so that it can run anyway, however limited.

Right now, I have managed to increase the performance of my GPU by manually overwriting the sclk to 630MHz in all performance_levels in radeon_pm.c, which surprises me as overriding the clock was not possible for me to do previously via sysfs. 

I've managed to tweak both sclk and mclk (or so I believe), but I still cannot tweak the voltage (vddc). The reason is, if I increase the sclk to 650MHz, the lockup will happen again. Changing the pl->vddc  variable does not seem to do anything. After various tracing with printk, I understand that on my system:

pi->voltage_control = radeon_atom_is_voltage_gpio(rdev, SET_VOLTAGE_TYPE_ASIC_VDDC,
   VOLTAGE_OBJ_GPIO_LUT)

this returns false, while:

si_pi->voltage_control_svi2 =
radeon_atom_is_voltage_gpio(rdev, SET_VOLTAGE_TYPE_ASIC_VDDC,
   VOLTAGE_OBJ_SVID2);

This returns true, so I believe my system is using SVI2 somehow to set the voltage. Having no experience with SVI2, I read online and found out that SVI2 is a voltage regulator that uses Data / Clock pins to clock-in 8 bits of information and convert it to some voltage value between OFF, 0.5V -> 1.5V, offering fine control based on some look up table.

My questions are as follows:
Is it possible for me to modify my system so that I can manually adjust the voltage to my GPU?

Thank you very much in advance. This is the first time I deal with kernel drivers, so any guidance on the matter helps a lot.

- Evans



[Index of Archives]     [Linux USB Devel]     [Linux Audio Users]     [Yosemite News]     [Linux Kernel]     [Linux SCSI]

  Powered by Linux