The HSM clock needs to be running at 101% the pixel clock of the HDMI controller, however it's shared between the two HDMI controllers, which means that if the resolutions are different between the two HDMI controllers, and the lowest resolution is on the second (in enable order) controller, the first HDMI controller will end up with a smaller than expected clock rate. Since we don't really need an exact frequency there, we can simply change the minimum rate we expect instead. Signed-off-by: Maxime Ripard <maxime@xxxxxxxxxx> --- drivers/gpu/drm/vc4/vc4_hdmi.c | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/drivers/gpu/drm/vc4/vc4_hdmi.c b/drivers/gpu/drm/vc4/vc4_hdmi.c index 9f30fab744f2..d99188c90ff9 100644 --- a/drivers/gpu/drm/vc4/vc4_hdmi.c +++ b/drivers/gpu/drm/vc4/vc4_hdmi.c @@ -462,7 +462,7 @@ static void vc4_hdmi_encoder_enable(struct drm_encoder *encoder) * pixel clock, but HSM ends up being the limiting factor. */ hsm_rate = max_t(unsigned long, 120000000, (pixel_rate / 100) * 101); - ret = clk_set_rate(vc4_hdmi->hsm_clock, hsm_rate); + ret = clk_set_min_rate(vc4_hdmi->hsm_clock, hsm_rate); if (ret) { DRM_ERROR("Failed to set HSM clock rate: %d\n", ret); return; -- git-series 0.9.1 _______________________________________________ dri-devel mailing list dri-devel@xxxxxxxxxxxxxxxxxxxxx https://lists.freedesktop.org/mailman/listinfo/dri-devel