HS400 tuning and presets

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



I noticed some odd things with the MMC/SDHCI code while debugging an
HS400 tuning issue.

1) Is it expected that we never enable SDHCI_CTRL_PRESET_VAL_ENABLE
for an eMMC running at HS200 or HS400?
The flow for enabling HS400 is: Legacy -> HS200 -> Perform Tuning ->
HS -> HS400.
Looking at [sdhci_set_ios](https://source.chromium.org/chromiumos/chromiumos/codesearch/+/master:src/third_party/kernel/v5.4/drivers/mmc/host/sdhci.c;l=2019),
it looks like it's responsible for enabling presets.

    if (!(host->quirks2 & SDHCI_QUIRK2_PRESET_VALUE_BROKEN) &&
                    ((ios->timing == MMC_TIMING_UHS_SDR12) ||
                     (ios->timing == MMC_TIMING_UHS_SDR25) ||
                     (ios->timing == MMC_TIMING_UHS_SDR50) ||
                     (ios->timing == MMC_TIMING_UHS_SDR104) ||
                     (ios->timing == MMC_TIMING_UHS_DDR50) ||
                     (ios->timing == MMC_TIMING_MMC_DDR52))) {
            u16 preset;

            sdhci_enable_preset_value(host, true);
            preset = sdhci_get_preset_value(host);
            ios->drv_type = (preset & SDHCI_PRESET_DRV_MASK)
                    >> SDHCI_PRESET_DRV_SHIFT;
    }

MMC_TIMING_MMC_HS200 and MMC_TIMING_MMC_HS400 are missing from the
conditions, so we never enable presets. This means that by default
(only 2 controllers provide select_drive_strength) we use drive
strength B for both the card and the controller.

    int mmc_select_drive_strength(struct mmc_card *card, unsigned int max_dtr,
                                  int card_drv_type, int *drv_type)
    {
            struct mmc_host *host = card->host;
            int host_drv_type = SD_DRIVER_TYPE_B;

            *drv_type = 0;

            if (!host->ops->select_drive_strength)
                    return 0;
            ...
    }

Here is a trace log showing HS400 initialization: https://0paste.com/79874

2) When performing HS400 tuning we end up enabling presets.
The re-tuning sequence is: HS400->DDR52->HS->HS200->Perform Tuning->HS->HS400

So when we transition to DDR52 the code above enables presets. You can
see this happening in this trace log: https://0paste.com/79875. Look
at timestamp 1926.173800.

This means that the card and controller have the potential to have
mismatching drive strengths. This can be seen at timestamp
1926.175813.The HS200 preset on my controller is configured to A, but
the card would be set up as B (if the controller didn't override
select_drive_strength).

Should we be enabling presets for HS200/HS400 (and potentially
MMC_HS), or should we remove MMC_DDR52 from the condition above?

It looks like 0dafa60eb2506 ("mmc: sdhci: also get preset value and
driver type for MMC_DDR52") was the CL that added MMC_DDR52 to the
condition.

3) How do we ensure that card drive strength and controller drive
strength stay in sync when presets are enabled?
Right now mmc_select_driver_type is only called from
`mmc_select_hs400es` and `mmc_select_hs200`. `mmc_select_driver_type
doesn't currently take the timing into account when making a decision.
Only two devices currently provide the `select_drive_strength`
override, so we are setting the card to drive strength B for most
controllers.

Should we modify mmc_select_drive_strength to take in the target
timing so it can return the correct card drive strength. We could then
add an sdhci_select_drive_strength that queries the preset registers
(if enabled) and returns the target drive strength.

4) Should we be calling `mmc_select_driver_type` from
`mmc_hs400_to_hs200` and `mmc_hs200_to_hs400` during the transitions?
Or do we not care?

Sorry for the long email...
Once I get some guidance, I can send some patches.

Thanks,
Raul



[Index of Archives]     [Linux Memonry Technology]     [Linux USB Devel]     [Linux Media]     [Video for Linux]     [Linux Audio Users]     [Yosemite News]     [Linux Kernel]     [Linux SCSI]

  Powered by Linux