> On Jul 13, 2020, at 4:17 PM, Krish Sadhukhan <krish.sadhukhan@xxxxxxxxxx> wrote: > > > On 7/13/20 4:11 PM, Nadav Amit wrote: >>> On Jul 13, 2020, at 4:06 PM, Krish Sadhukhan <krish.sadhukhan@xxxxxxxxxx> wrote: >>> >>> >>> On 7/12/20 9:39 PM, Nadav Amit wrote: >>>> The low CR3 bits are reserved but not MBZ according to tha APM. The >>>> tests should therefore not check that they cause failed VM-entry. Tests >>>> on bare-metal show they do not. >>>> >>>> Signed-off-by: Nadav Amit <namit@xxxxxxxxxx> >>>> --- >>>> x86/svm.h | 4 +--- >>>> x86/svm_tests.c | 26 +------------------------- >>>> 2 files changed, 2 insertions(+), 28 deletions(-) >>>> >>>> diff --git a/x86/svm.h b/x86/svm.h >>>> index f8e7429..15e0f18 100644 >>>> --- a/x86/svm.h >>>> +++ b/x86/svm.h >>>> @@ -325,9 +325,7 @@ struct __attribute__ ((__packed__)) vmcb { >>>> #define SVM_CR0_SELECTIVE_MASK (X86_CR0_TS | X86_CR0_MP) >>>> #define SVM_CR0_RESERVED_MASK 0xffffffff00000000U >>>> -#define SVM_CR3_LEGACY_RESERVED_MASK 0xfe7U >>>> -#define SVM_CR3_LEGACY_PAE_RESERVED_MASK 0x7U >>>> -#define SVM_CR3_LONG_RESERVED_MASK 0xfff0000000000fe7U >>>> +#define SVM_CR3_LONG_RESERVED_MASK 0xfff0000000000000U >>>> #define SVM_CR4_LEGACY_RESERVED_MASK 0xff88f000U >>>> #define SVM_CR4_RESERVED_MASK 0xffffffffff88f000U >>>> #define SVM_DR6_RESERVED_MASK 0xffffffffffff1ff0U >>>> diff --git a/x86/svm_tests.c b/x86/svm_tests.c >>>> index 3b0d019..1908c7c 100644 >>>> --- a/x86/svm_tests.c >>>> +++ b/x86/svm_tests.c >>>> @@ -2007,38 +2007,14 @@ static void test_cr3(void) >>>> { >>>> /* >>>> * CR3 MBZ bits based on different modes: >>>> - * [2:0] - legacy PAE >>>> - * [2:0], [11:5] - legacy non-PAE >>>> - * [2:0], [11:5], [63:52] - long mode >>>> + * [63:52] - long mode >>>> */ >>>> u64 cr3_saved = vmcb->save.cr3; >>>> - u64 cr4_saved = vmcb->save.cr4; >>>> - u64 cr4 = cr4_saved; >>>> - u64 efer_saved = vmcb->save.efer; >>>> - u64 efer = efer_saved; >>>> - efer &= ~EFER_LME; >>>> - vmcb->save.efer = efer; >>>> - cr4 |= X86_CR4_PAE; >>>> - vmcb->save.cr4 = cr4; >>>> - SVM_TEST_CR_RESERVED_BITS(0, 2, 1, 3, cr3_saved, >>>> - SVM_CR3_LEGACY_PAE_RESERVED_MASK); >>>> - >>>> - cr4 = cr4_saved & ~X86_CR4_PAE; >>>> - vmcb->save.cr4 = cr4; >>>> - SVM_TEST_CR_RESERVED_BITS(0, 11, 1, 3, cr3_saved, >>>> - SVM_CR3_LEGACY_RESERVED_MASK); >>>> - >>>> - cr4 |= X86_CR4_PAE; >>>> - vmcb->save.cr4 = cr4; >>>> - efer |= EFER_LME; >>>> - vmcb->save.efer = efer; >>>> SVM_TEST_CR_RESERVED_BITS(0, 63, 1, 3, cr3_saved, >>>> SVM_CR3_LONG_RESERVED_MASK); >>>> - vmcb->save.cr4 = cr4_saved; >>>> vmcb->save.cr3 = cr3_saved; >>>> - vmcb->save.efer = efer_saved; >>>> } >>>> static void test_cr4(void) >>> APM says, >>> >>> "Reserved Bits. Reserved fields should be cleared to 0 by software when writing CR3." >>> >>> If processor allows these bits to be left non-zero, "should be cleared to 0" means it's not mandatory then. I am wondering what this "should be" actually means :-) ! >> I really tested it, so I guess we “should” not argue about it. ;-) > No, I am not arguing over your test results. :-) >> Anyhow, according to APM Figure 5-16 (“Control Register 3 (CR3)-Long Mode”), >> bits 52:63 are “reserved, MBZ” and others are just marked as “Reserved”. So >> it seems they are not the same. > I am just saying that the APM language "should be cleared to 0" is misleading if the processor doesn't enforce it. Just to ensure I am clear - I am not blaming you in any way. I also found the phrasing confusing. Having said that, if you (or anyone else) reintroduces “positive” tests, in which the VM CR3 is modified to ensure VM-entry succeeds when the reserved non-MBZ bits are set, please ensure the tests fails gracefully. The non-long-mode CR3 tests crashed since the VM page-tables were incompatible with the paging mode. In other words, instead of setting a VMMCALL instruction in the VM to trap immediately after entry, consider clearing the present-bits in the high levels of the NPT; or injecting some exception that would trigger exit during vectoring or something like that. P.S.: If it wasn’t clear, I am not going to fix KVM itself for some obvious reasons.