Depends what information you need - if you just need a binary valid/not valid then prune it first then verify. If you want a more fine grained data set then don't. Write some code - forking and running openssl verify each time will be insanely slow - don't do that. I doubt you really have a billion unique certificates - avoid testing duplicates. Also don't forget that you really need certificate chains, so I hope you captured the intermediate certificates too!
Cheers
Rich.
On 30 March 2017 at 18:44, ebe ebe <cipetpet5@xxxxxxxxxx> wrote:
Hello,
I am a CS graduate student and doing a measurement study regarding the SSL ecosystem. I have approximately 1 billion SSL certificates and I would like to run openssl verify on each certificate to sift out invalid certificates. My major concern, as you might guess, is whether doing this verification is feasible given the size of my dataset. An alternative idea I have is to replicate the verification steps of openssl. More specifically, I am working with a Hadoop infrastructure and I can perform some of the verification steps without running into scalability issues (e.g is certificate between notBefore-notAfter timestamps, subject key&authority key identifier checks). However, with this approach I feel like verifying the signature would be a big challenge. Any ideas on how I can tackle these problems?
Regards,
Ceyhun
--
openssl-users mailing list
To unsubscribe: https://mta.openssl.org/mailman/listinfo/openssl-users
-- openssl-users mailing list To unsubscribe: https://mta.openssl.org/mailman/listinfo/openssl-users