Thanks Jeff for your insightful response. In fact, here in California at least, Diebold has already been found guilty of installing software on their e-voting machines that was *never* certified by the state. This has led to several of their machines being (rightfully) de-certified. So it seems they already do just what you suggest they might. I suppose though, that one could argue that posting *any* source code would be quite an improvement. See http://edition.cnn.com/2004/TECH/biztech/08/23/evoting.labs.ap/ ~Jaeson -----Original Message----- From: Lorne J. Leitman [mailto:leitman@xxxxxxxxxxx] Sent: Wednesday, September 22, 2004 1:05 PM To: Jaeson Schultz Cc: bugtraq@xxxxxxxxxxxxxxxxx Subject: RE: Diebold Global Election Management System (GEMS) Backdoor Account Allows Authenticated Users to Modify Votes What's to stop them from giving you source code for one executable, and then installing something totally different on the machines, come election day? If you've read Ken Thompson's article "Reflections on Trusting Trust," you realize that even the source code won't provide ultimate proof of security and trustworthiness. Only dissecting the object code taken from one of the voting machines in production can do that, and that's an exteremely difficult thing to do. To quote from Ken Thompson's article (which can be found at http://www.acm.org/classics/sep95/ ): "The moral is obvious. You can't trust code that you did not totally create yourself. (Especially code from companies that employ people like me.) No amount of source-level verification or scrutiny will protect you from using untrusted code. In demonstrating the possibility of this kind of attack, I picked on the C compiler. I could have picked on any program-handling program such as an assembler, a loader, or even hardware microcode. As the level of program gets lower, these bugs will be harder and harder to detect. A well installed microcode bug will be almost impossible to detect. " --Jeff Leitman