November 14, 2003

Security by obscurity

"Security by obscurity" is the belief that a good way to ensure the security of computer systems is to keep the systems' designs secret. This idea is very seductive, particularly because the alternative, exposing the designs of the systems to the public, may seem counter-intuitive. In most situations, however, "security by obscurity" doesn't work: even if the systems' designs are supposed to be kept secret, they often aren't.

This was recently exemplified by recent incidents involving two of the three major voting machine manufacturers in the U.S. Namely, the source code belonging to Diebold leaked onto the Internet earlier this year, and the portions of system belonging to Sequoia were found on the Internet a few weeks ago. On a much smaller scale, we assume that thousands of people (from maintenance staff to election officials to ex-employees) already had access to these systems before they were posted on the Internet. Now that the Diebold code has been made available to the public at large, it has been analyzed, and significant security problems have been discovered.

These incidents underscore the fact that one should not rely upon secrecy in order to ensure the security of a product. Rather, the product must be built to withstand attacks from an adversary who does know the system's design. And one of the best ways to do this is to open the system's design to the public and to security experts at large. Posted by Tadayoshi Kohno at November 14, 2003 12:05 AM