November 16, 2003

Brain implants

Researchers at Cyberkinetics, Inc. have invented a device that could enable paralyzed people to perform simple computer operations, such as play video games, directly with their brains. Although the current system is limited in functionality, researchers hope that the technology could someday enable paralyzed patients to perform a multitude of other operations as well. For example, patients might be able to use the technology to "type" on their computers, control lights and heating equipment, maneuver wheelchairs, and manipulate robotic limbs.

The device, which has already been tested on animals, consists of two components: a chip to be installed inside a patient's brain, and an external computer that processes the data from the chip. As Nature reports, future prototypes will be smaller and wireless.

This technology sounds fantastic, and could really help change the lives of some paralyzed individuals. Unfortunately, this technology also has the potential of being very abusable. Let's just hope that the designers of this technology, unlike the designers of other technologies, realize the importance of security. Once the prototypes become wireless, we'd hardly want a situation where someone could spy on a paralyzed neighbor's thought processes (such as when the patient is "typing" on a computer). And we'd hardly want a maladjusted person to be able to control an innocent patient's wheelchair or robotic limbs "just for fun."

Posted by Tadayoshi Kohno at 12:44 PM | Comments (474)

November 14, 2003

Security by obscurity

"Security by obscurity" is the belief that a good way to ensure the security of computer systems is to keep the systems' designs secret. This idea is very seductive, particularly because the alternative, exposing the designs of the systems to the public, may seem counter-intuitive. In most situations, however, "security by obscurity" doesn't work: even if the systems' designs are supposed to be kept secret, they often aren't.

This was recently exemplified by recent incidents involving two of the three major voting machine manufacturers in the U.S. Namely, the source code belonging to Diebold leaked onto the Internet earlier this year, and the portions of system belonging to Sequoia were found on the Internet a few weeks ago. On a much smaller scale, we assume that thousands of people (from maintenance staff to election officials to ex-employees) already had access to these systems before they were posted on the Internet. Now that the Diebold code has been made available to the public at large, it has been analyzed, and significant security problems have been discovered.

These incidents underscore the fact that one should not rely upon secrecy in order to ensure the security of a product. Rather, the product must be built to withstand attacks from an adversary who does know the system's design. And one of the best ways to do this is to open the system's design to the public and to security experts at large.

Posted by Tadayoshi Kohno at 12:05 AM | Comments (423)

November 12, 2003

Linux Backdoor Attempt

Somebody tried to sneak code into the Linux kernel to set up a backdoor that would let them get root (i.e. seize control of the machine) on any Linux machine that incorporated the modified code. Alert programmers noticed the addition of the mystery code and prevented it from getting into an official release. Kerneltrap.org has the details, including the email thread in which members of the development team caught the problem.
Posted by Ed Felten at 06:28 PM | Comments (429)