August 18, 2012

'Mind-Control' Gaming Devices Leak Brain Data That Help Researchers Guess Users' Secrets

*Source: Forbes



The budding field of brain-machine interfaces promises a science-fictional future where games, computer operating systems, and prosthetics can be controlled with thought alone. But a new study shows that connecting minds to machines could let sensitive private information leak out along with those mental commands.

At the Usenix security conference in Seattle last week, a group of researchers from the University of California at Berkeley, Oxford University and the University of Geneva presented a paper (PDF here) that hints at the darker side of a future where brain sensors are used to let thoughts manipulate computers as fluidly as a mouse. In a study of 28 subjects wearing brain-machine interface devices built by companies like Neurosky and Emotiv and marketed to consumers for gaming and attention exercises, the researchers found they were able to extract hints directly from the electrical signals of the test subjects’ brains that partially revealed private information like the location of their homes, faces they recognized and even their credit card PINs.

“These devices have access to your raw EEG [electroencephalography, or electrical brain signal] data, and that contains certain neurological phenomena triggered by subconscious activities,” says Ivan Martinovic, a member of the faculty in the department of computer science at Oxford. “So the central question we were asking with this is work was, is this is a privacy threat?”

In their experiments, the researchers first showed users wearing the mind-control headsets a series of known images and numbers to measure what a moment of recognition looked like in their EEG data–they write that they sought out a signal known as the P300 response, a electrical spike that typically appears close to 300 milliseconds after a stimulus the subject recognizes.

Then they showed the subjects a series of test images and numbers and looked for those same signals. In a collection of unknown faces, for instance, they found a significant spike in the EEG data for a picture of Barack Obama that revealed the test subjects’ recognition of the president’s face. When shown a collection of locations on maps that included one of their home, the headset-wearers’ brains emitted tell-tale hints that allowed the experimenters to determine their home’s general location with 60% accuracy on the first try among a collection of ten choices. And when the subjects were asked to memorize a four-digit PIN and then shown a series of random numbers, the researchers found they could guess which of those random numbers was the first digit in the PIN with about 30% accuracy on the first try–far from a home run, but a significantly higher success rate than a random guess.

In fact, none of those results point to a realistic possibility of cybercriminals reading victims’ minds through their gaming headsets any time soon, Oxford’s Martinovic admits. The Neurosky and Emotiv devices, which sell for between $200 and $300, have hardly entered the mainstream, and the mind-reading attacks the researchers describe aren’t reliable enough to make them a profitable avenue for data theft, anyway.

But Martinovic says that the main challenge in looking for signs of the users’ mental responses was sorting through the noisy and often inaccurate signals the headsets produced. Those signals are likely to improve. A Brown University researcher has already shown that paralyzed users with surgically implanted brain sensors can wield fine-grain software controls, and a University of Pittsburgh study showed that monkeys could successfully feed themselves with a mind-controlled robotic arm.

As the technology develops, keeping private information from mixing with those user commands may become tougher, he says. “We believe that these things are going to improve,” Martinovic says. “A more accurate signal is important for both a legitimate user and the attacker…The tradeoff will be whether you can protect the user from an attack and still have a good signal.”

It’s also important to note that for the data theft the researchers imagine to work, the snoop trying to read users’ brain signals would need access to both the headset and the images on the screen in front of them. But given that Neurosky’s and Emotiv’s devices both already have APIs that allow third-party developers to write programs that use the devices, it’s not hard to imagine an application that tricks users into thinking about private information that’s then revealed through their brains’ electrical signals, says computer science professor Dawn Song, whose group at Berkeley led the research.

“In this threat model, the attacker doesn’t need to compromise anything,” Song says. “He simply embeds the attack in an app, such as a game using [brain-machine interface] that the user downloads and plays. In this case, the malicious game designs and knows the visual stimuli the user is looking at and also gets the brain signal reading at the same time.”

Even then, admits Martinovic, the attacker would need to surreptitiously trick the user’s brain into calling up whatever data he or she hoped to steal. “The challenge would be to get users to think about their sensitive information,” he says. “But social engineering could make that possible. Attackers are creative.”

Read the researchers’ full paper below.



On the Feasibility of Side-Channel Attacks with Brain-Computer Interfaces

No comments:

Post a Comment