I loved Opera’s April Fools last week on Face Gestures. This was a particularly funny take on Mouse Gestures that allow you to navigate within a browser using simple customised mouse moves. Face Gestures…well see for yourself below, the write up is also on Opera Labs.
One pundit picked up on it and talked about how perhaps this wasn’t so out of the realms of reality as there could be an accessibility hook for people who were unable to use either a mouse or a keyboard. Little did they know that they’d hit on something as there is already technology out there that reads facial movements to manipulate a cursor on screen in the form of Brainfingers and others.
More recently Adam Wilson – a University of Wisconsin-Madison biomedical engineering doctoral student – Tweeted using a Brain-Twitter interface, a communication interface based on brain activity related to changes in an object on screen. This is a revolutionary tool for people who have ALS, brain-stem stroke or high spinal cord injury like my friend Sam. The fact that writing text takes longer than the norm means that Twitter is a perfect medium for mass communication when working under these restrictions.
The interface consists, essentially, of a keyboard displayed on a computer screen. “The way this works is that all the letters come up, and each one of them flashes individually,” says Williams. “And what your brain does is, if you’re looking at the ‘R’ on the screen and all the other letters are flashing, nothing happens. But when the ‘R’ flashes, your brain says, ‘Hey, wait a minute. Something’s different about what I was just paying attention to.’ And you see a momentary change in brain activity.
The following video show (unfortunately there is no commentry) shows how the user wearing teh cap linked to the screen can select text.
As with Start Trek, James Bond or Minority Report fiction can be a great innovator of technology so it was interesting to also stumble across a video from the TED Ideas Worth Spreading conference about developing a sixth sense and a wearable interactive device to help you find out more about the environment around you be it a super market, sat on a train reading the paper or simply meeting people at a conference.
The part I love is being able to see what a person is all about, their tags and social networks when out and about at conferences. This is something that we spoke about at the W3C workshop on the Future of Social Networking in January.
So maybe there is something in it after all…
Update, 22 June 2009: BBC click reported about OpenVibe, a software system that allows people suffering from locked-in syndrome to communicate using just their brainwaves. This is an incredible piece of software that while time consuming to use could change the lives of people who are absolutely fine except for being completely paralysed, even to the extent they can’t move face muscles. Thanks to Steve Lee for Tweeting about this.