Upcoming Events
China GDC
9/24 - 9/26 @ Beijing, China

New York Games Conference
9/25 - 9/26 @ New York, NY

9/26 - 9/28 @ Tempe, AZ

Southern Interactive Entertainment and Games Expo (SIEGE)
10/3 - 10/5 @ Marietta, GA

More events...

Quick Stats
5483 people currently visiting GDNet.
2218 articles in the reference section.

Help us fight cancer!
Join SETI Team GDNet!

Link to us

Emotive Press Event
Posted February 29 9:46 PM by Drew Sikora
Emotive EPOC Overview

The company's CEO, Nam Do, was the main presenter at the event and introduced the gathered reporters to the upcoming product release of the EPOC neuroheadset. Set to be released for Christmas at the end of the year, the headset is priced at a "very affordable" $299 USD. While it may be arguable as to whether it's price is fair, once the full scope of the product was explained I can certainly justify myself spending the money to get one when it's released. In addition to the headset itself, the product will include game content (the initial offering being worked on in conjunction with Demiurge Studios) that fully utilizes the headset's features, as well as access to Emortal, a "next generation online portal that unleashes the full potential of the headset." The connectivity of the headset is wireless, though they didn't specify the exact nature of the wireless technology, and it's powered by a rechargeable battery.

Here are answers to a couple of important questions I know you all have. For pictures, check out the Emotive Press Event Gallery

How Does It Work?

Explained by Director of Design Lori Washbon, the headset is constructed of several nodes spaced around the head that detect the electric fields being emitted by your brain. These EEG patterns are then processed and recognized. Unlike regular EEG scans that require the electrodes to be glued directly to your head, Emotiv has a "patented material" that lets the nodes simply rest against your skull with no adhesive required. As the neurons in your brain send messages back and forth, the electrical signals are picked up and analyzed. Like other forms of bodily interface (such as gestures), you must first train the headset to recognize what kind of thought pattern equates to a certain action. This obviously requires a certain degree of consistency on the part of the wearer, hence the innate difficulty and challenge of learning to use the headset. Note that the thoughts picked up by the headset include moods and feelings as well, as your brain signals your muscles to perform certain actions.

I should also note that Ms. Washbon carried out her part of the presentation in a way that eerily reminds me of a CyberDyne product pitch. I'm not trying to implicate anything here, but should we all wind up thoughtless mind-controlled zombies in the future...

What Does It Do?

With the assistance of three main Suites, the EPOC headset can deliver a wide range of possibilities:

The Affectiv Suite - Monitors players’ emotional states in real-time, providing an extra dimension in game interaction by allowing the game to respond to players' emotions. The Affectiv suite can be used to monitor players’ state of mind and allow developers to tailor difficulty to suit each situation.

The Cognitiv Suite - Reads and interprets players’ conscious thoughts and intent and can differentiate between multiple conscious thought commands. The Cognitiv suite reads the player’s thoughts and intent, such as lifting an object, and sends commands through the API to levitate the object in the virtual world. The full list of commands available for the demo application were: lift, drop, push, zoom, left, right, rotate CW (object), rotate CCW (object), rotate left (player), rotate right (player), rotate backward, rotate forward and dissapear (object). Any command can be created through the product SDK.

The Expressiv Suite - Uses signals measured by the neuroheadset to interpret players’ facial expressions in real-time. The Expressiv suite provides a natural enhancement to game interaction by enabling game characters to mirror the reactions and expression of the player in real time, including complex non-verbal expressions. This was demonstrated with a robot avatar on the screen that very closely mimicked the person wearing the headset, all the way down to the eye blinks and eyebrow movements. Head movement was mimicked accurately as well.

Why Should We Care?

As game developers, the implications are simply staggering, as briefly touched on above. In addition, based on the responses Emotiv received from their display of the headset at the 2007 GDC, they have decided to make their free downloadable SDK an open standard, which gives anyone the ability to integrate it seamlessly into their game. Barring that, the headset can still be used for other software and older game titles that do not have intrinsic support for the device, using EmoKey (you can see Marketing had a lot of fun with this product). EmoKey is essentially a keyboard emulator that maps your thoughts through the headset to certain keys that control actions within a game. So you're playing Deus Ex and you think "run" which is then translated to the keyboard command for "run" in the game. Since the headset can pick up multiple thoughts, thinking of accelerating in a racing game and then turning at the same time is possible as well. Awesome? Yes. Although it would take a lot of time, you could theoretically (depending on the extents to which you can map multiple keys to one command) program the headset to detect you thinking of a word, and then carry out the keypresses to type it on the screen. Imagine typing up a report without lifting a finger.

Does It Really Work?

This is probably the question that bothers most people who've heard about this product. Neural interfaces haven't received a whole lot of attention from the main stream press and are still to many people considered to be in the realm of science fiction. So for a lot of people this product is seen as a surprise, something that shouldn't have been developed by this time. But for years now they've been using monkeys to control cursors on computer screens, and more recently humans have been doing much the same in labs across the world. I'll admit, even having read articles in Discover and Scientific American for years, that it didn't seem very plausible to me, and I went into the event looking for excuses to call hoax.

I was granted one when the headset failed to operate fully during the Cognitiv demo, and Nam Do pleaded for people to turn off all cellular devices as they were "interfering with the headset." They also had a few moments during the demonstration that I couldn't help but see as planned tension-builders. The person wearing the headset would be trying to perform an action and would fail two or three times and just as they were ready to move on - Bam! It worked. I would have been more convinced had they not gotten it to work in those instances.

However, despite the fact that some part of me wanted to believe that the Wizard of OZ was behind the stage and it was all a big hoax, the more I saw the more convinced I became. The demonstration of the Expressiv Suite went quite well as I mentioned earlier with the robot avatar mimicking the head and facial movements/expressions of the wearer. Later on in the week I happened by Albert Reed, the Studio Director and Co-Founder of Demiurge, who practically begged me to go and try it for myself. "It works! It really works!" he said, and you could just see in his eyes the excitement as he recalled the work Demiurge has been doing on the game integration front regarding the headset. Not only that, but Brian Crescente over at Kotaku did get a chance to try it out, and vouched for its capabilities. I never did get an opportunity to try the headset out myself.

You have to admit that it's hard to beat back skepticism over this device, but as far as I'm concerned I'm sold - though I'm still going to wait for it to hit retail and the actual reviews to come streaming in. I remain hopeful, however.
 Back to GDC 2008
 See more Parties and Special Events
 Discuss this article