Get ready to dump the keyboard: Experts claim mind controlled computers are just a decade away

  • Expert says we are moving towards 'computing at the speed of thought'
  • Open-source projects will allow people to assemble own neuroheadsets 
  • This technology will enable us to capture brain activity noninvasively
  • In 10 to 15 years, this hardware will recognize nouns we think about
  • Computers will be able to create text documents by reading our thoughts

The first computers cost millions of dollars and were locked inside rooms equipped with special electrical circuits and air conditioning. 

The only people who could use them had been trained to write programs in that specific computer's language. 

Today, gesture-based interactions, using multitouch pads and touchscreens, and exploration of virtual 3D spaces allow us to interact with digital devices in ways very similar to how we interact with physical objects.

Scroll down for video

Multitouch pads and touchscreens recognize movements of fingers on a surface, while devices such as the Wii and Kinect recognize movements of arms and legs. Frances Van Scoy says this is bringing us closer to  towards 'computing at the speed of thought'

HOW COULD IT HAPPEN? 

A professor at West Virginia University believes her research is helping to move us toward what might be called 'computing at the speed of thought.'

Frances Van Scoy says low-cost open-source projects such as OpenBCI allow people to assemble their own neuroheadsets that capture brain activity noninvasively.

Ten to 15 years from now, hardware/software systems using those sorts of neuroheadsets could assist Van Scoy by recognizing the nouns we've thought about in the past few minutes.

If it replayed the topics of our recent thoughts, it could retrace your steps and remember what thought triggered your most recent thought. 

The computer could deliver the first draft of a short story, either as a text file or even as a video file showing the scenes and dialogue generated in the writer's mind. 

This newly immersive world not only is open to more people to experience; it also allows almost anyone to exercise their own creativity and innovative tendencies. 

No longer are these capabilities dependent on being a math whiz or a coding expert: Mozilla's 'A-Frame' is making the task of building complex virtual reality models much easier for programmers. 

And Google's 'Tilt Brush' software allows people to build and edit 3D worlds without any programming skills at all.

My own research hopes to develop the next phase of human-computer interaction. 

We are monitoring people's brain activity in real time and recognizing specific thoughts (of 'tree' versus 'dog' or of a particular pizza topping). 

It will be yet another step in the historical progression that has brought technology to the masses – and will widen its use even more in the coming years.

From those early computers dependent on machine-specific programming languages, the first major improvement allowing more people to use computers was the development of the Fortran programming language.

It expanded the range of programmers to scientists and engineers who were comfortable with mathematical expressions. This was the era of punch cards, when programs were written by punching holes in cardstock, and output had no graphics – only keyboard characters.

By the late 1960s mechanical plotters let programmers draw simple pictures by telling a computer to raise or lower a pen, and move it a certain distance horizontally or vertically on a piece of paper. 

The commands and graphics were simple, but even drawing a basic curve required understanding trigonometry, to specify the very small intervals of horizontal and vertical lines that would look like a curve once finished.

MIND CONTROLLED DRONES ARE WEAPONS OF THE FUTURE 

A team of researchers has developed technology that lets a human control multiple drones using their brain waves, and the group is now working on squadrons of drones that could perform complex operations.

Researchers at the Human-Oriented Robotics and Control (HORC) lab at Arizona State University have been working with the US army for the last two years.

The system works using one controller who watches the drones, while his thoughts are read using a computer.

The controller wears a skull cap fitted with 128 electrodes wired to a computer. The device records electrical brain activity. If the controller moves a hand or thinks of something, certain areas light up.

These thoughts are then communicated to the robots using Bluetooth. 

The 1980s introduced what has become the familiar windows, icons and mouse interface. 

That gave nonprogrammers a much easier time creating images – so much so that many comic strip authors and artists stopped drawing in ink and began working with computer tablets. 

Animated films went digital, as programmers developed sophisticated proprietary tools for use by animators.

Low-cost open-source projects will let people to assemble their own neuroheadsets (pictured is a neuroheadset that connects brain activity to the car's engine) that capture brain activity noninvasively

Simpler tools became commercially available for consumers. In the early 1990s the OpenGL library allowed programmers to build 2D and 3D digital models and add color, movement and interaction to these models.

In recent years, 3D displays have become much smaller and cheaper than the multi-million-dollar CAVE and similar immersive systems of the 1990s. 

They needed space 30 feet wide, 30 feet long and 20 feet high to fit their rear-projection systems. 

RACING GAME USES BRAINWAVES TO POWER CARS 

One firm has developed what it considers to be the next level in gaming - a headset that lets you control on-screen and physical objects using just your mind.

The game was developed in partnership with the Institute of Electrical and Electronics Engineers (IEEE) and Australian-based Emotiv.

A ‘driver’ is wired up to Emotiv’s electroencephalography (EEG) headset and the device is trained to read their unique brain patterns.

The first step involves training the headset to learn the wearer’s ‘neutral’ state. This involves ‘clearing their brain’.

Ten to 15 years from now, hardware/software systems using neuroheadsets could assist Van Scoy by recognizing the nouns she’s thought about in the past few minutes. Pictured is one that lets you control on-screen and physical objects using just your mind

They are then asked to think of a repetitive task that will associated with driving the car.

This doesn't need to be a driving-related thought; it can be any thought that the wearer can continuously think and repeat.

This is known as the ‘push’ state and for MailOnline’s test this involved thinking about playing Greensleeves on a piano and imaging the finger positions as they move through the chords.

Once the headset is trained the game begins. The wheels of an on-screen car begin to spin to signal that the brain patterns are being recognised.

The wearer is then asked to think about their repetitive task, at which point the car begins to move.

During the demonstration, these brain waves moved a car the size of a shoebox around a track and each race involves two players wired up to the headset.

The Emotive headsets are embedded with sensors that record electrical activity along the wearer’s scalp, forehead and above the right ear.

These sensors measure and monitor brain waves and these patterns are converted to commands using a brain-computer interface.

Now smartphone holders can provide a personal 3D display for less than US$100.

User interfaces have gotten similarly more powerful. 

Multitouch pads and touchscreens recognize movements of multiple fingers on a surface, while devices such as the Wii and Kinect recognize movements of arms and legs. 

A company called Fove has been working to develop a VR headset that will track users' eyes, and which will, among other capabilities, let people make eye contact with virtual characters.

A company called Fove (pictured) has been working to develop a VR headset that will track users' eyes, and which will, among other capabilities, let people make eye contact with virtual characters

My own research is helping to move us toward what might be called 'computing at the speed of thought.' 

Low-cost open-source projects such as OpenBCI allow people to assemble their own neuroheadsets that capture brain activity noninvasively.

Ten to 15 years from now, hardware/software systems using those sorts of neuroheadsets could assist me by recognizing the nouns I've thought about in the past few minutes. 

If it replayed the topics of my recent thoughts, I could retrace my steps and remember what thought triggered my most recent thought.

THE JAPANESE MACHINE COULD ALLOW TELEPATHIC TALK

A 'mind-reading' device that can decipher words from brainwaves without them being spoken has been developed by Japanese scientists, raising the prospect of 'telepathic' communication.

Researchers have found the electrical activity in the brain is the same when words are spoken and when they are left unsaid.

By looking for the distinct wave forms produced before speaking, the team was able to identify words such as 'goo', 'scissors' and 'par' when spoken in Japanese.

Researchers from Japan used technology that measures the electrical activity of the brain to decipher brainwaves that occur before someone speaks (stock picture). They found distinct brainwaves were formed before syllables were spoken

With more sophistication, perhaps a writer could wear an inexpensive neuroheadset, imagine characters, an environment and their interactions. 

The computer could deliver the first draft of a short story, either as a text file or even as a video file showing the scenes and dialogue generated in the writer's mind.

Once human thought can communicate directly with computers, a new world will open before us. 

One day, I would like to play games in a virtual world that incorporates social dynamics as in the experimental games 'Prom Week' and 'Façade' and in the commercial game 'Blood & Laurels.'

The computer could deliver the first draft of a short story, either as a text file or even as a video file showing the scenes and dialogue generated in the writer's mind. Once human thought can communicate directly with computers, a new world will open before us

This type of experience would not be limited to game play. 

Software platforms such as an enhanced Versu could enable me to write those kinds of games, developing characters in the same virtual environments they'll inhabit.

Years ago, I envisioned an easily modifiable application that allows me to have stacks of virtual papers hovering around me that I can easily grab and rifle through to find a reference I need for a project. 

I would love that. I would also really enjoy playing 'Quidditch' with other people while we all experience the sensation of flying via head-mounted displays and control our brooms by tilting and twisting our bodies.

Frances Van Scoy, Associate Professor of Computer Science and Electrical Engineering, West Virginia University

This article was originally published on The Conversation. Read the original article.

 

The comments below have not been moderated.

The views expressed in the contents above are those of our users and do not necessarily reflect the views of MailOnline.

By posting your comment you agree to our house rules.

Who is this week's top commenter? Find out now