Introducing the Neo Revolutionary Thought User Interface (TUI) - NewsGossipBull.BlogSpot.com - Latest News, Gossip & Bullshit
Quotes by TradingView

Twitter

Introducing the Neo Revolutionary Thought User Interface (TUI)



Introducing the Neo Revolutionary Thought User Interface (TUI)


We are now living in the hyper-neo-future, the future that is now, the future in which your VR 360 device will interact with you through your thoughts. That's right folks, now you will be able to communicate with your device through electrical signals that go inside your brain whenever you think!



Planet Earth initiates commands and sustains and controls life on it through electrical signals! That's right folks, God Earth evolves life and planetary environment through various thought-based electrical signals.






This is our planet thinking, and these are the electrical signals going inside its brain!

Brain-Computer Interface

What it is: In a brain-computer interface, a computer is controlled purely by thought (or, more accurately, brain waves). There are a few different approaches being pursued, including direct brain implants, full helmets, and headbands that capture and interpret brain waves.

Army Mind-Control Projects

According to an article in Time from September 2008, the American Army is actively pursuing "thought helmets" that could some day lead to secure mind-to-mind communication between soldiers. The goal, according to the article, is a system where entire military systems could be controlled by thought alone.
While this kind of technology is still far off, the fact that the military has awarded a $4 million contract to a team of scientists from the University of California at Irvine, Carnegie Mellon University, and the University of Maryland means that we might be seeing prototypes of these systems within the next decade.

The Matrixesque Brain Interface: MEMS-Based Robotic Probe

Researchers at Caltech are working on a MEMS-based robotic probe that can implant electrodes into your brain to interface with particular neurons. While it sounds very The Matrix-y, the idea is that it could allow for advanced control of prosthetic limbs or similar body-control.
The software part of the device is complete, though the micro-mechanical part (the part that actually goes into your brain) is still under development.

The Army's Totally Serious Mind-Control Project

Army scientists want to cram this array of brain-wave reading sensors into a helmet.
 
Soldiers barking orders at each other is so 20th Century. That's why the U.S. Army has just awarded a $4 million contract to begin developing "thought helmets" that would harness silent brain waves for secure communication among troops. Ultimately, the Army hopes the project will "lead to direct mental control of military systems by thought alone."
If this sounds insane, it would have been as recently as a few years ago. But improvements in computing power and a better understanding of how the brain works have scientists busy hunting for the distinctive neural fingerprints that flash through a brain when a person is talking to himself. The Army's initial goal is to capture those brain waves with incredibly sophisticated software that then translates the waves into audible radio messages for other troops in the field. "It'd be radio without a microphone, " says Dr. Elmar Schmeisser, the Army neuroscientist overseeing the program. "Because soldiers are already trained to talk in clean, clear and formulaic ways, it would be a very small step to have them think that way."
B-movie buffs may recall that Clint Eastwood used similar "brain-computer interface" technology in 1982's Firefox, named for the Soviet fighter plane whose weapons were controlled by the pilot's thoughts. (Clint was sent to steal the plane, natch.) Yet it's not as far-fetched as you might think: video gamers are eagerly awaiting a crude commercial version of brain wave technology — a $299 headset from San Francisco-based Emotiv Systems — in summer 2009.
The Army doesn't move quite as fast as gamers though. The military's vastly more sophisticated system may be a decade or two away from reality, let alone implementation. The five-year contract it awarded last month to a coalition of scientists from the University of California at Irvine, Carnegie Mellon University, and the University of Maryland, seeks to "decode the activity in brain networks" so that a soldier could radio commands to one or many comrades by thinking of the message he wanted to relay and who should get it. Initially, the recipients would most likely hear transmissions rendered by a robotic voice via earphones. But scientists eventually hope to deliver a version in which commands are rendered in the speaker's voice and indicate the speaker's distance and direction from the listener.
"Having a soldier gain the ability to communicate without any overt movement would be invaluable both in the battlefield as well as in combat casualty care," the Army said in last year's contract solicitation. "It would provide a revolutionary technology for silent communication and orientation that is inherently immune to external environmental sound and light."
The key challenge will be to develop software able to pinpoint the speech-related brain waves picked up by the 128-sensor array that ultimately will be buried inside a helmet. Those sensors detect the minute electrical charges generated by nerve pathways in the brain when thinking occurs. The sensors will generate an electroencephalogram — a confusing pile of squiggles on a computer screen — that scientists will study to find those vital to communicating. "We think we can train a computer to understand those squiggles to the point that they can read off the commands that your brain is issuing to your mouth and lips," Schmeisser says. Unfortunately, it's not a matter of finding the single right squiggle. "There's no golden neuron that's talking," he says.
Dr. Mike D'Zmura of UC-Irvine, the lead scientist on the project, says his task is akin to finding the right strands on a plate full of pasta. "You need to pick out the relevant pieces of spaghetti," he says, "and sometimes they have to be torn apart and re-attached to others." But with ever-increasing computing power the task can be done in real time, he says. Users also will have to be trained to think loudly. "How do we get a person to think something to themselves in a way that leaves a very strong signal in EEGs that we can read off against the background noise?" D'Zmura asks. Finally, because every person's EEG is different, persons using "thought helmets" will have to be trained so that computers intercepting their unspoken commands recognize each user's unique mental pattern.
Both scientists pre-emptively deny expected charges that they're literally messing with soldiers' minds. "A lot of people interpret wires coming out of the head as some sort of mind reading," D'Zmura sighs. "But there's no way you can get there from here," Schmeisser insists. "Not only do you have to be willing, but since your brain is unique, you have to train the system to read your mind — so it's impossible to do it against someone's will and without their active and sustained cooperation."
And don't overlook potential civilian benefits. "How often have you been annoyed by people screaming into their cell phones?" Schmeisser asks. "What if instead of their Bluetooth earpiece it was a Bluetooth headpiece and their mouth is shut and there's blessed silence all around you?" Sounds like one of those rare slices of the U.S. military budget even pacifists might support.

THOUGHTS INTO MOTION: AMAZING BRAIN-CONTROLLED DEVICES THAT ARE ALREADY HERE

 
brain control the user interface of future eeg headset
Why toil with the pressing of buttons or tilting of joysticks when controlling something can be as simple as thinking about it? This kind of technology may sound like the stuff of science fiction, but in reality, it’s actually been around for decades. The development of electroencephalography (EEG) technology can be traced back to the 1920’s, but it wasn’t until recently that we figured out a way to use neurofeedback to control electronic devices. Nowadays, we can use brain-computer interfaces (BCI’s) to control everything from prosthetic limbs, to robotic arms, cars, and even things as simple as your computer’s cursor.
In the past couple years, BCI technology has expanded in leaps and bounds. Not only are sensor technologies becoming more advanced, but companies like Emotiv and NeuroSky are working to make BCI headsets more affordable and available to consumers. Software development kits are available for most major EEG headsets, which means developers everywhere can tinker with the technology and help to expand its uses. Here’s a look at some of the latest developments in the world of mind control, including many you can buy today.

MindSet (2007)

brain control the user interface of future screen shot 2012 08 09 at 12 10 15 pm
To give you a sense of chronology, we’ll take a step back and start with some of the earliest consumer applications of BCI. About five years ago, NeuroSky created the MindSet, the first affordable EEG headset. Up until this point, dry sensor technology wasn’t easily accessible or easy to use. It shipped with an early version of NeuroBoy, a game where you use thoughts to trigger telekinetic powers to manipulate objects and accomplish tasks. NeuroSky has since released a newer EEG headset called the MindWave, which despite looking newer, is essentially the same technology without headphones.
 

MindFlex (2009)

brain control the user interface of future mindflex
Back in 2009, NeuroSky partnered with Mattel to make MindFlex — a game where players are taksed with moving a ball through an obstacle course using nothing more than their thoughts. The game was a huge commercial success, and surely helped to put NeuroSky on the map. Today the company’s chips are used in a number of different EEG headsets, and they have rapidly growing app store filled with games from developers who have taken advantage of the company’s free SDK.
 

Emotiv EPOC (2011)

brain control the user interface of future headset
About a year ago, Emotiv released its own take on the EEG headset. Using an array of 14 different sensors and two gyroscopes, it can pick up four different mental states, 13 conscious thoughts, a range of different facial expressions, and head movement in any direction. Like NeuroSky, it also has a software development kit — the only difference is that this one isn’t free. Although we haven’t used it first hand, its list of features suggests that it’s the most advanced EEG headset available to consumers.

Necomimi headset (2011)

brain control the user interface of future screen shot 2012 08 at 10 28 58 am
This one is a bit on the strange side. Made by the Japanese company Neurowear, this headset uses a dry electrode to sense your attentiveness, which is then expressed by the headset’s ears. If you’re feeling scatterbrained and out of it, the ears will droop. When you’re on point, they’ll perk up, and when you’re happy or excited, they’ll wiggle to show your mood. Although we don’t expect these to become the next fashion craze, but Necomimi’s idea to express emotion through accessories is one that could definitely become more common in the future.

Toyota PXP Bicycle

brain control the user interface of future 0919 vbike parlee 01 vert full 600
Back in 2011, Toyota embarked on a project with Parlee Cycles (dubbed the Prius X Parlee) that aimed to blend classic design principles with modern technology in a bicycle. The end result was a beautiful bike with some seriously cool features. It has a dock on the handlebars that you can plug an iPhone into, allowing you to track your speed and get navigational information. But even better than that, the bike allows you to shift gears with nothing more than a thought. Just pop on the helmet that’s retrofitted with an Emotiv EPOC headset, and you’re ready to roll.
brain control the user interface of future mindcar
Sure, Google has proven that cars can drive with minds of their own, but that’s a whole different ball game than controlling a car with your own mind. German company BrainDriver has created a program using Emotiv’s SDK that allows a driver to pilot a car without actually touching the steering wheel or pedals. Instead, the EEG sensor is programmed to pick up on conscious thoughts like “left” or “forward” and wirelessly beam them to a custom-built automated control system that pushes pedals and turns the steering wheel. Although it’s not legally roadworthy just yet, its a good proof of concept that has helped establish BCI as a legitimate interface worthy of further exploration.

Board of Imagination

brain control the user interface of future board imagination chaotic moon labs
Before they got the idea to control a skateboard with their minds, the tinkerers over at Chaotic Moon Labs created what they called ‘The Board of Awesomeness” — a motorized skateboard that was retrofitted with an Xbox Kinect and controlled with gestures. But apparently waving their hands around in order to make it go was too 2011 for them, so they swapped motion control for BCI to create the “Board of Imagination.” Check out the video where the pilot — who goes by the name Whurley — explains how easy the board is to use. All you have to do is imagine a point somewhere ahead of you, imagine yourself being there, and the board moves forward.

SWARM Extreme: Brainwave controlled AR.Drone Parrot brain control the user interface of future ar drone parrot

With bikes, cars, and skateboards all covered, it seems that it was only a matter of time before someone took brainwave control to the skies. A couple students at Northeastern University College of Computer and Information Science have done just that, and built a program that allows them to fly the popular AR.Drone Parrot via BCI. With nothing more than their brainwaves, these guys can remotely control the flight path of one or multiple quadcopter drones. If this catches on, perhaps that the days of the boxy two-joystick controller with a long antenna will soon be behind us.

Software

In addition to the various hardware that can be controlled by thought, more and more mind-controlled software and games keep popping up every day. The two big players in the BCI game are definitely Emotiv and NeuroSky, and both have digital storefronts where you can purchase games and software applications made by various developers and studios. There are a bunch of them out there, but here are a few of the more interesting ones.

UpCake

brain control the user interface of future screen shot 2012 08 10 at 2 13 36 pm
You’ve gotta love game developers who follow through on silly ideas. The goal of UpCake is to maneuver a flying cupcake upward as high as you can. Concentration is key as you’ll have to overcome various hindering forces as the levels get harder. It’s designed to work with Android, so you can literally filter your brainwaves through Ice Cream Sandwich to levitate a cupcake. What a world we live in.

28 Spoons Later

brain control the user interface of future 28sl
Whoever thought this one up is a genius. The premise is that a zombie is trying to eat your brain, but he’s a civilized zombie, so he can only do so with a spoon. To avoid having your brains eaten, you’ll need to use your Matrix-like brain powers to bend the spoon and render it useless to Mr. Zombie. I’m guessing the key to success is taking that little bald-headed kid’s advice: “do not try to bend the spoon; that’s impossible. Only try to realize the truth — there is no spoon. Then you will see that it is not the spoon that bends, it is only yourself.” What an awesome blend of pop culture references and next-gen technology.

SubConch

brain control the user interface of future screen shot 2012 08 10 at 43 07 am
SubConch is a mind-controlled synthesizer. How awesome is that? The software allows you to control various musical properties (volume, pitch, LFO speed, LFO depth, modulation frequency, wave shape, and reverb) by linking them to certain cognitive thoughts, emotions, or even facial expressions. For example, you could set the pitch to change when you feel frustration, or increase the reverb by simply thinking “reverb up.” Imagine in the future when this kind of tech becomes more polished, and allows you to directly translate the music in your head into a live performance that others can hear.
These may be some impressive devices, but they really represent just the tip of the iceberg. If you’re looking for more cool things you can control with your brain, we suggest checking out what Emotiv and NeuroSky have available in their stores, or checking out Neruogadget, a site dedicated to covering the latest products and developments in BCI.
 
 
Our brain generates all kinds of electrical signals with our thoughts, so much so that each specific thought has its own brainwave pattern. These unique electrical signals can be mapped to carry out specific commands so that thinking the thought can actually carry out the set command.
In a EPOC neuroheadset created by Tan Le, the co-founder and president of Emotiv Lifescience, users have to don a futuristic headset that detects their brainwaves generated by their thoughts.
As you can see from this demo video, the command executed by thought is pretty primitive (i.e. pulling the cube towards the user) and yet the detection seems to be facing some difficulties. It looks like this UI may take awhile to be adequately developed.
In any case, envision a (distant) future where one could operate computer systems with thoughts alone. From the concept of a ‘smart home’ where one could turn lights on or off without having to step out of your bed in the morning, to the idea of immersing yourself in an ultimate gaming experience that responds to your mood (via brainwaves), the potential for such an awesome UI is practically limitless.





Popular