Mind Reading Tech

The Next Frontier in Intuitive User Interfaces

With advancements in human-computer interfaces, communication with machines is more intuitive than ever. These natural user interfaces, however, rely on a person's ability to control voluntary movements. What about people who are immobilized or situationally impaired and can't type, tap, or speak?

Enter mind reading technology. Today scientists and engineers are pushing the boundaries of human-computer interfaces with a new wave of "brain-reading" technologies aimed at enabling control of machines through one's thoughts.

Why is this Happening?

Innovation inspired
Mind reading technology, for a long while, has been the stuff of science fiction. However, with recent developments in neuroscience, artificial intelligence, machine learning, and engineering, brain-to-machine communication is now a near-reality.

The convergence of new tech
Mind reading technology, the integration of sensor-embedded hardware with intelligent software, are devices that decode electrical brain signals into commands for computing technologies. Leading tech companies like Facebook and Microsoft are already experimenting with non-invasive brain reading technologies to collect neural data from users that enable their teams to design user interfaces that are more adaptive and easier to use.

Cross-domain experimentation
Though the technology still has a long way to go before it can accurately decode the brain activity of mass audiences or reliably approximate user intent, innovation is already happening in both academia and industry through the combination of non-invasive, brainwave-reading technologies (e.g., EEG or EMG) and artificial intelligence.

What's on the Horizon?

Brain-machine interfaces
Startups are already working on introducing brain-machine interfaces (BMIs) to consumers by enabling developers to build next-generation interactions with their brain-reading platforms. CTRL-Labs offers "neural control" kits that translate neuromuscular signals into instructions for computer devices. Similarly, Neurable, makers of the world's first brain-controlled game, "Awakening" have released an SDK for game developers.

"Mind-reading" wearables
A number of startups have already figured out ways to embed EEG-sensors into consumer-friendly wearables that provide users biofeedback to help train or de-stress. Notably, researchers at MIT have developed AlterEgo, a non-invasive wearable that enables computers to receive prompts you think but don’t say aloud. The promise of non-invasive wearables is key in convincing users they have control over this new tech.

AlterEgo: Interfacing with devices through silent speech

Realtime brain decoding
Researchers at Carnegie Mellon have found a way to combine brain imaging technology with learning algorithms to decode "complex thoughts". Nissan, meanwhile, is experimenting with brain decoding technologies in their "Brain to Vehicle" (B2V) systems to help predict driver actions and decode drivers wants.

Smashing Recommendations

  1. 1.

    Start learning more about the brain

    It is going to take a while before brain machine interfaces are available for a mass audience. In the meantime, digital and innovation teams should develop an understanding of how the brain works; especially the visual cortex, neuromuscular signals, and motor neurons. These systems and signals are key building blocks for emerging mind reading tech.

  2. 2.

    Examine the need for more intuitive user interfaces

    Brain-machine interfaces have the potential to make your product or service more accessible to all users and provide them with greater safety, comfort, and ease of use. The important question is, what critical need or pain point would a brain-machine interface address for your users? How will you ensure your solution creates value and is not just a novelty?

  3. 3.

    Assure users they have complete control

    Transparency and clear articulation of the value your mind-controlled device offers will be critical in helping users adapt to a new frontier in human-computer interaction. Furthermore, as government regulation begins to take form, the tech community needs to play an active role in governing and protecting the end user.

People-Centric Smart Cities
AI in healthcare