Can AI read my mind? Honestly, I’d be way more worried if another person could do that using AI! Jokes aside, though, the idea of AI decoding our thoughts doesn’t seem so far-fetched anymore. I recently came across some fascinating developments that blend neuroscience and AI in ways that are both inspiring and deeply human.
Here’s the story: Researchers in Sydney have created an AI-powered system that translates brain signals into words using a wearable cap embedded with sensors that read electrical activity from the brain. This isn’t your typical sci-fi tale—it’s a real, experimental device.
Here’s how it works: The cap picks up the brain’s electrical signals, sending them to a monitoring unit where a deep learning AI decoder processes and converts these signals into written words. Then, a large language model steps in to refine the text and correct any mistakes, before displaying the final output on a screen. While the technology is still in its early stages and currently trained on a limited set of words and phrases, it’s already showing promising results.
AI correctly identified the target word about 75% of the time, with researchers aiming for 90% accuracy—a huge leap for non-invasive brain wave decoding.
This technology belongs to a larger family known as brain-computer interfaces (BCIs). The concept isn’t entirely new, but the range of approaches and their applications have been growing rapidly. BCIs essentially pick up signals that reflect your intention—like moving your hand—and translate those intentions into commands that computers can understand.
Most famously, Elon Musk‘s Neuralink is pushing the envelope with a tiny chip implanted directly into the brain through surgery. The chip has enabled a few individuals to control devices—whether it’s moving a cursor or playing video games—with their thoughts alone. There are even clinical trials underway for “telepathy” products that aim to let people control their phones or computers just by thinking, with expansion into Canada, the UK, and the UAE already approved.
What’s particularly remarkable about Neuralink is that it’s achieved full cursor control by thought alone without relying on eyetracking or external sensors. Watching the demonstration of the first user moving a MacBook Pro cursor with pure mental commands is nothing short of mind-blowing.
At the same time, other BCIs are following different paths. US-based Paradromics is developing a device called Kexus, which involves a microelectrode array implanted under the skull to detect neural activity with very high precision. This system is designed to help patients with severe neurological disorders regain speech and movement.
Compared to these invasive solutions, the system at the University of New South Wales (UNSW) in Sydney stands out because it is completely non-invasive. Instead of surgeries or implants, it uses a wearable EEG cap to read brain waves and an external AI unit to translate thoughts into text—making it accessible and less risky.
Though the accuracy of this non-invasive approach is not perfect yet, this technology promises to be a game changer, especially for people recovering from strokes or facing paralysis and speech difficulties.
It’s inspiring to see how the medical needs—like restoring lost motor or speech functions—are driving these technologies forward. Once those critical needs are met, the possibilities explode from there—imagine silent thought-based commands for augmented reality or effortless communication without speaking.
The most exciting aspect? The simplicity and ease of use of the non-invasive system makes it the most immediately compelling for broad adoption and real-world impact.
These developments remind me that the future of AI isn’t just about machines getting smarter—it’s about connecting in more human ways than ever before. The bridge from brain to computer might just redefine how we communicate, live, and heal.



