AI Developed in Australia Decodes Brainwaves into Words
(MENAFN) Australian scientists have unveiled a groundbreaking artificial intelligence system capable of interpreting brainwaves into readable words and phrases—paving the way for major advances in both healthcare and human-computer interaction.
As reported by Australian broadcasting agency, the innovative model was created by Daniel Leong, PhD student Charles Zhou, and Professor Chin-Teng Lin at the GrapheneX-Human-centric Artificial Intelligence Center, part of the University of Technology Sydney.
Using electroencephalogram (EEG) data captured through a non-invasive wearable cap, the AI leverages deep learning techniques to translate neural signals into recognizable language.
While currently limited to a defined vocabulary and sentence structure, the system marks a significant leap in brain-computer interface research.
Professor Lin emphasized that this early-stage version was deliberately trained on a narrow set of phrases to enhance the accuracy of word recognition. To further improve performance, the research team is expanding their dataset by inviting additional participants to read texts while wearing the EEG device.
Their long-term ambition extends beyond decoding internal monologue. The team envisions future applications that enable direct communication between individuals through neural signals alone.
Mohit Shivdasani, a bioelectronics expert at the University of New South Wales, highlighted the technology’s transformative potential: Scientists have long been searching for patterns hidden within biological signals, he said. But now, AI is capable of detecting brainwave patterns that were previously unrecognizable.
Shivdasani also noted the benefit of AI integration in brain-implanted devices, explaining that such systems could be tailored to an individual’s cognitive habits: AI, especially when integrated into implantable devices, could rapidly adapt to an individual’s unique brainwave patterns and how they perform specific tasks. he added.
So far, the model has achieved approximately 75% accuracy in translating thoughts into text—a milestone the team hopes to push to 90% with further development.
As reported by Australian broadcasting agency, the innovative model was created by Daniel Leong, PhD student Charles Zhou, and Professor Chin-Teng Lin at the GrapheneX-Human-centric Artificial Intelligence Center, part of the University of Technology Sydney.
Using electroencephalogram (EEG) data captured through a non-invasive wearable cap, the AI leverages deep learning techniques to translate neural signals into recognizable language.
While currently limited to a defined vocabulary and sentence structure, the system marks a significant leap in brain-computer interface research.
Professor Lin emphasized that this early-stage version was deliberately trained on a narrow set of phrases to enhance the accuracy of word recognition. To further improve performance, the research team is expanding their dataset by inviting additional participants to read texts while wearing the EEG device.
Their long-term ambition extends beyond decoding internal monologue. The team envisions future applications that enable direct communication between individuals through neural signals alone.
Mohit Shivdasani, a bioelectronics expert at the University of New South Wales, highlighted the technology’s transformative potential: Scientists have long been searching for patterns hidden within biological signals, he said. But now, AI is capable of detecting brainwave patterns that were previously unrecognizable.
Shivdasani also noted the benefit of AI integration in brain-implanted devices, explaining that such systems could be tailored to an individual’s cognitive habits: AI, especially when integrated into implantable devices, could rapidly adapt to an individual’s unique brainwave patterns and how they perform specific tasks. he added.
So far, the model has achieved approximately 75% accuracy in translating thoughts into text—a milestone the team hopes to push to 90% with further development.

Legal Disclaimer:
MENAFN provides the
information “as is” without warranty of any kind. We do not accept
any responsibility or liability for the accuracy, content, images,
videos, licenses, completeness, legality, or reliability of the information
contained in this article. If you have any complaints or copyright
issues related to this article, kindly contact the provider above.
Most popular stories
Market Research

- Global Plant-Based Meat Market Report 2025: Size Projected USD 100.31 Billion, CAGR Of 21.92% By 2033.
- With Seal, Walrus Becomes The First Decentralized Data Platform With Access Controls
- Blackrock Becomes The Second-Largest Shareholder Of Freedom Holding Corp.
- United States Jewelry Market Forecast On Growth & Demand Drivers 20252033
- Origin Summit Debuts In Seoul During KBW As Flagship Gathering On IP, AI, And The Next Era Of Blockchain-Enabled Real-World Assets
- Brazil Edtech Market Size, Share, Trends, And Forecast 2025-2033
Comments
No comment