This Startup Wants to Put Its Brain-Computer Interface in the Apple Vision Pro
Summary
Cognixion, a California startup, is launching a clinical trial that pairs its non-invasive brain–computer interface (BCI) with a modified Apple Vision Pro headset to help people with paralysis and speech disorders communicate. The system uses a custom EEG headband (six sensors) that reads visual-fixation signals from the back of the head, a neural computing pack worn at the hip, and personalised generative-AI software trained on an individual’s speech history. Earlier tests with Cognixion’s Axon-R headset showed promising conversation speeds for people with ALS. The new trial will test the Vision Pro integration with up to 10 US participants and aims to scale toward a pivotal FDA trial if results support efficacy.
Key Points
- Cognixion is running a non-invasive BCI trial integrated with the Apple Vision Pro to enable thought-driven communication for people with severe motor and speech impairments.
- The company replaces Apple’s headband with its own six-sensor EEG band and processes signals with a hip-worn neural pack to decode visual-fixation attention.
- Personalised AI models are built for each user from their speech history, writing and other data to speed and naturalise output.
- Previous Axon-R studies with ALS patients showed near-normal conversational rates when paired with Cognixion’s AI software.
- Apple’s new accessibility protocol has made Vision Pro integration possible; other BCI firms (eg, Synchron) are also connecting to Vision Pro.
- Non-invasive BCIs face weaker, noisier signals than implants, so AI copilots are critical to bridge performance gaps.
- Regulatory clearance will require larger pivotal trials (~30 patients) to demonstrate real-world quality-of-life benefits for FDA review.
Content summary
Cognixion aims to democratise access to BCI-driven communication by avoiding surgical implants and leveraging an off-the-shelf mixed-reality platform. Participants in the trial will use a Vision Pro headset modified with Cognixion’s EEG headband; brain signals linked to where a user is visually fixating are decoded and turned into selections or AI-suggested phrases. The underlying software trains on each user’s unique language patterns, so outputs match their voice and style. The company argues that using Vision Pro — with its app ecosystem and accessibility protocol — could deliver an easier path to broad adoption than a bespoke, medically implanted device.
However, researchers note a key limitation: non-invasive EEG yields low-amplitude, noisy signals compared with implants, making fast, reliable decoding difficult. Cognixion believes generative-AI and personalised models can compensate, improving usability enough for real-world communication. If early results hold up, the firm will pursue a larger trial to support medical-device approval and wider clinical use.
Context and relevance
This story sits at the intersection of accessibility tech, consumer AR/VR platforms, and the fast-moving BCI field. Apple’s decision to open a BCI-friendly accessibility protocol for Vision Pro created an opportunity for startups to integrate neural input without Apple having to build its own neural hardware. For people with paralysis and speech motor disorders, a non-invasive, Vision-Pro-based solution could reach users far faster and with lower risk than implanted chips.
For clinicians, regulators and technologists, the trial is important because it tests whether AI can meaningfully boost the practical performance of non-invasive BCIs. If successful, it would shift expectations about which BCI approaches are viable at scale and accelerate integration of assistive neural tech into mainstream consumer devices — while also raising questions about safety, privacy and regulatory oversight as consumer and medical worlds converge.
Why should I read this?
Short answer: because this isn’t sci‑fi any more — it’s a real push to get thought-driven communication into mainstream AR hardware. If you care about accessibility, assistive tech or the future of consumer devices handling sensitive neural data, this piece saves you the legwork: key trial details, technical trade-offs (non‑invasive vs invasive), and why Apple’s Vision Pro matters as a platform. Big implications for patients, regulators and anyone tracking how AI is being used to patch noisy biological signals into usable interfaces.