According to The Wall Street Journal, Apple reportedly plans to allow its users to control their iPhones and other Apple devices with their brain signals by the end of 2025. If true, this will be a significant step forward in integrating brain-computer interface (BCI) technology into mainstream consumer electronics.
The initiative results from a partnership with Synchron, a neurotechnology startup known for developing the Stentrode, an implantable BCI device. The Stentrode implant is designed to help individuals with severe motor impairments, such as those caused by amyotrophic lateral sclerosis (ALS), control their digital devices using neural activity alone.
What Is the Stentrode and How Does It Work?
Unlike other implants, such as Neuralink, which has a more invasive approach, Stentrode is inserted through the jugular vein and positioned within a blood vessel near the brain’s motor cortex. It uses 16 electrodes to detect motor-related brain signals, eliminating the need for open-brain surgery. The implant then translates the detected brain signals into digital commands. This enables users to interact with Apple devices, including the Apple Vision Pro, using only their thoughts.
Since 2019, Synchron has implanted the Stentrode in 10 patients under an FDA investigational device exemption. One of these test participants, a Pennsylvania-based ALS patient who has lost the use of his arms and hands, was reportedly able to navigate the menus on the Apple Vision Pro and experience the Swiss Alps in VR through thought alone. Although the control was slower than traditional input methods, it represents a critical accessibility breakthrough for users with limited mobility.
Apple’s Accessibility Framework Gets a Major Upgrade
Apple plans to establish a dedicated industry standard in brain-computer interface in collaboration with Synchron, similar to what the tech giant did in 2014 when it introduced a “Made for iPhone” hearing aid protocol as a Bluetooth standard.
The company reportedly plans to integrate BCI support directly into its Switch Control accessibility framework, which already supports a variety of adaptive input methods, such as joysticks, switches, and sip-and-puff systems.
What This Means for the Future of Apple Devices
If Apple rolls out native BCI support in late 2025 or early 2026, not only would it be a win for accessibility, but it could be a stepping stone to a new input paradigm. Imagine using your iPhone or iPad just with your thoughts without ever touching the screen.
While initial use cases will focus on users with motor impairments, the implications for broader adoption are significant. This also showcases Apple’s growing interest in non-traditional input methods, from eye-tracking and spatial gestures to direct neural control.
Do you see this as a breakthrough in accessibility or a step toward something bigger in tech? Drop your thoughts in the comments below.