Seems like every distant vision of the future has man jacking into his gear via some crazy head gear or a plug on the back of the neck or head. We just take it for granted that yeah, that's the fastest way to get to the brain: through the stem or straight into the cortex. Well, think again, because the Florida Institute for Human and Machine Cognition's 30-year neural interface project is yielding fruit -- the kind you can taste. Their Brain Port machine / sensory interface uses 144 microelectrodes to transmit information through sensitive nerve fibers in your lingua, enabling devices to supplement your own sensory perception. The system, which is getting shown off to Navy and Marine Corps divers next month will supposedly have sonar integration for sub-aqueous orientation, but has already apparently given some landlubber blind people the ability to catch balls, "notice" others walking in front of them, and find doors. With IR, radar, sonar, and other forms of detection, the researchers believe this device will obsolete night vision -- even our own eyes -- sooner than later.
How BrainPort Works
A blind woman sits in a chair holding a video camera focused on a scientist sitting in front of her. She has a device in her mouth, touching her tongue, and there are wires running from that device to the video camera. The woman has been blind since birth and doesn't really know what a rubber ball looks like, but the scientist is holding one. And when he suddenly rolls it in her direction, she puts out a hand to stop it. The blind woman saw the ball. Through her tongue.
Well, not exactly through her tongue, but the device in her mouth sent visual input through her tongue in much the same way that seeing individuals receive visual input through the eyes. In both cases, the initial sensory input mechanism -- the tongue or the eyes -- sends the visual data to the brain, where that data is processed and interpreted to form images. What we're talking about here is electrotactile stimulation for sensory augmentation or substitution, an area of study that involves using encoded electric current to represent sensory information -- information that a person cannot receive through the traditional channel -- and applying that current to the skin, which sends the information to the brain. The brain then learns to interpret that sensory information as if it were being sent through the traditional channel for such data. In the 1960s and '70s, this process was the subject of ground-breaking research in sensory substitution at the Smith-Kettlewell Institute led by Paul Bach-y-Rita, MD, Professor of Orthopedics and Rehabilitation and Biomedical Engineering at the University of Wisconsin, Madison. Now it's the basis for Wicab's BrainPort technology (Dr. Bach-y-Rita is also Chief Scientist and Chairman of the Board of Wicab).
Electricity isn't the only type of stimulation used in high-tech sensory substitution devices. There are devices that use "vibrotactile" stimulation, among other means, to send information to the brain through an alternate sensory channel. In a vibrotactile stimulation device, encoded sensory signals are applied to the skin by one or more vibrating pins. Tactaid, an auditory substitution device, uses this type of technology.
Most of us are familiar with the augmentation or substitution of one sense for another. Eyeglasses are a typical example of sensory augmentation. Braille is a typical example of sensory substitution -- in this case, you're using one sense, touch, to take in information normally intended for another sense, vision. Electrotactile stimulation is a higher-tech method of receiving somewhat similar (although more surprising) results, and it's based on the idea that the brain can interpret sensory information even if it's not provided via the "natural" channel. Dr. Bach-y-Rita puts it this way:
... we do not see with the eyes; the optical image does not go beyond the retina where it is turned into spatio-temporal nerve patterns of [impulses] along the optic nerve fibers. The brain then recreates the images from analysis of the impulse patterns.
The multiple channels that carry sensory information to the brain, from the eyes, ears and skin, for instance, are set up in a similar manner to perform similar activities. All sensory information sent to the brain is carried by nerve fibers in the form of patterns of impulses, and the impulses end up in the different sensory centers of the brain for interpretation. To substitute one sensory input channel for another, you need to correctly encode the nerve signals for the sensory event and send them to the brain through the alternate channel. The brain appears to be flexible when it comes to interpreting sensory input. You can train it to read input from, say, the tactile channel, as visual or balance information, and to act on it accordingly. In JS Online's "Device may be new pathway to the brain," University of Wisconsin biomedical engineer and BrainPort technology co-inventor Mitch Tyler states, "It's a great mystery as to how that process takes place, but the brain can do it if you give it the right information."
Concepts of Electrotactile Stimulation
The concepts at work behind electrotactile stimulation for sensory substitution are complex, and the mechanics of implementation are no less so. The idea is to communicate non-tactile information via electrical stimulation of the sense of touch. In practice, this typically means that an array of electrodes receiving input from a non-tactile information source (a camera, for instance) applies small, controlled, painless currents (some subjects report it feeling something like soda bubbles) to the skin at precise locations according to an encoded pattern. The encoding of the electrical pattern essentially attempts to mimic the input that would normally be received by the non-functioning sense. So patterns of light picked up by a camera to form an image, replacing the perception of the eyes, are converted into electrical pulses that represent those patterns of light. When the encoded pulses are applied to the skin, the skin is actually receiving image data. According to Dr. Kurt Kaczmarek, BrainPort technology co-inventor and Senior Scientist with the University of Wisconsin Department of Orthopedics and Rehabilitation Medicine, what happens next is that "the electric field thus generated in subcutaneous tissue directly excites the afferent nerve fibers responsible for normal, mechanical touch sensations." Those nerve fibers forward their image-encoded touch signals to the tactile-sensory area of the cerebral cortex, the parietal lobe.
Mouse-over the part labels of the brain to see where those parts are located.
Under normal circumstances, the parietal lobe receives touch information,
the temporal lobe receives auditory information, the occipital lobe receives
vision information and the cerebellum receives balance information.
(The frontal lobe is responsible for all sorts of higher brain functions,
and the brain stem connects the brain to the spinal cord.)
Within this system, arrays of electrodes can be used to communicate non-touch information through pathways to the brain normally used for touch-related impulses. It's a fairly popular area of study right now, and researchers are looking at endless ways to utilize the apparent willingness of the brain to adapt to cross-sensory input. Scientists are studying how to use electrotactile stimulation to provide sensory information to the vision impaired, the hearing impaired, the balance impaired and those who have lost the sense of touch in certain skin areas due to nerve damage. One particularly fascinating aspect of the research focuses on how to quantify certain sensory information in terms of electrical parameters -- in other words, how to convey "tactile red" using the characteristics of electricity.
This is a field of scientific study that has been around for nearly a century, but it has picked up steam in the last few decades. The miniaturization of electronics and increasingly powerful computers have made this type of system a marketable reality instead of just a really impressive laboratory demonstration. Enter BrainPort, a device that uses electrotactile stimulation to transmit non-tactile sensory information to the brain. BrainPort uses the tongue as a substitute sensory channel.
Scientists have been studying electrotactile presentation of visual information since the early 1900s, at least. These research setups typically used a camera to set current levels for a matrix of electrodes that spatially corresponded to the camera's light sensors. The person touching the matrix could visually perceive the shape and spatial orientation of the object on which the camera was focused. BrainPort builds on this technology and is arguably more streamlined, controlled and sensitive than the systems that came before it.
For one thing, BrainPort uses the tongue instead of the fingertips, abdomen or back used by other systems. The tongue is more sensitive than other skin areas -- the nerve fibers are closer to the surface, there are more of them and there is no stratum corneum (an outer layer of dead skin cells) to act as an insulator. It requires less voltage to stimulate nerve fibers in the tongue -- 5 to 15 volts compared to 40 to 500 volts for areas like the fingertips or abdomen. Also, saliva contains electrolytes, free ions that act as electrical conductors, so it helps maintain the flow of current between the electrode and the skin tissue. And the area of the cerebral cortex that interprets touch data from the tongue is larger than the areas serving other body parts, so the tongue is a natural choice for conveying tactile-based data to the brain.
Wicab is currently seeking FDA approval for a balance-correction BrainPort application. A person whose vestibular system, the overall balance mechanism that begins in the inner ears, is damaged has little or no sense of balance -- in severe cases, he may have to grip the wall to make it down a hallway, or be unable to walk at all. Some inner-ear disorders include bilateral vestibular disorders (BVD), acoustic neuroma and Meniere's disease, and the sense of balance can also be affected by common conditions like migraines and strokes. The BrainPort balance device can help people with balance problems to retrain their brains to interpret balance information coming from their tongue instead of their inner ear.
An accelerometer is a device that measures, among other things, tilt with respect to the pull of gravity. The accelerometer on the underside of the 10-by-10 electrode array transmits data about head position to the CPU through the communication circuitry. When the head tilts right, the CPU receives the "right" data and sends a signal telling the electrode array to provide current to the right side of the wearer's tongue. When the head tilts left, the device buzzes the left side of the tongue. When the head is level, BrainPort sends a pulse to the middle of the tongue. After multiple sessions with the device, the subject's brain starts to pick up on the signals as indicating head position -- balance information that normally comes from the inner ear -- instead of just tactile information.
Wicab conducted a clinical trial with the balance device in 2005 with 28 subjects suffering from bilateral vestibular disorders (BVD). After training on BrainPort, all of the subjects regained their sense of balance for a period of time, sometimes up to six hours after each 20-minute BrainPort session. They could control their body movements and walk steadily in a variety of environments with a normal gait and with fine-motor control. They experienced muscle relaxation, emotional calm, improved vision and depth perception and normalized sleep patterns.
The BrainPort Vision Device
Test results for the BrainPort vision device are no less encouraging, although Wicab has not yet performed formal clinical trials with the setup. According to the University of Washington Department of Ophthalmology, 100 million people in the United States alone suffer from visual impairment. This might be age-related, including cataracts, glaucoma and macular degeneration, from diseases like trachoma, diabetes or HIV, or the result of eye trauma from an accident. BrainPort could provide vision-impaired people with limited forms of sight.
To produce tactile vision, BrainPort uses a camera to capture visual data. The optical information -- light that would normally hit the retina -- that the camera picks up is in digital form, and it uses radio signals to send the ones and zeroes to the CPU for encoding. Each set of pixels in the camera's light sensor corresponds to an electrode in the array. The CPU runs a program that turns the camera's electrical information into a spatially encoded signal. The encoded signal represents differences in pixel data as differences in pulse characteristics such as frequency, amplitude and duration. Multidimensional image information takes the form of variances in pulse current or voltage, pulse duration, intervals between pulses and the number of pulses in a burst, among other parameters. According to U.S. Patent 6,430,450, licensed to Wicab for the BrainPort application:
To the extent that a trained user may simultaneously distinguish between multiple of these characteristics of amplitude, width and frequency, the pulses may convey multidimensional information in much the same way that the eye perceives color from the independent stimulation of different color receptors.
The electrode array receives the resulting signal via the stimulation circuitry and applies it to the tongue. The brain eventually learns to interpret and use the information coming from the tongue as if it were coming from the eyes.
After training in laboratory tests, blind subjects were able to perceive visual traits like looming, depth, perspective, size and shape. The subjects could still feel the pulses on their tongue, but they could also perceive images generated from those pulses by their brain. The subjects perceived the objects as "out there" in front of them, separate from their own bodies. They could perceive and identify letters of the alphabet. In one case, when blind mountain climber Erik Weihenmayer was testing out the device, he was able to locate his wife in a forest. One of the most common questions at this point is, "Are they really seeing?" That all depends on how you define vision. If seeing means you can identify the letter "T" somewhere outside yourself, sense when that "T" is getting larger, smaller, changing orientation or moving farther away from your own body, then they're really seeing. One study that conducted PET brain scans of congenitally blind people while they were using the BrainPort vision device found that after several sessions with BrainPort, the vision centers of the subjects' brains lit up when visual information was sent to the brain through the tongue. If "seeing" means there's activity in the vision center of the cerebral cortex, then the blind subjects are really seeing.
The BrainPort test results are somewhat astonishing and lead many to wonder about the scope of applications for the technology. In the next section, we'll see which BrainPort applications Wicab is currently focusing on in clinical trials, what other applications it foresees for the technology and how close it is to commercially launching a consumer-friendly version of the device.
Current and Potential Applications
While the full spectrum of BrainPort applications has yet to realized, the device has the potential to lessen an array of sensory limitations and to alleviate the symptoms of a variety of disorders. Just a few of the current or foreseeable medical applications include:
* providing elements of sight for the visually impaired
* providing sensory-motor training for stroke patients
* providing tactile information for a part of the body with nerve damage
* alleviating balance problems, posture-stability problems and muscle rigidity in people with balance disorders and Parkinson's disease
* enhancing the integration and interpretation of sensory information in autistic people
Beyond medical applications, Wicab has been exploring potential military uses with a grant from the Defense Advanced Research Projects Agency (DARPA). The company is looking into underwater applications that could provide the Navy SEALs with navigation information and orientation signals in dark, murky water (this type of setup could ultimately find a major commercial market with recreational SCUBA divers). The BrainPort electrodes would receive input from a sonar device to provide not only directional cues but also a visual sense of obstacles and terrain. Military-navigation applications could extend to soldiers in the field when radio communication is dangerous or impossible or when their eyes, ears and hands are needed to manage other things -- things that might blow up. BrainPort may also provide expanded information for military pilots, such as a pulse on the tongue to indicate approaching aircraft or to indicate that they must take immediate action. With training, that pulse on their tongue could elicit a faster reaction time than a visual cue from a light on the dashboard, since the visual cue must be processed by the retina before it's forwarded to the brain for interpretation.
Other potential BrainPort applications include robotic surgery. The surgeon would wear electrotactile gloves to receive tactile input from robotic probes inside someone's chest cavity. In this way, the surgeon could feel what he's doing as he controls the robotic equipment. Race car drivers might use a version of BrainPort to train their brains for faster reaction times, and gamers might use electrotactile feedback gloves or controllers to feel what they're doing in a video game. A gaming BrainPort could also use a tactile-vision process to let gamers perceive additional information that isn't displayed on the screen.
BrainPort is currently conducting a second round of clinical trials as it works its way through the FDA approval process for the balance device. The company estimates a commercial release in late 2006, with a roughly estimated selling price of $10,000 per unit.
Already more streamlined than any previous setup using electrotactile stimulation for sensory substitution, BrainPort envisions itself even smaller and less obtrusive in the future. In the case of the balance device, all of the electronics in the handheld part of the system might fit into a discreet mouthpiece. A dental-retainer-like unit would house a battery, the electrode array and all of the microelectronics necessary for signal encoding and transmitting. In the case of the BrainPort vision device, the electronics might be completely embedded in a pair of glasses along with a tiny camera and radio transmitter, and the mouthpiece would house a radio receiver to receive encoded signals from the glasses. It's not exactly a system on a chip, but give it 20 years -- we might be seeing a camera the size of a grain of rice embedded in people's foreheads by then.
About two million optic nerves are required to transmit visual signals from the retina—the portion of the eye where light information is decoded or translated into nerve pulses—to the brain's primary visual cortex. With BrainPort, the device being developed by neuroscientists at Middleton, Wisc.–based Wicab, Inc., visual data are collected through a small digital video camera about 1.5 centimeters in diameter that sits in the center of a pair of sunglasses worn by the user.
Bypassing the eyes, the data are transmitted to a handheld base unit, which is a little larger than a cell phone. This unit houses such features as zoom control, light settings and shock intensity levels as well as a central processing unit (CPU), which converts the digital signal into electrical pulses—replacing the function of the retina.
The BrainPort device seems to work well in practice: patients quickly learn how to find doorways and elevator buttons and even read letters and numbers. At table, users can easily pick out cups and forks; I suppose you'd take it out to eat.
Who could benefit from the BrainPort vision device?
The current investigational prototype works best for individuals who are blind and have no better than light perception. Since we do not stimulate the eye or optic nerve, our technology has the potential to work across a wide range of visual impairments. We are actively developing device modifications to address the needs for those with low vision such as macular degeneration.
What is the resolution of the display?
The images below demonstrate how information from the video camera is represented on the tongue. Today's prototypes have 400 to 600 points of information on a ~3cm x 3cm tongue display, presented at approximately 30 frames per second, yielding an information rich image stream. Our research suggests that the tongue is capable of resolving much higher resolution information and we are currently working to develop the optimal tongue display hardware and software.
Original 100 points
Static Original Image 10 x 10 Resampled Image 25 x 25 Resampled Image 60 x 60 Resampled Image
How long does it take to learn?
Our current research studies involve participation between 2-10 hours*. Within minutes of introduction, users may understand where in space stimulation arises (up, down, left and right) and the direction of movement. Within an hour of practice, users can generally identify and reach for nearby objects, and point to and estimate the distance of objects out of reach. With additional training, subjects can identify letters and numbers and can recognize landmark information when using the device in a mobile scenario.