Breakthroughs in understanding the function of the brain have opened up a host of possibilities for expanding humanity’s perception of the world around us – and investable commercial opportunities are following the Fiduciary Investors Symposium at Stanford heard.
Humanity is on the cusp of being able to choose how it interacts with the physical world, raising the possibility that science can creating new senses for humans that will radically change how we perceive our cosmos.
Neuroscientist David Eagleman, adjunct professor in the Pysch/Public Mental Health and Population Sciences department of Stanford University, says every living organism has what’s known as “umwelt” – or the way in which it perceives the world around it.
Eagleman told the Top1000funds.com Fiduciary Investors Symposium at Stanford on Tuesday that our interaction with the physical world is constrained by our biology. Our eyes, for example, convert photons into electrical signals, and the biology of the eye means it is receptive to only a small portion of the light spectrum. Eagleman says the colors we can perceive are only about one ten-trillionth of the available spectrum – which includes, microwaves, radio waves, x-rays, and gamma rays.
Eagleman’s insight is that the human brain is fundamentally a receptor of electrical signals, and that it doesn’t matter where the brain gets its those signals from, it will always be able to process them into something that enables us to interact with the physical world. But even though we think we see or hear the world around us, “the whole secret is you your brain is not directly hearing or seeing [anything]”, Eagleman says.
“Your brain is locked in silence and darkness inside the vault of your skull and all your brain ever experiences are electrochemical signals running around in the dark,” he says.
“And that it turns out the brain is really good at this, at figuring out patterns and extracting information this way, and eventually building your entire subjective cosmos out of that.
“The key point I want to make is that your brain doesn’t know and doesn’t care where the data come from, because whether that’s photons getting captured in these spheres in your skull, or air compression waves getting picked up on your vibrating eardrum, or pressure or temperature on your fingertips, it all gets converted to spikes – just these little electrical signals that are running around. And it all looks the same in the brain.”
Eagleman says the brain is an extremely good “general purpose computing device” – even though it is the most complex thing we have so far found in the universe, and still well beyond our capability to fully understand it.
“Whatever information comes in, just figures out what it’s going to do with it,” he says.
Eagleman says once the brain is understood in this way, it’s then possible to regard organs such as eyes or ears or the tongue as peripheral devices that convert interactions with the physical world into data inputs for the brain. And it’s a short leap from there to realise that other peripherals can then be created, which exploit the brain’s ability to make sense of the signals it receives.
These insights have already been commercialised, and there is more to come. For example, Eagleman created a jacket that could be worn by hearing impaired people, with converts sounds in their environment into vibrations they feel though their skin. It took a surprisingly short period of time – measured in mere hours – for an individual’s brain to start interpreting those vibrations as other people would interpret signals received from the ears. Smaller devices, such as wristbands, have now been created that do similar things. And Eagleman says the cost of producing a device that helps a deaf person “hear” is a fraction of alternatives such as cochlear implants.
A similar approach has been taken for people with sight impairments, and to help people who have had, for example, a leg amputation, to walk again more quickly by providing sensory feedback to them from an artificial leg.
“So those are some of the clinical things we’re doing, but what I’m interested in is how can we use a technology like this to add a completely new kind of sense to expand the human umwelt?
“For example, could we feed somebody real-time data from the internet and have them actually come to understand and have that become a direct perceptual experience for them?”
Eagleman says he has conducted experiments in which a subject feels a real-time feed of data from the internet for five seconds, and then two buttons appear on a screen.
“He has to make a choice…and he gets feedback a couple of seconds later, either frowny face or smiley face”, Eagleman says.
“And what he doesn’t know is that we’re feeding in real-time data from the stock market, and he’s making buy and sell decisions. We’re seeing if he can develop a direct perception of the economic movements of this planet.”
Eagleman has already commercialised some of his research, establishing a company called Neosensory to produce a wristband, used to address hearing loss and tinnitus.
“Neosensory spun out of my lab a while ago,” he says.
“We’re on wrists all over the world now. It’s been really exciting.”