Sunday SciKu | A Cloudy Forecast

Photo by Ales Krivec


We tend to think of our senses as the gathering of data, as if we’re video cameras that are always running, because that’s how it feels to perceive. But that’s not how perception works. That’s fine for electronics, but too inefficient for biology.

What’s really always running while we’re conscious is a model that we project onto the world—a hypothesis about the way things are. We only use our senses to test and adjust that model as necessary. Emphasis on necessary, because the model is actually a map of the tools and obstacles the world has lain before us. A cup isn’t a cup; it’s a handled drinking thing. A low branch is something to duck.

And we’re fortune it evolved this way, because this process is how consciousness was able bootstrap itself into existence, creating an evolutionary pressure for increasing brain power, allowing for more accurate and elaborate models, which we were eventually able to push into the future, imagining external realities that don’t yet exist and perspectives we can’t access. Without sensory perception being this kind of interactive process, we’d have remained as self-aware as a smartphone, programmatically reacting to stimuli.

There are serious downsides, though, in the modern world. Because so much of our experience is based on our expectations, we’re loaded with an array of cognitive biases that bury the truth about everything, and which are often exploited to our detriment. It’s almost impossible to overestimate our own disconnection from reality.

But many researchers have shown it in the lab, and that’s what was done at Dresden Technical University, inspiring this week’s sciku. Researchers hooked people up to MRI machines and watched their subcortical auditory pathways as patterns of sounds were repeated and broken. What they demonstrated, basically, is that once a sound pattern is established, it isn’t even processed by the ears until it changes. We hear the expectation and not the sound itself.

This theory of sense cognition explains a lot of things—why it’s so difficult to proofread our own work, for example. We see what we expect to see, not what’s there. It’s also likely what explains neurological disorders like dyslexia, which has been correlated with audio pathway disruption—it manifests so strangely, with words seeming to crawl around the page, because of a mismatch between audio and visual expectation. The experience of dyslexia is how difficult reading would be if our predictive text module in the sound-sensing area of the brain were turned off, and we couldn’t anticipate the next word as well as we usually do. No matter what we think we’re doing, most of perception is expectation.

Anyway, interesting stuff.

 

waiting on
the local weather report
clouds tomorrow

 

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.