“Our brains are very well designed,” says Dr. Douglas Tweed, physiology professor at U of T and senior author on a paper in the March 6 issue of Nature. “The brain takes in raw data from its surroundings through sensors and interprets it, rejecting interpretations that it considers unlikely. The brain gauges the probabilities of things in real life and uses these estimates to guide our perceptions. But sometimes we can be fooled by bizarre things.
“This shouldn’t be seen as a flaw in the system, however,” Tweed argues. “This is the way the brain works. Sensors are always flawed; they simply do not provide enough information for us to reconstruct our world. The brain must use prior knowledge to interpret our surroundings and we found that it seems to do this optimally.”
This research project, led by U of T post-doctoral fellow Matthias Niemeier, uses a theory presented in the 1800s by Hermann von Helmholtz, a German physiologist. Helmholtz, who stated that perception is a matter of unconscious inference, suggested that all of our senses are imperfect and those signals sent to the brain are flawed. With this flawed data, the brain is forced to guess – based on its sensor readings – what is happening in the environment. With small or unexpected changes it often guesses wrong, which is why people can be fooled by optical illusions or sleight of hand, Tweed explains. He and Niemeier conducted the research with Professor Douglas Crawford of York University.
The researchers tested whether the brain’s perception processes are working optimally given the flawed data it receives. They programmed a computer-simulated brain to make optimal use of sensor data and prior knowledge, giving it realistic vision and quick eye movements (also known as saccades). The researchers then measured how well it perceived events in its simulated world.
The team compared these findings with those of human subjects. Subjects’ heads were immobilized and a device shaped like a contact lens inserted into their eye to measure its motion and relay information back to a computer. Using a large screen, researchers conducted two experiments that tested participants’ perceptions of distance and degree of change, using a white dot that “jumped” on the screen. They found that small jumps were invisible to participants; larger ones were seen but individuals underestimated how far the dot jumped.
“What we found was that, in simple situations, the simulated computer brain perceived things the same way and made the same kinds of errors that human brains do, even when we programmed the computer to function optimally,” says Niemeier.
“When small changes occurred during a saccade, these changes were either ignored or downplayed by both the computer and the test subjects. So we concluded that the optimal solution when it comes to perceiving the outside environment is to ignore some changes.”
The brain knows what can and cannot realistically occur based on probability and prior knowledge, say the researchers. Eyes have quite a narrow field of high-resolution vision, something in the area of two degrees, Niemeier explains. To obtain a complete picture, a person’s eyes are constantly making quick movements. These saccades – about 100,000 a day – scan our surroundings and take about 30 milliseconds each. The brain knows that an event is unlikely to happen in 30 milliseconds and either ignores or downplays small changes (also known as saccadic suppression of displacement). Only when a change is large enough does the brain notice, he says.
“The brain makes the best possible use of the flawed data it gets from sensors like the eyes or ears, piecing together bits of information until a final picture is obtained, much like the process involved in solving a jigsaw puzzle,” adds Tweed. “The sensors react even to unlikely or unexpected events but the brain disregards some of these signals to form one coherent picture. The brain is always compromising.”
The research was supported by the Canadian Institutes of Health Research and the Canada Research Chairs program.