The world we experience as “out there” is actually a reconstruction of a tiny part of reality by our brains. Our interpretation of sensations are heavily shaped by our thinking processes such as attention, expectation and memory. What we perceive at any given moment is also determined by our personal physical abilities, energy levels, feelings, social identities and much more.
Thus our brains construct simulations of our surroundings by combining incoming sensory data with existing, unconsciously stored memories, beliefs and concepts.
Let us just peep into our brains. The brain is locked inside the bony vault of our skull, trying to figure out what is out there in the world apart from controlling our internal body functions. There is no light inside the skull nor is there any sound. Our eyes, ears and other sensory organs just deliver streams of electrical signals to our brain. These signals do not come with labels attached like “I am from a cat or I am from a coffee cup”. They are just electrical signals which do not themselves have any shape, colour or sound. In order, therefore, to figure out what is out there in the world, the brain has to combine these ambiguous sensory signals with some prior “expectations” or “predictions” about the world.

According to Anil Seth, professor of Cognitive and Computational Neuroscience at the University of Sussex, “perception, instead of just being a reflection of what’s actually there in the world, is always this active process of interpretation. We assume that we see with our eyes but, in fact, we see with our brains. Our eyes are of course necessary, but what we actually end up perceiving is much more a product of how our brain interprets all this information from the eyes than the eyes being this window into an objective external reality.”
Our brain draws on lifetime of our past experiences, like things that have happened to us personally and things that we have learned from our friends, teachers, books, videos and other sources. In the blink of an eye, our brain reconstructs bits and pieces of past experiences as our neurons pass electrochemical information back and forth in an ever-shifting, complex neural network. Our brain assembles these bits to infer the meaning of the sense data that we are receiving to decide what to do about it.
Fortunately, while it is true that the inputs from our body and its sense organs are like simulations, we can still be fairly confident that most of the time, they are faithful representations of the actual things out there. Evolution over hundreds of years, has shaped and reshaped those representations to be a pretty close to the external world. Organisms that could not accurately and reliably perceive and respond to their environment have not survived and became extinct. We, humans have descended from those animals that had relatively accurate perceptions of external reality.
To appreciate that perception is not reality, let us take the example of colour. While we all think colour as an important attribute of an object, Newton remarked, colour is not a property of an object. When electromagnetic radiation hits an object, some of it bounces off and is captured by our eyes.
Our eyes are equipped to detect only a limited set of wave lengths, from about 400 nm to about 700 nm, and pass them on to the brain. Colour, therefore, is an interpretation of wave lengths by our brains. Incidentally, our recognisable visual wavelengths is less than one trillionth of the available spectrum around us.
We also do not have biological receptors to pick up various spectrums, including x-rays, microwaves, radio waves, gamma rays, cell phone conversations etc. even when all of these are flowing through us. The slice of the outside reality that we can sense is thus limited by our biology. Every creature on earth perceives, as objective reality, only that much which its biology permits. Therefore colours are a clever trick that evolution has developed to help our brains to keep track of surfaces under changing lighting conditions.
Out of all our senses, the visual system collects up to 80% of all the sensory data received from the environment. In order to make sense of this deluge of optical information, the visual inputs are converted into electrochemical signals by approximately 130 million light-sensitive cells in the retina known as cones and sent to the brain to be processed by a complex network of nerve cells.
Light and colour have other effects on us. We also have on our eyes retinal ganglion cells which respond to light by sending signals mainly to a central brain region called the hypothalamus which plays no part in forming visual images.
The hypothalamus is a key part our brain responsible for the secretion of a number of hormones which control many aspects of the body’s self-regulation, including temperature, sleep, hunger and circadian rhythms. What this means is that there is clearly an established physiological mechanism through which colour and light can affect mood, heart rate, alertness, and impulsivity, to name but a few.
Similarly, our acoustic sense can only register and process a very narrow band of frequencies ranging from about 16 Hz–20 kHz. Typically, infrasonic and ultrasonic bands are just not perceivable by humans despite being essential for other species such as elephants and bats, respectively.
Interestingly, to make sense of complex environments permeated by light and sound, brain waves constantly adapt, compensating for drastically different sound and vision processing speeds. It is common knowledge that sound and light travel at very different speeds. If the brain did not account for this difference, it would be much harder for us to tell where sounds came from, and how they are related to what we see. The visual and the sound signals created at the same time are sensed by the brain at different points of time and are processed by neural circuits at different speeds. However they are still presented to us as happening synchronously.
Our human biology limits us from sensing what other animals can do. Unlike bees, we do not see ultraviolet light and we cannot sense magnetic field unlike turtles, worms and wolves. We are also deaf to high and low pitch sounds that other animals can hear and we have a relatively weak sense of smell.
While there is clearly an enormous data in the external world, evolution has equipped us to process only a very limited set of data enough to enable us to survive. The goal of our senses and brain is to make one and only one decision based on the unambiguous interpretation of the received data in order to execute an appropriate action. This single interpretation enables us to take fast action without quarrelling about alternatives enabling us to survive as species. In order to make such a clear interpretation of available data, we need a mental model of the external world which is very clear without ambiguities.
Thus our brain has a problem to solve, which philosophers call a ‘reverse inference problem’. Faced with ambiguous data, our brain must somehow guess the causes of that data to decide the plan of action to keep us safe and alive.
Fortunately, our brain has another source of information which is our memory that can help with this very challenging task. Our brain can draw on our lifetime of past experiences, some of which would be similar to the present moment, to guess the meaning of the sense data.
The process of combining prior knowledge with uncertain evidence is known as Bayesian Integration. MIT neuroscientists have discovered distinctive brain signals that encode the prior beliefs. They have also found how the brain uses these signals to make judicious decisions in the face of uncertainty. And this whirlwind of mental construction all happens in the blink of an eye, completely outside of our awareness. The intimate processing between sensory inputs and our neural networks, enables us to recognize familiar objects or take appropriate actions within a few milliseconds.
Our Bayesian brain predicts incoming sensory data using the internal model from within and this is called ‘interoception’. It also predicts the meaning of outside data and this is called ‘exteroception’. It matches these two predictions to give us the perception that we become conscious of. If there is discrepancy between the two predictions, then the internal model which was built based on past experiences is updated.

While we might feel as if we are simply reacting to events that happen around us, in actuality, our brain constantly and invisibly guesses what to do next and what we will experience next, based on memories that are similar to the present moment. The key word here is ‘similar’. The brain doesn’t need an exact match. We have no trouble climbing a new, unfamiliar staircase because we have climbed staircases in the past. So similarity is enough for our brain to help us survive and thrive in the world.
The problem of predictive coding, however, is that this fast and quick assessment by brain based on the past experiences may be faulty at times as illustrated by Alex Korb, a postdoctoral researcher in neuroscience at UCLA in the following narrative.
“I am driving down a sunny, tree-lined street in Santa Monica. As I make a left turn I notice a blind man standing on the corner with his seeing-eye dog. He wears dark sunglasses and carries a cane.
As I turn past him I see that what I thought was a cane is actually a pooper-scooper! It amazes me that a blind man is capable of cleaning up after his dog. I guess in absence of vision the brain develops a greater sensitivity to localizing smells. I chastise myself for assuming that blind people are more disabled than they actually are. Then I notice the dog is on a regular leash rather than a sturdier seeing-eye dog leash, and I can’t understand how that could possibly provide enough tactile guidance to the blind man. I figure he’s been blind a while and has the hang of it. As I drive away I glance in the rear-view mirror and see the blind man turn his head both ways before crossing the street. Finally, it dawns on me that the man is not actually blind, he is just a normally-sighted guy wearing sunglasses, carrying a pooper-scooper and taking his dog for a walk”.
According to the theory of predictive coding, our brain constantly attempts to model the probability of its own future states, with the goal of minimizing uncertainty. At the macro level, anticipation is the key to predicting events as they unfold, thus allowing us to interact with the external world efficiently. At the micro level anticipation prepares our motor and sensory functions ready to execute the expected actions.
The predictive coding framework supports the brain functions to minimize surprises and uncertainty that may be faced by us.
Let us look at the example of how our brain extrapolates. It takes time for information from our eyes to reach our brain. It takes further time to analyse the received electrical signals to come to a conclusion on the data. Only at this point of time we can perceive the meaning of the input data. Due to this processing delay, the information available to our conscious perception is always outdated.
Let us consider catching a ball. It takes several dozen milliseconds for information from the eye to reach our brain and about 120 milli-seconds to take any action on the basis of this information. As the ball continues to move all this while, our perception about the current position of the ball is always lagging. Typically, in lots of sports, balls travel at speeds well above 100km per hour, This means that the ball can move more than three metres during this lag time before we consciously perceive the ball. Clearly, if we react based on the perceived position of the ball, we would never be able to catch or hit the ball as it would have passed us by due to the delay. However, we manage to hit the ball only because our intelligent brain extrapolates the moving object’s position forward, along its perceived trajectory. In cricket, the bowler tries to deceive the batsman by making the ball move in a different direction than what the batsman’s brain has projected.
Similarly, our intelligent brain, based on our previous experiences, has learned that a single sensory cue, such as a loud bang, can have many different causes. It could be a door being slammed, a bursting balloon, a hand clap, a gunshot etc. Our brain searches our memory for past experiences to provide the closest match to this sound, fully taking into account the context with its accompanying sights, smells and other sensations.
Here is another example of predictive processing by our brain. Let us see how our brain can so effortlessly read jumbled and garbled words.
“It deosn’t mttaer in waht oredr the ltteers in a wrod are, the olny iprmoetnt tihng is taht the frist and lsat ltteer be at the rghit pclae. The rset can be a toatl mses and you can sitll raed it wouthit porbelm. Tihs is bcuseae the huamn mnid deos not raed ervey lteter by istlef, but the wrod as a wlohe.”
As our brain deciphered each word in the example above, it also predicted which words would logically come next to form a coherent sentence. Dr Lars Muckli, neurophysiologist at the University of Glasgow, says “We are continuously anticipating what we will see, hear or feel next.”
Here is another example of how context decides our perception.
Look at the same shape in the middle appearing in two different contexts. Given the context below where there are two alphabets on either side, we will perceive the shape to be “B”.

On the other hand, in the following context of numbers on either side, the same shape will be perceived by us as the number “13.”

Our perception is therefore driven by our cognitive expectations based on the context.
There is also downside. We underestimate the capacity of our brains to create their own convincing realities. Psychologists use the term “cognitive distortions” to describe irrational, inflated thoughts or beliefs that distort a person’s perception of reality, usually in a negative way.
We underestimate how powerfully realistic some dissociative experiences, hallucinations and other well-recognized mental/neural misperceptions can seem. And yet we find our own subjective perceptions so persuasive that we are more willing to doubt the laws of physics than to doubt our own minds. We can’t help assuming that perception equals reality.
Psychiatrists therefore have an unenviable task of trying to persuade people to be sceptical about their own beliefs and to critically examine the evidence for their assumptions and to not automatically believe their own thoughts and perceptions.
For all the advancements the world has seen in every field of science, including neuroscience, the mechanics of perception and thinking still elude comprehensive understanding.
Take these examples of what scientists are still trying to figure out. When we lie on our sides, the brain appears to dial down its reliance on information related to the external world and instead increases reliance on internal perceptions generated by touch. Blindfolding degrades our representation of the external world, which allows our internal body-centred perception to dominate. In our inner ear, we have a little bit of ocean that came with us when we evolved from the sea. We carry it around to assess gravity, so we can tell which way is up. Issues with our hearing system can cause disorders such as vertigo.
Indian philosophy, even as early as the period of Rig-Veda, gave the concept of Maya to depict the world as unreal and illusionary. The changing world that we see around us can be compared to the moving images on a movie screen. There cannot be a movie without the screen. Brahman, our true self, is the one who enables us like the screen to sense the world as reality.
Vijnanavada Buddhism also looks at the world as unreal and only as the projection of our mind.
Some References:
https://www.frontiersin.org/articles/10.3389/fnhum.2014.00566/full
https://news.mit.edu/2019/how-expectation-influences-perception-0715
https://www.frontiersin.org/articles/10.3389/fpain.2020.574370/full
https://journals.sagepub.com/doi/full/10.26599/BSA.2019.9050023
https://www.quantamagazine.org/to-be-energy-efficient-brains-predict-their-perceptions-20211115/
Lovely article. Fascinating to know about colour being a perception and not being inherent to the object. Makes sense as many are colour-blind and perceive the same object differently. As you say, the brain uses memory to complement its sensory inputs. It would be good to know when does the brain update its priors? We see the biases exist despite the brain facing information to the contrary. Does the physiology of the brain determine how deep these biases are? Can we train the brain to update its priors faster?