The ancients knew the moon looks bigger near the horizon but no theory convincingly explains the illusion. Now a new idea aims to settle the debate once and for all
One of the classic optical illusions involves the Moon, which appears larger near the horizon than overhead. This illusion has been known and discussed for centuries and yet its explanation is still hotly contested.
Today, the debate is set to reignite thanks to the work of Joseph Antonides and Toshiro Kubota at Susquehanna University in Pennsylvania. These guys have a new theory that the illusion occurs because of a contradiction between the way the brain compares distance cues from its perceptual model of the world and cues from binocular vision.
That the illusion exists is uncontested. One easily accessible proof comes from photographic evidence that the size of the moon remains constant as it crosses the sky. So the question of why it appears larger near the horizon has been studied by many people from a variety of disciplines.
Perhaps the most well known explanation is the Size-Contrast theory. This states that the perceived angular size of the moon is proportional to the perceived angular size of objects around it.
Near the horizon, the moon is close to objects of a size that we know, such as trees, buildings and so on. And since it is comparable in size to these familiar objects, it appears larger.
This is related to the famous Ebbinghaus illusion in which the apparent size of a circle depends on the size of circles near by.
Antonides and Kubota say there are two problems with this theory. The first is that it does not explain the degree of expansion. Some observers report the moon appearing twice as large near the horizon and yet in experiments with the Ebbinghaus illusion, observers typically report an increase of only about 10 per cent.
The second is that it does not explain why the effect disappears in photographs and videos. By contrast the Ebbinghaus illusion is easy to reproduce.
The new theory is based on the idea that the brain judges distance in two different ways. The first is with binocular vision. When the image from each eye is the same, an object must be distant.
The second is our built-in model of the world in which we perceive the sky to be a certain finite distance away and the Sun, moon and stars to be in front of it (rather than appearing through a hole, for example).
This results in a contradiction. Our perceptual model of the world suggests the moon is closer than the sky while our binocular vision suggest it is not.
Antonides and Kubota’s theory is that the illusion is the result of the way the brain handles this contradiction. “We hypothesize that the brain resolves this contradiction by distorting the visual projections of the moon resulting in an increase in angular size,” they say.
They point out that the distortion depends crucially on the perceived distance to the sky. This is heavily influenced by distance cues on the ground, which make the sky, and therefore also the moon, look closer. Similarly, when these cues are absent, when the moon is high in the sky, both the moon and the sky seem further away.
That’s an interesting idea that should stimulate debate. Antonides and Kubota say they want to explore the idea further by experimenting with the illusion. For example, they want to measure changes in the apparent expansion of the moon with different distance cues–from an open field, a valley, mountain, inner city landscape and so on.
It might also be interesting to see how (and even if) the illusion arises in people who lack binocular vision.
Then there is the question of why the illusion reportedly disappears when the world is viewed upside down, ie standing on your head. Having not tried this, I cannot vouch for its veracity. But come the next full moon, I fully intend to test it. So don’t be surprised to see some observers of the next full moon acting rather strangely.