A paper in Current Biology by Andrew Glennerster and colleagues shows that humans ignore the evidence of their own eyes to create a fictional stable world as described in the Oxford University News.
The Virtual Reality Research Group in Oxford used the latest in virtual reality technology to create a room where they could manipulate size and distance freely. They made the room grow in size as people walked through it, but subjects failed to notice when the scene around them quadrupled in size. As a consequence, they made gross errors when asked to estimate the size of objects in that room. (...) These results imply that observers are more willing to adjust their estimate of the separation between the eyes or the distance walked than to accept that the scene around them has changed in size,’ says Dr Glennerster. ‘More broadly, these findings mark a significant shift in the debate about the way in which the brain forms a stable representation of the world. They form part of a bigger question troubling neuroscience – how is information from different times and places linked together in the brain in a coherent way?’
Why do I blog this? this is an interesting example of the weird connections between cognitive systems and space perception.