Mary’s room is a philosophical thought experiment used to question a physical or materialistic explanation of consciousness and mind. The argument has various forms but it essentially boils down to a situation where Mary is a scientist who cannot see colours but goes about to study everything physical there is to know about colour. So she learns about light, quantum mechanics, molecular biology, photoreceptors, neuroscience, psychology, art, and so forth. Then suddenly her colour vision is restored. The question then is whether or not she has learned something new. If you answer yes, then there cannot possibly be a physical or material explanation of consciousness since she has learned everything that she could about the physical properties of colour vision. The thought experiment is meant to highlight that there seems to be something special and nonphysical about qualia.
I’m certain that most everyone has at one point wondered if what they call red looks the same to someone else. How many times have you had a conversation where someone says “How do I know that my red is not your blue?” The philosopher David Chalmers uses Mary’s room as one of his arguments against a pure physical explanation of consciousness. I believe he is the person who coined the term “the hard problem of consciousness”, for the issue of how to understand the awareness of consciousness. His arguments are quite compelling but I haven’t quite jumped off the materialistic bandwagon. My response to Mary’s room is that if Mary truly discovered everything to know about consciousness then she will not learn anything new when her colour vision is restored. However, I might differ from other materialists in that I believe that she would have to simulate colour vision in her brain to qualify as knowing everything physical because there is a profound difference between knowing what an algorithm is and actually executing it. The reason, which drives much of my recent philosophical inquiry, is the Halting Problem.
The exact definition of the Halting Problem is that there cannot exist an algorithm or computer program that can tell whether every computation will halt. The inclusion of “every” is important because there could be ways to tell if some programs will halt. Obviously if we wrote the program
Print “hello world”
it would print “hello world” then stop. And the program
while (3>2) continue
would run forever. However, if you had enough “conditional” and “goto” statements, it would be hard to figure out if the program would halt and in fact, you can’t come up with a surefire system to decide if all programs will halt.
The more general implication of the Halting Problem is that you can’t generally predict what computer programs will do. Hence, there is a qualitative difference between knowing an algorithm and executing it. There is a difference between a recipe for a cake and the actual cake. You might be able to prove some certain properties about cakes using physics and chemistry but if wanted to fully understand and appreciate a cake, then you would need to know everything about the brain and cake perception, all the contingencies of the kitchen such as humidity, temperature, altitude, number of mold spores in the air, water quality and so forth. In other words, a recipe alone may be insufficient to understand a cake.
Hence, even if there existed an algorithm to understand consciousness and colour, Mary would still have to run the “experience colour” program on the specified hardware, i.e. her brain. So suppose she couldn’t experience colour because she was missing cones in her retina. Then to fully understand colour, she could implant electrodes in her visual cortex to trigger the sequence of stimuli that causes an experience of colour. Now some (most?) would argue that this is cheating and that I have actually capitulated to the argument that you couldn’t learn everything about colour by doing physical experiments and theorizing so consciousness cannot be entirely physical. However, my response is that physical knowledge entails running the program on the exact hardware it is to be implemented on because of the Halting Problem.