This past Sunday, economist Paul Krugman was lamenting in a book review of Justin Fox’s book “Myth of the Rational Market” (which he liked very much) that despite this current financial crisis and previous crises, like the failure of the hedge fund Long Term Capital Management, people still believe in efficient markets as strongly as ever. The efficient market hypothesis is the basis of most of modern finance and assumes that the price of a security is always correct and that you can never beat the market. So artificial bubbles should never occur. Krugman wonders what it will take to ever change people’s minds.
I want to show here that there might be no amount of evidence that will ever change their minds and they can still be perfectly rational in the Bayesian sense. The argument can also apply to all other controversial topics. I think it is generally believed in intellectual circles that the reason there is so much disagreement on these issues is that the other side is either stupid, deluded or irrational. I want to point out that believing in something completely wrong even in the face of overwhelming evidence may arise in perfectly rational beings. That is not to say that faulty reasoning does not exist and can be dangerous. It just explains why two perfectly reasonable and intelligent people can disagree so alarmingly.
Consider the very simple case of a hypothesis H, like “the market is not efficient” and there is some data D, like a financial crisis. Then from Bayes rule, the probability that the hypothesis is true given the data is , where P(D|H) is the likelihood function (probability of obtaining the data given the hypothesis is true), P(H) is the prior probability the hypothesis is true, and P(D) is the probability of obtaining the data. Thus the odds that the hypothesis is true to it being false is
So, you see that even if two people have identical likelihood functions (i.e. reasoning ability) and have the same data, they can still come to completely different conclusions depending on their priors. For example, let’s say two people agree that the odds of a crisis occurring given that efficient markets are false is 100 to 1. So whatever the prior odds against an efficient market was before, it is now 100 times greater after the crisis. So for someone who may have believed that efficient markets had even odds of being false now believes it is 100 to 1 that it wrong. But, someone who originally believed that the odds of an efficient market were false was one in a million now believes it is one in ten thousand. Hence, given enough events they will eventually change their minds. However, suppose there is a person who believed that the probability of an efficient market is false is zero. Then they are completely unaffected by the data and no amount of data can ever convince them. If a hypothesis has zero prior support then it can never be validated no matter what the data.
You could argue that a zero prior is faulty to start with but that is not a failure of reasoning. In fact, it is easy to see how a prior could be zero. Suppose you are a naive student and you take a class from a Fama or a Miller who implicitly assumes that the market is efficient. You could easily end up believing that it is a law of nature like gravity. There could even be an amplification effect in that the professor may have some doubt about the idea but for pedagogical or other purposes will not bring it up in her class. Then the next generation is even more certain and eventually it becomes dogma.
So why are there efficient market doubters? Well I think that there are probably some neural mechanisms that sets a minimum doubt level in every person. Some people have complete certainty in their beliefs while others have doubts about everything. I believe that this doubt level is innate and could be related to genes governing certain ion channels in the brain. So some of the students will not completely believe in the efficient market. However, given that doubt level seems to be broadly distributed in the population there must be advantages for maintaining diversity in a population. A community of pure believers is dangerous (e.g. Jonestown) but one of doubters may end up starving to death because they can never decide on what to do. A balance of the two may be necessary to get things done but also have a reality check. This also means that some wrong ideas will only disappear when the zero doubt holders take them to the grave.