Much has been written of the role of fake news in the US presidential election. While we will never know how much it actually contributed to the outcome, as I will show below, it could certainly affect people’s beliefs. Psychology experiments have found that humans often follow Bayesian inference – the probability we assign to an event or action is updated according to Bayes rule. For example, suppose is the probability we assign to whether climate change is real; is our probability that climate change is false. In the Bayesian interpretation of probability, this would represent our level of belief in climate change. Given new data (e.g. news), we will update our beliefs according to
What this means is that our posterior probability or belief that climate change is true given the new data, , is equal to the probability that the new data came from our internal model of a world with climate change (i.e. our likelihood), multiplied by our prior probability that climate change is real, divided by the probability of obtaining such data in all possible worlds, . According to the rules of probability, the latter is given by , which is the sum of the probability the data came from a world with climate change and that from one without.
This update rule can reveal what will happen in the presence of new data including fake news. The first thing to notice is that if is zero, then there is no update. In this binary case, this means that if we believe that climate change is absolutely false or true then no data will change our mind. In the case of multiple outcomes, any outcome with zero prior (has no support) will never change. So if we have very specific priors, fake news is not having an impact because no news is having an impact. If we have nonzero priors for both true and false then if the data is more likely from our true model then our posterior for true will increase and vice versa. Our posteriors will tend towards the direction of the data and thus fake news could have a real impact.
For example, suppose we have an internal model where we expect the mean annual temperature to be 10 degrees Celsius with a standard deviation of 3 degrees if there is no climate change and a mean of 13 degrees with climate change. Thus if the reported data is mostly centered around 13 degrees then our belief of climate change will increase and if it is mostly centered around 10 degrees then it will decrease. However, if we get data that is spread uniformly over a wide range then both models could be equally likely and we would get no update. Mathematically, this is expressed as – if then . From the Bayesian update rule, the posterior will be identical to the prior. In a world of lots of misleading data, there is no update. Thus, obfuscation and sowing confusion is a very good strategy for preventing updates of priors. You don’t need to refute data, just provide fake examples and bury the data in a sea of noise.