Response to Oxford paper on covid-19

Here is my response to the paper from Oxford (Lourenco et al.) arguing that novel coronavirus infection may already be widespread in the UK and Italy.  The result is based on fitting a disease spreading model, called an SIR model, to the cumulative number of deaths. SIR models usually consist of ordinary differential equations (ODEs) for the fraction of people in a given population who are susceptible to the infectious agent (S), the number infected (I),  and the number recovered (R). There is one other state in the model, which is the fraction who die from the disease (D).  The SIR model considers transitions between these states.  In the case of ODEs, the states are treated as continuous quantities, which is not a bad approximation for a large population, and each equation in the model describes the rate of change of a state (hence differential equation).  There are parameters in the model governing the rate of different interactions in each  equation, for example there is a parameter for the rate of increase in S whenever an S interacts with an I, and then there is a rate of loss of an I, which transitions into either R or D.  The Oxford group model D somewhat differently.  Instead of a transition from I into D they consider that a fraction of (1-S) will die with some delay between time of infection and death.

They estimate the model parameters by fitting the model to the cumulative number of deaths.  They did this instead of fitting directly to I because that is unreliable as many people who have Covid-19 have not been tested. They also only fit to the first 15 days from the first recorded death since they want to model what happens before social distancing was implemented.  They find that the model is consistent with a scenario where the probability that an infected person gets severe enough to be flagged is low and thus the disease is much more wide spread than expected. I redid the analysis without assuming that the parameters need to have particular values (called priors in Bayesian inference and machine learning) and showed that a wide range of parameters will fit the data. This is because the model is under-constrained by death data alone so even unrealistic parameters can work.  To be fair, the authors only proposed that this is a possibility and thus the population should be tested for anti-bodies to the coronavirus (SARS-CoV-2) to see if indeed there may already be herd immunity in place. However, the press has run with the result and that is why I think it is important to examine the result more closely.

The relevant Covid-19 fatality rate

Much has been written in the past few days about whether the case fatality rate (CFR) for Covid-19 is actually much lower than the original estimate of about 3 to 4%. Globally, the CFR is highly variable ranging  from half a  percent in Germany to nearly 10% in Italy. The difference could be due to underlying differences in the populations or to the extent of testing. South Korea, which has done very wide scale testing, has a CFR of around 1.5%. However, whether the CFR is high or low is not the important parameter.  The number we must determine is the population fatality rate because even if most of the people who become infected with SARS-CoV-2 have mild or even no symptoms so the CFR is low, if most people are susceptible and the entire world population gets the virus then even a tenth of a percent of 7 billion is still a very large number.

What we don’t know yet is how much of the population is susceptible. Data from the cruise ship Diamond Princess showed that about 20% of the passengers and crew became infected but there were still some social distancing measures in place after the first case was detected so this does not necessarily imply that 80% of the world population is innately immune. A recent paper from Oxford argues that about half of the UK population may already have been infected and is no longer susceptible. However, I redid their analysis and find that widespread infection although possible is not very likely (details to follow — famous last words) but this can and should be verified by testing for anti-bodies in the population. The bottom line is that we need to test, test and test both for the virus and for anti-bodies before we will know how bad this will be.

How many Covid-19 cases are too many ?

The US death rate is approximately 900 per 100,000 people. Thus, for a medium sized city of a million there are on average 25 deaths per day. Not all of these deaths will be  preceded by hospital care of course but that gives an idea for the scale of the case load of the health care system. The doubling time for the number of cases of Covid-19 is about 5 days. At this moment, the US has over 25 thousand cases with 193 cases in Maryland, where I live, and over 11 thousand in New York. If the growth rate is unabated then in 5 days there will be almost 400 cases in Maryland and over 50 thousand in the US. The case-fatality rate for Covid-19 is still not fully known but let’s suppose it is 1% and let’s say 5% of those infected need hospital care. This means that 5 days from now there will be an extra 20 patients in Maryland and 2500 patients in the US. New York will be have an extra thousand patients. Many of these patients will need ventilators and most hospitals only have a few. It is easy to see that it will not take too long until every ventilator in the state and US will be in use. Also, with the shortage of protective gear,  some of the hospital staff will contract the virus and add to the problem. As conditions in hospitals deteriorate, the virus will spread to non-covid-19 patients. This is where northern Italy is now and the US is about 10 days behind them. This is the scenario that has been presented to the policy makers and they have chosen to take what may seem like extreme social distancing measures now. We may not be able to stop this virus but if we can slow the doubling time, which is related to how many people are infected by a person with the virus, then we can give the health care system a chance to catch up.

The failure of supply-side social policy

The US is in the midst of two social crises. The first is an opioid epidemic that is decimating parts of rural and now urban America and the second is a surge in the number of migrants crossing the southern US border primarily from Central America. In any system that involves flow, either physical (e.g. electricity) or social (e.g. money), the amount of flow (i.e. flux) is dependent on the amount of supply (e.g. power station/federal reserve) and the amount of demand (e.g. air conditioner/disposable income). So if you want to reduce opioid consumption or illegal immigration you can either shut down the supply or reduce the demand.

During the twentieth century there was a debate over the causes of booms and busts in the economy. I am greatly simplifying the debate but on one side were the demand-side Keynesians who believed that the business cycle is mostly a result of fluctuating demand. If people suddenly decide to stop spending then businesses would lose customers, which would lead them to lay off workers, who would then have less money to spend in other businesses and thus reduce demand further and so forth, leading to a recession. On the other side there were the supply-siders who believed that the problem of economic downturns was inadequate supply, which would be solved by cutting taxes and reducing business regulations. The Great Recession of 2008 provided a partial test of both theories as the US applied a demand-side fix in the form of a stimulus while Europe went for “expansionary austerity” and cut government spending, which slashes demand. The US has now experienced over a decade of steady growth while Europe went into a double dip recession before climbing out after the policy changed. That is not to say that demand-side policies always work. The 1970’s were plagued by stagflation with high unemployment and high inflation for which the Keynesians had no fix. Former Fed Chairman Paul Volcker famously raised interest rates in 1979 to reduce the money supply. It triggered a short recession, which was followed by nearly three decades of low inflation economic growth.

In terms of social policy, the US has really only tried supply-side solutions. The drug war put a lot of petty dealers and drug users in jail but did little to halt the use of drugs. It seems to me that if we really want to solve or at least alleviate the opioid and drug crisis, we need to slash demand. Opioids are pain killers and are physically addictive. Addicted users who try to stop will experience withdrawal, which is extremely painful. If you do succeed you will no longer be physically addicted. However, you can always relapse if you use again. The current US opioid epidemic started with a change in the philosophy of pain management by the medical establishment with a concurrent development of new supposedly less addictive opioid pills. So doctors, encouraged by the pharmaceutical industry, began prescribing opioids for all manners of ailments. Most doctors were well intentioned but a handful participated in outright criminal activity and became de facto drug dealers. In any case, this led to the initial phase of the opioid epidemic. When awareness of over prescription started to enter public consciousness there was pressure to reduce the supply. Addicts then turned to illicit opioids like heroin, which started phase 2 of the epidemic. However, as this supply was targeted by drug enforcement, a new highly potent and cheaper synthetic opioid, fentanyl, emerged. This was something that was easy to produce in makeshift labs anywhere and also provided a safer business model for drug dealers. However, fentanyl is so potent that this is has led to a surge in overdose deaths. Instead of targeting supply we need to reduce demand. First we need to understand why people take them in the first place. While some drugs are taken for the experience or entertainment, opioids are mostly being used to alleviate pain and suffering. It is probably no coincidence that the places most ravaged by opioids are also those that are struggling most economically. If we want to get a handle on the opioid crisis we need to improve these areas economically. People probably also take drugs for some form of escape. This is where I think video games and virtual reality may be helpful. We can debate the merits of playing Fortnite 16 hours a day but it is surely better than taking cocaine. I think we should take using video games as a treatment for drug addiction seriously. We could and should also develop games for this purpose.

Extra border security has not stemmed illegal immigration. What does slow immigration is a downturn in the US economy, which quenches demand for low-skilled labour, or an improvement in the conditions of the originating countries, which reduces the desire to leave in the first place. The current US migrant crisis is mostly due to the abhorrent and dangerous conditions in Guatemala and Honduras. For Europe, it is problems in Africa and the Middle East. In both cases, putting up more barriers or treating the migrants inhospitably is not really doing much. It just makes the journey more perilous, which is bad for the migrant and a moral and public relations nightmare for host countries. Perhaps, we could try to stem demand by at least making it safer in the originating countries. The US could provide more aid to Latin America including stationing American troops if necessary to curb gang activity and restore civil order. This would at least help diminish those seeking asylum. Reducing economic migration is much harder since we really don’t know how to do economic development very well but more investment in source countries could help. While globalization and free trade may have hurt the US worker and contributed to the opioid epidemic by decimating manufacturing in the US, it has also brought a lot of people out of abject poverty. The growth miracles in China and the rest of Asia would not be possible without international trade and investment. Thus the two crises are not independent. More free trade could help to reduce illegal immigration but it could also lead to worsening economic conditions for some regions spurring more opioid use. There are no magic bullets but we at least need to change the strategy.

Duality and computation in the MCU

I  took my kindergartener to see Avengers: Endgame recently. My son was a little disappointed, complaining that the film had too much talking and not enough fighting. To me, the immense popularity of the Marvel Cinematic Universe series and so-called science fiction/fantasy in general is an indicator of how people think they like science but really want magic. Popular science-fictiony franchises like MCU and Star Wars are couched in scientism but are often at odds with actual science as practiced today. Arthur C Clarke famously stated in his third law that “Any sufficiently advanced technology is indistinguishable from magic.” A sentiment captured in these films.

Science fiction should extrapolate from current scientific knowledge to the possible. Otherwise, it should just be called fiction. There have been a handful of films that try to do this like 2001: A Space Odyssey or more recently Interstellar and The Martian. I think there is a market for these types of films but they are certainly not as popular as the fantasy films. To be fair, neither Marvel nor Star Wars (both now owned by Disney) market themselves as science fiction as I defined it. They are intended to be mythologies a la Joseph Campbell’s Hero’s Journey. However, they do have a scientific aesthetic with worlds dominated by advanced technology.

Although I find the MCU films not overly compelling, they do bring up two interesting propositions. The first is dualism. The superhero character Ant-Man has a suit that allows him to change size and even shrink to sub-atomic scales, called the quantum realm in the films. (I won’t bother to discuss whether energy is conserved in these near instantaneous size changes, an issue that affects the Hulk as well). The film was advised by physicist Spiros Michalakis and is rife with physics terminology and concepts like quantum entanglement. One crucial concept it completely glosses over is how Ant-man maintains his identity as a person, much less his shape, when he is smaller than an atom. Even if one were to argue that one’s consciousness could be transferred to some set of quantum states at the sub-atomic scale, it would be overwhelmed by quantum fluctuations. The only self-consistent premise of Ant-Man is that the essence or soul if you wish of a person is not material. The MCU takes a definite stand for dualism on the mind-body problem, a sentiment with which I presume the public mostly agrees. 

The second is that magic has immense computational power. In the penultimate Avengers movie, the villain Thanos snaps his fingers while in possession of the complete set of infinity stones and eliminates half of all living things. (Setting aside the issue that Thanos clearly does not understand the the concept of exponential growth. If you are concerned about overpopulation, it is pointless to shrink the population and do nothing else because it will just return to its original size in short time.) What I’d like to know is who or what does the computation to carry out the command. There are at least two hard computational problems that must be solved. The first is to identify all lifeforms.  This is clearly no easy task as we to this day have no precise definition of life. Do viruses get culled by the snap? Do the population of silicon-based lifeforms of Star Trek get halved or is it only biochemical life? What algorithm does the snap use to find all the life forms? Living things on earth range in size from single cells (or viruses if you count them) all the way to 35 metre behemoths, which are comprised of over 10^{23} numbers of atoms. How do the stones know what scales they span in the MCU? Do photosynthetic lifeforms get spared since they don’t use many resources? What about fungi? Is the MCU actually a simulated universe where there is a continually updated census of all life? How accurate is the algorithm? Was it perfect? Did it aim for high specificity (i.e. reduce false positives so you only kill lifeforms and not non lifeforms) or high sensitivity (i.e. reduce false negatives and thus don’t miss any lifeforms). I think it probably favours sensitivity over specificity – who cares if a bunch of ammonia molecules accidentally get killed. The find-all-life problem is made much easier by proposition 1 because if all life were material then the only way to detect them would be to look for multiscale correlations between atoms (or find organic molecules if you only care about biochemical life). If each lifeform has a soul then you can simply search for “soulfulness”. The lifeforms were not erased instantly but only after a brief delay. What was happening over this delay. Is magic propagation limited by the speed of light or some other constraint? Or did the computation take time? In Endgame, the Hulk restores all the Thanos erased lifeforms and Tony Stark then snaps away Thanos and all of his allies. Where were the lifeforms after they were erased? In Heaven? In a soul repository somewhere? Is this one of the Nine Realms of the MCU? How do the stones know who is a Thanos ally? The second computation is to then decide which half to extinguish. The movie seems to imply that the choice was random so where did the randomness come from? Do the infinity stones generate random numbers? Do they rely on quantum fluctuations? Finally, in a world with magic, why is there also science? Why does the universe follow the laws of physics sometimes and magic other times. Is magic a finite resource as in Larry Niven’s The Magic Goes Away. So many questions, so few answers.