Missing the trend

I have been fortunate to have been born at a time when I had the opportunity to witness the birth of several of the major innovations that shape our world today.  I have also managed to miss out on capitalizing on every single one of them. You might make a lot of money betting against what I think.

I was a postdoctoral fellow in Boulder, Colorado in 1993 when my very tech savvy advisor John Cary introduced me and his research group to the first web browser Mosaic shortly after it was released. The web was the wild west in those days with just a smattering of primitive personal sites authored by early adopters. The business world had not discovered the internet yet. It was an unexplored world and people were still figuring out how to utilize it. I started to make a list of useful sites but unlike Jerry Yang and David Filo, who immediately thought of doing the same thing and forming a company, it did not remotely occur to me that this activity could be monetized. Even though I struggled to find a job in 1994, was fairly adept at programming, watched the rise of Yahoo! and the rest of the internet startups, and had friends at Stanford and Silicon Valley, it still did not occur to me that perhaps I could join in too.

Just months before impending unemployment, I managed to talk my way into being the first post doc of Jim Collins, who just started as a non-tenure track research assistant professor at Boston University.  Midway through my time with Jim, we had a meeting with Charles Cantor, who was a professor at BU then, about creating engineered organisms that could eat oil. Jim subsequently recruited graduate student Tim Gardner, now CEO of Riffyn, to work on this idea. I thought we should create a genetic Hopfield network and I showed Tim how to use XPP to simulate the various models we came up with. However, my idea seemed too complicated to implement biologically so when I went to Switzerland to visit Wulfram Gerstner at the end of 1997,  Tim and Jim, freed from my meddling influence, were able create the genetic toggle switch and the field of synthetic biology was born.

I first learned about Bitcoin in 2009 and had even thought about mining some. However, I then heard an interview with one of the early developers, Gavin Andresen, and he failed to understand that because the supply of Bitcoins is finite, prices denominated in it would necessarily deflate over time. I was flabbergasted that he didn’t comprehend the basics of economics and was convinced that Bitcoin would eventually fail. Still, I could have mined thousands of Bitcoins on a laptop back then, which would be worth tens of millions today.  I do think blockchains are an important innovation and my former post-bac fellow Wally Xie is even the CEO of the blockchain startup QChain. Although I do not know where cryptocurrencies and blockchains will be in a decade, I do know that I most likely won’t have a role.

I was in Pittsburgh during the late nineties/early 2000’s in one of the few places where neural networks/deep learning, still called connectionism, was king. Geoff Hinton had already left Carnegie Mellon for London by the time I arrived at Pitt but he was still revered in Pittsburgh and I met him in London when I visited UCL. I actually thought the field had great promise and even tried to lobby our math department to hire someone in machine learning for which I was summarily dismissed and mocked. I recruited Michael Buice to work on the path integral formulation for neural networks because I wanted to write down a neural network model that carried both rate and correlation information so I could implement a correlation based learning rule. Michael even proposed that we work on an algorithm to play Go but obviously I demurred. Although, I missed out on this current wave of AI hype, and probably wouldn’t have made an impact anyway, this is the one area where I may get a second chance in the future.

 

 

Advertisements

The wealth threshold

The explanation for growing wealth inequality proposed by Thomas Piketty in his iconic book Capital in the Twenty-First Century, is that the rate of growth from capital exceeds that of the entire economy in general. Thus, the wealth of owners of capital (i.e. investors) will increase faster than everyone else. However, even if the rate of growth were equal, any difference in initial conditions or savings rate, would also amplify exponentially. This can be seen in this simple model. Suppose w is the total amount of money you have, I is your annual income, E is your annual expense rate, and r is the annual rate of growth of investments or interest rate. The rate of change in your wealth is given by the simple formula

\frac{dw}{dt} = I(t) - E(t)+ r w,

where we have assumed that the interest rate is constant but it can be easily modified to be time dependent. This is a first order linear differential equation, which  can be solved to yield

w = w_0 e^{r t} + \int_{0}^t (I-E) e^{r(t-s)} ds,

where w_0 is your initial wealth at time 0. If we further assume that income and expenses are constant then we have w = w_0 e^{r t} +  (I-E)( e^{rt} -1)/r. Over time, any difference in initial wealth will diverge exponentially and there is a sharp threshold for wealth accumulation. Thus the difference between building versus not building wealth could amount to a few hundred dollars in positive cash flow per month. This threshold is a nonlinear effect that shows how small changes in income or expenses that would be unnoticeable to a wealthy person could make an immense difference for someone near the bottom. Just saving a thousand dollars per year, less than a hundred per month, would give one almost a hundred and fifty thousand dollars after forty years.

Equifax vs Cassini

The tired trope from free market exponents is that private enterprise is agile, efficient, and competent, while government is prodding, incompetent, and wasteful. The argument is that because companies must survive in a competitive environment they are always striving to improve and gain an edge against their competitors. Yet history and recent events seem to indicate otherwise. The best strategy in capitalism seems to be to gain monopoly power and extract rent. While Equifax was busy covering up their malfeasance instead of trying to fix things for everyone they harmed, Cassini ended a brilliantly successful mission to explore Saturn. The contrast couldn’t have been greater if it was staged. The so-called incompetent government has given us moon landings, the internet, and built two Voyager spacecraft that have lasted 40 years and have now exited the Solar system into interstellar space. There is no better run organization than JPL. Each day at NIH, a government facility, I get to interact with effective and competent people who are trying to do good in the world. I think it’s time to update the government is the problem meme.

Science and the vampire/zombie apocalypse

It seems like every time I turn on the TV, which only occurs when I’m in the exercise room, there is a show that involves either zombies or vampires. From my small sampling, it seems that the more recent incarnations try to invoke scientific explanations for these conditions that involve a viral or parasitic etiology. Popular entertainment reflects societal anxieties; disease and pandemics is to the twenty first century what nuclear war was to the late twentieth. Unfortunately, the addition of science to the zombie or vampire mythology makes for a much less compelling story.

A necessary requirement of good fiction is that it be self-consistent. The rules that govern the world the characters inhabit need to apply uniformly. Bram Stoker’s Dracula was a great story because there were simple rules that governed vampires – they die from exposure to sunlight, stakes to the heart, and silver bullets. They are repelled by garlic and Christian symbols. Most importantly, their thirst for blood was a lifestyle choice, like consuming fine wine, rather than a nutritional requirement. Vampires lived in a world of magic and so their world did not need to obey the laws of physics.

Once you try to make vampirism or zombism a disease and scientifically plausible in our world, you run into a host of troubles. Vampires and zombies need to obey the laws of thermodynamics, which means they need energy to function. This implies that the easiest way to kill one of these creatures is to starve them to death. Given how energetically active vampires are and how little caloric content blood has by volume, since it is mostly water, vampires would need to drink a lot of blood to sustain themselves. All you need to do is to quarantine all humans into secure locations for a few days and all vampires should either starve to death or fall into a dormant state. Vampirism is self-limiting because there would not be enough human hosts to sustain a large population. This is why only small animals can subsist entirely on blood (e.g. vampire bats weight about 40 grams and can drink half their weight in blood). Once, you make vampires biological, it makes no sense why they can only drink blood. What exactly is in blood that they can’t get from eating flesh? Even if they don’t have a digestive system that can handle solid food, they could always put meat into a Vitamix and make a smoothie. Zombies eat all parts of humans so they would need to feed less often than vampires and thus be harder to starve. However, zombies are usually non-intelligent and thus easier to avoid and sequester. It seems like any zombie epidemic could be controlled at very early stages. Additionally, why is it that zombies don’t eat each other? Why do they only like to eat humans?  Why aren’t they hanging around farms and eating livestock and poultry?

Vampires and sometimes zombies also have super-strength without having to bulk up. This means that their muscles are much more efficient. How is this possible? Muscles are pretty similar at the cellular level. Chimpanzees are stronger than humans by weight because they have more fast twitch than slow twitch muscles. There is thus always a trade-off between strength and endurance. In a physically plausible world, humans should always find an edge in combating zombies or vampires. The only way to make a vampire or zombie story viable is to endow them with nonphysical properties. My guess is that we have hit peak vampire/zombie; the next wave of horror shows will feature a more plausible threat – evil AI.

The end of (video) reality

I highly recommend listening to this Radiolab podcast. It tells of new software that can create completely fabricated audio and video clips. This will take fake news to an entirely different level. It also means that citizen journalists with smartphones, police body cams, security cameras, etc. will all become obsolete. No recording can be trusted. On the other hand, we had no recording technology of any kind for almost all of human history so we will have to go back to simply trusting (or not trusting) what people say.

The robot human equilibrium

There has been some push back in the media against the notion that we will “soon” be replaced by robots, e.g. see here. But absence of evidence is not evidence of absence. Just because there seem to be very few machine induced job losses today doesn’t mean it won’t happen tomorrow or in ten years. In fact, when it does happen it probably will happen suddenly as have many recent technological changes. The obvious examples are the internet and smartphones but there are many others. We forget that the transition from vinyl records to CDs was extremely fast; then iPods and YouTube killed CDs. Video rentals became ubiquitous from nothing in just a few years and died just as fast when Netflix came along, which was then completely replaced a few years later by streaming video. It took Amazon a little longer to become dominant but the retail model that had existed for centuries has been completely upended in a decade. The same could happen with AI and robots. Unless you believe that human thought is not computable, then in principle there is nothing a human can do that a machine can’t. It could take time to set up the necessary social institutions and infrastructure for an AI takeover but once it is established the transition could be abrupt.

Even so that doesn’t mean all or even most humans will be replaced. The irony of AI, known as Moravec’s Paradox (e.g. here), is that things that are hard for humans to do, like play chess or read X-rays, are easy for machines to do and vice versa. Although drivers and warehouse workers are destined to be the first to be replaced, the next set of jobs will likely be highly paid professionals like stock brokers, accountants, doctors, and lawyers. But as the ranks of the employed start to shrink, the economy will also shrink and wages will go down (even if the displaced do eventually move on to other jobs it will take time). At some point, particularly for jobs that are easy for humans but harder for machines, humans could be cheaper than machines.  So while we can train a machine to be a house cleaner, it may be more cost effective to simply hire a person to change sheets and dust shelves. The premium on a university education will drop. The ability to sit still for long periods of time and acquire arcane specialized knowledge will simply not be that useful anymore. Centers for higher learning will become retreats for the small set of scholarly minded people who simply enjoy it.

As the economy shrinks, land prices in some areas should drop too and thus people could still eke out a living. Some or perhaps many people will opt or be pushed out of the mainstream economy altogether and retire to quasi-pre-industrial lives. I wrote about this in quasi-utopian terms in my AlphaGo post but a dystopian version is equally probable. In the dystopia, the gap between the rich and poor could make today look like an egalitarian paradise. However, unlike the usual dystopian nightmare like the Hunger Games where the rich exploit the poor, the rich will simply ignore the poor. But it is not clear what the elite will do with all that wealth. Will they wall themselves off from the rest of society and then what, engage in endless genetic enhancements or immerse themselves in a virtual reality world? I think I’d rather raise pigs and make candles out of lard.