Snowbird summary

The Snowbird meeting finishes today.  I think it has been highly successful with over 800 participants.  Last night, I was on the “forward looking panel” moderated by Alan Champneys and one of the questions asked was what defines nonlinear dynamics and this meeting.  I gave a rather flip answer about how we are now in the age of machine learning and statistics and this meeting is everything in applied math that is not that.  Of course that is not true given that data assimilation was a major part of this meeting and Sara Solla gave a brilliant talk on applying the generalized linear model to neural data to estimate functional connectivity in the underlying cortical circuit.

Given some time to reflect on that question, I think the common theme of Snowbird is the concept of taking a complicated system and reducing it to something simpler that can be analyzed.   What we do is to create nontrivial models that can be accessed mathematically. This is distinctly different from other branches of applied math like optimization and numerical methods.  However, one difference between previous meetings and now is that before the main tools to analyze these reduced systems were methods of dynamical systems such as geometric singular perturbation theory (e.g. see here) and bifurcation theory.  Today, a much wider range of methods are being utilized.

Another question posed was whether there was too much biology  at this meeting.   I said yes because I thought there were too many parallel sessions.  Although, I said it partially with tongue in cheek, I think there are both good and bad things about biology being overly represented.   It is good that biology is doing well and attracting lots of people but it would be a bad thing if the meeting becomes so large that it devolves into multiple concurrent meetings where people only go to the talks that are directly related to what they already know. In a meeting with fewer parallel sessions one has more chance to learn something new and see something unexpected.  I really have no idea what should be done about this if anything at all.

Finally, a question about how data will be relevant to our field was posed from the audience. My answer was that the big trend right now was in massive data mining but I thought that it had overpromised and would eventually fail to deliver.  Eventually, dynamical systems methods will be required to help reduce and analyze the data.  However, I do want to add that data will play a bigger and bigger role in dynamical systems research.  In the past, we mostly strived to just qualitatively match experiments but now the data has improved to the point that we can try to quantitatively match it.   This will require using statistics.  Readers of this blog will know that I have been an advocate of using Bayesian methods.  I really believe that the marriage of dynamical systems and statistics will have great impact. Statistics is about fitting models to data but the models used are rather simple and generally not  motivated by mechanisms.  Our field is about producing models based on the underlying mechanisms.  It will be a perfect fit.

Snowbird meeting

I’m currently at the biannual SIAM Dynamical Systems Meeting in Snowbird Utah.  If a massive avalanche were to roll down the mountain and bury the hotel at the bottom, much of applied dynamical systems research in the world would cease to exist.  The meeting has been growing steadily for the past thirty years and has now maxed out the capacity of Snowbird.   The meeting will either eventually have to move to a new venue or restrict the number of speakers. My inclination is to move but I don’t think that is the most popular sentiment.  Thus far, I have found the invited talks to be very interesting.  Climate change seems to be the big theme this year.  Chris Jones and Raymond Pierrehumbert  both gave talks on that topic.  I chaired the session by noted endocrinologist and neuroscientist Stafford Lightman who gave a very well received talk on the dynamics of hormone secretion. Chiara Daraio gave a very impressive talk on manipulating sound propagation with chains of ball bearings.  She’s basically creating the equivalent of nonlinear optics and electronics in acoustics.  My talk this afternoon is on finite size effects in spiking neural networks.  It is similar but not identical to the one I gave in New Orleans in January (see here).  The slides are here.

Coffee and your health

A recently published paper in the Journal of the National Cancer Institute reports that heavy coffee consumption can reduce the risk of prostate cancer.  This story made the rounds in the popular press as would be expected.  The paper is based on a longitudinal study of 47,911 health professionals.  What they found was that men who consumed more than six cups of coffee per day lowered their risk of developing prostate cancer by 18%.  While this sounds impressive, one must weigh this with the fact that the probability of getting prostate cancer was around 10% over the 20 years of the study.  So this means that six cups of coffee per day or more lowered the risk from  10% to 8%.  A reduction yes, but probably not enough to start drinking massive amounts of coffee for this purpose alone.  Stated in another way, the risk was reduced from  529 cancers per 100 000 person-years to 425 cancers.  Now, the study also found that the reduction in risk for severe prostate cancer was around fifty percent.  However, the risk of getting lethal prostate cancer is also lower so the risk drops from 79 cancers per 100 000 person-years to 34.

Now the problem with these epidemiological studies is that there are so many confounders, and although the authors were extremely careful in trying to account for them,  they are still dealing with very uncertain data.  Previous studies on the effects of coffee showed no effects on prostate cancer risk.  There is also the problem of multiple comparisons.  I’m sure the authors looked at risks for all sorts of diseases and this one turned out to be statistically significant.  As I posted before (see here and here), many if not most high-profile clinical results turn out to be wrong and for good systematic reasons.  Now, it is biologically plausible that coffee could have some effect in reducing cancer.  It contains lots of bioactive molecules and antioxidants and we should test these directly.  My take is that until there is a solid biophysical explanation for a clinical effect, the jury is always out.

Hiding behind complexity

This week on Econtalk, Russ Roberts and John Papola discuss their most recent economics rap video about a fictional  debate between  F.A. Hayek and J.M. Keynes entitled “Fight of the Century”.  You can find the video here.  An earlier one also featuring Hayek and Keynes was called “Fear the Boom and the Bust”.  Both of the videos are entertaining and educational.   If you don’t know anything about Hayek, Keynes or macroeconomics, you can learn quite a bit just from these videos. Although, Roberts and Papola are strong proponents of Hayek, they try their best to represent Keynes fairly.    I will not address the economics arguments directly but rather comment on the philosophy of Hayek that motivates his ideas.

Hayek is a hero of libertarian leaning economists such as Roberts.  His main thesis is that the economy is far too complex to ever be understood by economists so any attempt at economic manipulation by the government such as the stimulus is misguided at best and dangerous at worst.  He claims that it is always better to just let the free market play out.  An Econtalk episode on Hayek can be found here and these ideas are summarized in his Nobel address found here.  In some sense, Hayek was ahead of his time in recognizing the importance  of complex systems as a field into itself.  On the other hand, he takes a very defeatist attitude towards it.   I have argued before (e.g. see here) that a purely reductionist approach is a futile approach to understanding complex phenomenon like biology or economics.  However, that doesn’t imply that some form of systematization or quantification is impossible.  For example, consider water flowing in a pipe.  If the velocity of the water is low then the water will flow smoothly.  However, when the velocity is fast enough the flow will become turbulent.  We can even calculate when the instability transition will occur.  Although it is completely futile to predict the trajectories of water molecules in a turbulent flow, there are statistical invariants that are well behaved.  Hayek claims that even a statistical theory of economics is impossible because economics is comprised of heterogeneous players so there is no natural way to average.  However, he makes such claims without any proof.  It may be true that there are no statistical invariants in economics but that is a question that can at least be studied.  Hayek doesn’t even believe that economics measures like the unemployment rate is of any use because that knowledge cannot be used in any useful way.

My approach to complex systems is based on two observations.  The first is that we can only have some quantitative control of a system if it is smooth enough so that small perturbations generally lead to small changes in the system.  We can handle  instances of where small changes lead to big changes (e.g. bifurcation or critical points where the qualitative behavior of the system changes drastically, like a phase transition between liquid and gas) if they are not too close to each other.  Hence, I only try to model something that behaves relatively nicely.  (I’ve argued before (e.g. see here) that physics could be described as the science of model-able things.)  The second observation is that most functions, no matter how badly behaved, can be made smooth if you integrate over it enough times.  If a system is very complex, I look for integrated or averaged quantities that seem to behave better.  For example, while the dynamics of the molecules in a gas are buzzing around in a haphazard unpredictable way, the temperature of the gas is well defined and can be described quantitatively.  Although human metabolism is highly complex, it still obeys the conservation of energy at a global level and I can use that fact to make quantitative predictions about the response of body weight to changes in food intake. So my take on the stimulus is that it is plausible that increasing spending can increase the velocity  of  money flow in the economy and kick us out of a recession.  While I doubt that we will ever be able to predict exactly how well a stimulus will work I think we can at least make some probabilistic predictions about the effect size.

Physicists in government

My old friend Ted Hsu (PhD in physics from Princeton) has been elected Member of Parliament of Kingston and the Islands in the Canadian federal election this past Monday.  Ted represents a small but hopefully growing number of scientists and physicists entering public life.  He’s had experience in academia and the private sector and is well positioned to make a big impact in government.  Congratulations Ted!