## New paper on childhood growth and obesity

August 1, 2013

Kevin D Hall, Nancy F Butte, Boyd A Swinburn, Carson C Chow. Dynamics of childhood growth and obesity: development and validation of a quantitative mathematical model. Lancet Diabetes and Endocrinology 2013 .

You can read the press release here.

In order to curb childhood obesity, we need a good measure of how much food kids should eat. Although people like Claire Wang have proposed quantitative models in the past that are plausible, Kevin Hall and I have insisted that this is a hard problem because we don’t fully understand childhood growth. Unlike adults, who are more or less in steady state, growing children are a moving target. After a few fits and starts we finally came up with a satisfactory model that modifies our two compartment adult body composition model to incorporate growth. That previous model partitioned excess energy intake into fat and lean compartments according to the Forbes rule, which basically says that the ratio of added fat to lean is proportional to how much fat you have so the more fat you have the more excess Calories go to fat. The odd consequence of that model is that the steady state body weight is not unique but falls on a one dimensional curve. Thus there is a whole continuum of possible body weights for a fixed diet and lifestyle. I actually don’t believe this and have a modification to fix it but that is a future story.

What puzzled me about childhood growth was how do we know how much more to eat as we grow? After some thought, I realized that what we could do is to eat enough to maintain the fraction of body fat at some level, using leptin as a signal perhaps, and then tap off the energy stored in fat when we needed to grow. So just like we know how much gasoline (petrol) to add by simply filling the tank when it’s empty, we simply eat to keep our fat reserves at some level. In terms of the model, this is a symmetry breaking term that transfers energy from the fat compartment to the lean compartment. In my original model, I made this term a constant and had food intake increase to maintain the fat to lean ratio and showed using singular perturbation theory that his would yield growth that was qualitatively similar to the real thing. This then sat languishing until Kevin had the brilliant idea to make the growth term time dependent and fit it to actual data that Nancy Butte and Boyd Swinburn had taken. We could then fit the model to normal weight and obese kids to quantify how much more obese kids eat, which is more than previously believed. Another nice thing is that when the child stops growing the model is automatically the adult model!

## The myth of the single explanation

July 30, 2013

I think one of the things that tends to lead us astray when we try to understand complex phenomena like evolution, disease, or the economy, is that we have this idea that they must have a single explanation. For example, recently two papers have been published in high profile journals trying to explain mammal monogamy. Although monogamy is quite common in birds it only occurs in 5% of mammals. Here is Carl Zimmer’s summary.  The study in Science, which surveyed 2545 mammal species, argued that monogamy arises when females are solitary and sparse. Males must then commit to one since dates are so hard to find. The study in PNAS examined 230 primate species, for which monogamy occurs at the higher rate of 27%, and used Bayesian inference to argue that monogamy arises to prevent male infanticide. It’s better to help out at home rather than go around killing other men’s babies. Although both of these arguments are plausible, there need not be a single universal explanation. Each species could have its own set of circumstances that led to monogamy involving these two explanations and others. However, while we should not be biased towards a single explanation, we shouldn’t also throw up our hands like Hayek and argue that no complex phenomenon can be understood. Some phenomena will have simpler explanations than others but since the Kolmogorov complexity is undecidable there is no algorithm that can tell you which is which. We will just have to struggle with each problem as it comes.

## Talk at GRC

July 24, 2013

I’m currently in Mt. Snow, Vermont to give a talk at the Gordon Research Conference on Computer Aided Drug Design. Yes, I know nothing about drug design. I am here because the organizer, Anthony Nicholls, asked me to give a pedagogical talk on Bayesian Inference. My slides are here. I only arrived yesterday but the few talks I’ve seen have been quite interesting. One interesting aspect of this conference is that many of the participants are from industry. The evening sessions are meant to be of more general interest. Last night were two talks about how to make science more reproducible. As I’ve posted before, many published results are simply wrong. The very enterprising Elizabeth Iorns has started something called the Reproducibility Initiative. I am not completely clear about how it works but it is part of another entity she started called Science Exchange, which helps to facilitate collaborations with a fee-for-service model. The Reproducibility Initiative piggy backs on Science Exchange by providing a service (for a fee) to validate any particular result. Papers that pass approval get a stamp of approval. It is expected that pharma would be interested in using this service so they can inexpensively check if possible drug targets actually hold up. Many drugs fail at phase three of clinical trials because they’ve been shown to be ineffective and this may be due to the target being wrong to start with.

On a final note, I flew to Albany and drove here. Unlike in the past when I would have printed out a map, I simply assumed that I could use Google Maps on my smart phone to get here. However, Google Maps doesn’t really know where Mt. Snow is. It tried to take me up a dirt road to the back of the ski resort. Also, just after I turned up the road, the phone signal disappeared so I was blind and had no paper backup. I was suspicious that this was the right way to go so I turned back to the main highway in hopes of finding a signal or a gas station to ask for directions. A few miles down Route 9, I finally did get a signal and also found a sign that led me the way. Google Maps still tried to take me the wrong way. I should have followed what I always tell my daughter – Always have a backup plan.

## New paper in Nature Reviews Genetics

July 22, 2013

A Coulon, CC Chow, RH Singer, DR Larson Eukaryotic transcriptional dynamics: from single molecules to cell populations. Nat Gen Reviews (2013).

Abstract | Transcriptional regulation is achieved through combinatorial interactions between regulatory elements in the human genome and a vast range of factors that modulate the recruitment and activity of RNA polymerase. Experimental approaches for studying transcription in vivo now extend from single-molecule techniques to genome-wide measurements. Parallel to these developments is the need for testable quantitative and predictive models for understanding gene regulation. These conceptual models must also provide insight into the dynamics of transcription and the variability that is observed at the single-cell level. In this Review, we discuss recent results on transcriptional regulation and also the models those results engender. We show how a non-equilibrium description informs our view of transcription by explicitly considering time- and energy-dependence at the molecular level.

## New paper on measuring gastric acid output

July 16, 2013

This paper started many years ago when Steve Wank, of the Digestive Diseases Branch of NIDDK, had this idea to use this new wireless PH detecting SmartPill that you could swallow to determine how much acid your stomach was producing.  There really was no noninvasive way to monitor how well medications would work for certain reflux diseases.  What he wanted was a model of gastric acid secretion output based on the dynamics of PH when a buffer was added to design a protocol for the experiment.  I came up with a simple mass-action model of acid buffering and made some graphs for him.  We then tested the model out in a beaker.  He thought the model worked better than I did but it was somewhat useful to him in designing the experiment.

Weinstein et al.  A new method for determining gastric acid output using a wireless pH-sensing capsule.  Aliment Pharmacol Ther 37: 1198 (2013)

Abstract:

BACKGROUND:Gastro-oesophageal reflux disease (GERD) and gastric acid hypersecretion respond well to suppression of gastric acid secretion. However, clinical management and research in diseases of acid secretion have been hindered by the lack of a non-invasive, accurate and reproducible tool to measure gastric acid output (GAO). Thus, symptoms or, in refractory cases, invasive testing may guide acid suppression therapy.

AIM:To present and validate a novel, non-invasive method of GAO analysis in healthy subjects using a wireless pH sensor, SmartPill (SP) (SmartPill Corporation, Buffalo, NY, USA).

METHODS:Twenty healthy subjects underwent conventional GAO studies with a nasogastric tube. Variables impacting liquid meal-stimulated GAO analysis were assessed by modelling and in vitro verification. Buffering capacity of Ensure Plus was empirically determined. SP GAO was calculated using the rate of acidification of the Ensure Plus meal. Gastric emptying scintigraphy and GAO studies with radiolabelled Ensure Plus and SP assessed emptying time, acidification rate and mixing. Twelve subjects had a second SP GAO study to assess reproducibility.

RESULTS:Meal-stimulated SP GAO analysis was dependent on acid secretion rate and meal-buffering capacity, but not on gastric emptying time. On repeated studies, SP GAO strongly correlated with conventional basal acid output (BAO) (r = 0.51, P = 0.02), maximal acid output (MAO) (r = 0.72, P = 0.0004) and peak acid output (PAO) (r = 0.60, P = 0.006). The SP sampled the stomach well during meal acidification.

CONCLUSIONS:SP GAO analysis is a non-invasive, accurate and reproducible method for the quantitative measurement of GAO in healthy subjects. SP GAO analysis could facilitate research and clinical management of GERD and other disorders of gastric acid secretion.

## Houghton opines on the unfairness of prizes

July 12, 2013

I recently wrote about Michael Houghton declining the prestigious Gairdner prize because it left out two critical contributors to the discovery of the Hepatitis C virus. Houghton has now written an opinion piece in Nature Medicine arguing that prizes relax the restriction to three awardees, an arbitrary number I’ve never understood. After all, one could argue that Freeman Dyson had a reasonable claim on the Nobel Prize awarded to Feynman, Schwinger, and Tomonaga for QED.  I’ve quoted the entire piece below.

Nature Medicine: Earlier this year, I was greatly honored with the offer of a 2013 Canada Gairdner International Award for my contributions to the discovery of the hepatitis C virus (HCV). I was selected along with Harvey Alter, chief of clinical studies in the Department of Transfusion Medicine at the US National Institutes of Health’s Clinical Center in Bethesda, Maryland, and Daniel Bradley, a consultant at the US Centers for Disease Control and Prevention in Atlanta, both of whom had a vital role in the research that eventually led to the identification and characterization of the virus.

My colleagues accepted their awards. However, I declined my C$100,000 ($98,000) prize because it excluded two other key contributors who worked with me closely to successfully isolate the viral genome for the first time. I felt that given their crucial inputs, it would be wrong of me to keep accepting major prizes just ‘on their behalf’, a situation that has developed because major award foundations and committees around the world insist that prizes be limited to no more than three recipients per topic.

HCV was identified in 1989 in my laboratory at the Chiron Corporation, a California biotechnology firm since purchased by the Swiss drug company Novartis. The discovery was the result of seven years of research in which I worked closely, both intellectually and experimentally, with Qui-Lim Choo, a member of my own laboratory, and George Kuo, who had his own laboratory next door to mine at Chiron. We finally identified the virus using a technically risky DNA-expression screening technique through which we isolated a single small nucleic acid clone from among many millions of such clones from different recombinant libraries. This was achieved without the aid of the still-evolving PCR technology to amplify the miniscule amounts of viral nucleic acid present in blood. We ultimately proved that this clone derived from a positive-stranded viral RNA genome intimately associated with hepatitis, but one not linked to either the hepatitis A or B viruses12. The finding represented the first time any virus had been identified without either prior visualization of the virus itself, characterization of its antigens or viral propagation in cell culture.

The high-titer infectious chimpanzee plasma used for our molecular analyses at Chiron was provided in 1985 by Bradley, an expert in chimpanzee transmission of HCV and in the virus’s basic properties and cellular responses, with whom I had an active collaboration since 1982. The proposed aim of the collaboration was for my laboratory to apply contemporary molecular cloning methodologies to a problem that had proven intractable since the mid-1970s, when Alter and his colleagues first demonstrated the existence of non-A, non-B hepatitis (NANBH), as it was then known. Alter’s team went on to define the high incidence and medical importance of NANBH, including the virus’s propensity to cause liver fibrosis, cirrhosis and cancer. They also identified high-titer infectious human plasma in 1980 and were instrumental in promoting the adoption of surrogate tests for NANBH by blood banks to reduce the incidence of post-transfusion infection.

With regrets to the Gairdner Foundation—a generous and altruistic organization—I felt compelled to decline the International Gairdner Award without the addition of Kuo and Choo to the trio of scientists offered the award. In 1992, all five of us received the Karl Landsteiner Memorial Award from the American Association of Blood Banks. But subsequent accolades given in honor of HCV’s discovery have omitted key members of the group: only Bradley and I received the 1993 Robert Koch Prize, and only Alter and I won the 2000 Albert Lasker Award for Clinical Medical Research—in both cases, despite my repeated requests that the other scientists involved in the discovery be recognized. With the exclusion once more of Kuo and Choo from this year’s Gairdner Award, I decided that I should not continue to accept major awards without them. In doing so, I became the first person since the Gairdner’s inception in 1959 to turn down the prize.

I hope that my decision helps bring attention to a fundamental problem with many scientific prizes today. Although some awards, such as the Landsteiner, are inclusionary and emphasize outstanding team accomplishments, the majority of the world’s prestigious scientific awards—including the Gairdner, Lasker and Shaw prizes, which all seem to be modeled on the Nobel Prize and indeed are sometimes known as the ‘baby Nobels’—are usually restricted to at most three individuals per discovery. Unsurprisingly, this limitation often leads to controversy, when one or more worthy recipients are omitted from the winners list.

Perhaps what may help this situation is for awards committees to solicit, and then be responsive to, input from potential recipients themselves prior to making their final decisions. Some of the recipients are best placed to know the full and often intricate history of the discovery and collaborative efforts, and such input should help committees better understand the size of the contributing team from which they can then choose recipients according to each award’s particular policy.

With this information in hand, award organizers should be willing to award more than three researchers. As knowledge and technology grows exponentially around the world and with an increasing need for multidisciplinary collaborations to address complex questions and problems, there is a case to be made for award committees adjusting to this changing paradigm. Moreover, it is inherently unfair to exclude individuals who played a key part in the discovery. Why should they and their families suffer such great disappointment after contributing such crucial input? Some award restructuring could also be inspirational to young scientists, encouraging them to be highly interactive and collaborative in the knowledge that when a novel, long-shot idea or approach actually translates to scientific success, all key parties will be acknowledged appropriately.

In this vein, I am happy to note that the inaugural Queen Elizabeth Prize for Engineering, a new £1 million (\$1.6 million) prize from the UK government, was awarded at a formal ceremony last month to five individuals who helped create the internet and the World Wide Web, even though the original guidelines stipulated a maximum of three recipients. If the Queen of England—the very emblem of tradition—can cast protocol aside, clearly other institutions can too. I hope more awards committees will follow Her Majesty’s lead.

## The distortion of thresholds

July 5, 2013

The Obama administration has decided to delay by one year the implementation of the employer health care mandate for businesses with more than 50 employees. The fear was that companies with slightly more than 50 employees would simply lay off or convert to part-time the workers above the threshold to avoid the penalties. I think in general thresholds are a bad idea for economic and tax policy as they provide an incentive to game the system. They should be replaced by smooth scales. US federal income taxes have a small number of rigid brackets for which income above a certain amount is taxed at a higher rate. This should be replaced by  a smooth function so the tax rate for an extra dollar earned will only be slightly higher than the previous dollar. The shape of the function can be debated but a smooth one would certainly work better than the current discontinuous one. In terms of the employer mandate for providing health care, a smooth phased-in penalty would avoid the incentives for companies to manipulate the number of employees they have.

## The problem with “just desserts”

June 29, 2013

The blogosphere is aflutter over Harvard economist and former chairman of the Council of Economic Advisors to Bush 43, Greg Mankiw‘s recent article “Defending the One Percent“. Mankiw’s paper mostly argues against the classic utilitarian reason for redistribution – a dollar is more useful to a poor person than a rich one.  However, near the end of the paper he proposes that an alternative basis for fair income distribution should be the just desserts principle where everyone is compensated according to how much they contribute. Mankiw believes that the recent surge in income inequality is due to changes in technology that favour superstars who create much more value for the economy than the rest. He then argues that the superstars are superstars because of heritable innate qualities like IQ and not because the economy is rigged in their favour.

The problem with this idea is that genetic ability is a shared natural resource that came through a long process of evolution and everyone who has ever lived has contributed to this process. In many ways, we’re like a huge random Monte Carlo simulation where we randomly try out lots of different gene variants to see what works best. Mankiw’s superstars are the Monte Carlo trials that happen to be successful in our current system. However, the world could change and other qualities could become more important just as physical strength was more important in the pre-information age. The ninety-nine percent are reservoirs of genetic variability that we all need to prosper. Some impoverished person alive today may possess the genetic variant to resist some future plague and save humanity. She is providing immense uncompensated economic value. The just desserts world is really nothing more than a random world; a world where you are handed a lottery ticket and you hope you win. This would be fine but one shouldn’t couch it in terms of some deeper rationale. A world with a more equitable distribution is one where we compensate the less successful for their contribution to economic progress. However, that doesn’t mean we should have a world with completely equal income distribution. Unfortunately, the human mind needs incentives to try hard so for maximal economic growth, the lottery winners must always get at least a small bonus.

## What I do

June 21, 2013

For those interested, here is a four page summary of my research activities that I wrote for my upcoming quadrennial review at NIH.  It doesn’t include everything I’ve done in the past four years, just the main lines of research.

June 24, 2013:  Corrected a small typo in the summary.

## Body weight simulator iPhone app

June 19, 2013

The body weight simulator, originally a web based java application, is now also an iPhone app (see here in iTunes).  The simulator is based on the human metabolism model developed by Kevin Hall, myself, and collaborators.  The exact model is given in detail in our Lancet paper, which is listed here along with other related references.  The app predicts the time course of your body weight given your baseline parameters and your new diet and/or new physical activity.  It will also give a suggested daily caloric intake to attain a new weight over a specified period of time along with the diet required to maintain that weight.  The model uses parameters calibrated to the average American so your own mileage will vary.  Also, I basically wrote the app in my spare time over the past year so it is pretty primitive as far as apps go but it does the job.  Please try it out and give me feedback.

## Patent perspiration not inspiration

June 17, 2013

Little irks me more than the current state of US patent law. It stifles innovation and encourages patent trolls, the most famous being Nathan Myhrvold’s Intellectual Ventures. The main problem is that we are allowing patents for the wrong thing. Currently, patents are awarded for innovative ideas, which means you can try to patent fairly obvious ideas like a device that converts optical images into digital information, which amazingly enough is owned by one patent troll who is trying to extort money out of anyone who has ever used a scanner, including nonprofits (see here). This American Life had an episode devoted to this topic (see here). What we should do instead is to patent the effort and cost sunk into developing an idea into a product. Alex Tabarrok has been writing about this topic for a long time and has a nice paper giving the economic reasons of why this would be better (see here for reprint). You should only get patent protection in proportion to the costs you have incurred in developing the idea.

Ideas are cheap; turning them into successful businesses is the hard part. Me, and probably everyone else, had the idea for Google Glass years if not decades ago. I had no idea how to make it work, nor did I put any effort in trying to do so. I just thought it would be great to have a projection screen built into glasses. Actually, my full idea was that it would project an image onto the retina with the focus set at infinity so I wouldn’t have to strain my eyes to read it. I don’t think anyone should be able to patent such an idea. We should encourage lots of companies to come up with ways of implementing eyeglass projection systems and let them battle it out in the marketplace. In some sense, fashion should be our model. You can’t patent fashion so designers must constantly innovate to keep ahead of the imitators. If anything, there is too much innovation in fashion. Given that we are now on the wrong side of the “Tabarrok Curve“, the argument that abolishing patents would stifle innovation is no longer valid. This is one issue that both liberals and libertarians should agree on. If we are to get any laws passed at all this congressional term, we should get patent reform.

## Genes can no longer be patented

June 13, 2013

The US Supreme Court ruled today that human genes cannot be patented. Here is the link to the New York Times article. The specific case regards Myriad Genetics, which held a patent that controlled the rights to all tests for the BRCA1 and BRCA2 genes implicated in breast cancer. The patent essentially blocked most research on the BRCA genes. The immediate effect will be that genetic testing will become cheaper and more widespread. People will argue that not allowing genes to be patented will discourage further innovation. I doubt it. Most discoveries, like genes, come from basic federally funded research. Any company can now develop a test for any newly discovered gene. Patent law has been broken for decades and this is just one small step to correcting it.

## The demise of Arbaclofen for Fragile X

June 7, 2013

Seaside Therapeutics has recently announced that they would be withdrawing their Fragile X drug Arbaclofen (STX209) from further clinical trials (see here). They had already reached Phase 3 and the drug showed promise in some patients but probably not enough to secure enough funding to continue or guarantee FDA approval. See the New York Times story for some personal accounts of the impact of this decision. The drug is a GABA-B agonist and is similar to the drug Baclofen, which is used to treat muscle spasms. Fragile X syndrome, which has symptoms similar to autism,  is caused by a mutation to the FMR1 gene that silences the production of the FMRP protein. Like most proteins, it is not exactly clear what FMRP does except that it may involve protein translation and affects synaptic plasticity in mouse models. One hypothesis of the cause of autism and Fragile X is that there is an overabundance of synaptic excitation.

My paper with Shashaank Vattikuti explored the effects of such imbalances on a cortical circuit model and showed that it could reproduce some psychophysical experiments (see here for summary of the paper). It is thus a plausible hypothesis that a GABA-B agonist, which are inhibitory transmitters, may alleviate some of the symptoms and I believe that it does in some patients. However, such a blunt instrument would probably not work in all patients. One reason is that not all imbalances between excitation and inhibition are necessarily equal, i.e. too much excitation may not be the same as too little inhibition. A neural circuit with very high excitation and inhibition balancing each other could behave very differently from one with low amounts of each balancing each other. The high circuit would have “high gain” and be very responsive to perturbations while the low circuit would have low gain. It is also not clear that simply increasing inhibition everywhere will result in a net decrease in inhibition because of the multiple feedback loops in the system. Increasing inhibition between inhibitory neurons could decrease the net inhibition on excitatory neurons.

The trials seem to show that about a third of the patients improved with Arbaclofen. They are probably the ones that have too little inhibition and increasing inhibition helps. I think this case suggests that we may need a new model for FDA approval of drugs. Perhaps we should not insist that drugs only treat specific illnesses but should also be approved if they are shown to have some biological effect and do not cause harm. I believe that there are many drugs that have failed to obtain FDA approval that actually do work and could help some patients. Instead of waiting until we can figure out ahead of time which patients will benefit from a given drug before we approve of it, we can just approve of it for restricted use and try it on patients to see what happens. The danger of course is that it may be difficult to know if a drug works and desperate patients and especially parents will insist on using a drug even if the physician believes it has no effect. This could cause harm and increase the cost of medical care. One of the things we could do is to have the government or nonprofit companies take over failed but safe drugs and provide them at low cost under some regulation. Actually, I think we need to completely revamp how drugs are developed but I need to leave that to a future post.

## The failure of science museums (and some radio shows)

May 30, 2013

While I’m in the ranting mood, I’m also going to criticize my  favourite childhood radio show Quirks and Quarks on CBC. The problem I have with the show these days is that it basically only covers astronomy, dinosaurs, and animal behavior. Occasionally, it will also cover high energy physics or climate change. It pays scant attention to the rest of biology, physics, chemistry, computer science, or mathematics. The show does a very poor job of giving the public an idea of what most scientists really do and what constitutes scientific breakthroughs. I think it is more important now than ever that science shows try to educate the public on how the scientific method really works, to get across how difficult it can be to come up with experiments to test hypotheses and how long it takes to get from breakthroughs in the lab to applications. They should also better convey the sense of how it is impossible to predict what will become useful in the future and how lots and lots of failure is a prerequisite for progress. I hope Quirks and Quarks will become more serious because it’s migrating its way to the bottom of my podcast stack.

## Wiener on robots

May 21, 2013

An essay by Norbert Wiener written in 1949 intended for the New York Times was recently uncovered.  He pretty much had it right 64 years ago. Below is the rather serious last section. Earlier in the piece, we find out that programming was  called “taping” at that time.

New York Times:

The Genie and the Bottle

These new machines have a great capacity for upsetting the present basis of industry, and of reducing the economic value of the routine factory employee to a point at which he is not worth hiring at any price. If we combine our machine-potentials of a factory with the valuation of human beings on which our present factory system is based, we are in for an industrial revolution of unmitigated cruelty.

We must be willing to deal in facts rather than in fashionable ideologies if we wish to get through this period unharmed. Not even the brightest picture of an age in which man is the master, and in which we all have an excess of mechanical services will make up for the pains of transition, if we are not both humane and intelligent.

Finally the machines will do what we ask them to do and not what we ought to ask them to do. In the discussion of the relation between man and powerful agencies controlled by man, the gnomic wisdom of the folk tales has a value far beyond the books of our sociologists.

There is general agreement among the sages of the peoples of the past ages, that if we are granted power commensurate with our will, we are more likely to use it wrongly than to use it rightly, more likely to use it stupidly than to use it intelligently. [W. W. Jacobs’s] terrible story of the “Monkey’s Paw” is a modern example of this — the father wishes for money and gets it as a compensation for the death of his son in a factory accident, then wishes for the return of his son. The son comes back as a ghost, and the father wishes him gone. This is the outcome of his three wishes.

Moreover, if we move in the direction of making machines which learn and whose behavior is modified by experience, we must face the fact that every degree of independence we give the machine is a degree of possible defiance of our wishes. The genie in the bottle will not willingly go back in the bottle, nor have we any reason to expect them to be well disposed to us.

In short, it is only a humanity which is capable of awe, which will also be capable of controlling the new potentials which we are opening for ourselves. We can be humble and live a good life with the aid of the machines, or we can be arrogant and die.

## Most of neuroscience is wrong

May 20, 2013

John Ioannidis has a recent paper in Nature Reviews Neuroscience arguing that many results in neuroscience are wrong. The argument follows his previous papers of why most published results are wrong (see here and here) but emphasizes the abundance of studies with small sample sizes in neuroscience. This both reduces the chances of finding true positives and increases the chances of obtaining false positives. Under powered studies are also susceptible to what is called the “winner’s curse” where the effect sizes of true positives are artificially amplified. My take is that any phenomenon with a small effect should be treated with caution even if it is real. If you really wanted to find what causes a given disease then you probably want to find something that is associated with all cases, not just in a small percentage of them.

## Bayesian model comparison Part 2

May 11, 2013

In a previous post, I summarized the Bayesian approach to model comparison, which requires the calculation of the Bayes factor between two models. Here I will show one computational approach that I use called thermodynamic integration borrowed from molecular dynamics. Recall, that we need to compute the model likelihood function

$P(D|M)=\int P((D|M,\theta)P(\theta|M) d\theta$     (1)

for each model where $P(D|M,\theta)$ is just the parameter dependent likelihood function we used to find the posterior probabilities for the parameters of the model.

The integration over the parameters can be accomplished using the Markov Chain Monte Carlo, which I summarized previously here. We will start by defining the partition function

$Z(\beta) = \int P(D|M,\theta)^\beta P(\theta| M) d\theta$    (2)

where $\beta$ is an inverse temperature. The derivative of the log of the partition function gives

$\frac{d}{d\beta}\ln Z(\beta)=\frac{\int d\theta \ln[P(D |\theta,M)] P(D | \theta, M)^\beta P(\theta|M)}{\int d\theta \ P(D | \theta, M)^\beta P(\theta | M)}$    (3)

which is equal to the ensemble average of $\ln P(D|\theta,M)$. However, if we assume that the MCMC has reached stationarity then we can replace the ensemble average with a time average $\frac{1}{T}\sum_{i=1}^T \ln P(D|\theta, M)$.  Integrating (3) over $\beta$ from 0 to 1 gives

$\ln Z(1) = \ln Z(0) + \int \langle \ln P(D|M,\theta)\rangle d\beta$

From (1) and (2), we see that  $Z(1)=P(D|M)$, which is what we want to compute  and $Z(0)=\int P(\theta|M) d\theta=1$.

Hence, to perform Bayesian model comparison, we simply run the MCMC for each model at different temperatures (i.e. use $P(D|M,\theta)^\beta$ as the likelihood in the standard MCMC) and then integrate the log likelihoods $Z(1)$ over $\beta$ at the end. For a Gaussian likelihood function, changing temperature is equivalent to changing the data “error”. The higher the temperature the larger the presumed error. In practice, I usually run at seven to ten different values of $\beta$ and use a simple trapezoidal rule to integrate over $\beta$.  I can even do parameter inference and model comparison in the same MCMC run.

Erratum, 2013-5-2013,  I just fixed an error in the final formula

## Discounting the obvious

April 24, 2013

The main events in the history of science have involved new ideas overthrowing conventional wisdom. The notion that the earth was the center of the universe was upended by Copernicus. Species were thought to be permanent and fixed until Darwin. Physics was thought to be completely understood at the end of the nineteenth century and then came relativity theory and quantum mechanics to mess everything up. Godel overthrew the notion that mathematics was infallible. This story has been repeated so many times that people now seem to instinctively look for the counterintuitive answer to every problem. There are countless books on thinking outside of the box.  However, I think that the supplanting of “linear” thinking with “nonlinear” thinking is not always a good idea and sometimes it can have dire consequences.

A salient example is the current idea that fiscal austerity will lead to greater economic growth. GDP is defined as the sum of  consumption, investment, government spending and exports minus imports. If consumption or investment were to decline in an economic contraction, as in the Great Recession, then the simple linear idea would be that GDP and growth can be bolstered by increased government spending. This was the standard government response immediately after the financial crisis of 2008. However, starting in about 2010 when the recovery wasn’t deemed fast enough instead of considering the simple idea that the stimulus wasn’t big enough, the idea that policy makers, especially in Europe, adopted was that government spending was crowding out private spending so that a decrease in government spending would lead to a net increase in GDP and growth. This is very nonlinear thinking because it requires a decrease in GDP to induce an increase in GDP. Thus far this idea is not working and austerity has led to lower GDP growth in all countries that have tried it.  This idea was reinforced by a famous, now infamous, paper by Reinhart and Rogoff, which claimed that when government debt reaches 90% of GDP, growth is severely curtailed. This result has been taken as undisputed truth by governments and the press even though there were many economists who questioned it.  However, it turns out that the paper has major errors (including an Excel coding error). See here for a summary.  This is case where the nonlinear idea (as well as conflating correlation with causation) is probably wrong and has inflicted immense hardship on a large number of people.

## New paper on fat

April 19, 2013

Sex-Associated Differences in Free Fatty Acid Flux of Obese Adolescents.

Section on Growth and Obesity (D.C.A.-W., A.H.A., S.J.R.M., G.I.U., M.T.-K., J.A.Y.), Program in Developmental Endocrinology and Genetics, Eunice Kennedy Shriver National Institute of Child Health and Human Development; Mathematical Cell Modeling Section (V.P., C.C.C.), Division of Extramural Activities (C.G.S.), Division of Nutrition Research Coordination (V.S.H.), and Laboratory of Endocrinology and Receptor Biology (A.E.S.), National Institute of Diabetes and Digestive and Kidney Diseases; and Nuclear Medicine Department (J.C.R.), Hatfield Clinical Research Center, National Institutes of Health, U.S. Department of Health and Human Services, Bethesda, Maryland 20892.

The Journal of clinical endocrinology and metabolism (impact factor: 6.5). 02/2013; DOI:10.1210/jc.2012-3817

ABSTRACT Context: In obesity, increases in free fatty acid (FFA) flux can predict development of insulin resistance. Adult women release more FFA relative to resting energy expenditure (REE) and have greater FFA clearance rates than men. In adolescents, it is unknown whether sex differences in FFA flux occur. Objective: Our objective was to determine the associations of sex, REE, and body composition with FFA kinetics in obese adolescents. Participants: Participants were from a convenience sample of 112 non-Hispanic white and black adolescents (31% male; age range, 12-18 years; body mass index SD score range, 1.6-3.1) studied before initiating obesity treatment. Main Outcome Measures: Glucose, insulin, and FFA were measured during insulin-modified frequently sampled iv glucose tolerance tests. Minimal models for glucose and FFA calculated insulin sensitivity index (SI) and FFA kinetics, including maximum (l0 + l2) and insulin-suppressed (l2) lipolysis rates, clearance rate constant (cf), and insulin concentration for 50% lipolysis suppression (ED50). Relationships of FFA measures to sex, REE, fat mass (FM), lean body mass (LBM) and visceral adipose tissue (VAT) were examined. Results: In models accounting for age, race, pubertal status, height, FM, and LBM, we found sex, pubertal status, age, and REE independently contributed to the prediction of l2 and l0 + l2 (P < .05). Sex and REE independently predicted ED50 (P < .05). Sex, FM/VAT, and LBM were independent predictors of cf. Girls had greater l2, l0 + l2 and ED50 (P < .05, adjusted for REE) and greater cf (P < .05, adjusted for FM or VAT) than boys. Conclusion: Independent of the effects of REE and FM, FFA kinetics differ significantly in obese adolescent girls and boys, suggesting greater FFA flux among girls.

## Slides for Hopkins talk

April 15, 2013

I gave the Bodian Seminar at the Zanvyl Krieger Mind/Brain Institute of Johns Hopkins today.  I talked about cortical dynamics in the presence of conflicting stimuli. My slides are here. A summary of part of my talk can be found here.  Other pertinent papers can be found here and here.