Evolution of overconfidence

A new paper on the evolution of overconfidence (arXiv:0909.4043v2) will appear shortly in Nature. (Hat tip to J.L. Delatre). It is well known in psychology that people generally overvalue themselves and it has always been a puzzle as to why.  This paper argues that under certain plausible conditions, it may have been evolutionarily advantageous to be overconfident.  One of the authors is James Fowler who has garnered recent fame for claiming with Nicholas Christakis that medically noninfectious phenomena such as obesity and divorce are socially contagious.  I have always been skeptical of these social network results and it seems like  there has been some recent push back.  Statistician and blogger Andrew Gelman has a summary of the critiques here.  The problem with these papers  fall in line with the same problems of many other clinical papers that I have posted on before (e.g. see here and here).  The evolution of overconfidence paper does not rely on statistics but on a simple evolutionary model.

The model  considers competition between two parties for some scarce resource.  Each party possess some heritable attribute and the one with the higher value of that attribute will win a contest and obtain the resource.   The model allows for three outcomes in any interaction: 1) winning a competition and obtaining the resource with value W-C (where C is the cost of competing), 2) claiming the resource without a fight with value W, and 3) losing a competition with a value -C.    The parties assess their own and their opponents attributes before deciding to compete.  If both parties had perfect information, participating in a contest would be unnecessary.  Both parties would realize who would win and the stronger of the two would claim the prize. However,  because of the error and biases in assessing attributes, resources will be contested. Overconfidence is represented as a positive bias in assessing oneself.  The authors chose a model that was simple enough to explicitly evaluate the outcomes of all possible situations and show that when the reward for winning is sufficiently large compared to the cost, then overconfidence is evolutionarily stable.

Here I will present a simpler toy model of why the result is plausible. Let P be the probability that a given party will win a competition on average and let Q be the probability that they will engage in a competition. Hence, Q is a measure of overconfidence.  Using these values, we can then compute the expectation value of an interaction:

E = Q^2P (W-C) + Q(1-Q) W - Q^2(1-P) C

(i.e. the probability of a competition and winning is Q^2P, the probability of  winning and not having to fight is Q(1-Q), the probability of  losing a competition is Q^2(1-P), and it doesn’t cost anything to not compete.)  The derivative of E with respect to Q is

E' = 2 QP(W-C) + (1-2Q)W-2Q(1-P)C=2Q[(1-P)W-C]+W

Hence, we see that if (1-P)W > C, i.e. the reward of winning sufficiently exceeds the cost of competing, then the expectation value is guaranteed to increase with increasing confidence. Of course this simple demonstration doesn’t prove that overconfidence is a stable strategy but it does affirm Woody Allen’s observation that “95% of life is just showing up.”

New paper on binocular rivalry

J Neurophysiol. 2011 Jul 20. [Epub ahead of print]

The role of mutual inhibition in binocular rivalry.

Seely J, Chow CC.

Binocular rivalry is a phenomenon that occurs when ambiguous images are presented
to each of the eyes. The observer generally perceives just one image at a time,
with perceptual switches occurring every few seconds. A natural assumption is
that this perceptual mutual exclusivity is achieved via mutual inhibition between
populations of neurons that encode for either percept. Theoretical models that
incorporate mutual inhibition have been largely successful at capturing
experimental features of rivalry, including Levelt's propositions, which
characterize perceptual dominance durations as a function of image contrasts.
However, basic mutual inhibition models do not fully comply with Levelt's fourth
proposition, which states that percepts alternate faster as the stimulus
contrasts to both eyes are increased simultaneously. This theory-experiment
discrepancy has been taken as evidence against the role of mutual inhibition for
binocular rivalry. Here, we show how various biophysically plausible
modifications to mutual inhibition models can resolve this problem.

PMID: 21775721  [PubMed - as supplied by publisher]

Paper can be downloaded here.

Review paper on steroid-mediated gene expression

Mol Cell Endocrinol. 2011 Jun 1. [Epub ahead of print]

The road less traveled: New views of steroid receptor action 
from the path of dose-response curves. 
Simons SS Jr, Chow CC.

Steroid Hormones Section, NIDDK/CEB, NIDDK, National Institutes of Health,
Bethesda, MD, United States.

Conventional studies of steroid hormone action proceed via quantitation of the
maximal activity for gene induction at saturating concentrations of agonist
steroid (i.e., A(max)). Less frequently analyzed parameters of receptor-mediated
gene expression are EC(50) and PAA. The EC(50) is the concentration of steroid
required for half-maximal agonist activity and is readily determined from the
dose-response curve. The PAA is the partial agonist activity of an antagonist
steroid, expressed as percent of A(max) under the same conditions. Recent results
demonstrate that new and otherwise inaccessible mechanistic information is
obtained when the EC(50) and/or PAA are examined in addition to the A(max).
Specifically, A(max), EC(50), and PAA can be independently regulated, which
suggests that novel pathways and factors may preferentially modify the EC(50)
and/or PAA with little effect on A(max). Other approaches indicate that the
activity of receptor-bound factors can be altered without changing the binding of
factors to receptor. Finally, a new theoretical model of steroid hormone action
not only permits a mechanistically based definition of factor activity but also
allows the positioning of when a factor acts, as opposed to binds, relative to a
kinetically defined step. These advances illustrate some of the benefits of
expanding the mechanistic studies of steroid hormone action to routinely include
EC(50) and PAA.

PMID: 21664235  [PubMed - as supplied by publisher]

New paper on insulin’s effect on lipolysis

J Clin Endocrinol Metab. 2011 May 18. [Epub ahead of print]

Higher Acute Insulin Response to Glucose May Determine Greater 
Free Fatty Acid Clearance in African-American Women.

Chow CC, Periwal V, Csako G, Ricks M, Courville AB, Miller BV 3rd, Vega GL,
Sumner AE.

Laboratory of Biological Modeling (C.C.C., V.P.), National Institute of Diabetes
and Digestive and Kidney Diseases, National Institutes of Health, Bethesda,
Maryland 20892; Departments of Laboratory Medicine (G.C.) and Nutrition (A.B.C.),
Clinical Center, National Institutes of Health, and Clinical Endocrinology Branch
(M.R., B.V.M., A.E.S.), National Institute of Diabetes and Digestive and Kidney
Diseases, National Institutes of Health, Bethesda, Maryland 20892; and Center for
Human Nutrition (G.L.V.), University of Texas Southwestern Medical Center at
Dallas, Dallas, Texas 75235.

Context: Obesity and diabetes are more common in African-Americans than whites.
Because free fatty acids (FFA) participate in the development of these
conditions, studying race differences in the regulation of FFA and glucose by
insulin is essential. Objective: The objective of the study was to determine
whether race differences exist in glucose and FFA response to insulin. Design:
This was a cross-sectional study. Setting: The study was conducted at a clinical
research center. Participants: Thirty-four premenopausal women (17
African-Americans, 17 whites) matched for age [36 ± 10 yr (mean ± sd)] and body
mass index (30.0 ± 6.7 kg/m(2)). Interventions: Insulin-modified frequently
sampled iv glucose tolerance tests were performed with data analyzed by separate
minimal models for glucose and FFA. Main Outcome Measures: Glucose measures were
insulin sensitivity index (S(I)) and acute insulin response to glucose (AIRg).
FFA measures were FFA clearance rate (c(f)). Results: Body mass index was similar
but fat mass was higher in African-Americans than whites (P < 0.01). Compared
with whites, African-Americans had lower S(I) (3.71 ± 1.55 vs. 5.23 ± 2.74
[×10(-4) min(-1)/(microunits per milliliter)] (P = 0.05) and higher AIRg (642 ±
379 vs. 263 ± 206 mU/liter(-1) · min, P < 0.01). Adjusting for fat mass,
African-Americans had higher FFA clearance, c(f) (0.13 ± 0.06 vs. 0.08 ± 0.05
min(-1), P < 0.01). After adjusting for AIRg, the race difference in c(f) was no
longer present (P = 0.51). For all women, the relationship between c(f) and AIRg
was significant (r = 0.64, P < 0.01), but the relationship between c(f) and S(I)
was not (r = -0.07, P = 0.71). The same pattern persisted when the two groups
were studied separately. Conclusion: African-American women were more insulin
resistant than white women, yet they had greater FFA clearance. Acutely higher
insulin concentrations in African-American women accounted for higher FFA

PMID: 21593106  [PubMed - as supplied by publisher]

New paper on estimating food intake from body weight

 Am J Clin Nutr. 2011 Jul;94(1):66-74. Epub 2011 May 11.

Estimating changes in free-living energy intake and its confidence interval.

Hall KD, Chow CC.

Laboratory of Biological Modeling, National Institute of Diabetes and Digestive
and Kidney Diseases, Bethesda, MD.

Background: Free-living energy intake in humans is notoriously difficult to
measure but is required to properly assess outpatient weight-control
interventions. Objective: Our objective was to develop a simple methodology that 
uses longitudinal body weight measurements to estimate changes in energy intake
and its 95% CI in individual subjects. Design: We showed how an energy balance
equation with 2 parameters can be derived from any mathematical model of human
metabolism. We solved the energy balance equation for changes in free-living
energy intake as a function of body weight and its rate of change. We tested the 
predicted changes in energy intake by using weight-loss data from controlled
inpatient feeding studies as well as simulated free-living data from a group of
"virtual study subjects" that included realistic fluctuations in body water and
day-to-day variations in energy intake. Results: Our method accurately predicted 
individual energy intake changes with the use of weight-loss data from controlled
inpatient feeding experiments. By applying the method to our simulated
free-living virtual study subjects, we showed that daily weight measurements over
periods >28 d were required to obtain accurate estimates of energy intake change 
with a 95% CI of <300 kcal/d. These estimates were relatively insensitive to
initial body composition or physical activity level. Conclusions: Frequent
measurements of body weight over extended time periods are required to precisely 
estimate changes in energy intake in free-living individuals. Such measurements
are feasible, relatively inexpensive, and can be used to estimate diet adherence 
during clinical weight-management programs.

PMCID: PMC3127505 [Available on 2012/7/1]
PMID: 21562087  [PubMed - in process]

Review paper on analyzing dose-response curves

I’ve been negligent about announcing new papers so I’ll do it all at once and then perhaps provide details later:

 Methods Enzymol. 2011;487:465-83.

Inferring mechanisms from dose-response curves.

Chow CC, Ong KM, Dougherty EJ, Simons SS Jr.

Laboratory of Biological Modeling, NIDDK/CEB, National Institutes of Health,
Bethesda, Maryland, USA.

The steady state dose-response curve of ligand-mediated gene induction usually
appears to precisely follow a first-order Hill equation (Hill coefficient equal
to 1). Additionally, various cofactors/reagents can affect both the potency and
the maximum activity of gene induction in a gene-specific manner. Recently, we
have developed a general theory for which an unspecified sequence of steps or
reactions yields a first-order Hill dose-response curve (FHDC) for plots of the
final product versus initial agonist concentration. The theory requires only that
individual reactions "dissociate" from the downstream reactions leading to the
final product, which implies that intermediate complexes are weakly bound or
exist only transiently. We show how the theory can be utilized to make
predictions of previously unidentified mechanisms and the site of action of
cofactors/reagents. The theory is general and can be applied to any biochemical
reaction that has a FHDC.

PMID: 21187235  [PubMed - indexed for MEDLINE]

The history of beer and ALDH2

Smithsonian magazine has an interesting article on the history of beer this month.  See here. According to Patrick McGovern, an archaeologist at the University of Pennsylvannia, the earliest known alcoholic beverage comes from China:

Smithsonian: When McGovern traveled to China and discovered the oldest known alcohol—a heady blend of wild grapes, hawthorn, rice and honey that is now the basis for Dogfish Head’s Chateau Jiahu—he was touched but not entirely surprised to learn of another “first” unearthed at Jiahu, an ancient Yellow River Valley settlement: delicate flutes, made from the bones of the red-crowned crane, that are the world’s earliest-known, still playable musical instruments.

The interesting part about alcohol and distillation being first discovered in China is that about half of people of east Asian decent lack the gene to process alcohol, leading to what is nontechnically known as “Asian flush syndrome”.  When alcohol is consumed, it is first metabolized into acetaldehyde in the liver by the enzyme alcohol dehydrogenase.  The acetaldehyde is what makes you flush, gives you a headache, increase your heart rate and generally induces hangover-like symptoms.  Another group of enzymes converts the acetaldehyde to acetic acid, which can then be metabolized.  However, some Asians possess a variant of the  aldehyde dehydrogenase gene ALDH2, which renders it inactive.  So, they (including me) basically turn alcohol rapidly into a poison acetaldehyde that then persists.

It is well known that this ALDH2*2 variant is protective of alcoholism.  In fact, the drug used to treat alcoholism – disulfiram, blocks the conversion of acetaldehyde to acetic acid thereby mimicking Asian flush syndrome.  Thus it is interesting that the ALDH2*2 variant arose in the population that first discovered alcohol production.  Is it a mutation that protects against alcoholism?  One a side note, it has also been shown that those that do possess the ALDH2*2 variant but continue to drink have a higher incidence of squamous cell esophageal cancer.  So maybe, my lack of ability to drink may be a result of the excesses of my ancestors.

Judicial system versus Bayesian brain

I think the recent uproar over the acquittal of  Casey Anthony clearly shows how our internal system of inference can be at odds with the American judicial system.  For those of you who don’t pay attention to the mainstream media, Casey Anthony was a young mother of a toddler that was found dead.  What captivated the American public was that the toddler had been missing for a month before Anthony reported it.  She lied to her parents and the authorities about the whereabouts of the child and even appeared celebratory in public during the period of the child’s disappearance.  In the court of public opinion, Anthony was clearly guilty.  The fact that a mother showed no anxiety whatsoever over the disappearance of a child clearly indicates that she was the culprit.

The American judicial system requires that if there is any reasonable doubt of guilt then a person must be acquitted.  The burden of proof is on the prosecution.  In this case, there were no witnesses and no physical evidence linking Anthony to the death of the child or that the child was even murdered.  Thus there was the remote possibility that Anthony was not responsible for the child’s death but simply took advantage of the situation, as macabre as that may be.  Even though the probability  of these two unlikely events – a mother wishing to be free of her child and a child going missing – is exceedingly low, it is still non zero and thus the jury was forced to acquit.

Now if a single piece of evidence had linked Anthony to the crime, say a fingerprint or DNA sample,  then she most likely would have been found guilty.  The interesting aspect of this is that there is also an equally low probability that someone could have planted the evidence to frame her. Thus, reasonable doubt is not a global quantity according to the law. It is not sufficient that the total probability that the accused is guilty be high, it also matters if it is high in each of several categories, i.e. motive, opportunity, and direct physical evidence or witnesses.  Circumstantial evidence is insufficient to convict a criminal.  However, it appears that our brains do not work this way.  We seem to take the global probability of guilt and go with that.

What’s in your sunscreen?

Here’s something to think about from Scientific American:

…And just what are the risks? According to the non-profit Environmental Working Group (EWG), there are two major types of sunscreens available in the U.S. “Chemical” sunscreens, the more common kind, penetrate the skin and may disrupt the body’s endocrine system, as their active ingredients (e.g., octylmethylcinnamate, oxybenzone, avobenzone, benzophone, mexoryl, PABA or PARSOL 1789) mimic the body’s natural hormones and as such can essentially confuse the body’s systems. Quite a risk to take, considering that the chemical varieties don’t even work for very long once applied.

Meanwhile, “mineral” sunscreens are considered somewhat safer, as their active ingredients are natural elements such as zinc or titanium. But “micronized” or “nano-scale” particles of these minerals can get below the skin surface and cause allergic reactions and other problems for some people. EWG recommends sticking with “mineral” sunscreens whenever possible but, more important, taking other precautions to avoid prolonged sun exposure altogether. “At EWG we use sunscreens, but we look for shade, wear protective clothing, and avoid the noontime sun before we smear on the cream,” the group reports.

As for spray varieties, EWG recommends avoiding them entirely: “These ingredients are not meant to be inhaled into the lungs.” With so little known about the effects of sunscreen chemicals on the body when rubbed into the skin, we may never know how much worse the effects may be when they are inhaled. But suffice it to say: When your neighbor at the beach is spraying down Junior, it’s in your best interest to turn away and cover your nose and mouth…

Crime and immigration

Noted sociologist Richard Florida has an opinion piece in the Finacial Times and The Atlantic (see here) about how immigration may be responsible for the recent decline in violent crime in cities. Many explanations have been given for why crime has decreased since the nineteen nineties. The bestselling book Freakonomics suggested that the decline was because of legalized abortion, which meant fewer unwanted children who would go on to be criminals. Florida shows that there is a strong negative correlation between the presence of large immigrant communities and the crime rate. Again, like all epidemiological results, this correlation may or may not be significant much less have causal value. That is not to say that it is not correct. Immigrant neighborhoods may have a greater sense of a small town community that discourages crime but if the opposite correlation was found an equally plausible just-so story could also be concocted. I think the crucial point about this result along with all other explanations of complex phenomena is that we are drawn towards single universal explanations. I am all the time even though there may not be a reason why there should even be an explanation in a few hundred bits. The opposite view would be that a single explanation is implausible. After all, something like crime involves millions upon millions of degrees of freedom so why should it be compressible to a few hundred bits. However, if the phenomenon is consistent across many cities and regions then maybe a global explanation may be in order. The variance around the mean is also important. If the variability is low then a universal explanation carries more weight. I think that we tend to either embrace reduced descriptions or reject them outright based on the ideas presented. However, in many cases, a careful examination of the data may at least tell us if a reduced description is warranted or not.