More news stories on food waste

www.economist.com/sciencetechnology/displayStory.cfm?story_id=14960159

www.cbc.ca/consumer/story/2009/11/24/tech-environment-food-waste.html

www.cbc.ca/news/yourview/2009/11/food-waste-and-climate-change-will-you-change-your-habits.html

sciencenow.sciencemag.org/cgi/content/full/2009/1125/1

http://www.voanews.com/mp3/voa/english/ourw/ourw0305a.mp3

journalwatch.conservationmagazine.org/2009/11/26/america-clean-your-plate/

www.livescience.com/culture/091126-food-waste.html

en.greenplanet.net/food/research/1172-wasteful-eating-habits-sharpens-world-hunger.html

news.mongabay.com/2009/1125-hance_foodwaste.html

www.greenbang.com/rising-food-waste-also-wastes-oil-and-water-researchers-find_12708.html

www.digitaljournal.com/article/282792#tab=comments&sc=0&contribute=&local=

www.deakin.edu.au/deakin-speaking/node/77

www.tribalinsight.com/

blog.friendseat.com/forty-percent-of-food-supply-in-usa-wasted/

www.scoop.co.nz/stories/SC0911/S00054.htm

timesofindia.indiatimes.com/world/us/Chew-on-it-Americans-throw-away-40-of-food/articleshow/5277225.cms

www.foodproductiondaily.com/Processing/US-food-waste-impacts-climate-say-scientists?utm_source=RSS_text_news

http://www.theygaveusarepublic.com/diary/4285/pig-nation-americans-waste-40-percent-of-food-produced-here

www.independent.co.uk/life-style/health-and-families/health-news/clear-your-plate-and-spare-the-planet-1829914.html

roguepundit.typepad.com/roguepundit/2009/11/food-wastelines.html

trueslant.com/daviddisalvo/2009/11/27/study-finds-that-americans-throw-away-40-of-all-food/

healthystealthy.wordpress.com/2009/11/27/does-global-warming-make-my-ass-look-fat-americans-waste-1400-calories-per-person-enough-to-feed-another-whole-person-its-negatively-impacting-the-environment/

www.myfoxdc.com/dpp/news/dpgo-Study-US-Wastes-40-Percent-of-Its-Food-mb-200911291259516017848

www.greenlivingtips.com/blogs/456/Food-waste-epidemic.html

www.foodnavigator-usa.com/Science-Nutrition/US-food-waste-impacts-climate-say-scientists?utm_source=RSS_text_news

www.stuffedandstarved.org/drupal/node/528

news.mongabay.com/2009/1129-hance_foodwastetwo.html

New paper on food waste

Hall KD, Guo J, Dore M, Chow CC (2009) The Progressive Increase of Food Waste in America and Its Environmental Impact. PLoS ONE 4(11): e7940. doi:10.1371/journal.pone.0007940

This paper started out as a way to understand the obesity epidemic.  Kevin Hall and I developed a reduced model of how food intake is translated into body weight change [1].  We then decided to apply the model to the entire US population.  For the past thirty years there has been an ongoing study (NHANES) that has been taking a representative sample of the US population and taking anthropomorphic measurments like body weight and height. The UN Food and Agriculture Organization and the USDA have also kept track of how much food is available to the population. We thought it would be interesting to see if the food available accounted for the increase in body weight over the past thirty years.

What we  found was that the available food more than accounted for all of the weight gain.  In fact our calculations showed that the gap between predicted food intake and actual intake has diverged linearly over time.  This “energy gap” could be due to two things: 1) people were actually burning more energy than our model indicated because they were more physically active than expected (we assumed that physical activity stayed constant for the last thirty years), or 2) there has been a progressive increase of food waste.  Given that most people have argued that physical activity has gone down recently, which would make the energy gap even greater, we opted to for conclusion 2).   Our estimate is also on the conservative side because we didn’t accout for the fact that children eat less than adults on average.

I didn’t want to believe the result at first but the numbers were the numbers.  We have gone from wasting about 900 kcal per person per day in 1974 to 1400 kcal  in 2003.  It takes about 3 kcal to make 1 kcal of food so the energy in the wasted food amounts to about 4% of total US oil consumption.  The wasted food also uses about 25% of all fresh water use.  Ten percent of it could feed Canada.  The press has taken some interest in our result.  Our paper was covered by CBC news, Kevin and I were interviewed by Science and Kevin was interviewed on Voice of America.

[1] Chow CC, Hall KD (2008) The Dynamics of Human Body Weight Change. PLoS Comput Biol 4(3): e1000045. doi:10.1371/journal.pcbi.1000045

New paper on transients

A new paper, Competition Between Transients in the Rate of Approach to a Fixed Point, SIAM J. Appl. Dyn. Syst. 8, 1523 (2009) by Judy Day, Jonathan Rubin and myself appears today. The official journal link to the paper is here and the PDF can be obtained here.  This paper came about because of a biological phenomenon known as tolerance.  When the body is exposed to a pathogen or toxin there is an inflammatory response.  This makes you feel ill and initiates the immune system to mount a defense.  In some cases, if you are hit with a second dose of the toxin you’ll get a heightened response.  However, there are situations where you can have a decreased response to a second dose and that is called tolerance.

Judy Day was my last graduate student at Pitt.  When I left for NIH,  Jon Rubin stepped in to advise her.  Her first project on tolerance was to simulate a reduced four dimensional model of the immune system and see if tolerance could be observed in the model [1]. She found that it did occur under certain parameter regimes. What she showed was that if you watch a particular inflammatory marker, then it’s response could be damped if a preconditioning dose is first administered.

The next step was to understand mathematically how and why it occurred. The result after several starts and stops was this paper. We realized that tolerance boiled down to a question regarding the behavior of transients, i.e. how fast does an orbit get to a stable fixed point starting from different initial conditions. For example, consider two orbits with initial conditions (x1,y1) and (x2,y2) with x1 > x2, where y represents all the other coordinates. Tolerance occurs if the x coordinate of orbit 1 ever becomes smaller than the x coordinate of orbit 2 independent of what the other coordinates do. From continuity arguments, you can show that if tolerance occurs at a single point in space or time it must occur in a neighbourhood around those points. In our paper, we showed that tolerance could be understood geometrically and that for linear and nonlinear systems with certain general properties, tolerance is always possible although the theorems don’t say which orbits in particular will exhibit it.   However, regions of tolerance can be calculated explicitly in two dimensional linear systems and estimated for nonlinear planar systems.

[1] Day J, Rubin J, Vodovotz Y, Chow CC, Reynolds A, Clermont G, A reduced mathematical model of the acute inflammatory response II. Capturing scenarios of repeated endotoxin administration, Journal of theoretical biology, 242(1):237-56 2006.

Screening for terrorists

The recent tragedy at Fort Hood has people griping about missed signals that could have been used to prevent the attack.  However, I will argue that is likely to be impossible to ever have a system that can screen out all terrorists without also flagging a lot of innocent people.  The calculation is a simple exercise in probability theory that is often given in first year statistic classes.

Suppose we have a system in place that gives a yes Y or no response of whether or not a person is a terrorist T.  Let P(T) be the probablity that a given person is a terrorist,  P(T|Y) be the probability that a person  is a terrorist given that the test said yes.  Thus P(~T|Y)=1-P(T|Y) is the probability that one is not a terrorist even though the test said so.  Using Bayes theorem we have that

P(~T|Y)=P(Y|~T) P(~T)/P(Y)  (*)

where P(Y)=P(Y|T)P(T) + P(Y|~T)P(~T) is the probability of getting a yes result.   Now, the probability of being a terrorist is very low.   Out of the 300 million or so people in the US a small number are probably potential terrorists.  The US military has over a million people on active service.   Hence, the probability of not being a terrorist is very high.

From (*),  we see that in order to have a low probability of flagging an innocent person we need to have  P(Y|~T)P(~T)<< P(Y|T)P(T), or P(Y|~T)<< P(Y|T) P(T)/P(~T).  Since  P(T) is very small, P(T)/P(~T)~ P(T),   so if the true positive probability P(Y|T) was near one (i.e. a test that catches all terrorists), we need the false positive probability P(Y|~T) to be much smaller than the probability of having a terrorist, which means we need a test that gives false positives at a rate of less than 1 in a million.  The problem is that the true positive and false positive probabilities will be correlated.  The more sensitive the test the more likely it is to get a false positive.  So if you set your threshold to be very low so P(Y|T) is very high (i.e. make sure you never miss a terrorist), you’ll most certainly have P(Y|~T) to also be high.  I doubt you’ll ever have a test where P(Y|T) is near one while P(Y|~T) is less than one in a million.   So basically, if we want to catch all the terrorists, we’ll also have to flag a lot of innocent people.

Evidence based medicine

The New York Times magazine’s headline story this Sunday is on evidence-based medicine.  It talks about how a physician, Brent James, has been developing objective empirical means to measure outcomes and use the data to design medical protocols.  This is a perfect example of the migration from a highly skilled and paid profession (what I called an NP job in a recent post) to a more algorithmic and mechanical one (a P job).  Here are some excerpts from the story:

For the past decade or so, a loose group of reformers has been trying to do precisely this. They have been trying to figure out how to improve health care while also holding down the growth in costs. The group includes Dr. John Wenn­berg and his protégés at Dartmouth, whose research about geographic variation in care has received a lot of attention lately, as well as Dr. Mark McClellan, who ran Medicare in the Bush administration, and Dr. Donald Berwick, a Boston pediatrician who has become a leading advocate for patient safety. These reformers tend to be an optimistic bunch. It’s probably a necessary trait for anyone trying to overturn an entrenched status quo. When I have asked them whether they have any hope that medicine will change, they have tended to say yes. When I have asked them whether anybody has already begun to succeed, they have tended to mention the same name: Brent James.

…In the late 1980s, a pulmonologist at Intermountain named Alan Morris received a research grant to study whether a new approach to ventilator care could help treat a condition called acute respiratory distress syndrome. The condition, which is known as ARDS, kills thousands of Americans each year, many of them young men. (It can be a complication of swine flu.) As Morris thought about the research, he became concerned that the trial might be undermined by the fact that doctors would set ventilators at different levels for similar patients. He knew that he himself sometimes did so. Given all the things that the pulmonologists were trying to manage, it seemed they just could not set the ventilator consistently.

Working with James, Morris began to write a protocol for treating ARDS. Some of the recommendations were based on solid evidence. Many were educated guesses. The final document ran to 50 pages and was left at the patients’ bedsides in loose-leaf binders. Morris’s colleagues were naturally wary of it. “I thought there wasn’t anybody better in the world at twiddling the knobs than I was,” Jim Orme, a critical-care doctor, told me later, “so I was skeptical that any protocol generated by a group of people could do better.” Morris helped overcome this skepticism in part by inviting his colleagues to depart from the protocol whenever they wanted. He was merely giving them a set of defaults, which, he emphasized, were for the sake of a research trial.

… While the pulmonologists were working off of the protocol, Intermountain’s computerized records system was tracking patient outcomes. A pulmonology team met each week to talk about the outcomes and to rewrite the protocol when it seemed to be wrong. In the first few months, the team made dozens of changes. Just as the pulmonologists predicted, the initial protocol was deeply flawed. But it seemed to be successful anyway. One widely circulated national study overseen by doctors at Massachusetts General Hospital had found an ARDS survival rate of about 10 percent. For those in Intermountain’s study, the rate was 40 percent.

Continue reading