Archive for the ‘Medicine’ Category

Journal Club

May 20, 2015

Here’s the paper I will be covering in Journal Club tomorrow:

Neurons for hunger and thirst transmit a negative-valence teaching signal


Homeostasis is a biological principle for regulation of essential physiological parameters within a set range. Behavioural responses due to deviation from homeostasis are critical for survival, but motivational processes engaged by physiological need states are incompletely understood. We examined motivational characteristics of two separate neuron populations that regulate energy and fluid homeostasis by using cell-type-specific activity manipulations in mice. We found that starvation-sensitive AGRP neurons exhibit properties consistent with a negative-valence teaching signal. Mice avoided activation of AGRP neurons, indicating that AGRP neuron activity has negative valence. AGRP neuron inhibition conditioned preference for flavours and places. Correspondingly, deep-brain calcium imaging revealed that AGRP neuron activity rapidly reduced in response to food-related cues. Complementary experiments activating thirst-promoting neurons also conditioned avoidance. Therefore, these need-sensing neurons condition preference for environmental cues associated with nutrient or water ingestion, which is learned through reduction of negative-valence signals during restoration of homeostasis.

New paper on steroid-regulated gene expression

May 19, 2015

I am extremely pleased that the third leg of our theory on steroid-regulated gene expression is finally published.

Theory of partial agonist activity of steroid hormones
Abstract: The different amounts of residual partial agonist activity (PAA) of antisteroids under assorted conditions have long been useful in clinical applications but remain largely unexplained. Not only does a given antagonist often afford unequal induction for multiple genes in the same cell but also the activity of the same antisteroid with the same gene changes with variations in concentration of numerous cofactors. Using glucocorticoid receptors as a model system, we have recently succeeded in constructing from first principles a theory that accurately describes how cofactors can modulate the ability of agonist steroids to regulate both gene induction and gene repression. We now extend this framework to the actions of antisteroids in gene induction. The theory shows why changes in PAA cannot be explained simply by differences in ligand affinity for receptor and requires action at a second step or site in the overall sequence of reactions. The theory also provides a method for locating the position of this second site, relative to a concentration limited step (CLS), which is a previously identified step in glucocorticoid-regulated transactivation that always occurs at the same position in the overall sequence of events of gene induction. Finally, the theory predicts that classes of antagonist ligands may be grouped on the basis of their maximal PAA with excess added cofactor and that the members of each class differ by how they act at the same step in the overall gene induction process. Thus, this theory now makes it possible to predict how different cofactors modulate antisteroid PAA, which should be invaluable in developing more selective antagonists.

Steroids are crucial hormones in the body, which are involved in development and homeostasis. They regulate gene expression by first binding to nuclear receptors that freely float in the cytosol. The receptor-steroid complex is activated somehow and transported to the nucleus, where it binds to a hormone response element and initiates transcription. Steroids can either induce or repress genes in a dose dependent way and the dose-response function is generally a linear-fractional function. In our work, we modeled the whole sequence of events as a complex-building biochemical reaction sequence and showed that a linear-fractional dose response could only arise under some specific but biophysically plausible conditions. See herehere, and here for more background.

Given the importance of steroids and hormones, several important drugs target these receptors. They include tamoxifen and raloxifene, and RU486. These drugs are partial agonists in that bind to nuclear receptors and either, block, reduce, or even increase gene expression. However, it was not really known how partial agonists or antagonists work. In this paper, we show that they work by altering the affinity of some reaction downstream of receptor-ligand binding and thus they can do this in a gene specific way. We show that the activity of a given partial agonist can be reversed by some other downstream transcription factor provided it act after this reaction. The theory also explains why receptor-ligand binding affinity has no affect on the partial agonist activity. The theory makes specific predictions on the mechanisms of partial agonists based on how the maximal activity and the EC50 of the dose response change as you add various transcription factors.

The big problem with these drugs is that nuclear receptors act all over the body and thus the possibility of side effects is high. I think our theory could be used as a guide for developing new drugs or combinations of drugs that can target specific genes and reduce side effects.

The Ebola response

October 25, 2014

The real failure of the Ebola response is not that a physician went bowling after returning from West Africa but that there are not more doctors over there containing the epidemic where it is needed. Infected patients do not shed virus particles until they become symptomatic and it is emitted in bodily fluids. The New York physician monitored his temperature daily and reported immediately to a designated Ebola hospital the moment he detected high fever. We should not be scape goating physicians who are trying to make a real difference in containing this outbreak and really protecting the rest of the world. This current outbreak was identified in the spring of 2014 but there was no international response until late summer. We know how to contain Ebola – identify patients and isolate them and this is what we should be doing instead of making  emotional and unhelpful policy decisions.

The ultimate pathogen vector

March 31, 2014

If civilization succumbs to a deadly pandemic, we will all know what the vector was. Every physician, nurse, dentist, hygienist, and health care worker is bound to check their smartphone sometime during the day before, during, or after seeing a patient and they are not sterilizing it afterwards.  The fully hands free smartphone could be the most important invention of the 21st century.

Optimizing food delivery

March 25, 2014

This Econtalk podcast with Frito-Lay executive Brendan O’Donohoe from 2011 gives a great account of how optimized the production and marketing system for potato chips and other salty snacks has become. The industry has a lot of very smart people trying to figure out how to ensure that you maximize food consumption from how to peel potatoes to how to stack store shelves with bags of chips. This increased efficiency is our hypothesis (e.g. see here) for the obesity epidemic. However, unlike before where I attributed the increase in food production to changes in agricultural policy, I now believe it is mostly due to the vastly increased efficiency of food production. This podcast shows the extent of the optimization after the produce leaves the farm but the efficiency improvements on the farm are just as dramatic. For example, farmers now use GPS to optimally line up their crops.

The Stephanie Event

January 14, 2014

You should read this article in Esquire about the advent of personalized cancer treatment for a heroic patient named Stephanie Lee.  Here is Steve Hsu’s blog post. The cost of sequencing is almost at the point where everyone can have their normal and tumor cells completely sequenced to look for mutations like Stephanie. The team at Mt.  Sinai Hospital in New York described in the article inserted some of the mutations into a fruit fly and then checked to see what drugs killed it. The Stephanie Event was the oncology board meeting at Sinai where the treatment for Stephanie Lee’s colon cancer, which had spread to the liver, was discussed. They decided on a standard protocol but would use the individualized therapy based on the fly experiments if the standard treatments failed.  The article was beautifully written, combining a compelling human story with science.

Fred Sanger 1918 – 2013

November 21, 2013

Perhaps the greatest biologist of the twentieth century and two-time Nobel prize winner, Fred Sanger, has died at the age of 95. He won his first Nobel in 1958 for determining the amino acid sequence of insulin and his second in 1980 for developing a method to sequence DNA.  An obituary can be found here.

TB, streptomycin, and who gets credit

September 4, 2013

The Science Show did a feature story recently about the discovery of streptomycin,  the first antibiotic to treat tuberculosis, which had killed 2 billion people in the 18th and 19th centuries. Streptomycin was discovered by graduate student Albert Schatz in 1943, who worked in the lab of Professor Selman Waksman at Rutgers. Waksman was the sole winner of the 1952 Nobel Prize for this work. The story is narrated by the author of the book Experiment Eleven, who paints Waksman as the villain and Schatz as the victim. Evidently, Waksman convinced Schatz to sign away his patent rights to Rutgers but secretly negotiated a deal to obtain 20% of the royalties. When Schatz discovered this, he sued Waksman and obtained a settlement. However, this turned the scientific community against him and he forced him out of microbiology into science education. To me, this is just more evidence that prizes and patents are incentives for malfeasance.

New paper on childhood growth and obesity

August 1, 2013

Kevin D Hall, Nancy F Butte, Boyd A Swinburn, Carson C Chow. Dynamics of childhood growth and obesity: development and validation of a quantitative mathematical model. Lancet Diabetes and Endocrinology 2013 .

You can read the press release here.

In order to curb childhood obesity, we need a good measure of how much food kids should eat. Although people like Claire Wang have proposed quantitative models in the past that are plausible, Kevin Hall and I have insisted that this is a hard problem because we don’t fully understand childhood growth. Unlike adults, who are more or less in steady state, growing children are a moving target. After a few fits and starts we finally came up with a satisfactory model that modifies our two compartment adult body composition model to incorporate growth. That previous model partitioned excess energy intake into fat and lean compartments according to the Forbes rule, which basically says that the ratio of added fat to lean is proportional to how much fat you have so the more fat you have the more excess Calories go to fat. The odd consequence of that model is that the steady state body weight is not unique but falls on a one dimensional curve. Thus there is a whole continuum of possible body weights for a fixed diet and lifestyle. I actually don’t believe this and have a modification to fix it but that is a future story.

What puzzled me about childhood growth was how do we know how much more to eat as we grow? After some thought, I realized that what we could do is to eat enough to maintain the fraction of body fat at some level, using leptin as a signal perhaps, and then tap off the energy stored in fat when we needed to grow. So just like we know how much gasoline (petrol) to add by simply filling the tank when it’s empty, we simply eat to keep our fat reserves at some level. In terms of the model, this is a symmetry breaking term that transfers energy from the fat compartment to the lean compartment. In my original model, I made this term a constant and had food intake increase to maintain the fat to lean ratio and showed using singular perturbation theory that his would yield growth that was qualitatively similar to the real thing. This then sat languishing until Kevin had the brilliant idea to make the growth term time dependent and fit it to actual data that Nancy Butte and Boyd Swinburn had taken. We could then fit the model to normal weight and obese kids to quantify how much more obese kids eat, which is more than previously believed. Another nice thing is that when the child stops growing the model is automatically the adult model!

New paper on measuring gastric acid output

July 16, 2013

This paper started many years ago when Steve Wank, of the Digestive Diseases Branch of NIDDK, had this idea to use this new wireless PH detecting SmartPill that you could swallow to determine how much acid your stomach was producing.  There really was no noninvasive way to monitor how well medications would work for certain reflux diseases.  What he wanted was a model of gastric acid secretion output based on the dynamics of PH when a buffer was added to design a protocol for the experiment.  I came up with a simple mass-action model of acid buffering and made some graphs for him.  We then tested the model out in a beaker.  He thought the model worked better than I did but it was somewhat useful to him in designing the experiment.

Weinstein et al.  A new method for determining gastric acid output using a wireless pH-sensing capsule.  Aliment Pharmacol Ther 37: 1198 (2013)


BACKGROUND:Gastro-oesophageal reflux disease (GERD) and gastric acid hypersecretion respond well to suppression of gastric acid secretion. However, clinical management and research in diseases of acid secretion have been hindered by the lack of a non-invasive, accurate and reproducible tool to measure gastric acid output (GAO). Thus, symptoms or, in refractory cases, invasive testing may guide acid suppression therapy.

AIM:To present and validate a novel, non-invasive method of GAO analysis in healthy subjects using a wireless pH sensor, SmartPill (SP) (SmartPill Corporation, Buffalo, NY, USA).

METHODS:Twenty healthy subjects underwent conventional GAO studies with a nasogastric tube. Variables impacting liquid meal-stimulated GAO analysis were assessed by modelling and in vitro verification. Buffering capacity of Ensure Plus was empirically determined. SP GAO was calculated using the rate of acidification of the Ensure Plus meal. Gastric emptying scintigraphy and GAO studies with radiolabelled Ensure Plus and SP assessed emptying time, acidification rate and mixing. Twelve subjects had a second SP GAO study to assess reproducibility.

RESULTS:Meal-stimulated SP GAO analysis was dependent on acid secretion rate and meal-buffering capacity, but not on gastric emptying time. On repeated studies, SP GAO strongly correlated with conventional basal acid output (BAO) (r = 0.51, P = 0.02), maximal acid output (MAO) (r = 0.72, P = 0.0004) and peak acid output (PAO) (r = 0.60, P = 0.006). The SP sampled the stomach well during meal acidification.

CONCLUSIONS:SP GAO analysis is a non-invasive, accurate and reproducible method for the quantitative measurement of GAO in healthy subjects. SP GAO analysis could facilitate research and clinical management of GERD and other disorders of gastric acid secretion.

Houghton opines on the unfairness of prizes

July 12, 2013

I recently wrote about Michael Houghton declining the prestigious Gairdner prize because it left out two critical contributors to the discovery of the Hepatitis C virus. Houghton has now written an opinion piece in Nature Medicine arguing that prizes relax the restriction to three awardees, an arbitrary number I’ve never understood. After all, one could argue that Freeman Dyson had a reasonable claim on the Nobel Prize awarded to Feynman, Schwinger, and Tomonaga for QED.  I’ve quoted the entire piece below.

Nature Medicine: Earlier this year, I was greatly honored with the offer of a 2013 Canada Gairdner International Award for my contributions to the discovery of the hepatitis C virus (HCV). I was selected along with Harvey Alter, chief of clinical studies in the Department of Transfusion Medicine at the US National Institutes of Health’s Clinical Center in Bethesda, Maryland, and Daniel Bradley, a consultant at the US Centers for Disease Control and Prevention in Atlanta, both of whom had a vital role in the research that eventually led to the identification and characterization of the virus.

My colleagues accepted their awards. However, I declined my C$100,000 ($98,000) prize because it excluded two other key contributors who worked with me closely to successfully isolate the viral genome for the first time. I felt that given their crucial inputs, it would be wrong of me to keep accepting major prizes just ‘on their behalf’, a situation that has developed because major award foundations and committees around the world insist that prizes be limited to no more than three recipients per topic.

HCV was identified in 1989 in my laboratory at the Chiron Corporation, a California biotechnology firm since purchased by the Swiss drug company Novartis. The discovery was the result of seven years of research in which I worked closely, both intellectually and experimentally, with Qui-Lim Choo, a member of my own laboratory, and George Kuo, who had his own laboratory next door to mine at Chiron. We finally identified the virus using a technically risky DNA-expression screening technique through which we isolated a single small nucleic acid clone from among many millions of such clones from different recombinant libraries. This was achieved without the aid of the still-evolving PCR technology to amplify the miniscule amounts of viral nucleic acid present in blood. We ultimately proved that this clone derived from a positive-stranded viral RNA genome intimately associated with hepatitis, but one not linked to either the hepatitis A or B viruses12. The finding represented the first time any virus had been identified without either prior visualization of the virus itself, characterization of its antigens or viral propagation in cell culture.

The high-titer infectious chimpanzee plasma used for our molecular analyses at Chiron was provided in 1985 by Bradley, an expert in chimpanzee transmission of HCV and in the virus’s basic properties and cellular responses, with whom I had an active collaboration since 1982. The proposed aim of the collaboration was for my laboratory to apply contemporary molecular cloning methodologies to a problem that had proven intractable since the mid-1970s, when Alter and his colleagues first demonstrated the existence of non-A, non-B hepatitis (NANBH), as it was then known. Alter’s team went on to define the high incidence and medical importance of NANBH, including the virus’s propensity to cause liver fibrosis, cirrhosis and cancer. They also identified high-titer infectious human plasma in 1980 and were instrumental in promoting the adoption of surrogate tests for NANBH by blood banks to reduce the incidence of post-transfusion infection.

With regrets to the Gairdner Foundation—a generous and altruistic organization—I felt compelled to decline the International Gairdner Award without the addition of Kuo and Choo to the trio of scientists offered the award. In 1992, all five of us received the Karl Landsteiner Memorial Award from the American Association of Blood Banks. But subsequent accolades given in honor of HCV’s discovery have omitted key members of the group: only Bradley and I received the 1993 Robert Koch Prize, and only Alter and I won the 2000 Albert Lasker Award for Clinical Medical Research—in both cases, despite my repeated requests that the other scientists involved in the discovery be recognized. With the exclusion once more of Kuo and Choo from this year’s Gairdner Award, I decided that I should not continue to accept major awards without them. In doing so, I became the first person since the Gairdner’s inception in 1959 to turn down the prize.

I hope that my decision helps bring attention to a fundamental problem with many scientific prizes today. Although some awards, such as the Landsteiner, are inclusionary and emphasize outstanding team accomplishments, the majority of the world’s prestigious scientific awards—including the Gairdner, Lasker and Shaw prizes, which all seem to be modeled on the Nobel Prize and indeed are sometimes known as the ‘baby Nobels’—are usually restricted to at most three individuals per discovery. Unsurprisingly, this limitation often leads to controversy, when one or more worthy recipients are omitted from the winners list.

Perhaps what may help this situation is for awards committees to solicit, and then be responsive to, input from potential recipients themselves prior to making their final decisions. Some of the recipients are best placed to know the full and often intricate history of the discovery and collaborative efforts, and such input should help committees better understand the size of the contributing team from which they can then choose recipients according to each award’s particular policy.

With this information in hand, award organizers should be willing to award more than three researchers. As knowledge and technology grows exponentially around the world and with an increasing need for multidisciplinary collaborations to address complex questions and problems, there is a case to be made for award committees adjusting to this changing paradigm. Moreover, it is inherently unfair to exclude individuals who played a key part in the discovery. Why should they and their families suffer such great disappointment after contributing such crucial input? Some award restructuring could also be inspirational to young scientists, encouraging them to be highly interactive and collaborative in the knowledge that when a novel, long-shot idea or approach actually translates to scientific success, all key parties will be acknowledged appropriately.

In this vein, I am happy to note that the inaugural Queen Elizabeth Prize for Engineering, a new £1 million ($1.6 million) prize from the UK government, was awarded at a formal ceremony last month to five individuals who helped create the internet and the World Wide Web, even though the original guidelines stipulated a maximum of three recipients. If the Queen of England—the very emblem of tradition—can cast protocol aside, clearly other institutions can too. I hope more awards committees will follow Her Majesty’s lead.

Body weight simulator iPhone app

June 19, 2013

The body weight simulator, originally a web based java application, is now also an iPhone app (see here in iTunes).  The simulator is based on the human metabolism model developed by Kevin Hall, myself, and collaborators.  The exact model is given in detail in our Lancet paper, which is listed here along with other related references.  The app predicts the time course of your body weight given your baseline parameters and your new diet and/or new physical activity.  It will also give a suggested daily caloric intake to attain a new weight over a specified period of time along with the diet required to maintain that weight.  The model uses parameters calibrated to the average American so your own mileage will vary.  Also, I basically wrote the app in my spare time over the past year so it is pretty primitive as far as apps go but it does the job.  Please try it out and give me feedback.


Genes can no longer be patented

June 13, 2013

The US Supreme Court ruled today that human genes cannot be patented. Here is the link to the New York Times article. The specific case regards Myriad Genetics, which held a patent that controlled the rights to all tests for the BRCA1 and BRCA2 genes implicated in breast cancer. The patent essentially blocked most research on the BRCA genes. The immediate effect will be that genetic testing will become cheaper and more widespread. People will argue that not allowing genes to be patented will discourage further innovation. I doubt it. Most discoveries, like genes, come from basic federally funded research. Any company can now develop a test for any newly discovered gene. Patent law has been broken for decades and this is just one small step to correcting it.

The demise of Arbaclofen for Fragile X

June 7, 2013

Seaside Therapeutics has recently announced that they would be withdrawing their Fragile X drug Arbaclofen (STX209) from further clinical trials (see here). They had already reached Phase 3 and the drug showed promise in some patients but probably not enough to secure enough funding to continue or guarantee FDA approval. See the New York Times story for some personal accounts of the impact of this decision. The drug is a GABA-B agonist and is similar to the drug Baclofen, which is used to treat muscle spasms. Fragile X syndrome, which has symptoms similar to autism,  is caused by a mutation to the FMR1 gene that silences the production of the FMRP protein. Like most proteins, it is not exactly clear what FMRP does except that it may involve protein translation and affects synaptic plasticity in mouse models. One hypothesis of the cause of autism and Fragile X is that there is an overabundance of synaptic excitation.

My paper with Shashaank Vattikuti explored the effects of such imbalances on a cortical circuit model and showed that it could reproduce some psychophysical experiments (see here for summary of the paper). It is thus a plausible hypothesis that a GABA-B agonist, which are inhibitory transmitters, may alleviate some of the symptoms and I believe that it does in some patients. However, such a blunt instrument would probably not work in all patients. One reason is that not all imbalances between excitation and inhibition are necessarily equal, i.e. too much excitation may not be the same as too little inhibition. A neural circuit with very high excitation and inhibition balancing each other could behave very differently from one with low amounts of each balancing each other. The high circuit would have “high gain” and be very responsive to perturbations while the low circuit would have low gain. It is also not clear that simply increasing inhibition everywhere will result in a net decrease in inhibition because of the multiple feedback loops in the system. Increasing inhibition between inhibitory neurons could decrease the net inhibition on excitatory neurons.

The trials seem to show that about a third of the patients improved with Arbaclofen. They are probably the ones that have too little inhibition and increasing inhibition helps. I think this case suggests that we may need a new model for FDA approval of drugs. Perhaps we should not insist that drugs only treat specific illnesses but should also be approved if they are shown to have some biological effect and do not cause harm. I believe that there are many drugs that have failed to obtain FDA approval that actually do work and could help some patients. Instead of waiting until we can figure out ahead of time which patients will benefit from a given drug before we approve of it, we can just approve of it for restricted use and try it on patients to see what happens. The danger of course is that it may be difficult to know if a drug works and desperate patients and especially parents will insist on using a drug even if the physician believes it has no effect. This could cause harm and increase the cost of medical care. One of the things we could do is to have the government or nonprofit companies take over failed but safe drugs and provide them at low cost under some regulation. Actually, I think we need to completely revamp how drugs are developed but I need to leave that to a future post.

Most of neuroscience is wrong

May 20, 2013

John Ioannidis has a recent paper in Nature Reviews Neuroscience arguing that many results in neuroscience are wrong. The argument follows his previous papers of why most published results are wrong (see here and here) but emphasizes the abundance of studies with small sample sizes in neuroscience. This both reduces the chances of finding true positives and increases the chances of obtaining false positives. Under powered studies are also susceptible to what is called the “winner’s curse” where the effect sizes of true positives are artificially amplified. My take is that any phenomenon with a small effect should be treated with caution even if it is real. If you really wanted to find what causes a given disease then you probably want to find something that is associated with all cases, not just in a small percentage of them.

New paper on fat

April 19, 2013

Sex-Associated Differences in Free Fatty Acid Flux of Obese Adolescents.

Diane C Adler-Wailes, Vipul Periwal, Asem H Ali, Sheila M Brady, Jennifer R McDuffie, Gabriel I Uwaifo, Marian Tanofsky-Kraff, Christine G Salaita, Van S Hubbard, James C Reynolds, Carson C Chow, Anne E Sumner, Jack A Yanovski

Section on Growth and Obesity (D.C.A.-W., A.H.A., S.J.R.M., G.I.U., M.T.-K., J.A.Y.), Program in Developmental Endocrinology and Genetics, Eunice Kennedy Shriver National Institute of Child Health and Human Development; Mathematical Cell Modeling Section (V.P., C.C.C.), Division of Extramural Activities (C.G.S.), Division of Nutrition Research Coordination (V.S.H.), and Laboratory of Endocrinology and Receptor Biology (A.E.S.), National Institute of Diabetes and Digestive and Kidney Diseases; and Nuclear Medicine Department (J.C.R.), Hatfield Clinical Research Center, National Institutes of Health, U.S. Department of Health and Human Services, Bethesda, Maryland 20892.

The Journal of clinical endocrinology and metabolism (impact factor: 6.5). 02/2013; DOI:10.1210/jc.2012-3817

ABSTRACT Context: In obesity, increases in free fatty acid (FFA) flux can predict development of insulin resistance. Adult women release more FFA relative to resting energy expenditure (REE) and have greater FFA clearance rates than men. In adolescents, it is unknown whether sex differences in FFA flux occur. Objective: Our objective was to determine the associations of sex, REE, and body composition with FFA kinetics in obese adolescents. Participants: Participants were from a convenience sample of 112 non-Hispanic white and black adolescents (31% male; age range, 12-18 years; body mass index SD score range, 1.6-3.1) studied before initiating obesity treatment. Main Outcome Measures: Glucose, insulin, and FFA were measured during insulin-modified frequently sampled iv glucose tolerance tests. Minimal models for glucose and FFA calculated insulin sensitivity index (SI) and FFA kinetics, including maximum (l0 + l2) and insulin-suppressed (l2) lipolysis rates, clearance rate constant (cf), and insulin concentration for 50% lipolysis suppression (ED50). Relationships of FFA measures to sex, REE, fat mass (FM), lean body mass (LBM) and visceral adipose tissue (VAT) were examined. Results: In models accounting for age, race, pubertal status, height, FM, and LBM, we found sex, pubertal status, age, and REE independently contributed to the prediction of l2 and l0 + l2 (P < .05). Sex and REE independently predicted ED50 (P < .05). Sex, FM/VAT, and LBM were independent predictors of cf. Girls had greater l2, l0 + l2 and ED50 (P < .05, adjusted for REE) and greater cf (P < .05, adjusted for FM or VAT) than boys. Conclusion: Independent of the effects of REE and FM, FFA kinetics differ significantly in obese adolescent girls and boys, suggesting greater FFA flux among girls.

Slides for ACP talk

April 9, 2013

I just gave a talk on obesity at a diabetes course at the American College of Physicians meeting in San Francisco.  My slides are here.

Hepatitis C and the folly of prizes

April 3, 2013

The scientific world was set slightly aflutter when Michael Houghton turned down the prestigious Gairdner Award for the the discovery of Hepatitis C. Harvey Alter and Daniel Bradley were the two other recipients. Houghton, who had previously received the Lasker Award with Alter, felt he could not accept one more award because two colleagues Qui-Lim Choo and George Kuo did not receive either of these awards, even though their contributions were equally important.

Hepatitis, which literally means inflammation of the liver, was characterized by Hippocrates and known to be infectious since the 8th century. The disease had been postulated to be viral at the beginning of the 20th century and by the 1960’s two viruses termed Hepatitis A and Hepatitis B had been established. However, there still seemed to be another unidentified infectious agent which was termed Non-A Non-B Hepatitis NANBH.

Michael Hougton, George Kuo and Qui-Lim Choo were all working at the Chiron corporation in the early 1980’s.   Houghton started a project to discover the cause of NANBH in 1982 with Choo joining a short time later. They made significant process in generating mouse monoclonal antibodies with some specificity to NANBH infected materials from chimpanzee samples received from Daniel Bradley at the CDC. They used the antibodies to screen cDNA libraries from infected materials but they had not isolated an agent. George Kuo had his own lab at Chiron working on other projects but would interact with Houghton and Choo. Kuo suggested that they try blind cDNA immunoscreening on serum derived from actual NANBH patients. This approach was felt to be too risky but Kuo made a quantitative assessment that showed it was viable. After two years of intensive and heroic screening by the three of them, they identified one clone that was clearly derived from the NANBH genome and not from human or chimp DNA. This was definitive proof that NANBH was a virus, which is now called Hepatitis C. Kuo then developed a prototype of a clinical Hepatitis C antibody detection kit and used it to screen a panel of NANBH blood provided by Harvey Alter of the NIH. Kuo’s test was a resounding success and the blood test that came out of that work has probably saved 300 million or more people from Hepititis C infection.

The question then is who deserves the prizes. Is it Bradley and Alter, who did careful and diligent work obtaining samples or is it Houghton, Choo, and Kuo, who did the heroic experiments that isolated the virus? For completely unknown reasons, the Lasker was awarded to just Houghton and Alter, which primed the pump for more prizes to these two. Now that the Lasker and Gairdner prizes have been cleared, that leaves just the Nobel Prize. The scientific community could get it right this time and award it to Kuo, Choo, and Houghton.


Addendum added 2013-5-2:  I should add that many labs from around the world were also trying to isolate the infective agent of NANBH and all failed to identify the correct samples from Alter’s panel.  It is not clear how long it would have been and how many more people would have been infected if Kuo, Choo, and Houghton had not succeeded when they did.

Epipheo video

February 1, 2013

The narration comes from an interview with me.


A meal for a day

January 17, 2013

The Center for Science in the Public Interest has some examples of meals in restaurants that contain the caloric requirements for a whole day.  And you doubted the push hypothesis for the obesity epidemic.


Get every new post delivered to your Inbox.

Join 243 other followers