Archive for the ‘Opinion’ Category

Unintended consequences

September 10, 2014

Here is a true story. A young man is trained to hit people as hard as possible and to react immediately to any provocation with unhindered aggression. He signs a 5 year contract for 35 million dollars to do this 16 times a year or more if he and his colleagues are very successful at doing it. One day he gets upset with his fiancée and strikes her in the head so hard that she is knocked unconscious in a public place. This creates a minor stir so the employer mandates that he must apologize and is prohibited from smashing into people for 2 of the 16 times he is scheduled to do so. The fiancée-now-spouse also refuses to press charges because she doesn’t want to jeopardize the 27 million over the next 3 years owed to the man. However, a video showing the incident is made public creating a huge uproar so the employer abruptly fires the man and condemns him since he now is no longer financially useful to the employer. The public now feels vindicated that such a despicable man is no longer employed and that domestic violence now is given the attention it deserves. However, the spouse is very unhappy because her comfortable lifestyle has just been pulled from right under her. Now, other spouses who live with violent but rich men will be even more silent about abuse because they fear losing their livelihoods too. If we really cared about victims of domestic violence, we would force the employer to set up a fund to ensure that spouses that come forward are compensated financially. We would also force them to support institutions that help the many more victims of domestic abuse who are not married to rich and famous people. This young man is probably an upstanding citizen most of the time. Now he is unemployed and potentially even angrier. He should not be thrown out onto the street but given a chance to redeem himself. The employers and the system who trained and groomed these young men need to look at themselves.

Saving large animals

January 11, 2013

One  story in the news lately is the dramatic increase in the poaching of African elephants (e.g. New York Times). Elephant numbers have plunged dramatically in the past few years and their outlook is not good. This is basically true of most large animals like whales, pandas, rhinos, bluefin tuna, whooping cranes, manatees, sturgeon, etc. However, one large animal has done extremely well while the others have languished. In the US it had a population of zero 500 years ago and now it’s probably around 100 million.That animal as you have probably guessed is the cow. While wild animals are being hunted to extinction or dying due to habitat loss and climate change, domestic animals are thriving. We have no shortage of cows, pigs, horses, dogs, and cats.

Given that current conservation efforts are struggling to save the animals we love, we may need to try a new strategy. A complete ban on ivory has not stopped the ivory trade just as a ban on illicit drugs has not stopped drug use. Prohibition does not seem to be a sure way to curb demand. It may just be that starting some type of elephant farming may be the only way to save the elephants. It could raise revenue to help protect wild elephants and could drop the price in ivory sufficiently to make poaching less profitable. It could also backfire and increase the demand for ivory.

Another counter intuitive strategy may be to sanction limited hunting of some animals. The introduction of wolves into Yellowstone park has been a resounding ecological success but it has also angered some people like ranchers and deer hunters. The backlash against the wolf has already begun. One ironic way to save wolves could be to legalize the hunting of them. This would give hunters an incentive to save and conserve wolves. Given that the set of hunters and ranchers often have a significant intersection, this could dampen the backlash. There is a big difference in attitudes towards conservation when people hunt to live versus hunting for sport. When it’s a job, we tend to hunt to extinction like  buffalo, cod, elephants, and bluefin tuna. However, when it’s for sport, people want to ensure the species thrives. While I realize that this is controversial and many people have a great disdain for hunting, I would suggest that hunting is no less humane and perhaps more than factory abattoirs.

Complete solutions to life’s little problems

September 25, 2012

One of the nice consequence of the finiteness of human existence is that there can exist complete solutions to some of our problems.  For example, I used to leave the gasoline (petrol for non-Americans) cap of my car on top of the gas pump every once in a while.  This has now been completely solved by the ludicrously simple solution of tethering the cap to the car.  I could still drive off with the gas cap dangling but I wouldn’t lose it.  The same goes for locking myself out of my car.  The advent of remote control locks has also eliminated this problem.  Because human reaction time is finite, there is also an absolute threshold for internet bandwidth above which the web browser will seem instantaneous for loading pages and simple computations.  Given our finite lifespan, there is also a threshold for the amount of disk space required to store every document, video, and photo we will ever want.  The converse is that are also more books in existence than we can possibly read in a life time although there will always be just a finite number of books by specific authors that we may enjoy.  I think one strategy for life is to make finite as many things as possible because then there is a chance for a complete solution.

The flipside of medicare efficiency

August 26, 2012

The selection of Paul Ryan as the Republican vice presidential candidate for the upcoming US federal election has brought health care reform back into the spotlight.  While the debate has been highly acrimonious, the one point that everyone seems to agree on is that the rate of increase in health care and in particular medicare spending is unsustainable.  Health care is currently one sixth of the economy and it will take up an increasing share if the growth is not reduced.  I think that a really expensive health care system may actually be a good thing.  What people tend to forget is that there are two sides of a cost.   When we pay for healthcare, that money goes to someone.  Making something more efficient, means producing the same amount of stuff with fewer people.

The official unemployment rate is currently about 8% but the actual fraction number of people who don’t work or wish they had more work is much higher.  Efficiency eliminates jobs.  People like Tom Friedman of the New York Times thinks (e.g. see here) that this will just free us up to do “creative” jobs.  However, what if you are a person that doesn’t want or is unable to do a “creative” job?  My guess is that as we become more efficient, more and more people will be left with nothing to do.  The solution is either to have a massive welfare system or we become less efficient.

However, not all inefficiencies are equal. We wouldn’t want monopolies where all the money flows to a small number of individuals.  What we need is a highly stochastic form of inefficiency that involves lots of people. Healthcare may be just what we need. It’s something that is highly decentralized and affects everyone.  It can’t be easily outsourced. I’ve argued before that having 80% of the economy be devoted to healthcare doesn’t seem that outlandish.  After all, how many flat screen TVs do you need?

Is abstract thinking necessary?

August 1, 2012

Noted social scientist, Andrew Hacker, wrote a provocative opinion piece in the New York Times Sunday arguing that we relax mathematics requirements for higher education. Here are some excerpts from his piece:

New York Times: A TYPICAL American school day finds some six million high school students and two million college freshmen struggling with algebra. In both high school and college, all too many students are expected to fail. Why do we subject American students to this ordeal? I’ve found myself moving toward the strong view that we shouldn’t.

…There are many defenses of algebra and the virtue of learning it. Most of them sound reasonable on first hearing; many of them I once accepted. But the more I examine them, the clearer it seems that they are largely or wholly wrong — unsupported by research or evidence, or based on wishful logic. (I’m not talking about quantitative skills, critical for informed citizenship and personal finance, but a very different ballgame.)

…The toll mathematics takes begins early. To our nation’s shame, one in four ninth graders fail to finish high school. In South Carolina, 34 percent fell away in 2008-9, according to national data released last year; for Nevada, it was 45 percent. Most of the educators I’ve talked with cite algebra as the major academic reason.

The expected reaction from some of my colleagues was understandably negative. After all, we live in a world that is becoming more complex requiring more mathematical skills not less. Mathematics is as essential to one’s education as reading. In the past, I too would have whole heartedly agreed. However, over the past few years I have started think otherwise. Just to clarify, Hacker does not (nor I) believe that critical thinking is unimportant. He argues forcefully that all citizens should have a fundamental grounding in the concepts of arithmetic, statistics and quantitative reasoning. I have even posted before (see here)  that I thought mathematics should be part of the accepted canon of what an educated citizen should know and I’m not backing away from that belief. Hacker thinks we should be taught a “citizen’s statistics” course. My suggested course was:  “Science and mathematics survival tools for the modern world.”  The question is whether or not we should expect all students to master the abstract reasoning skills necessary for algebra.

I’ll probably catch a lot of flack for saying this but from my professional and personal experience, I believe that there is a significant fraction of the population that is either unable or unwilling to think abstractly.  I also don’t think we can separate lack of desire from lack of ability. The willingness to learn something may be just as “innate” as the ability to do something. I think everyone can agree that on the abstract thinking scale almost everyone can learn to add and subtract but only a select few can understand cohomology theory.  In our current system, we put high school algebra as the minimum threshold, but is this a reasonable place to draw the line? What we need to know  is the distribution of people’s maximum capacity for abstract thinking. The current model requires that  the distribution be  almost zero left of algebra with a fat tail on the right. But what if the actual distribution is broad with a peak somewhere near calculus?  In this case, there would be a large fraction of the population to the left of algebra. This is pure speculation but there could even be a neurophysiological basis to abstract thinking in terms of the fraction of neural connections within higher cortical areas versus connections between cortical and sensory areas. There could be a trade-off between abstract thinking and sensory processing. This need not even be purely genetic. As I posted before, not all the neural connections can be set by the genome so most are either random or arise through plasticity.

To me, the most important issue that Hacker brings up is not whether or not we should make everyone learn algebra but what should we do about the people who don’t and as a result are denied the opportunity to attend college and secure a financially stable life. Should we devote our resources to try to teach it to them better or should we develop alternative ways for these people to be productive in our society? I really think we should re-evaluate the goal that everyone goes to college. In fact, given the exorbitant cost and the rise of online education, the trend away from traditional college may have already begun. We should put more emphasis on apprenticeship programs and community colleges. Given the rapid rate of change in the job market, education and training should be thought of as a continual process instead of the current model of four years and out. I do believe that a functional democracy requires an educated citizenry. However, college attendance has been steadily increasing the past few decades but one would be hard pressed to argue that democracy has concomitantly improved. A new model may be in order.

A new model for publishing

January 2, 2012

Two months ago in a guest editorial for DSWeb (see here), I expressed some dismay that while we have had great inovation in many aspects of our work lives, the current (broken) publication model has remained relatively unchanged.  Now my colleagues at NIH – Dwight Kravitz and Chris Baker have published a stimulating and provocative article (see here) highlighting the many problems with the current situation, especially with the wasteful treadmill of trying to get something into a “high impact” journal, and propose a new model.  Although this will mostly have salience for people in fields that try to publish in journals like Nature and Science, I recommend that anyone who publishes should read the paper and form their own opinion.  Here is mathematician Kreso Josic’s take on the paper.

From my view as a physicist cum mathematician cum biologist, I’ve seen publishing from several perspectives.  The theoretical physics/applied math world seems to have a good system already in place where everyone posts their papers on the arXiv and then publish in an “obvious” physics or math journal like one of the Physical Review or SIAM ones.  These journals are fairly low cost for the authors, if you don’t want colour figures or physical preprints (but not cheap), and they have a nice system of transferring to sister journals if you are rejected automatically so the review process is efficient.  However, publishing in the biology world is more of a nightmare that is well documented by Dwight and Chris in their paper.  Here, getting into a high impact journal like Nature or Science can make or break your career and the chances of getting in are slim.  Authors spend a lot of their time and energy trying to get their work published and if you have little name recognition in a field it is extremely difficult just to get your paper reviewed by the more prestigious journals. Dwight and Chris have some excellent ideas of how to fix this system, which  I think have a lot of merit.  The one thing I would like to see is to make the cost for authors be as low as possible so that it doesn’t impede low funded labs.

The pitfalls of obesity and cancer drugs

November 23, 2011

The big medical news last week was that the US FDA revoked the use of the drug Avastin for the treatment of breast cancer.  The reason was that any potential efficacy did not outweigh the side effects.  Avastin is an anti-angiogenesis drug that blocks the formation of blood vessels by inhibiting the vascular endothelial growth factor VEGF-A.  This class of drugs is a big money-maker for the biotechnology firm Genentech and has been used in cancer treatments and for macular degeneration where it is called Lucentis.  Avastin will still be allowed for colorectal and lung cancer and physicians can still prescribe it off-label for breast cancer.  The strategy of targeting blood delivery as an anti-tumour strategy was pioneered by Judah Folkman.  He and collaborators also showed that adipose tissue mass (i.e. fat cells) can be regulated through controlling blood vessel growth (Rupnick et al., 2002) and this has been proposed as a potential therapy for obesity (e.g. Kolonin et al, 2004; barnhart et al. 2011).  However, the idea will probably not go very far because of potential severe side effects.

I think this episode illustrates a major problem in developing any type of drug for obesity and to some degree cancer.  I’ve posted  on the basic physiology and physics of weight change multiple times before (see here) so I won’t go into details here but suffice it to say that we get fat because we eat more than we burn.  Consider this silly analogy:  Suppose we have a car with an expandable gas tank and we seem to be overfilling it all the time so that it’s getting really big and heavy.  What should we do to lighten the car?  Well, there are three basic strategies: 1) We can put a hole in the gas tank so as we fill the tank gas just leaks out. 2) We can make the engine more inefficient so it burns gas faster or 3) We can put less gas in the car.  If you look at it this way, the first two strategies seem completely absurd but they are pursued all the time in obesity research.  The drug Orlistat blocks absorption of fat in the intestines, which basically tries to make the gas tank (and your bowels) leaky.  One of the most celebrated recent discoveries in obesity research was the discovery that human adults have brown fat.  This is a type of adipocyte that converts food energy directly into heat.  It is abundant in small mammals like rodents and babies (that’s why your  newborn is nice and warm) but was thought to disappear in adults. Now, various labs are trying to develop drugs that activate brown fat.  In essence they want to make us less efficient and turn us into heaters.   The third strategy of reducing input has also been tried and has failed various times.  Stimulants such as methampthetamines were found very early on to suppress appetite but turning people into speed addicts wasn’t a viable strategy.  A recent grand failure was the cannabinoid receptor CB-1 blocker Rimonabant.  It worked on the principle that since cannabis seems to enhance appetite, blocking it suppresses appetite. It does work but it also caused severe depression and suicidal thoughts.  Also, given that CB-1 is important in governing synaptic strengths, I’m sure there would have been bad long-term effects as well. I won’t bother telling the story of fen-phen.

It’s kind of easy to see why almost all obesity drug therapies will fail because they must target some important component of metabolism or neural function.  While we seem to have some unconscious controls of appetite and satiety, we can also easily override them (as I plan to do tomorrow for Thanksgiving).  Hence, any drug that targets some mechanism will likely either cause bad side effects or  be compensated by other mechanisms.  This also applies to some degree to cancer drugs, which must kill cancer cells while ignoring healthy cells.  This is why I tend not to get overly excited whenever another new discovery in obesity research is announced.

Guest editorial

October 28, 2011

I have a guest editorial in the SIAM Dynamical System online magazine DSWeb this month.  The full text of the editorial is below and the link is here.  I actually had several ideas circulating in my head and didn’t really know what would come out until I started to write.   This is how my weekly blog posts often go.  The process of writing itself helps to solidify inchoate ideas.  I think too often, young people want to wait until everything is under control before they write. I try to tell them to never just sit there and stare at a screen.  Just start writing and something will come.

Math in the Twenty First Century

I was a graduate student at MIT in the late eighties. When I started, I wrote Fortran code on an EMACS editor at a VT100 terminal to run simulations on a Digital Equipment Corporation VAX computer. When I didn’t understand something, I would try to find a book or paper on the topic. I spent hours in the library reading and photocopying. Somehow, I managed to find and read everything that was related to my thesis topic. Email had just become widespread at that time. I recall taking to it immediately and found it to be an indispensible tool for keeping in touch with my friends at other universities and even to make dinner plans. Then, I got a desktop workstation running X Windows. I loved it. I could read email and run my code simultaneously. I could also log onto the mainframe computer if necessary. As I was finishing up, my advisor got a fax machine for the office (hard to believe that they are that recent and now obsolete) and used it almost everyday.

I think that immediate integration of technology has been the theme of the past twenty-five years. Each new innovation – email, the desktop computer, the fax machine, the laser printer, the world wide web, mobile phones, digital cameras, power point slides, iPods, and so forth – becomes so quickly enmeshed into our lives that we can’t imagine what life would be like without them. Today, if I want to know what some mathematical term means I can just type it into Google and there will usually be a Wikipedia or Scholarpedia page on it. If I need a reference, I can download it immediately. If I have a question, I can post it on Math Overflow and someone, possibly a Field’s medalist, will answer it quickly (I actually haven’t done this yet myself but have watched it in action). Instead of walking over to the auditorium, I now can sit in my office and watch lectures online. My life was not like this fifteen years ago.

Yet, despite this rapid technological change, we still publish in the same old way. Sure, we can now submit our papers on the web and there are online journals but there has been surprisingly little innovation otherwise. In particular, many of the journals we publish in are not freely available to the public. The journal publishing industry is a monopoly that has surprising lasting power. If you are not behind the cozy firewall of an academic institution, much of the knowledge we produce is inaccessible to you. Math is much better than most other sciences since people post their papers to the arXiv. This is a great thing but it is not the same as a refereed journal. Perhaps, now is the time for us to come up with a new model to present our work – something that is refereed and public. Something new. I don’t know what it should look like, but I do know that when it comes around I’ll wonder how I ever got along without it.

Correlations

September 10, 2011

If I had to compress everything that ails us today into one word it would be correlations.  Basically, everything bad that has happened recently from the financial crisis to political gridlock is due to undesired correlations.  That is not to say that all correlations are bad. Obviously, a system without any correlations is simply noise.  You would certainly want the activity on an assembly line in a factory to be correlated. Useful correlations are usually serial in nature like an invention leads to a new company.  Bad correlations are mostly parallel like all the members in Congress voting exclusively along party lines, which reduces an assembly with hundreds of people into just two. A recession is caused when everyone in the economy suddenly decides to decrease spending all at once.  In a healthy economy, people would be uncorrelated so some would spend more when others spend less and the aggregate demand would be about constant. When people’s spending habits are tightly correlated and everyone decides to save more at the same time then there would be less demand for goods and services in the economy so companies must lay people off resulting in even less demand leading to a vicious cycle.

The financial crisis that triggered the recession was due to the collapse of the housing bubble, another unwanted correlated event.  This was exacerbated by collateralized debt obligations (CDOs), which  are financial instruments that were doomed by unwanted correlations.  In case you haven’t followed the crisis, here’s a simple explanation. Say you have a set of loans where you think the default rate is 50%. Hence, given a hundred mortgages, you know fifty will fail but you don’t know which. The way to make a triple A bond out of these risky mortgages is to lump them together and divide the lump into tranches that have different seniority (i.e. get paid off sequentially).  So the most senior tranche will be paid off first and have the highest bond rating.  If fifty of the hundred loans go bad, the senior tranche will still get paid. This is great as long as the mortgages are only weakly correlated and you know what that correlation is. However, if the mortgages fail together then all the tranches will be bad.  This is what happened when the bubble collapsed. Correlations in how people responded to the collapse made it even worse.  When some CDOs started to fail, people panicked collectively and didn’t trust any CDOs even though some of them were still okay. The market for CDOs became frozen so people who had them and wanted to sell them couldn’t even at a discount. This is why the federal government stepped in.  The bail out was deemed necessary because of bad correlations.  Just between you and me, I would have let all the banks just fail.

We can quantify the effect of correlations in a simple example, which will also show the difference between sample mean and population mean. Let’s say you have some variable x that estimates some quantity. The expectation value (population mean) is \langle x \rangle = \mu.  The variance of x, \langle x^2 \rangle - \langle x \rangle^2=\sigma^2 gives an estimate of the square of the error. If you wanted to decrease the error of the estimate then you can take more measurements. So let’s consider a sample of n measurements.  The sample mean is (1/n)\sum_i^n x_i . The expectation value of the sample mean is  (1/n)\sum_i \langle x_i \rangle = (n/n)\langle x \rangle = \mu. The variance of the sample mean is

\langle [(1/n)\sum_i x_i]^2 \rangle - \langle x \rangle ^2 = (1/n^2)\sum_i \langle x_i^2\rangle + (1/n^2) \sum_{j\ne k} \langle x_j x_k \rangle - \langle x \rangle^2

Let C=\langle (x_j-\mu)(x_k-\mu)\rangle be the correlation between two measurements. Hence, \langle x_j x_k \rangle = C +\mu^2. The variance of the sample mean is thus \frac{1}{n} \sigma^2 + \frac{n-1}{n} C.  If the measurements are uncorrelated (C=0) then the variance is \sigma^2/n, i.e. the standard deviation or error is decreased by the square root of the number of samples.  However, if there are nonzero correlations then the error can only be reduced to the amount of correlations C.  Thus, correlations give a lower bound in the error on any estimate.  Another way to think about this is that correlations reduce entropy and entropy reduces information.  One way to cure our current problems is to destroy parallel correlations.

 

The perils of sugar

April 15, 2011

Science writer Gary Taubes has a provocative article in the forthcoming New York Times magazine on whether sugar is toxic.  Taubes has penned two well received books on metabolism and obesity recently – Good calories bad calories and Why we get fat?.   In the context of the article, sugar is defined to be either sucrose, which is composed of 50% glucose and 50% fructose  or high fructose corn syrup (HFCS), which is composed of 55% fructose and 45% glucose.  The fact that sucrose, which is what you put in your coffee and HFCS, which until recently had replaced sucrose in many products like soft drinks, are so similar in composition has always been sufficient evidence for me that if one of them is bad for you then the other must be as well.

In order to understand why sugar could be unhealthful requires some background in human metabolism.   The energetic portions of food consists of carbohydrates, fat or protein, which are used by the cells of our body for fuel.   However, the brain only utilizes glucose (or ketone bodies when glucose is not available) and very little glucose is stored in the body (half a kilogram in the form of glycogen).  Hence, glucose is tightly regulated in the blood.  When we eat glucose, it enters the blood stream fairly rapidly.  The body responds by secreting insulin, which activates transporters in non-brain cells to take up glucose.   The cells will either burn the glucose or use it to replace depleted glycogen stores.  Glucose can also be utilized by muscle cells during intense exercise.  Any excess glucose will be taken up by the liver and be converted into fat in the form of triglycerides.

The reason fructose is considered  bad is that it is only metabolized in the liver where it is converted into glucose or fat.   The fat is either stored, which is thought to be bad, or gets secreted into the blood inside VLDL particles that are precursors of LDL particles, which is considered to be the bad cholesterol.  This line of reasoning is highly plausible but I think it is incomplete.  The argument is that  since fructose leads to fatty liver, I must avoid fructose.  However, what you really want to avoid is excess fat in the liver and high LDL and this can be caused by things other than fructose.  So in addition to  avoiding fructose, you must also make sure you don’t replace that fructose with something that is equally or almost as bad.  For instance, if you decided to replace that fructose with lots of glucose, it could still end up getting converted into fat in the liver.

To me, the real problem is not the fructose per se but the imbalance between glycogen use and carbohydrate input because  glucose or fructose will first replace depleted glycogen before getting converted to fat.   Hence the amount of exercise one engages in cannot be ignored.  Basically, any amount of carbohydrates consumed above your glucose/glycogen utilization level will necessarily be converted into fat.  Thus,  if you believe in this hypothesis, what you really should believe is that the increase in obesity, insulin resistance and Type II diabetes is not just due to any specific dietary element like fructose but a general  imbalance between carbs burned and carbs eaten.

Does NIH need to change?

April 1, 2011

Michael Crow,  president of Arizona State University, has an opinion piece in Nature this week arguing that the NIH needs to be revamped.  He points out that the although the NIH budget is 30 billion a year, there have relatively few recent benefits for public health.  He argues that the problem is that there is no emphasis on  promoting outcomes beyond basic science.   Right now the NIH consists of 27 separate institutes (I’m in NIDDK) with little coordination between them and great redundancy in their missions.  For an intramural principal investigator such as myself, the walls are invisible when it comes to science and collaborations but very apparent when it comes to regulations and navigating the bureaucracy.  Crow uses the obesity pandemic as an example of the NIH’s ineffectiveness in combating a health problem.  This point really hits home since the NIH Obesity Research Task Force, which is spread out over 27 NIH components, is largely unaware of the novel work coming out of our  group – the Laboratory of Biological Modeling.  Crow’s solution is to drastically reorganize the NIH.  An excerpt of his article is below.

Nature: What if the NIH were reconfigured to reflect what we know about the drivers of innovation and progress in health care?

“A new NIH should be structured around three institutes.”

This new NIH should be structured around three institutes. A fundamental biomedical systems research institute could focus on the core questions deemed most crucial to understanding human health in all its complexity — from behavioural, biological, physical, environmental and sociological perspectives.

Take, for instance, the ‘obesity pandemic’. In the United States, medical costs related to obesity (currently around $160 billion a year) are projected to double within the decade. And by some estimates, indirect spending associated with obesity by individuals, employers and insurance payers — for example on absenteeism, decreased productivity or short-term disability, exceeds direct medical costs by nearly threefold8. The NIH conducts and supports leading research on numerous factors relevant to obesity, but efforts are fragmented: 27 NIH components are associated with the NIH Obesity Research Task Force, a programme established to speed up progress in obesity research.

Within a systems research institute, scientists could better integrate investigations of drivers as diverse as genetics, psychological forces, sedentary lifestyles and the lack of availability of fresh fruit and vegetables in socioeconomically disadvantaged neighbourhoods.

A second institute should be devoted to research on health outcomes, that is, on measurable improvements to people’s health. This should draw on behavioural sciences, economics, technology, communications and education as well as on fundamental biomedical research. Existing NIH research in areas associated with outcomes could serve as the basis for expanded programmes that operate within a purpose-built organization. If the aim is to reduce national obesity levels — currently around 30% of the US population is obese — to less than 10% or 15% of the population, for example, project leaders would measure progress against that goal rather than according to some scientific milestone such as the discovery of a genetic or microbial driver of obesity.

The third institute, a ‘health transformation’ institute, should develop more sustainable cost models by integrating science, technology, clinical practice, economics and demographics. This is what corporations have to do to be successful in a competitive high-tech world. Rather than be rewarded for maximizing knowledge production, this institute would receive funding based on its success at producing cost-effective public-health improvements.

This kind of tripartite reorganization would limit the inevitable Balkanization that has come from having separate NIH units dedicated to particular diseases. Indeed, such a change would reflect today’s scientific culture, which is moving towards convergence — especially in the life sciences, where collaboration across disciplines is becoming the norm, advances in one field influence research in others, and emerging technologies are frequently relevant across different fields.

Productivity and ability

March 11, 2011

What makes some people more productive then others?  Is it innate ability, better training, hard work?  Although the meaning of productivity is subjective,  there are quantifiable differences between researchers in measures of productivity such as the  h-index.    Here I will argue that a small difference in ability or efficiency can lead to great differences in output.

Let’s consider a simple and admittedly flawed model of productivity.  Suppose we consider productivity to be the number of tasks you can complete and let P represent the probability that you can accomplish a  task (i.e. efficiency).  A task could be anything from completing an integral, to writing a program, to sticking an electrode into a cell, or to finishing a paper.  The probability of completing N independent tasks is T=P^N.  Conversely, the number of steps that can be completed with probability T is N = \log T/\log P.  Now let P = 1-\epsilon, where \epsilon is the failure probability.  Hence, for high efficiency (i.e. low failure rate),  we can expand the logarithm for small \epsilon and obtain N \propto \epsilon^{-1}.  The number of tasks you can complete for a given probability  is inversely proportional to your failure rate.

The rate of change in productivity with respect to efficiency increases even faster with

\frac{dN}{d P}\propto \epsilon^{-2}

Hence, small differences in efficiency can lead to large differences in the number of tasks that can be completed and the gain is more dramatic if you have higher efficiency.  For example, if you go from being 90\% efficient (i.e. \epsilon = .1) to 95\% efficient (i.e. \epsilon = .05) then you will double the number of tasks you can complete. Going from 98\% to 99\% is also a doubling in productivity.  The model clearly disregards the fact that tasks are often correlated and have different probabilities for success.  I know  some people who have great trouble in revising and resubmitting papers to get published and thus they end up having low measured productivity even though they have accomplished a lot.   However, it seems to indicate that it is always worth improving your efficiency even by a small amount.

Retire the Nobel Prize

October 12, 2009

I’ve felt for sometime now that perhaps we should retire the Nobel Prize.  The money could be used to fund grants, set up an institute for peace and science, or even have a Nobel conference like TED.  The prize puts too much emphasis on individual achievement and in many instances misplaced emphasis.  The old view of science involving the lone explorer seeking truth in the wilderness needs to be updated to a new metaphor of the sandpile, as used to described self-organized criticality by Per Bak, Chao Tang, and Kurt Wiesenfeld.  In the sandpile model, individual grains of sand are dropped on the pile and every once in awhile there are “avalanches” where a bunch of grains cascade down.  The distribution of avalanche sizes is a power law.  Hence, there is no scale to avalanches and there is no grain that is more special than any other.

This is just like science.  The contributions of scientists and nonscientists are like grains of sand dropping on the sandpile of knowledge and every once in awhile a big scientific avalanche is triggered.  The answer to the question of who triggered the avalanche is that everyone contributed to it.  The Nobel Prize rewards a few of the grains of sand that happened to be proximally located to some specific avalanche (and sometimes not) but the rewarded work always depended on something else.

(more…)

Reframing the evolution debate

September 4, 2009

I firmly believe that given the way our brains work, some arguments can never be resolved. This includes political and economic issues (e.g. efficient markets) and also the debate between evolution and creationism.  I think many scientists feel that the way to fight creationists is to challenge them at every level and try to win the debate using reason and overwhelming evidence.  If that doesn’t work then creationists should be shut down by legal and other means because they might take over and send us back to the Dark Ages. Unfortunately, if a creationist has a prior with zero support over the possibility that the earth is 4.5 billion years old then no amount of evidence can ever change their opinion.  That is why I think the Richard Dawkins strategy of equating science with atheism may not be a winning one.  I think there is a different approach that may even get creationists interested in modern biology and science as a way for them to get closer to God.

(more…)

Extraterrestrial Life

May 1, 2009

There is a pervasive belief  that there must be extraterrestrial life and in particular intelligent life in the universe.  In fact, that  is usually presumed to be so true that the only question people tend to ask is why haven’t we heard from anyone yet.  The celebrated Drake Equation, which tried to estimate the number of civilizations in the galaxy that we might be able to communicate with ended up with the number 10.  One of the factors in the Drake equation is the fraction of earth like planets that eventually develop life.  In 1961, when Drake wrote down the equation he assumed this fraction was one.  After all, the earth is teeming with life, so it must be easy to develop life anywhere, right?

Actually, we have no idea what the probability of forming life is. We do not know how life developed on earth.  There are several competing theories but we have no empirical evidence to support any of them.  The probability of forming life could be very high or it could be close to zero.  It is as equally likely that there are lots of civilizations out there to talk to or there are none.  The possibility that there is no life whatsoever in the visible universe beyond earth is as likely as any other hypothesis.  We really could be alone in the universe.

(more…)

Why ugly is sometimes beautiful

December 1, 2008

When Stravinsky’s ballet “The Rite of Spring” debuted in 1913 in Paris it caused a riot.  The music was so complex and novel that the audience didn’t know how to react.  They became agitated, jeered, argued amongst themselves and eventually became violent.  However, by the 1920’s the Rite of Spring was well accepted and now it is considered one of the greatest works of the 20th century.  When Impressionism was introduced in the late 19th century it was not well received.  The term was actually meant to be a derisive of the movement.  These days, the Impressionist rooms are often the most popular and crowded at Art Museums.  There was strong opposition to Maya Lin’s design for the Vietnam Memorial in 1981. She actually had to defend it before the US Congress and fought to keep it from being changed.   Now it is considered one of the most beautiful monuments in Washington D.C.  There are countless other examples of icons of beauty that were initially considered offensive or ugly.  I think this is perfectly consistent with what we know about neuroscience. (more…)


Follow

Get every new post delivered to your Inbox.

Join 118 other followers