Archive for the ‘Environment’ Category

Saving large animals

January 11, 2013

One  story in the news lately is the dramatic increase in the poaching of African elephants (e.g. New York Times). Elephant numbers have plunged dramatically in the past few years and their outlook is not good. This is basically true of most large animals like whales, pandas, rhinos, bluefin tuna, whooping cranes, manatees, sturgeon, etc. However, one large animal has done extremely well while the others have languished. In the US it had a population of zero 500 years ago and now it’s probably around 100 million.That animal as you have probably guessed is the cow. While wild animals are being hunted to extinction or dying due to habitat loss and climate change, domestic animals are thriving. We have no shortage of cows, pigs, horses, dogs, and cats.

Given that current conservation efforts are struggling to save the animals we love, we may need to try a new strategy. A complete ban on ivory has not stopped the ivory trade just as a ban on illicit drugs has not stopped drug use. Prohibition does not seem to be a sure way to curb demand. It may just be that starting some type of elephant farming may be the only way to save the elephants. It could raise revenue to help protect wild elephants and could drop the price in ivory sufficiently to make poaching less profitable. It could also backfire and increase the demand for ivory.

Another counter intuitive strategy may be to sanction limited hunting of some animals. The introduction of wolves into Yellowstone park has been a resounding ecological success but it has also angered some people like ranchers and deer hunters. The backlash against the wolf has already begun. One ironic way to save wolves could be to legalize the hunting of them. This would give hunters an incentive to save and conserve wolves. Given that the set of hunters and ranchers often have a significant intersection, this could dampen the backlash. There is a big difference in attitudes towards conservation when people hunt to live versus hunting for sport. When it’s a job, we tend to hunt to extinction like  buffalo, cod, elephants, and bluefin tuna. However, when it’s for sport, people want to ensure the species thrives. While I realize that this is controversial and many people have a great disdain for hunting, I would suggest that hunting is no less humane and perhaps more than factory abattoirs.

Weather prediction

October 31, 2012

I think it was pretty impressive how accurate the predictions for Superstorm Sandy were up to a week ahead.  The hurricane made the left hand turn from the Atlantic into New Jersey just as predicted.  I don’t think the storm could have been hyped any more than it was.  The east coast was completely devastated but at least we did have time to prepare.  The weather models  have gotten much better from even ten years ago. The storm also shows just how vulnerable the east coast is to a 14 foot storm surge.  I can’t imagine what a 20 foot surge would do to New York.

April 28, 2012

I listened to two Long Now Foundation talks on my way to Newark, Delaware and back yesterday for my colloquium talk.  These podcasts tend to be quite long, so they were perfect for the drive.  The first was by environmental activist and journalist Mark Lynas and the second by National Geographic photographer Jim Anderson.  Both were much more interesting than I expected.  Lynas, who originated the anti-genetically modified organism (GMO) food movement in Europe in the 1990s, has since changed  his mind and become more pragmatic.  He now advocates for a more rational environmental movement that embraces technological solutions such as GMO foods and nuclear energy.  He argues that many more people are killed by particulate matter from coal-fired generating plants in a year than over the entire history of nuclear use.  I have always felt that nuclear power is the only viable technology to reduce carbon emissions.  I have also argued previously that  I’m more worried about the acidification of the ocean due to CO2 than an increase temperature.  I think we should start building CANDU reactors now and head towards fast breeder reactors.

Jim Anderson talked about the loss of diversity of domesticated plants and animals and how they are essential for the survival of humans.  For the first 9,900 years of agriculture, we increased the diversity of our food stuff.  For the last hundred, we have gone in the other direction. We used to have hundreds to thousands of varieties of fruits and vegetables and now we’re down to a handful.  There are at most 5 varieties of apples I can buy at my local supermarket, yet a hundred years ago, each orchard would produce its own variety.  This leaves us extremely vulnerable to diseases.  The world’s banana supply is dominated by one variety (the Cavendish) and it is under siege by a fungus that threatens to wipe it out.  The Irish potato famine was so severe because they relied on only two varieties that were both susceptible to the same blight. Our fire wall against future blights are seed banks, where we try to preserve as many varieties as we can.  However, not all seeds can remain viable forever.  Many have to be planted every few years from which new seeds are harvested.  This replanting is often done by amateur horticulturists.  The podcast made me think that with the cost of genome sequencing dropping so rapidly, what we need now is for someone, like Google, to start sequencing every living being and making it publicly available, like Google Books.  In fact, if sequencers become cheap enough, this could be done by amateurs.  You would find some plant or animal, document it as well as you can, and upload the sequence to the virtual seed bank.  This can be a record of both wild and domesticated species.  We can then always resurrect one if we need to.  There could also be potential for mischief with highly dangerous species like small pox or anthrax, so we would need to have a public discussion over what should be available.

Infinite growth on finite resources

September 23, 2011

At this past summer’s Lindau meeting of Nobel Laureates, Christian Rene de Duve, who is over 90 years old, gave a talk on population growth and its potential dire effects on the world.  Part of his talk was broadcast on the Science Show.  His talk prompted me to think more about growth.  The problem is not that the population is growing per se.  Even if the population were stable, we would still eventually run out of fossil fuels if we consume energy at the same rate.  The crucial thing is that we must progressively get more efficient.  For example, consider a steady population where we consume some finite resource at the rate of $t^\alpha$.  Then so long as $\alpha < -1$, we can make that resource last forever since $\int_1^\infty t^\alpha$ is finite.  Now, if the population is growing exponentially then we would have to become exponentially more efficient with time to make the resource last.  However, making the world more efficient will take good ideas and skilled people to execute them and that will scale with the population.  So there might be some optimal growth rate where we ensure the idea generation rate is sufficient to increase efficiency so that we can sustain forever.

What’s in your sunscreen?

July 7, 2011

Here’s something to think about from Scientific American:

…And just what are the risks? According to the non-profit Environmental Working Group (EWG), there are two major types of sunscreens available in the U.S. “Chemical” sunscreens, the more common kind, penetrate the skin and may disrupt the body’s endocrine system, as their active ingredients (e.g., octylmethylcinnamate, oxybenzone, avobenzone, benzophone, mexoryl, PABA or PARSOL 1789) mimic the body’s natural hormones and as such can essentially confuse the body’s systems. Quite a risk to take, considering that the chemical varieties don’t even work for very long once applied.

Meanwhile, “mineral” sunscreens are considered somewhat safer, as their active ingredients are natural elements such as zinc or titanium. But “micronized” or “nano-scale” particles of these minerals can get below the skin surface and cause allergic reactions and other problems for some people. EWG recommends sticking with “mineral” sunscreens whenever possible but, more important, taking other precautions to avoid prolonged sun exposure altogether. “At EWG we use sunscreens, but we look for shade, wear protective clothing, and avoid the noontime sun before we smear on the cream,” the group reports.

As for spray varieties, EWG recommends avoiding them entirely: “These ingredients are not meant to be inhaled into the lungs.” With so little known about the effects of sunscreen chemicals on the body when rubbed into the skin, we may never know how much worse the effects may be when they are inhaled. But suffice it to say: When your neighbor at the beach is spraying down Junior, it’s in your best interest to turn away and cover your nose and mouth…

Miraculous technologies

April 29, 2011

This month’s Scientific American magazine has a story on 7 Radical Energy Solutions.  The link is here although you need a subscription to access the full article.  The 7 solutions are 1) Fusion-triggered fission – using lasers to trigger  fusion in small pellets to produce neutrons to ignite fission; advantage being that a chain reaction is not necessary so nuclear waste can be used as fuel. 2) Solar gasoline – converting solar energy directly into a carbon-based liquid fuel. 3) Quantum photovoltaics – use quantum dots to increase efficiency of solar cells by trapping hot electrons that are lost with existing technology. 4) Heat engines – generate power by capturing waste heat using shape-memory alloys. 5) Shock-wave auto engine – a new internal combustion engine that uses shock waves to propel a turbine. 6) Magnetic air conditioners – make a fridge with no moving parts by using special magnets to replace the refrigerant and  pumps. 7)  Clean coal – use an ionic liquid to pull CO2 out of coal plant exhaust;  the CO2 would then have to be sequestered underground. See here for descriptions of  projects funded by the US Department of Energy.

The article made me think of technology we use today that seems miraculous.  The first thing that comes to mind is the airplane.  People had dreamed of flight for centuries if not millenia but it wasn’t until technology matured enough that the dream was realized in 1903 by the Wright brothers.  The mobile phone was just a science fiction dream to me when I was child.  The refrigerator has always seemed miraculous to me.  Even after thermodynamics and the heat cycle was understood, it is still amazing that actual substances that could act as refrigerants were discovered.  I find  bullet proof glass kind of astounding.  All of our electronic technology is based on silicon, which is made from sand.   Water itself is kind of magical.  The fact that it is so abundant and takes on three phases in a human accessible range of temperatures is astonishing (or maybe not – cf. Anthropic Principle).    I could go on and on.  As Arthur C. Clarke once wrote: “Any sufficiently advanced technology is indistinguishable from magic.”  At any moment, there could be a technological breakthrough that changes history.

The long view

April 12, 2011

Physicist and Nobel Laureate Robert Laughlin was on Econtalk this past summer. The link to the podcast is here.  Laughlin, who likes to take on contrarian positions and is always entertaining, talks about the future of carbon and his forthcoming book “When coal is gone”.  Chapter two of his book originally entitled “Geological Time” was excerpted in The American Scholar with the title “What the earth knows” and can be obtained here.  In that chapter and on the podcast, Laughlin argues that the human age of fossil fuels and its effect on climate is but a blink of an eye in geological time.   The earth has endured much larger perturbations then humans will ever inflict.  He claims that we’ll run out of oil in about 60 years but we will still use carbon-based liquid fuels because their energy densities are without peer.  (You can’t fly an airplane without it.)  However, instead of getting it out of the ground we will manufacture it using coal or natural gas as feed stock.  In about 200 years we’ll run out of coal but we’ll still want to make fuels.  At that point, we’ll have to extract carbon out of the air or ocean, mostly likely using plants.  Laughlin tries to avoid taking political positions and does acknowledge that climate change could be bad for this and the next generation of humans even if it won’t matter much in the long term.  He’s confident the earth and humans will survive this crisis.  The one thing he does worry about is biodiversity loss, which is permanent.  There is a switch in topic to Laughlin’s previous book The Crime of Reason 50 minutes into the podcast.  In that book, Laughlin argues that  the US switch from a manufacturing economy to an information economy will stifle  learning and the dissemination of knowledge because if information becomes a commodity, its value depends on its scarcity.  Thus, the rate of innovation will decrease not increase and we will become more secretive in general.

National Elk Refuge

March 22, 2011

Right outside of Jackson, Wyoming is the National Elk Refuge, which was established in 1912.  It is the wintering ground for a herd of ten thousand elk as well as eight hundred bison.  During winter, the elk come down from the mountains to the Jackson Hole valley where the snow is thinner so they can access grass more easily.  You can take a horse drawn sleigh right out to the herd with the Grand Tetons as the the backdrop. Here are some pictures.

The power of small

December 24, 2010

Here’s a nice story for Christmas in the New York Times. African villagers can now get electrical power using cheap solar cell units. It can make a significant difference for their lives.

The cost of commuting

November 20, 2010

It is about 45 miles (70 km) from Baltimore to the NIH campus in Bethesda, MD.  If I were to travel the entire distance using public transit it would cost over 20 dollars for a return trip (one way bus fare in Baltimore is $1.60, commuter rail (Marc train) fare is$7.00, and Metro fare in DC is $3.65 ($3.85 during peak hours)).  That amounts to over $100 per week and$5000 per year.  If I bought a  monthly rail pass, then I could cut the cost down by 75% or so.  Now if instead I were to drive everyday,  ninety miles per day is equivalent to 22,500 miles per year.  A car that could travel 30 miles per gallon of gasoline would use 750 gallons a year.  At the current price of $3 per gallon, this would be$2250 per year.  If I drive my car for ten years and it cost twenty thousand dollars then that is an additional $2000 per year. Insurance, fees, maintenance and repairs probably costs another$2000 per year so driving would cost about $6000 per year. If I drove a cheaper and more efficient car then I could bring this cost down to$5000 per year.  Thus, driving is economically competitive with public transit.  Add in the fact that I would own a car anyway even if I didn’t use it to commute to work and driving is the less expensive choice.

How is this possible?  Well one cost that I didn’t account for is parking.  The NIH happens to have a large campus where parking is nominally free.  Although if I chose not to drive, I could receive a public transit subsidy of  up to $110 per month or$1320 per year.  If the NIH were located in downtown Washington DC, parking could cost over $400 per month or$5000 per year.  So the real reason driving is competitive with public transit is because parking is subsidized.  If I  worked in an urban center  where parking is expensive then driving would be much more expensive than public transit.  Driving is further subsidized because roads and highways are funded by tax dollars while the cost of maintaining transit stations and tracks are only partially funded by taxes.  If transportation infrastructure were publically funded or if subsidies for roads and parking did not exist then public transit would be the prohibitive cost effective option.

Biomass

October 28, 2010

Since the rise of human civilization,  life forms larger than 10 centimeters to a metre have been systematically culled or eliminated from the ecosystem.  Almost all land megafauna that used to roam wildly a few thousand or even hundred years ago are either extinct or reside in small numbers in protected parks and reserves. Macroscopic sized sea creatures that were reasonably plentiful just two or three decades ago may all disappear shortly.  In that mean time the population of  humans and domesticated plants and animals have exploded.

So, has there been a net gain or loss of total biomass?   I think the conventional wisdom would be that we have replaced large tracts of forest with pavement, lawns and farmland, which would seem like a huge net loss of biomass.  However, we have added extra nutrients (i.e. fertilizer) and carbon (i.e. fossil fuels) into the system. The energy flux from the sun has also not changed significantly in the last millennium.  Hence, the capacity to support life has probably not changed or maybe has even increased. Removing, all of the large wild animals may also create more opportunities for small animals.  Perhaps there are more small and microscopic creatures then there would have been had humans not existed.  I have no idea what the answer is.

Phytoplankton

July 30, 2010

I have always felt that a rise in global temperatures was the least of our worries about increasing CO2 in the atmosphere.  I’m much more concerned about how it could perturb the delicate balance that allows mammals to live, i.e. us.  One of the things that could be trouble is that CO2 dissolved in water can make the oceans more acidic by forming more carbonic acid, which could make it harder for marine creatures to make shells through calcification, which  in turn could have a large impact on the coral reefs and the ocean food chain.

Another thing I worry about is that our oxygen supply could decrease.  Although the direct effect of converting oxygen to water and CO2 through increased combustion of fossil fuels is small, the effect on photosynthetic organisms that make our oxygen is largely unknown.  I’ve actually been somewhat optimistic on this account thinking that since we are introducing more nutrients into the oceans and CO2 is increasing then perhaps phytoplankton, which make much of our oxygen and is a blanket term for photosynthetic microscopic sea organisms like cyanobacteria and dynoflagellates, might increase.  However, a paper in Nature this week, says otherwise.

Metabolism of Mice and Men

July 8, 2010

In the 1930′s, Swiss-American animal metabolism pioneer Max Kleiber noticed that the metabolic rate of animals scales as the body mass to the three quarters power.  There is still some controversy over whether the exponent is really three quarters or something else.  Many theories have been proposed for why the exponent  should be three quarters (or two thirds) but I won’t go into that here.  The crucial thing is that it is less than one and that implies that a large animal is more efficient than a small one.  This efficiency with size is not restricted to biological examples.  As Steve Strogatz pointed out in a New York Times column last year, the number of gas stations doesn’t grow linearly with the population of a city but rather grows in proportion to the 0.77 power of the population.  This sublinear scaling also goes for other city infrastructure like the total length of roads and electrical cables. Large cities may in fact be more efficient than small ones.

Now a mouse weighs about 20 to 30 grams so it is about a factor of 3500 times less massive than an average human.  Metabolic rate scales as mass to the three quarters so power density ( e.g. Watts/gram) scales as mass to the minus one quarter.  Hence, a mouse is $3500^{(1/4)}$ or 7 to 8 times less metabolically efficient than a human. A colony of mice weighing as much as a human would have to eat 7 to 8 times as much food.

However, in terms of total energy utilized, first world humans are much less efficient than mice and perhaps all other organisms.  The metabolic rate of an average person is about 10 megajoules per day or 115 watts but according to Wikipedia, the United States uses about 10,000 watts of power per capita.  This is a factor of 90 over the metabolic rate implying that an average American is a factor of ten less efficient than a mouse.  However, a very low energy use nation like Bangladesh only consumes about twice as much energy per capita as the human metabolic rate and thus an average Bangladeshi is more efficient than a mouse.

Some numbers for the BP leak

June 3, 2010

The Deepwater Horizon well is situated 1500 m below the surface of the Gulf of Mexico.  The hydrostatic pressure is approximately given by  the simple formula of $P_a+ g\rho h$ where $P_a = 100 \ kPa$ is the pressure of the atmosphere, $\rho = 1 \ g/ml = 1000 \ kg/m^3$   is the density of water, and $g = 10 \ m/s^2$ is the gravitational acceleration.  Putting the numbers together gives $1.5\times 10^7 \ kg/m s^2$, which is $15000 \ kPa$ or about 150 times atmospheric pressure.  Hence, the oil and natural gas must be under tremendous pressure to be able to leak out of the well at all.  It’s no wonder the Top Kill operation, where mud was pumped in at high pressure, did not work.

Currently, it is estimated that the leak rate is somewhere between 10,000 and 100,000 barrels of oil per day.  A barrel of oil is 159 litres or 0.159 cubic metres.  So basically 1600 to 16000 cubic metres of oil is leaking each day.  This amounts to a cube with sides of about 11 metres for the lower value and 25 metres for the upper one, which is about the length of a basketball court.  However, assuming that the oil forms a layer on the surface of the ocean that is 0.001 mm thick, this then corresponds to a slick with an area between 1,600 to 16,000 square kilometres.  Given that the leak has been going on for almost two months and the Gulf of Mexico is 160,000 square kilometres, this implies that the slick is either very thick, oil has started to wash up on shore, or a lot of the oil is still under the surface.

Nonlinear saturation

December 18, 2009

The classic Malthus argument is that populations will grow exponentially until they are nonlinearly suppressed by famine or disease due to overcrowding.  However, the lesson of the twentieth century is that populations can be checked for other reasons as well.  This is not necessarily a refutation of Malthus per se but rather that the quantity that populations  conserve need not be restricted to food or health.  There seems to be a threshold of economic prosperity when family income or personal freedom becomes the rate limiting step for a bigger family.  Developed nations such as Japan and Germany are approaching zero population growth and trending towards negative growth.  Russia currently has negative growth.

Hence, we can slow down population growth by increasing economic growth.  China is starting to see a very steep decline in population growth in the big cities like Shanghai that is independent of the one child policy.  The emerging middle class is now taking into account the cost of raising a child and how it would affect their lifestyle.  In a very poor country, the cost of raising a child is not really an issue.  In fact, if the probability of a child making it to adulthood is low and help from children is the only way for the elderly to survive then it is logical to have as many children as possible.  In this case, the classic Malthus argument with food and health (and aid) as the rate limiting quantities applies.

I bring this point up now because  it is crucial for the current debate about what to do about climate change.  One  way of mitigating human impact on the environment is to slow down population growth.  However, the most humane and effective method of doing that is to increase economic growth, which will then lead to an increase in emissions.  For example, in 2005, the US produced 23.5 tonnes of CO2 equivalents per person, (which incidentally is not the highest in the world and less than half of world leader Qatar), while  China produces about 5.5 tonnes and Niger 0.1 tonnes.  (This is not accounting for the extra emissions due to changes in land use.)   In absolute terms, China already produces more green house gases than the US and India is not far behind.   On the other hand the population growth rate of Niger is 3.6%, India is 1.5% and dropping, China is 0.6% and the US is 1%.   So, when we increase economic prosperity, we can reduce population growth and presumably suffering, but we will also increase greenhouse gas emissions.

Given that the world economy and agricultural system is entirely based on fossil fuels, it is also true that, at least on the short term, a restriction of carbon emissions will slow down or reduce economic growth.  Thus, even though climate change could have a catastrophic outcome for the future, curbing economic growth could also be bad.  Thus, for a developing nation and even some developed nations, the choice may not be so clear cut.  It is thus no wonder, that the current debate in Copenhagen is so contentious.  I think that unless the developed world can demonstrate that viable economic growth and prosperity can be achieved with reduced carbon emissions, the rest of the world and many people will remain skeptical.  I don’t think the leaders in the climate change community realize that this skepticism about carbon restrictions may not be all irrational.

New paper on food waste

November 25, 2009

Hall KD, Guo J, Dore M, Chow CC (2009) The Progressive Increase of Food Waste in America and Its Environmental Impact. PLoS ONE 4(11): e7940. doi:10.1371/journal.pone.0007940

This paper started out as a way to understand the obesity epidemic.  Kevin Hall and I developed a reduced model of how food intake is translated into body weight change [1].  We then decided to apply the model to the entire US population.  For the past thirty years there has been an ongoing study (NHANES) that has been taking a representative sample of the US population and taking anthropomorphic measurments like body weight and height. The UN Food and Agriculture Organization and the USDA have also kept track of how much food is available to the population. We thought it would be interesting to see if the food available accounted for the increase in body weight over the past thirty years.

What we  found was that the available food more than accounted for all of the weight gain.  In fact our calculations showed that the gap between predicted food intake and actual intake has diverged linearly over time.  This “energy gap” could be due to two things: 1) people were actually burning more energy than our model indicated because they were more physically active than expected (we assumed that physical activity stayed constant for the last thirty years), or 2) there has been a progressive increase of food waste.  Given that most people have argued that physical activity has gone down recently, which would make the energy gap even greater, we opted to for conclusion 2).   Our estimate is also on the conservative side because we didn’t accout for the fact that children eat less than adults on average.

I didn’t want to believe the result at first but the numbers were the numbers.  We have gone from wasting about 900 kcal per person per day in 1974 to 1400 kcal  in 2003.  It takes about 3 kcal to make 1 kcal of food so the energy in the wasted food amounts to about 4% of total US oil consumption.  The wasted food also uses about 25% of all fresh water use.  Ten percent of it could feed Canada.  The press has taken some interest in our result.  Our paper was covered by CBC news, Kevin and I were interviewed by Science and Kevin was interviewed on Voice of America.

[1] Chow CC, Hall KD (2008) The Dynamics of Human Body Weight Change. PLoS Comput Biol 4(3): e1000045. doi:10.1371/journal.pcbi.1000045

Are mass extinctions inevitable?

September 24, 2009

It is well known from the fossil record that there have been a large number of extinction events of various magnitudes.  Some famous examples include the Cretaceous-Tertiary extinction that killed off the dinosaurs 65 million years ago and the Great Dying 250 million years ago where almost everything died.  It has been postulated that mass extinctions occur every ~30 or ~60 million years.  Most explanations for these events are exogenous – some external astrophysical or geological cataclysm like an asteroid slamming into the Yucatan 65 million years ago or large scale volcanic eruptions.  However, as I watch the news every night, I’m beginning to wonder if life itself is unstable and prone to wild fluctuations.  We are currently in the midst of a mass extinction and it is being caused by us.  However, we are not separate from the ecosystem so in effect, the system is causing it’s own extinction.

I listen to a number of podcasts of science radio shows (e.g. CBC’s Quirks and Quarks, ABC’s The Science Show, BBC’s The Naked Scientists, …) on my long drive home from work each day.  Each week I hear stories and interviews of scientists finding that climate change is worse than they predicted and we’re nearing a point of no return.  (Acidification of the oceans is what scares me the most.)  However, in all of these shows there is always an optimistic undertone that implores us do something about this, under the assumption that we have a choice in what we do.   It is at this point that I can’t help but to smirk because we really don’t have a choice. We’re just a big dynamical (probably stochastic) system that is plunging along.  We may have the capability to experience and witness what is happening (a mystery of which I actually have the privilege to think about for a living) but we don’t have control per se as I wrote about recently.

Ecosystem ghosts

August 12, 2009

Olivia Judson’s blog post in the New York times today talks about the fragility and robustness of ecosystems.  She talks about how we really don’t know what happens to an ecosystem when a single species goes extinct.  Can that species be restored or has another species taken over its niche?  Also, when an invasive species arrives, it can thrive or not thrive.  Mathematical models have found that the perturbation induced by such an event can cause another species to go extinct even if the invasive species also goes extinct.  These transient invaders are called ghosts. She also talks about experimental ecosystems with single-cell organisms.  In these artificial settings, the equilibrium states are generally composed of a small number of organisms and ghosts can cause established species to disappear.

Now, this brings me to something that has always puzzled me, which is why are natural ecosystems so varied and relatively robust when they are at the same time so susceptible to invasive species?  Examples being rabbits and cane toads ravaging Australia, zebra mussels clogging up the North American Great Lakes, and Kudzu taking over the American southwest.  Clearly, these examples show that there were niches in the ecosystems that were not being exploited.  My guess is that if we wait long enough, the these invaded ecosystems will eventually adjust and become varied again.  After all, these invasive species are held in check in their native habitats. Thus, ecosystems may tend to evolve to a state with wide variety but also one that always leaves them vulnerable to attack.  Can we mathematically prove this? The really interesting thing is that this fragile stability seems to require a large number of species since experiments with small numbers tend to evolve to small communities.  Why is that? What is the difference between a large system and a small system?  Is there a bifurcation or phase transition as you increase the size of the ecosystem?  Is there an analogy to economics or the brain?  This is why I’m so interested in large but finite interacting systems.  There seems to be something there that I just don’t understand.

The mass of humanity

June 26, 2009

Ever since Malthus, there has been a concern about overpopulation.  I thought it would be an interesting excercise to see how much space the human population actually takes up.  For example, how many oil tankers would it take to carry around the volume of humanity if converted to liquid.  Let’s say there are 6 billion people on the planet and the average mass per person is 100 kg (this is an overestimate).  Hence, the upper bound on the mass of humanity is $10 ^{12}$ kg, or a billion metric tons.  Given that we are mostly water, we can assume that this is about $10^{12}$ litres.  Taking the cube root gives $10^4\times .1$ metres or a kilometre.  Thus, if we liquefied the mass of all humans, it would fit in a cube whose sides are a kilometre long.   The largest oil tankers can carry about five hundred thousand metric tons, so two thousand oil tankers could cart around all of humanity.  To put that into perspective, according to Wikipedia, the current fleet of oil tankers moves around 2 billion metric tons a year, so half the world’s fleet could carry around the world’s population.

Now, how much area would we take up if we were to stand side by side.  Let’s say 6 people can fit into a square metre of space, then we would all be able to fit into a billion square metres or 1000 square kilometres, about the size of  Hong Kong (according to Wolfram Alpha), or we could all fit 4 to a square metre  onto the island of Oahu in Hawaii.  If we each wanted about 100 square metres of space, then we would take up about a million square kilometres or about twice the area of France.  Wolfram Alpha also tells me that there is about $1.5\times 10^7$ square kilometres of arable land in the world.  If we assume that a square kilometre can feed 1000 people (10 people per hectare), then that puts the capacity of the earth at 15 billion people.