My Times column is on economic projections for the year 2100.
In the past 50 years, world per capita income roughly trebled in real terms, corrected for inflation. If it continues at this rate (and globally the great recession of recent years was a mere blip) then it will be nine times as high in 2100 as it was in 2000, at which point the average person in the world will be earning three times as much as the average Briton earns today.
I make this point partly to cheer you up on Easter Monday about the prospects for your great-grandchildren, partly to start thinking about what that world will be like if it were to happen, and partly to challenge those who say with confidence that the future will be calamitous because of climate change or environmental degradation. The curious thing is that they only predict disaster by assuming great enrichment. But perversely, the more enrichment they predict, the greater the chance (they also predict) that we will solve our environmental problems.
Past performance is no guide to future performance, of course, and a well aimed asteroid could derail any projection. But I am not the one doing the extrapolating. In 2012, the Intergovernmental Panel on Climate Change (IPCC) asked the Organisation for Economic Cooperation and Development (OECD) to generate five projections for the economy of the world, and of individual countries, in 2050 and 2100.
They make fascinating reading. The average per capita income of the world in 2100 is projected to be between three and 20 times what it is today in real terms. The OECD’s “medium” scenario, known as SSP2, also known as “middle of the road” or “muddling through”, sounds pretty dull. It is a world in which, in the OECD’s words, “trends typical of recent decades continue” with “slowly decreasing fossil fuel dependency”, uneven development of poor countries, delayed achievement of Millennium Development Goals, disappointing investment in education and “only intermediate success in addressing air pollution or improving energy access for the poor”.
And yet this is a world in which by 2100 the global average income per head has increased eight-fold [corrected from 13-fold] to $60,000 [corrected from $100,000 in the original article] (in 2005 dollars) compared with $7,800 today. Britain will have trebled its income per head. According to this middling scenario, the average citizen of the Democratic Republic of Congo, who today earns $300 a year, will then earn $42,000, or roughly what an American earns today. The average Indonesian, Brazilian or Chinese will be at least twice as rich as today’s American.
Remember this is in today’s money, corrected for inflation, but people will be spending it on tomorrow’s technologies, most of which will be cleverer, cleaner and kinder to the environment than today’s — and all for the same price. Despite its very modest assumptions, it is an almost unimaginable world: picture Beverly Hills suburbs in Kinshasa where pilotless planes taxi to a halt by gravel drives (or something equally futuristic). Moreover, the OECD reckons that inequality will have declined, because people in poor countries will have been getting rich faster than people in rich countries, as is happening now. All five storylines produce a convergence, though at different rates, between the incomes of poor and rich countries.
Can the planet survive this sort of utopian plutocracy? Actually, here it gets still more interesting. The IPCC has done its own projections to see what sort of greenhouse gas emissions these sorts of world would produce, and vice versa. The one that produces the lowest emissions is the one with the highest income per head in 2100 — a 16-fold increase in income but lower emissions than today: climate change averted. The one that produces the highest emissions is the one with the lowest GDP — a mere trebling of income per head. Economic growth and ecological improvement go together. And it is not mainly because environmental protection produces higher growth, but vice versa. More trade, more innovation and more wealth make possible greater investment in low-carbon energy and smarter adaptation to climate change. Next time you hear some green, doom-mongering Jeremiah insisting that the only way to avoid Armageddon is to go back to eating home-grown organic lentils cooked over wood fires, ask him why it is that the IPCC assumes the very opposite.
In the IPCC’s nightmare high-emissions scenario, with almost no cuts to emissions by 2100, they reckon there might be north of 4 degrees of warming. However, even this depends on models that assume much higher “climate sensitivity” to carbon dioxide than the consensus of science now thinks is reasonable, or indeed than their own expert assessment assumes for the period to 2035.
And in this storyline, by 2100 the world population has reached 12 billion, almost double what it was in 2000. This is unlikely, according to the United Nations: 10.9 billion is reckoned more probable. With sluggish economic growth, the average income per head has (only) trebled. The world economy is using a lot of energy, improvements in energy efficiency having stalled, and about half of it is supplied by coal, whose use has increased tenfold, because progress in other technologies such as shale gas, solar and nuclear has been disappointing.
I think we can all agree that this is a pretty unlikely future. It’s roughly like projecting forward from 1914 to a wealthy 2000 but with more people, lots more horse-drawn carriages and coal-fuelled steamships, and no clean-air acts. But the point is that making these sorts of assumption is the only way you can get to really high levels of carbon dioxide in 2100. And even so, remember, the average person is three times as rich. If the food supply had collapsed and fossil fuels had run out, then there would hardly be 12 billion people burning ten times as much coal and living like kings, would there? You cannot have it both ways.
These IPCC and OECD reports are telling us clear as a bell that we cannot ruin the climate with carbon dioxide unless we get a lot more numerous and richer. And they are also telling us that if we get an awful lot richer, we are likely to have invented the technologies to adapt, and to reduce our emissions, so we are then less likely to ruin the planet. Go figure.
[Post-script: Bjorn Lomborg arrives at similar conclusions – that the IPCC’s own figures show clearly that the cure is worse than the disease.]Continue reading →
My Saturday essay in the Wall Street Journal on resources and why they get more abundant, not less:
How many times have you heard that we humans are “using up” the world’s resources, “running out” of oil, “reaching the limits” of the atmosphere’s capacity to cope with pollution or “approaching the carrying capacity” of the land’s ability to support a greater population? The assumption behind all such statements is that there is a fixed amount of stuff—metals, oil, clean air, land—and that we risk exhausting it through our consumption.
“We are using 50% more resources than the Earth can sustainably produce, and unless we change course, that number will grow fast—by 2030, even two planets will not be enough,” says Jim Leape, director general of the World Wide Fund for Nature International (formerly the World Wildlife Fund).
But here’s a peculiar feature of human history: We burst through such limits again and again. After all, as a Saudi oil minister once said, the Stone Age didn’t end for lack of stone. Ecologists call this “niche construction”—that people (and indeed some other animals) can create new opportunities for themselves by making their habitats more productive in some way. Agriculture is the classic example of niche construction: We stopped relying on nature’s bounty and substituted an artificial and much larger bounty.
Economists call the same phenomenon innovation. What frustrates them about ecologists is the latter’s tendency to think in terms of static limits. Ecologists can’t seem to see that when whale oil starts to run out, petroleum is discovered, or that when farm yields flatten, fertilizer comes along, or that when glass fiber is invented, demand for copper falls.
That frustration is heartily reciprocated. Ecologists think that economists espouse a sort of superstitious magic called “markets” or “prices” to avoid confronting the reality of limits to growth. The easiest way to raise a cheer in a conference of ecologists is to make a rude joke about economists.
I have lived among both tribes. I studied various forms of ecology in an academic setting for seven years and then worked at the Economist magazine for eight years. When I was an ecologist (in the academic sense of the word, not the political one, though I also had antinuclear stickers on my car), I very much espoused the carrying-capacity viewpoint—that there were limits to growth. I nowadays lean to the view that there are no limits because we can invent new ways of doing more with less.
This disagreement goes to the heart of many current political issues and explains much about why people disagree about environmental policy. In the climate debate, for example, pessimists see a limit to the atmosphere’s capacity to cope with extra carbon dioxide without rapid warming. So a continuing increase in emissions if economic growth continues will eventually accelerate warming to dangerous rates. But optimists see economic growth leading to technological change that would result in the use of lower-carbon energy. That would allow warming to level off long before it does much harm.
It is striking, for example, that the Intergovernmental Panel on Climate Change’s recent forecast that temperatures would rise by 3.7 to 4.8 degrees Celsius compared with preindustrial levels by 2100 was based on several assumptions: little technological change, an end to the 50-year fall in population growth rates, a tripling (only) of per capita income and not much improvement in the energy efficiency of the economy. Basically, that would mean a world much like today’s but with lots more people burning lots more coal and oil, leading to an increase in emissions. Most economists expect a five- or tenfold increase in income, huge changes in technology and an end to population growth by 2100: not so many more people needing much less carbon.
In 1679, Antonie van Leeuwenhoek, the great Dutch microscopist, estimated that the planet could hold 13.4 billion people, a number that most demographers think we may never reach. Since then, estimates have bounced around between 1 billion and 100 billion, with no sign of converging on an agreed figure.
Economists point out that we keep improving the productivity of each acre of land by applying fertilizer, mechanization, pesticides and irrigation. Further innovation is bound to shift the ceiling upward. Jesse Ausubel at Rockefeller University calculates that the amount of land required to grow a given quantity of food has fallen by 65% over the past 50 years, world-wide.
Ecologists object that these innovations rely on nonrenewable resources, such as oil and gas, or renewable ones that are being used up faster than they are replenished, such as aquifers. So current yields cannot be maintained, let alone improved.
In his recent book “The View from Lazy Point,” the ecologist Carl Safina estimates that if everybody had the living standards of Americans, we would need 2.5 Earths because the world’s agricultural land just couldn’t grow enough food for more than 2.5 billion people at that level of consumption. Harvard emeritus professor E.O. Wilson, one of ecology’s patriarchs, reckoned that only if we all turned vegetarian could the world’s farms grow enough food to support 10 billion people.
Economists respond by saying that since large parts of the world, especially in Africa, have yet to gain access to fertilizer and modern farming techniques, there is no reason to think that the global land requirements for a given amount of food will cease shrinking any time soon. Indeed, Mr. Ausubel, together with his colleagues Iddo Wernick and Paul Waggoner, came to the startling conclusion that, even with generous assumptions about population growth and growing affluence leading to greater demand for meat and other luxuries, and with ungenerous assumptions about future global yield improvements, we will need less farmland in 2050 than we needed in 2000. (So long, that is, as we don’t grow more biofuels on land that could be growing food.)
But surely intensification of yields depends on inputs that may run out? Take water, a commodity that limits the production of food in many places. Estimates made in the 1960s and 1970s of water demand by the year 2000 proved grossly overestimated: The world used half as much water as experts had projected 30 years before.
The reason was greater economy in the use of water by new irrigation techniques. Some countries, such as Israel and Cyprus, have cut water use for irrigation through the use of drip irrigation. Combine these improvements with solar-driven desalination of seawater world-wide, and it is highly unlikely that fresh water will limit human population.
The best-selling book “Limits to Growth,” published in 1972 by the Club of Rome (an influential global think tank), argued that we would have bumped our heads against all sorts of ceilings by now, running short of various metals, fuels, minerals and space. Why did it not happen? In a word, technology: better mining techniques, more frugal use of materials, and if scarcity causes price increases, substitution by cheaper material. We use 100 times thinner gold plating on computer connectors than we did 40 years ago. The steel content of cars and buildings keeps on falling.
Until about 10 years ago, it was reasonable to expect that natural gas might run out in a few short decades and oil soon thereafter. If that were to happen, agricultural yields would plummet, and the world would be faced with a stark dilemma: Plow up all the remaining rain forest to grow food, or starve.
But thanks to fracking and the shale revolution, peak oil and gas have been postponed. They will run out one day, but only in the sense that you will run out of Atlantic Ocean one day if you take a rowboat west out of a harbor in Ireland. Just as you are likely to stop rowing long before you bump into Newfoundland, so we may well find cheap substitutes for fossil fuels long before they run out.
The economist and metals dealer Tim Worstall gives the example of tellurium, a key ingredient of some kinds of solar panels. Tellurium is one of the rarest elements in the Earth’s crust—one atom per billion. Will it soon run out? Mr. Worstall estimates that there are 120 million tons of it, or a million years’ supply altogether. It is sufficiently concentrated in the residues from refining copper ores, called copper slimes, to be worth extracting for a very long time to come. One day, it will also be recycled as old solar panels get cannibalized to make new ones.
Or take phosphorus, an element vital to agricultural fertility. The richest phosphate mines, such as on the island of Nauru in the South Pacific, are all but exhausted. Does that mean the world is running out? No: There are extensive lower grade deposits, and if we get desperate, all the phosphorus atoms put into the ground over past centuries still exist, especially in the mud of estuaries. It’s just a matter of concentrating them again.
In 1972, the ecologist Paul Ehrlich of Stanford University came up with a simple formula called IPAT, which stated that the impact of humankind was equal to population multiplied by affluence multiplied again by technology. In other words, the damage done to Earth increases the more people there are, the richer they get and the more technology they have.
Many ecologists still subscribe to this doctrine, which has attained the status of holy writ in ecology. But the past 40 years haven’t been kind to it. In many respects, greater affluence and new technology have led to less human impact on the planet, not more. Richer people with new technologies tend not to collect firewood and bushmeat from natural forests; instead, they use electricity and farmed chicken—both of which need much less land. In 2006, Mr. Ausubel calculated that no country with a GDP per head greater than $4,600 has a falling stock of forest (in density as well as in acreage).
Haiti is 98% deforested and literally brown on satellite images, compared with its green, well-forested neighbor, the Dominican Republic. The difference stems from Haiti’s poverty, which causes it to rely on charcoal for domestic and industrial energy, whereas the Dominican Republic is wealthy enough to use fossil fuels, subsidizing propane gas for cooking fuel specifically so that people won’t cut down forests.
Part of the problem is that the word “consumption” means different things to the two tribes. Ecologists use it to mean “the act of using up a resource”; economists mean “the purchase of goods and services by the public” (both definitions taken from the Oxford dictionary).
But in what sense is water, tellurium or phosphorus “used up” when products made with them are bought by the public? They still exist in the objects themselves or in the environment. Water returns to the environment through sewage and can be reused. Phosphorus gets recycled through compost. Tellurium is in solar panels, which can be recycled. As the economist Thomas Sowell wrote in his 1980 book “Knowledge and Decisions,” “Although we speak loosely of ‘production,’ man neither creates nor destroys matter, but only transforms it.”
Given that innovation—or “niche construction”—causes ever more productivity, how do ecologists justify the claim that we are already overdrawn at the planetary bank and would need at least another planet to sustain the lifestyles of 10 billion people at U.S. standards of living?
Examine the calculations done by a group called the Global Footprint Network—a think tank founded by Mathis Wackernagel in Oakland, Calif., and supported by more than 70 international environmental organizations—and it becomes clear. The group assumes that the fossil fuels burned in the pursuit of higher yields must be offset in the future by tree planting on a scale that could soak up the emitted carbon dioxide. A widely used measure of “ecological footprint” simply assumes that 54% of the acreage we need should be devoted to “carbon uptake.”
But what if tree planting wasn’t the only way to soak up carbon dioxide? Or if trees grew faster when irrigated and fertilized so you needed fewer of them? Or if we cut emissions, as the U.S. has recently done by substituting gas for coal in electricity generation? Or if we tolerated some increase in emissions (which are measurably increasing crop yields, by the way)? Any of these factors could wipe out a huge chunk of the deemed ecological overdraft and put us back in planetary credit.
Helmut Haberl of Klagenfurt University in Austria is a rare example of an ecologist who takes economics seriously. He points out that his fellow ecologists have been using “human appropriation of net primary production”—that is, the percentage of the world’s green vegetation eaten or prevented from growing by us and our domestic animals—as an indicator of ecological limits to growth. Some ecologists had begun to argue that we were using half or more of all the greenery on the planet.
This is wrong, says Dr. Haberl, for several reasons. First, the amount appropriated is still fairly low: About 14.2% is eaten by us and our animals, and an additional 9.6% is prevented from growing by goats and buildings, according to his estimates. Second, most economic growth happens without any greater use of biomass. Indeed, human appropriation usually declines as a country industrializes and the harvest grows—as a result of agricultural intensification rather than through plowing more land.
Finally, human activities actually increase the production of green vegetation in natural ecosystems. Fertilizer taken up by crops is carried into forests and rivers by wild birds and animals, where it boosts yields of wild vegetation too (sometimes too much, causing algal blooms in water). In places like the Nile delta, wild ecosystems are more productive than they would be without human intervention, despite the fact that much of the land is used for growing human food.
If I could have one wish for the Earth’s environment, it would be to bring together the two tribes—to convene a grand powwow of ecologists and economists. I would pose them this simple question and not let them leave the room until they had answered it: How can innovation improve the environment?Continue reading →
Latest Mind and Matter column is on why there is nothing so old as the recently new:
Watching friends learn kite-surfing last week, equipped not only with new designs of inflatable kites shaped like pterodactyls but new kinds of harnesses shaped like medieval chastity belts and even new helmets shaped like Elizabethan sleeping caps, it occurred to me that nothing becomes obsolete so fast as something new. For it is pretty clear that the rise of kite-surfing, invented in the late 1990s, is slowly killing wind-surfing.
Wind-surfing, invented in the 1970s, is not yet as moribund as the fax, which was invented at about the same time, but it may be heading that way. As recently as 2005, wind-surfers were scoffing at the upstart kite-surfers, arguing that their pastime was slower, more cumbersome and more dangerous. (I remember scoffing at email’s inferiority to fax in the days when you had to call to alert somebody to check for an incoming email.)
Now kite-surfing equipment packs smaller and costs less than wind surfing’s, the skills are easier to learn, the speed is as great-greater in light winds-and it can be done on land in the form of kite-karting. People have already crossed hundreds of miles of ocean by kite-surfing, from the Canaries to Morocco, for example, and from Tasmania to Australia.
My point is that new technologies threaten young technologies more than they threaten ancient ones. Kite-surfing may kill wind-surfing, but it will not affect sailing. Email eclipsed fax more than it did letter-writing. Social networking is overtaking telephoning, but not partying. In the era of Kinect, Space Invaders is dead, but poker is thriving. In competition with jets, airships have largely died out, but ships have not. Refrigeration killed the newfangled ice trade, but old-fashioned pickling, smoking and curing continued.
It seems there is nothing so dated as the recently new. Visiting a university recently I was shown a lecture theater that was state of the art when built just a few years before. Its proudest boast-an ethernet port at every seat-now sounds as obsolete in the wi-fi age as blotting paper and quill sharpeners.
In 1986, to celebrate the 900th anniversary of William the Conqueror’s Domesday Book, which documented every community in the Normans’ newly conquered England, the BBC began a project to redocument England on digital video. But it stored the data on videodiscs on Acorn microcomputers, both of which have long since become obsolete. As a result, the 1986 Digital Domesday Book quickly became far less accessible than the 1086 analog one, until rescued by a special academic research project to emulate the outdated software on a new platform in 2002.
This obsolescence of the new catches out politicians and educators. Government policy rarely ages as fast as when it contains pronouncements about new technology. Whereas debates about debt or defense from hundreds of years ago sound fairly familiar, the earnest promises of the early 1980s to establish strategic strengths in memory-chip manufacture sound quaint today. Europe’s policy fetishes of the 1990s with “teleworking,” “virtual reality” and interactive television have not aged well. When I was at school, there was a frantic push to teach us all Fortran programming language so we could cope with the computer age. Latin lessons have survived; Fortran ones have not.
It follows that obsolescence more probably beckons for the things that have changed our life most recently, rather than for the things that are already old. My generation finds it hard to believe that email will die, but the young barely touch it, preferring Facebook, Twitter and text. I suspect rather than go extinct, email will evolve into something more compatible with text and social networking. And perhaps we may be permitted a wry smile at the certain prospect that the young will in turn be marooned with obsolete habits and terms like Facebook, Twitter and text.Continue reading →
David Middleton has an interesting essay on ocean pH here.
Like me he finds the literature replete with data suggesting that a realistic reduction in alkalinity caused by CO2 increases will do no net harm to marine ecosystems. For example:
A recent paper in Geology (Ries et al., 2009) found an unexpected relationship between CO2 and marine calcifers. 18 benthic species were selected to represent a wide variety of taxa: “crustacea, cnidaria, echinoidea, rhodophyta, chlorophyta, gastropoda, bivalvia, annelida.” They were tested under four CO2/Ωaragonite scenarios…
The effects on calcification rates for all 18 species were either negligible or positive up to 606 ppm CO2. Corals, in particular seemed to like more CO2 in their diets…
This study alone gives the lie to the claim made in the high priests’ critique of my Times article that no average net effect can still mean harm, because it can mean (say) an ocean with more jellyfish and fewer corals. Shrimps, corals, starfish, red algae, green algae, snails, clams, and worms — all either unaffected or beneficially affected. Sounds pretty diverse to me.
What is really noticeable is that whereas the alarmists’ papers are full of words like `may’ and `can’ (for example,
The tolerance range and optimal value can be expected to vary between species, and may change through individual acclimation or genetically-driven selective adaptation. Nevertheless, increased frequency of extremes and/or relatively small changes in mean values can be damaging, with stress impacts that may affect health or reproduction…)
by contrast Middleton’s post is full of…graphs. Real data.
Here’s one that I’ve referred to before, showing that coccolithophores like higher CO2 levels.
This one was especially amazing. It’s a graph Middleton made from data used by a paper in Science.
Notice three things:
First, the calcification rate on the Great Barrier Reef has been rising, not falling, over recent centuries, at a time when CO2 levels have been rising. Just as I have been saying: CO2 dissolves in seawater to make HCO3 and HCO3 is the fuel corals use for calcification. Other things being equal CO2 fertilises coral growth.
Second, note that tiny little drop in the red line at the end. This is how the authors of the study describe that tiny little drop:
[Corals’] skeletal records show that throughout the GBR, calcification has declined by 14.2% since 1990, predominantly because extension (linear growth) has declined by 13.3%. The data suggest that such a severe and sudden decline in calcification is unprecedented in at least the past 400 years.
Third, note the drop-off in sample size near the end.
Truly, I am gobsmacked. As Middleton comments:
It is “cherry-picking” of the highest order, if that last data point really is the basis of this claim
(If it’s not, they are welcome to write in and say what is.)Continue reading →
The always perceptive Brendan O’Neill raises an important point about the Brisbane floods, which just may have been exacerbated by a collective institutional obsession with preparing for droughts caused by global warming (hat tip Bishop Hill).
It is worth looking at a document called ClimateSmart 2050, which was published in 2007 by the Queensland government. It outlines Queensland’s priorities for the next four decades (up to 2050) and promises to reduce the state’s greenhouse gas emissions by 60 per cent during that timeframe. The most striking thing about the document is its assumption that the main problem facing this part of Australia, along with most of the rest of the world, is essentially dryness brought about by global warming. It argues that “the world is experiencing accelerating climate change as a result of human activities”, which is giving rise to “worse droughts, hotter temperatures and rising sea levels”. We are witnessing “a tendency for less rainfall with more droughts”, the document confidently asserted.
As a consequence the government went on warning of water shortages even as the Wivenhoe dam got close to full, apparently forgetting that one of the dam’s jobs was to act as a flood shock absorber. As with British snow, the concern seems to have asymmetric, suggesting that climate change is causing officials to forget that weather noise may still be far more important than climate signal even in a slowly warming world.
O’Neill’s conclusion is characteristically wise:
This is not to say that “greens are to blame for Brisbane”. There’s no point joining the current clamour to find one evil person or one evil that can be held responsible for what is a very complex natural disaster. However, in a world in which the political elites increasingly spend their time fantasising over a future hot apocalypse, where it is fashionable to make Biblical predictions about mankind receiving a sweaty punishment for his wayward behaviour, it is worths raising the possibility at least that our priorities have become seriously skewed. Perhaps it is time for our leaders to come back down to Earth, and to address problems in the here and now, rather than endlessly moralising about man’s behaviour and its future impact on Mother Earth.Continue reading →
My latest Mind and Matter column in the Wall Street Journal is about parabolas, the evolution of throwing and angry birds:
The spectacular trajectory of the Angry Birds computer game, from obscure Finnish iPhone app to global ubiquity-there are board games, maybe even movies in the works-is probably inexplicable. Of course it’s cheap and charming, but such catapulting success must owe a lot to serendipitous, word-of-mouth luck. Yet, prompted by my friend Trey Ratcliff, who created the gaming-camera app 100 Cameras in 1, I’ve been musing on whether there’s an evolutionary aspect to its allure.
To play Angry Birds, you must use a catapult to lob little birds at structures in the hope of knocking them down on pigs. It’s the verb “lob” that intrigues me. There is something much more satisfactory about an object tracing a parabolic ballistic trajectory through space towards its target than either following a straight line or propelling itself.
Predicting parabolas is something humans just seem to find intriguing. How else do you explain golf? Or the awe in which we hold good quarterbacks in football and good spin bowlers in cricket? Our bodies are uniquely good at throwing things at targets. The trajectory must be prefigured in the brain before the projectile leaves the fingers. Our shoulders rotate, our scapulas slide, our pelvises pivot, our arms flex and our fingers extend.
With the exception of the archer fish, which knocks insects off leaves with well-aimed jets of water, no other animal uses parabolic trajectories. A chameleon darts out its tongue in a straight line. A dog likes catching a ball but could not begin to throw it. No animal has a throwing limb like us. A chimpanzee chucks rocks and branches when angry but usually underarm and with the random aim of a human toddler. The closest any bird-angry or not-comes to throwing is the Egyptian vulture trying to break ostrich eggs by strewing rocks in their general direction. The parabolic ballista is ours alone.
Until 10,000 years ago, most or even all human beings relied on this talent for gathering at least some of their food-by killing it at a distance. With the arrow, the spear thrower, the blowpipe, the boomerang, the sling, the harpoon and the thrown rock, we were killing prey from fish to birds to mammoths. Not to mention each other.
The biologists Paul Bingham and Joanne Souza have argued that the ability to deal death at a distance was crucial to the development of society, because with a well-aimed throw it was now easy for even weak individuals to punish those who abused their position. Not for nothing did Damon Runyon call guns “equalizers.'”
Yet throwing projectiles may be a feature of comparatively recent human evolution. For our own race of Homo sapiens, throwing rocks dates back only 80,000 years or so in Africa. By contrast, the skeletons of Neanderthals-our doomed cousins who lived in Europe till 30,000 years ago-show a high frequency of fractures that roughly mirror the kinds of breaks rodeo riders sustain. This suggests that they were getting close to their big-game prey, stabbing horses, reindeer, bears and rhinos with spears, rather than throwing javelins at them.
Moreover, modern baseball pitchers show a characteristic backward displacement of the shoulder joint-usually only on one side. So do the skeletons of early modern European hunter-gatherers, according to Jill Rhodes and Steven Churchill of Duke University, but not Neanderthals. They apparently had no “throwing arm.”
Imagine how much keener the joy of the throw if the prize was food after a day of hunger. No wonder we still love to experience the thrill of a well-launched parabolic projectile-even a cartoon of an angry bird.
Continue reading →
I had this article in the Times on 14 January:
The person who tips the world population over seven billion may be born this year. The world food price index hit a record high last month, according to the Food and Agriculture Organisation. Bad harvests in Russia and Australia, combined with rising oil prices, have begun to cause shortages, export bans and even riots. Does starvation loom?
No. Never has the world looked less likely to starve, or our grandchildren more likely to feed well. Never has famine been less widespread. Never has the estimated future peak of world population been lower.
It is true that the world population may pass seven billion some time in the next twelve months, but the rate of growth is decelerating. World population is now growing at just over 1% a year, down from roughly 2% in the 1960s. The actual number of people added to the world population each year has been dropping for more than 20 years.
This deceleration took demographers by surprise. As recently as 1980 many were still forecasting that the current century would see 15 billion people and rising. Only in 2002 did the United Nations realise that its models were wrong to assume that birth rates would not drop below 2 children per woman in many countries. Now the UN estimates that the population will most probably peak at 9.2 billion in about 2075 before starting a slow decline. Population quadrupled in the twentieth century; it will not even double in this.
Everywhere, the fall in the birth rate is dramatic. Countries like Iran and Sri Lanka now have total fertility rates below two children per woman. Bangladesh is now down to 2.7 from 6.8 in 1955. Nigeria’s birth rate has halved. These `demographic transitions’ are proving as predictable as they are mysterious. They seem to happen because women stop fearing their babies will die, and because they move to cities, get educated, get access to birth control and get richer. In other words, the causes are benign; coercion, of the kind so many `experts’ have long urged, is neither necessary nor helpful.
As for food prices, that `record high’ is nothing of the kind – if you take inflation into account. Food prices are up in real terms since 2000, but they are still about 30% below the level in 1980 and 85% down since 1900. In terms of wages, the decline has been even steeper.
Despite a doubling of the population, global food production per head is 30% up on what it was in the 1950s.
Besides, the current spike in food prices is caused by prosperity, not desperation. Newly-rich Chinese and Indians are eating more meat, boosting demand for grain to feed livestock. Meanwhile still-rich Americans and Europeans are indulging their farmers and green activists by taking food and turning it into motor fuel, a policy that pushes up food prices, hurts taxpayers and encourages habitat destruction.
You can bet your farm that all over the northern hemisphere farmers are planting more acres this winter – that’s the effect high prices always have (and spare a nod of gratitude to speculators, whose antics bring forward those extra plantings). So food prices will drop again.
Farm yields have been marching upwards for decades and will continue to do so. In the past sixty years, the total harvest of the big three crops that provide the bulk of our calories – maize, wheat and rice – has trebled, yet the acreage planted has hardly changed.
This trend is going to continue partly thanks to low-tech changes already in the pipeline. Helped by Chinese investment, improved transport to get African crops to market with less waste will make a big difference. As will tractors, which boost production by 25% or so – because they free the land for human food that would otherwise be needed to feed bullocks or horses.
African farmers will start to use much more fertilizer, as western farmers do, which makes it possible to sustain yields without exhausting the soil. A few years ago environmentalists argued that fertiliser would soon run short, because it is made using natural gas, a fossil fuel. But the discovery of how to extract abundant shale gas has turned that argument on its head: there are probably many decades’ worth of natural gas now available to make fertilizer.
There are high-tech changes afoot too. Maize and rice that have been genetically modified to resist pests and use less water, soybeans with better amino acid balance for pig food, wheat that can resist rust – all these are coming. Benighted Europe may reject these GM crops for superstitious reasons but surely not for long. The environmental benefits alone are now stark: GM crops can be pest resistant without the use of sprays that kill harmless insect bystanders.
The more yields increase, the more land can be set aside from food production for reforestation and national parks. This is happening already. National parks are expanding steadily, and land that was once farmed is being returned to forest, especially in countries like Britain and America. That is a huge contrast to a century ago, when farming kept up with population only by expanding into new areas of steppe, pampas and prairie.
Don’t forget another factor. Carbon dioxide levels in the air are rising. CO2 is a raw material that plants use to make sugars, which is why many greenhouse owners pump CO2 over their crops to boost production. The results of more than 600 experiments with rice, wheat and soybeans exposed to the sort of carbon dioxide levels expected by 2050 (an extra 300 parts per million) all show remarkably consistent 30+% increases in yield. And the higher the CO2, the less water a plant loses in absorbing it, so water stress will improve too. Plus, if global warming happens, it is likely to produce more rainfall, so that regions like the Sahel will continue to become greener, as it has in recent decades.
For all these reasons food production will probably continue to rise faster than population in the decades ahead. There will still be price spikes caused by bad weather or foolish policies, and there will be challenges: policies that encourage innovation cannot be taken for granted. Yet so long as trade is free and innovation flourishes, by 2050 it is easily possible that we can feed nine billion people with more and better food from less land.
Continue reading →
The Edge’s Annual Question is a great compilation of brief effusions from science groupies like me. This year the question was
My answer was this:
Brilliant people, be they anthropologists, psychologists or economists, assume that brilliance is the key to human achievement. They vote for the cleverest people to run governments, they ask the cleverest experts to devise plans for the economy, they credit the cleverest scientists with discoveries, and they speculate on how human intelligence evolved in the first place.
They are all barking up the wrong tree. The key to human achievement is not individual intelligence at all. The reason human beings dominate the planet is not because they have big brains: Neanderthals had big brains but were just another kind of predatory ape. Evolving a 1200-cc brain and a lot of fancy software like language was necessary but not sufficient for civilization. The reason some economies work better than others is certainly not because they have cleverer people in charge, and the reason some places make great discoveries is not because they have smarter people.
Human achievement is entirely a networking phenomenon. It is by putting brains together through the division of labor – through trade and specialisation – that human society stumbled upon a way to raise the living standards, carrying capacity, technological virtuosity and knowledge base of the species. We can see this in all sorts of phenomena: the correlation between technology and connected population size in Pacific islands; the collapse of technology in people who became isolated, like native Tasmanians; the success of trading city states in Greece, Italy, Holland and south-east Asia; the creative consequences of trade.
Human achievement is based on collective intelligence – the nodes in the human neural network are people themselves. By each doing one thing and getting good at it, then sharing and combining the results through exchange, people become capable of doing things they do not even understand. As the economist Leonard Read observed in his essay “I, Pencil’ (which I’d like everybody to read), no single person knows how to make even a pencil – the knowledge is distributed in society among many thousands of graphite miners, lumberjacks, designers and factory workers.
That’s why, as Friedrich Hayek observed, central planning never worked: the cleverest person is no match for the collective brain at working out how to distribute consumer goods. The idea of bottom-up collective intelligence, which Adam Smith understood and Charles Darwin echoed, and which Hayek expounded in his remarkable essay “The use of knowledge in society”, is one idea I wish everybody had in their cognitive toolkit.
Some of the other answers were great, including this from Sue Blackmore, of which this is an extract (and which should be compulsory reading for climate scientists):
The phrase “correlation is not a cause” (CINAC) may be familiar to every scientist but has not found its way into everyday language, even though critical thinking and scientific understanding would improve if more people had this simple reminder in their mental toolkit.
One reason for this lack is that CINAC can be surprisingly difficult to grasp. I learned just how difficult when teaching experimental design to nurses, physiotherapists and other assorted groups. They usually understood my favourite example: imagine you are watching at a railway station. More and more people arrive until the platform is crowded, and then – hey presto – along comes a train. Did the people cause the train to arrive (A causes B)? Did the train cause the people to arrive (B causes A)? No, they both depended on a railway timetable (C caused both A and B).
I soon discovered that this understanding tended to slip away again and again, until I began a new regime, and started every lecture with an invented example to get them thinking.
“Right”, I might say “Suppose it’s been discovered (I don’t mean it’s true) that children who eat more tomato ketchup do worse in their exams. Why could this be?” They would argue that it wasn’t true (I’d explain the point of thought experiments again). “But there’d be health warnings on ketchup if it’s poisonous” (Just pretend it’s true for now please) and then they’d start using their imaginations.
“There’s something in the ketchup that slows down nerves”, “Eating ketchup makes you watch more telly instead of doing your homework”, “Eating more ketchup means eating more chips and that makes you fat and lazy”. Yes, yes, probably wrong but great examples of A causes B – go on. And so to “Stupid people have different taste buds and don’t like ketchup”, “Maybe if you don’t pass your exams your Mum gives you ketchup”. And finally ” “Poorer people eat more junk food and do less well at school”.
Continue reading →
My Times column is on the relationshio between science and technology, especially in the UK:
The chancellor, George Osborne, made a speech on science in Cambridge last week in which he contrasted Britain’s “extraordinary” scientific achievements with “our historic weakness when it comes to translating those scientific achievements into commercial gain”. It’s a recurring complaint in British science policy that we discover things, then others make money out of them.
Britain’s astonishing ability to gather scientific firsts — we are second only to the US in Nobel prizes — shows no sign of abating. We have won 88 scientific Nobel prizes, 115 if you add economics, literature and peace. This includes 12 in the past ten years and at least one in each of the past five years. But we filed fewer patents last year than the US, Japan, Germany, France, China or South Korea, and we have seen many British discoveries commercialised by others: graphene, DNA sequencing, the worldwide web, to name a few. So yes, we are good at science but bad at founding new industries.
The government’s response to this is to encourage clusters of high-tech companies, such as the one around Cambridge, with its 1,500 technology-based firms, and to support technology “catapults” to help infant industries to get going in eight technologies in which Britain has a potential lead. An example is the manufacturing centre in Coventry with its 3D printing expertise.
Most of the firms in the Cambridge cluster moved there to get close to the university; they did not spin out of the university. Technology tends to come from technology, and to use science to help the process along, rather than to be born of science. The idea that innovation happens because you put science in one end of the pipe and technology comes out the other end goes back to Francis Bacon, and it is largely wrong. History shows that (public) science is the daughter of (private) technology at least much as it is the mother. Good universities recognise this and adjust their research programmes to what interests industry.
The steam engine led to the insights of thermodynamics, not vice versa. The dye industry drove chemistry. The centrifuge and X-ray crystallography, developed for the textile industry, led to the structure of DNA. DNA sequencing (a British technology) led to genomics (an international science). The development of mobile telephones, horizontal drilling for oil, and search engines owed almost nothing to university research. Sure, the firms that made these breakthroughs later went to universities in search of educated staff, and to help to solve problems through contracted research. But the breakthroughs owed less to the philosophical ruminations of scientists than to the tinkering of engineers.
Twenty years ago the economist Partha Dasgupta pointed out that the “republic of science”, with its insistence that results must be shared and that rewards come in the form of prizes and prestige, was very different from the privatised world of technology, where patents and profits were what mattered. In a paper with Paul David, he said: “Modern societies need to have both communities firmly in place and attend to maintaining a synergistic equilibrium between them.”
So perhaps the British problem is that we are good at the public bit, but not the private bit. Being good at science, we share our results with the world, rather than benefiting from them ourselves. It is not quite that simple. Terence Kealey, the vice-chancellor of Buckingham University, one of the foremost authorities on the economics of science, has made a strong case — recently buttressed with that badge of economic respectability, a mathematical model — that science is not a pure “public good”, like light from a lighthouse.
Although knowledge is shared among scientists, it is still not automatically accessible to “any passer-by or person of average curiosity”, say Professor Kealey and his co-author, Martin Ricketts. To join the conversation you need the tacit knowledge that comes from training in the particular field of science itself. And that’s why private firms are keen to cluster round Cambridge, to get the expertise and contacts, and to eavesdrop on the scientific chat in the university as well as in each other’s coffee rooms.
The mathematical model shows that there is a “pinch point” in research, where researchers need encouragement to start sharing knowledge with each other. There are lots of examples of government giving that encouragement and doing it well. So Professor Kealey says that inasmuch as Mr Osborne is creating new institutions of knowledge sharing and trust between previously separate entities (particular universities and industry), he is doing something that both theory and history show can be of benefit: it is one area where government action can be shown to have been a good thing.
Incidentally, history provides little support for the commonly held view that munificent funding of science by government results in faster economic growth. In the late 19th and early 20th century, France and Germany provided much public funding to science while Britain and America did not. The anglophone countries grew faster. And for every example of an unexpected spin-off from public funding of science — the worldwide web was invented by Sir Tim Berners-Lee so that physicists could share their results — there are plenty of cases of private funding having public effects. Sputnik, the pioneering Russian space probe, relied extensively on prewar research privately funded by Robert Goddard, supported by the Guggenheims.
In 2003 the OECD published a paper on “sources of growth in OECD countries” between 1971 and 1998, finding to its explicit surprise that, whereas privately funded research and development stimulated economic growth, publicly funded research had no economic impact whatsoever. It is possible that government spending on the wrong kind of science stops people working on the right kind of science as far as economic growth is concerned — too many esoteric projects of no interest to nearby industries.
Not that this means the public funding of science should cease. Given that the government takes close to half of GDP and spends it on many things, it would be a shame if none of that money found its way back to science, one of the great triumphs of our culture. The American physicist Robert Wilson, when asked in a congressional hearing during the cold war how a particle accelerator would contribute to national defence, replied that it had “nothing to do directly with defending our country except to help make it worth defending”.Continue reading →