The Rational Optimist has been short-listed for the Samuel Johnson prize for the best non-fiction book of 2010.
The Rational Optimist has been short-listed for the Samuel Johnson prize for the best non-fiction book of 2010.
My latest Mind and Matter column for the Wall Street Journal is on how the future turns out:
Last month a crash dummy flew to 5,000 feet above ground level in a personal jet pack. The inventor, New Zealander Glenn Martin, has spent decades on the project and is ready to start selling the device for $100,000 each next year. The gasoline-driven machine can stay aloft for 30 minutes, thanks to what is, in effect, a pair of large leaf-blowers. A parachute provides partial reassurance if something should go wrong.
Mr. Martin’s achievement is a reminder that, though we often underestimate the progress of a technology, sometimes we overestimate it. Back in the 1950s it seemed almost obvious that by the 21st century jet packs would be ubiquitous and routine aids to travel. They featured in sci-fi novels and comics and television series like “Lost in Space.” A time-traveler who arrived from that era might be impressed by our Internet and mobile phones but amazed at our lack of working jet packs.
But the Bell Rocket Belt now gathers dust in the Smithsonian’s Air and Space Museum. Its exceedingly short flight time, 20 seconds, was both impractical and unsafe. Unlike airplanes, jet packs cannot land gracefully when the power fails. Soon even the comic-book heroes of the future had begun to do without jet packs.Several jet packs were indeed under development by the late 1950s. One of them, called the Jump Belt, used compressed nitrogen. Another, the Aerojet, used compressed hydrogen peroxide. A few years later, Bell Aerospace’s Rocket Belt, also using hydrogen peroxide, seemed the most promising and briefly took James Bond off the ground in “Thunderball.”
Just who will buy Mr. Martin’s version is still unclear. He is pinning his hopes on emergency workers who need to get somewhere in a hurry. Good luck with that.
There’s a general point here. Though communication has advanced beyond the wildest dreams of futurologists of the 1950s, transportation has underperformed. Our time-traveling visitor would be shocked at the absence of routine space travel and the general scarcity of helicopters in civilian life. He would be surprised to find that our cars aren’t really that different in mechanism or speed from the things with fins that he drove. In vain would you boast of electric windows and catalytic converters.
The reason for transportation’s relative technological stagnation, compared with communication, is diminishing returns, which have blunted our ability to squeeze more out of technology. The fuel needed to fly at supersonic speeds is prohibitively expensive, as the Concorde proved. It has been said that if the car had experienced the sort of cost reductions that computing has found through Moore’s Law-which says that the density of transistors on a chip doubles every two years-it would travel to the moon and back on a teaspoon of gasoline.
By contrast, the telephone and telegraph had barely changed since he was born; radio and talkies had been around since his youth; only television and (yawn) telex were true novelties. (That the satellite would do more for communication than for transportation in the years ahead was emblematic of what was to come.)An elderly person alive in 1950 would have witnessed extraordinary changes in the mode, speed and availability of transport: cars, motorcycles, powered flight, jet engines, rockets, outboard motors, electric trains, hovercraft, helicopters and supersonic speed.
Which technologies today are about to hit the brick wall of diminishing returns, and which are poised for sudden price collapses?
The big question is over health care. In recent times it has tended to invent effective but expensive new procedures-medical jet packs, though more useful-such as surgery, scanning and radiotherapy. But some think it might be on the brink of finding cheaper therapies through regenerative medicine and genome sequencing. The latter’s cost is down by 99% in about a decade, far faster than Moore’s Law.Continue reading →
Here is an op-ed I wrote for today’s Australian newspaper:
POLLYANNA is a fool; Cassandra was wise. As a self-proclaimed “rational optimist” who argues that the world has been getting better for most people and that the future is likely to be better still, I am up against a deep prejudice towards pessimism that dominates the intelligentsia. As John Stuart Mill put it, “not the man who hopes when others despair, but the man who despairs when others hope, is admired by a large class of persons as a sage”.
What is more, pessimism has become a hallmark of the Left, chiefly because it justifies activism. Once upon a time conservatives lamented the way the world had gone to the dogs since the golden age (and some still do), while socialists championed growth, technology and innovation to liberate the working class.
Today, infected by Malthusian ecology, the Left relentlessly preaches millennial doom and technological risk: the climate is heading for catastrophe; resources are running out; population is growing too fast; farming cannot keep up; habitat is being destroyed; poverty, hunger, pollution, disease and greed are only going to get worse. A dramatic change in human stewardship of the planet is needed.
Based on the trajectory of the past five decades, and even (or especially) if the world economy grows rapidly, this century is likely to see mild climate change, cheap and abundant resources, falling population, ample food, more wilderness, and the average person becoming gradually – though erratically – wealthier, healthier, happier, cleverer, cleaner, freer, kinder, more peaceful and more equal. Each of the past five decades has almost certainly seen records set for each of those adjectives for the world as a whole.The evidence suggests that these predictions are likely to be wrong.
Yet the pessimism monster is irrepressible. No matter how many scares are proved wrong, the next set of dispatches of doom are treated with the same reverential respect.
Remember what the media said about the Y2K computer bug? “This is not a prediction, it is a certainty: there will be serious disruption in the world’s financial services industry . . . It’s going to be ugly” (The Sunday Times); “10 per cent of the nation’s top executives are stockpiling canned goods, buying generators and even purchasing handguns” (New York Times); “Army Fears Civil Chaos From Millennium Bug: Armed Forces Gearing Up To Deal With Civil Chaos” (Canada’s Globe and Mail). In the event nothing happened, but the media were soon saying the same thing about the next scare.
There’s a broad constituency for pessimism. No pressure group ever got donations by telling its donors calamity was unlikely; no reporter ever got his editor’s attention by saying that a scare was overblown; and no politician ever got on television by downplaying doom.
What is more, pessimism demands that “something must be done”, providing the excuse for businessmen and bureaucrats to hatch a plot against the public’s purse. Prophecies of doom can be profitable.
In its environmental incarnation, a pessimistic view of the world, because it diagnoses that things are going wrong, demands a top-down re-ordering of the world economy. Last January at Davos, no less than UN secretary-general Ban Ki-moon described the world economic model as a “global suicide pact” that would result in disaster if it was not reformed: “We need an environmental revolution”.
Governments all round the world are interfering with markets to try to bring about this environmental revolution. One of the policies they have adopted has taken 5 per cent of the world’s grain crop and turned it into biofuel to power motor vehicles. This has driven up food prices, increased malnutrition and encouraged the destruction of rain forest, while enriching farmers.
Yet, given that the planting and harvesting of biofuels use about as much oil as the fuels they displace, it has had precisely zero effect on carbon emissions. Nonetheless, it is considered a green, progressive policy.
Another policy is to bribe rich landowners to festoon the most picturesque landscapes with concrete pads on which are placed gargantuan steel towers topped with wind turbines containing two-tonne magnets made of an alloy of neodymium, a rare earth metal mined in inner Mongolia by a process of boiling in acid that produces poisoned lakes filled with mildly radioactive and toxic tailings.
The cost of this policy is borne by ordinary electricity users and their would-be employers. So far, the wind industry’s contribution to cutting carbon emissions is precisely zero, because it provides less than 0.5 per cent of world energy use and even that has to be offset by keeping fossil fuel plants running for when the wind does not blow.
Oh, and wind turbines have killed so many white-tailed eagles in Norway, wedge-tailed eagles in Tasmania and golden eagles in California that local populations of the species are in increased danger of extinction. And this is a green, “clean”, progressive policy?
The biofuel and wind industries are now powerful commercial lobbies. Optimists, by contrast, have less excuse to interfere, so they cannot build constituencies of vested interests.
Market champions believe the best way to make the world rich, clean and safe is to let people trade and innovate by encouraging the international mobility of goods, services, people, ideas and technology.
The consequence would be: “great improvement in the overall health and social conditions of the majority of people, energy and mineral resources abundant . . . because of rapid technical progress, which reduces the resources needed to produce a given level of output and increases the economically recoverable reserves, rapid technological progress [that] ‘frees’ natural resources currently devoted to provision of human needs for other purposes which increases ecologic resilience”.
Is this a quote from some starry-eyed free-market zealot? No. It is the official description of one of the model economic scenarios devised by the UN intergovernmental panel on climate change to calculate carbon emissions in the 21st century. It is the scenario that produces the greatest prosperity in the most sustainable way with by no means the highest emissions.
In other words, Ban Ki-moon’s very own organisation admits a golden future awaits the world if we trust the world’s free economic model and stop interfering. Since that would never do, pessimism prevails over common sense.Continue reading →
My latest Mind and Matter column in the Wall Street Journal is on cancer and evolution by natural selection:
Last week the American Cancer Society reported that death rates from cancer are falling steadily, at an annual rate of about 1.9% in men and 1.5% in women. A study published this week by the University of Colorado found that most seniors who died after being diagnosed with breast cancer actually lived long enough to have died of something else.
Prevention explains much of the decline in cancer fatalities, especially the drop in smoking. As for treatment, the most promising new options harness the very force that makes cancer so stubbornly virulent in the first place: evolution.
Adjusted for age, the incidence of some cancers has also been falling, contradicting the expectation widespread in the 1960s and 1970s that cancer rates would surge because of chemical pollution and the use of pesticides. In thrall to this view, Wilhelm Hueper, the mentor of Rachel Carson, refused to accept that lung cancer was caused by smoking. He said the data “unmistakingly suggest that cigarette smoking is not a major factor in the causation of lung cancer” and “it would be most unwise at this time to base future preventive measures of lung-cancer hazards mainly on the cigarette theory.”
Better treatments have played at least some role in the cancer death rate’s recent improvement. Surgery, radiation therapy and chemotherapy save or prolong many lives, though they’re often very blunt instruments that kill tumors only slightly faster than they kill patients.
A new generation of more effective and less vicious treatments is starting to make a difference, too. Imatinib (marketed as Gleevec), the first therapy that targets a specific enzyme expressed by cancer cells-rather than just kill cells that are dividing, like most chemotherapy-has now been saving lives for 10 years and is approved for 10 kinds of cancer. Others, such as vemurafenib for melanoma, are joining in.
Yet, by comparison with the rapid progress in rich countries against death from infectious disease, respiratory disease, heart disease and stroke, success against cancer is still far too slow and attritional. The chief reason for this is evolution. Because a tumor consists of a bunch of cells competing for the body’s resources by growing and dividing, and because it indulges in massive genetic trial and error, it experiences extremely rapid natural selection. Throw a drug at a tumor and you selectively benefit those of its cells that happen to be the most resistant to the drug. That is why most cancer treatments start out so well and then gradually lose the battle.
Logically, therefore, medicine needs to harness evolution to its cause, too. The “organ” that exploits natural selection most effectively is the immune system, which uses trial and error to find suitable antibodies to counter infectious diseases. So the growth of immunotherapy for cancer seems to be a promising angle, especially in the form of therapeutic cancer vaccines.
Dendreon’s Provenge is the first to be approved, for advanced prostate cancer. The physician extracts cells from each patient’s own immune system, selectively “trains” them to attack prostate-cancer cells, then multiplies them and reintroduces them into the body. The “training” is, in effect, a form of evolution by natural selection, in which the cells multiply if they can attack an antigen specific to prostate cancers. Provenge gives patients a 50% greater chance of being alive after three years.
This year, a large-scale clinical trial begins for Prima BioMed’s CVac, which takes a similar approach to ovarian cancer. Cells from the patient’s blood are matured into dendritic cells. Those dendritic cells that can train killer cells to go after mucin-1, a protein on the surface of ovarian cancer cells, are selected and then reintroduced under the patient’s skin.
Cancer vaccines are not magic bullets, and many have failed in clinical trials after promising early results. But their most hopeful feature is that they are beginning to use cancer’s greatest defense-evolution by trial and error-against it.Continue reading →
Walter Russell Mead is always worth reading. Now he has written a two-part essay on Al Gore and the climate debate (part one; part two) that is, I think, very perceptive. It is angry, hard-hitting, and I don’t agree with everything in it, but it somehow gets to to the core of the issue in a way that so much other commentary has not. This is the sort of old-fashioned polemic from somebody with historical perspective that has been lacking on this subject. Here’s his conclusion:
The green movement’s core tactic is not to “hide the decline” or otherwise to cook the books of science. Its core tactic to cloak a comically absurd, impossibly complex and obviously impractical political program in the authority of science. Let anyone attack the cretinous and rickety construct of policies, trade-offs, offsets and bribes by which the greens plan to govern the world economy in the twenty first century, and they attack you as an anti-science bigot.Continue reading →
New evidence has been published that the Great Barrier Reef is not in trouble from climate change. The effects of bleaching are short-lived and reversible. When I said this in my book, I was patronised from a great height by a bunch of marine biologists in New Scientist. Will they, and New Scientist, now apologise? As I keep saying, coral reefs are indeed under threat from man-made problems — pollution, overfishing, run-off, but climate change is the least of their worries. Here’s the abstract of Osborne et al’s paper in PLOS One:
Coral reef ecosystems worldwide are under pressure from chronic and acute stressors that threaten their continued existence. Most obvious among changes to reefs is loss of hard coral cover, but a precise multi-scale estimate of coral cover dynamics for the Great Barrier Reef (GBR) is currently lacking. Monitoring data collected annually from fixed sites at 47 reefs across 1300 km of the GBR indicate that overall regional coral cover was stable (averaging 29% and ranging from 23% to 33% cover across years) with no net decline between 1995 and 2009. Subregional trends (10-100 km) in hard coral were diverse with some being very dynamic and others changing little. Coral cover increased in six subregions and decreased in seven subregions. Persistent decline of corals occurred in one subregion for hard coral and Acroporidae and in four subregions in non-Acroporidae families. Change in Acroporidae accounted for 68% of change in hard coral. Crown-of-thorns starfish (Acanthaster planci) outbreaks and storm damage were responsible for more coral loss during this period than either bleaching or disease despite two mass bleaching events and an increase in the incidence of coral disease. While the limited data for the GBR prior to the 1980’s suggests that coral cover was higher than in our survey, we found no evidence of consistent, system-wide decline in coral cover since 1995. Instead, fluctuations in coral cover at subregional scales (10-100 km), driven mostly by changes in fast-growing Acroporidae, occurred as a result of localized disturbance events and subsequent recovery.
Here’s what i wrote in my book.
Take coral reefs, which are suffering horribly from pollution, silt, nutrient runoff and fishing – especially the harvesting of herbivorous fishes that otherwise keep reefs clean of algae. Yet environmentalists commonly talk as if climate change is a far greater threat than these, and they are cranking up the apocalyptic statements just as they did wrongly about forests and acid rain. Charlie Veron, an Australian marine biologist: ‘There is no hope of reefs surviving to even mid-century in any form that we now recognise.’ Alex Rogers of the Zoological Society of London pledges ‘an absolute guarantee of their annihilation’. No wiggle room there. It is true that rapidly heating the water by a few degrees can devastate reefs by ‘bleaching’ out the corals’ symbiotic algae, as happened to many reefs in the especially warm El Niño year of 1998. But bleaching depends more on rate of change than absolute temperature. This must be true because nowhere on the planet, not even in the Persian Gulf where water temperatures reach 35C, is there a sea too warm for coral reefs. Lots of places are too cold for coral reefs – the Galapagos, for example. It is now clear that corals rebound quickly from bleaching episodes, repopulating dead reefs in just a few years, which is presumably how they survived the warming lurches at the end of the last ice age. It is also apparent from recent research that corals become more resilient the more they experience sudden warmings. Some reefs may yet die if the world warms rapidly in the twenty-first century, but others in cooler regions may expand. Local threats are far more immediate than climate change.Continue reading →
My latest Mind and Matter column in the Wall Street Journal:
Driving home the other day it occurred to me that almost none of the greenery I could see-trees, garden shrubs, grass shoulders on the highway-was going to be used by humans for food, fuel, clothing or shelter.
That would not have been true 500 years ago. The roadside grass would have fed horses, the trees would have supplied firewood and their acorns would have fed pigs. Although the England of my day has 10 times the population of Tudor times, there’s more greenery for wild nature now than there was then.
Italy has more forest cover than it did 50 years ago, and Austria has doubled its production of greenery over what it would otherwise be.
Fossil fuels have well known disadvantages, but this is one of their easily overlooked benefits. By substituting oil and coal for horses and firewood, we have relieved the pressure on greenery to supply our needs. By using gas to make fertilizer, we can feed ourselves from a smaller acreage, leaving more acres for other species.
A professor in Vienna named Helmut Haberl has been investigating this phenomenon under the acronym HANPP-or human appropriation of net primary production (a fancy phrase for greenery or biomass). He concludes that human beings currently appropriate for themselves and their domestic animals 14.2% of the world’s greenery-including farms, forests, swamps, grasslands and scrub, but excluding the oceans. We destroy or prevent another 9.6% of greenery from growing, by paving or over-grazing; 76.2% remains for nature to use.
So there is still some headroom for the human enterprise. But I see an even more encouraging result hidden in Mr. Haberl’s numbers. He finds that the most industrialized parts of the globe do not necessarily have the largest impact on the biomass of natural ecosystems, even when you take into account their importation of primary production from other places.
The reason is that rich economies tend to boost plant growth, especially on farms, through fertilizer and irrigation. In some cases they do so to such effect that even a large human appropriation still leaves lots for other wildlife. Italy, for example, has more forest cover than it did 50 years ago, yet produces more food.
Mr. Haberl’s native Austria more than doubles its land’s production of greenery over what it would otherwise be. Consequently, since 1830 Austria has actually reduced the proportion of biomass it pinches for human use even while increasing its consumption. Through irrigation, other places, like the Nile delta, produce more biomass for wildlife than would grow without human intervention-even after people have appropriated much of the biomass for themselves.
Here’s a homely example. Some of the birds that visit my garden feed in fields-on seeds, shoots, worms and insects. Man-made fertilizer boosts the quantity of that food, and therefore of birds, whose droppings fall in my garden, transferring that fertility to an ecosystem that will not be harvested, just admired.
View Full Image
John S. Dykes
Almost none of the greenery I could see-trees, garden shrubs, grass shoulders on the highway-was going to be used by humans for food, fuel, clothing or shelter.
This means that so long as energy and water are abundant, humans might eventually aspire to make global appropriation net-negative. That is to say, our grandchildren could live wealthy consumer lifestyles consuming huge quantities of plant growth, yet actually increasing the amount of plant growth available to wildlife above what it would naturally be.
How would we get there? First, we would stop using the landscape for biofuels, a catastrophic policy mistake, and help rural Africans to switch from charcoal to kerosene or solar stoves. Second, we’d have to find abundant cheap energy with a small land footprint. That means gas, nuclear and maybe solar. Because nitrogen fertilizer is made by a reaction between natural gas and air, abundant gas means abundant fertilizer.
Continue reading →
Nic Lewis’s discovery of a statistical alteration applied by the IPCC lends strong support to lukwarming
As most people know, I am a lukewarmer — somebody who accepts carbon dioxide’s full greenhouse potential, but does not accept the much more dubious evidence for net positive feedbacks on top, and who therefore thinks that a temperatuire rise of more than 2C in this century is unlikely.
This view just got a strong boost. Nic Lewis, the indefatigable mathematical sleuth who helped expose the mistakes in a paper about Antarctic temperature trends has been looking at how the IPCC estimates climate sensitivity — that is, the warming expected for a doubling of CO2. He finds that the one study that estimated sensitivity entirely from experimental data — Forster and Gregory 2006 — was distoted by the IPCC when it came to present their results. The distortion was the imposition of a Bayesian “uniform prior” in a way that statisticians say is wholly inappropriate, because it effectively assumes a priori that strong warming is more probable than it is. Yet you don’t even have to know that the use is inappropriate to know that it’s inappropriate to take a published result and alter the graph from it, adding an obscure footnote to say you have done so. A published result is a published result.
The effect was to fatten the tail of the graph, making a warming of more than 2C look much more probable.
I defy you to look at that graph — the green one — and tell me that a temperature rise oif more than 2C is not “unlikely” according to that study. I defy you to look at the graph — the blue one — and not conclude that whoever drew it had better have a very good argument for fattening the tail compared with what the authors had originally published.
NIc has found that the IPCC did much the same to most of the other estimates of climate sensitivity, which rely mostly on models. This mistake is central to the IPCC’s case, not peripheral. It undermines the credibility of the case for urgent action against climate change and strongly supports the argument that, other things being equal, CO2 doubling will not cause more than a mild and net beneficial warming.
Here’s Nic’s first paragraph:
The IPCC Fourth Assessment Report of 2007 (AR4) contained various errors, including the well publicised overestimate of the speed at which Himalayan glaciers would melt. However, the IPCC’s defenders point out that such errors were inadvertent and inconsequential: they did not undermine the scientific basis of AR4. Here I demonstrate an error in the core scientific report (WGI) that came about through the IPCC’s alteration of a peer-reviewed result. This error is highly consequential, since it involves the only instrumental evidence that is climate-model independent cited by the IPCC as to the probability distribution of climate sensitivity, and it substantially increases the apparent risk of high warming from increases in CO2 concentration.Continue reading →
Frank Dikotter’s fine — and vital — book on Mao’s Great famine won the Samuel Johnson prize. But you can see a short film and a discussion about my book on the BBC Culture show here (from minute 17.17 onwards). It’s an honour to have made it to the …Continue reading →