Skip to content


The United Floods of America: Cross-Checking Data on the Coming Deluge

A picture is worth a thousand words. The polar ice is melting, the glaciers are retreating. Whether or not you believe this influx of H2O is caused by anthropogenic global warming (although, seriously, it is), that won’t stop the steady rise in sea levels from here on out.

Source: http://www.globalwarmingart.com/wiki/File:Recent_Sea_Level_Rise_png

The inability to gaze into a crystal ball and see the future is what gives climate change deniers so much ammunition. While climate scientists are pretty sure what the future holds, the deniers are absolutely positive that no one knows the future. When it comes to sound bites, the deniers’ emphatic response trumps the scientists’ measured response every time. (It’s a good thing the millennials have been conditioned since Day 1 to think critically about the media; I’m counting on their rationality to help get us out of this mess.)

The point is, we need some graphic illustrations to bring home the dangers of climate change. A polar bear adrift on an iceberg may tug at the heartstrings, but it doesn’t mean much if you live in Florida:

 

Source: http://www.globalwarmingart.com/wiki/File:Florida_Sea_Level_Risks_png

Or Louisiana:

Source: http://www.globalwarmingart.com/wiki/File:Louisiana_Sea_Level_Risks_png

That’s why Nickolay Lamm’s digitized photographs are so amazing. Maybe you’ve seen these pictures. They combine sea level rise mapping data from Climate Central with photographs of beloved American landmarks and shows how the creeping sea level will decimate our landscape. The 12-foot sea level rise he envisions inundates Liberty Island, leaving the Statue of Liberty alone above the waterline while the massive gift shop built in her honor sinks into the drink:

 

Here’s the Jefferson Memorial with the rising Atlantic sea level of 12 feet:

 

 

How about Boston?

 

And while we’re at it, here’s something for you West Coast people—AT&T Park in San Francisco, after the Giants become a synchronized swim team:

 

These Photoshopped Pictures of Doom represent the worst case scenario of sea levels rising 12 feet, which is projected to happen in about 200 years under the climate models used by Climate Central, an independent, nonprofit organization staffed by highly regarded scientists and journalists dedicated to disseminating climate facts to the general public. So, obviously, the pictures are simply for illustrative purposes only—no one expects a major league stadium to last more than a couple decades, let alone a couple of centuries.

In addition to these images, Climate Central offers interactive maps of coastal states that show threats from sea level rise and storm surge in every coastal town from Portland, Maine to Galveston, Texas on one coast, and from Seattle to San Diego on the other. The maps show the current population numbers that are in danger from different levels of flooding. It’s important to note that pure sea level rise differs from the threat of tidal and storm surges, like that seen with Hurricane Sandy. The ocean doesn’t have to rise 12 feet for serious Katrina-like destruction; basically, most coastal areas near sea-level are at risk for major damage during serious weather events, which will be more frequent due to the warming ocean currents. But you already knew that.

However, a little fact-checking is in order here. How does Climate Central’s data stack up against the data from the National Oceanic and Atmospheric Administration (NOAA)? You can see for yourself at NOAA’s Sea Level Rise and Coastal Flooding Impacts page, which only projects up to a 6-foot sea-level rise, versus Climate Central’s 12 feet. To hedge their bets, NOAA’s map also includes a Mapping Confidence function, which shows the statistical likelihood that a given area will be inundated at each rise in sea level. Upshot: Fort Lauderdale—it’s time to look at inland real estate.

Look at Pompano Beach, just north of Fort Lauderdale. It’s on the Atlantic Intracoastal Waterway, and tons of housing developments have been constructed so each abode is on a canal. At a 6-foot sea level rise, nearly all of the community of over 100,000 people is underwater with a “high degree of confidence” according to the NOAA data (“high degree of confidence” are the areas in blue; “low degree of confidence” is in yellow):

 

Looking at a similar area from Climate Central’s map, the data is hard to read:

 

I think the areas underwater are those in the “mapped in area”—everything in white is safe (i.e., in the upper right-hand corner). But it’s hard to tell.

Furthermore, when trying to rectify Lamm’s Photoshopped image of the Statue of Liberty with the mapped data from Climate Central, things are fuzzy. Here’s the mapped image with a sea level rise of 5 feet around Liberty Island:

I think this means that the whole island is underwater, but I’m not sure. You’d think they would somehow indicate that the island is underwater but Lady Liberty herself is not. Here’s the map at a 4-foot sea-level rise:

I take this to mean that the island is largely “safe,” but the docks are gone.

Here’s the data from NOAA with a 6-foot sea level rise at Liberty Island, which seems to coincide fairly well with the 4-foot sea-level rise from Climate Central, although their map doesn’t have as high a resolution:

 

None of this means that Climate Central’s data is inaccurate, only that it is a bit hard to read. Thus, if you want scare tactics—Lamm’s photos are the way to go. If you want to drill down extremely granular data vetted by experts, NOAA can’t be beat. A corollary of this exercise is that despite politicians who yammer on about how the “jury’s still out” regarding climate change, many large government agencies have been dealing with the reality of global warming for many years and will continue to do so no matter who’s in office.

One question I have as a Michigander is the degree to which the Great Lakes will be affected by rising sea levels. So far, neither NOAA nor Climate Central has addressed this issue. Their maps focus solely on the East, West, and Gulf coasts. I think it all goes back to the uncertainty principle; many more factors are involved in water levels in the Great Lakes than in the comparatively simple prognostication process of pinpointing when Venice Beach will disappear. I take solace in this; it’s comforting to live amidst such a large supply of fresh water, even if “fresh” these days is a relative term.

Kathy Wilson Peacock is a writer, editor, nature lover, and flaneur of the zeitgeist. She favors science over superstition and believes that knowledge is the best super power. Favorite secret weapon: A library card.

Posted on: July 8, 2014, 10:30 am Category: Current Issues Tagged with: , , , , ,

That Hamburger = Enough Water to Fill a Swimming Pool

You’re already pretty savvy about the environmental significance of water. You know that fresh water is a finite resource and that Americans use a lot of it for things that don’t matter in the grand scheme of evolution—vast tracts of suburban lawns, fountains in Las Vegas, etc.

You turn the faucet off when you brush your teeth and perhaps even contribute to charities that bring clean water to poor people in developing countries. You eat low on the food chain because it’s better for your own health as well as the health of the planet. However, a good barbecue every now and then is a beautiful thing.

Now it’s time to learn about your water footprint. Like your carbon footprint, which measures your greenhouse gas output from your lifestyle (how far you drive, how much energy you consume, etc.), your water footprint measures how much water usage you are responsible for, not only for obvious things like toilet flushes and showers (illustrated below; this is known as your direct water footprint), but also for the things you consume and buy (known as your indirect water footprint). How much water did it take to grow that orange you’re eating? How much water was involved in the industrial processes that resulted in that Prius you’re driving?

The part of your indirect water footprint that you have the most control over is the part comprised of what you eat. No surprise here—a pound of meat takes oodles more water to create than a single orange. But just how much more? Here’s where you can scour the Internet for a meaningful infographic and not come up with anything overtly satisfying. There’s this:

It’s factual, but not very interesting. What we need is something that really drives home the point. So I decided to put together my own infographic that visually represents the water needed to grow various foods, using data from the U.S. Geological Survey (USGS).

Here it is:

Source: U.S. Geological Survey, http://ga2.er.usgs.gov/edu/activity-water-content.cfm

Now that’s information! Basically, the water footprint of a slice of bread, an orange, or a cup of coffee is infinitesimal next to the giant stomp of a hamburger. If those gallons of hamburger water were used for a swimming pool instead, you could do the backstroke in them.

Furthermore, to make sure the graph fit on this page, I used the lowest water estimate for hamburger. The USGS gives hamburger a water footprint range of between 4,000 gallons and 18,000 gallons of water for a single patty—hold the bacon, hold the cheese. This huge variance depends on where in the world the cattle are raised and numerous other factors. Let’s assume this burger is the product of a steer from a run-of-the-mill Texas cattle ranch, and not some pampered Kobe steer from Japan.

But where are the other foods? Yeah, chicken is there, at a relatively modest 500 gallons for a pound of meat, and the egg clocks in next to the chicken at 50 gallons—if you can squint maybe you see it. That’s 150 gallons of water for a brunch-sized omelet. But where is coffee (35 gallons per cup), orange juice (13 gallons per glass), and bread (10 gallons per slice)?

Well friends, they’re there. I input the numbers in Excel myself. But that’s the glory of this infographic: The discrepancy between staples like bread and orange juice and red meat is astronomical, even though a gallon of orange juice and a pound of hamburger are nearly equivalent in price. Their water footprint is not reflected in what we pay for them.

While water footprint statistics are certainly handy for the environmentally conscious among us, they also have real and significant ramifications on a geopolitical level. Each country or region has finite water resources, or water budgets, that may or may not be sufficient for its population to feed itself and live sustainability. What does this mean for a country like China, for instance, which has transformed hundreds of millions of people from vegetarians (out of necessity) into carnivores (out of preference) as its economic fortunes have skyrocketed—even as it remains one of the most water-scarce countries in the world?

The water footprint is a handy concept that allows us to see the ramifications of our choices, both as individuals and as nations in the global community. For more information on the water footprints of different kinds of food, check out this chart from Waterfootprint.org.

Kathy Wilson Peacock is a writer, editor, nature lover, and flaneur of the zeitgeist. She favors science over superstition and believes that knowledge is the best super power. Favorite secret weapon: A library card.

Posted on: June 25, 2014, 3:00 pm Category: Current Issues Tagged with: , , ,

The Megadrought Is Now

Forget the polar vortex, the record snowfalls in the East, and the abandoned cars on Atlanta’s iced-over freeways. The real news is the drought in the West, by far. Yet unless you live in the middle of it, you’re probably only tangentially aware of the crisis because it doesn’t fit snugly into the 24-hour news cycle. The drought took three years to come to fruition in 2013; it’s a silent, slow-rolling tragedy rather than blink-of-an-eye disaster, and it’s got California in its crosshairs:

This is the March 18, 2014 map from the USDA: See that brick-red splotch over central California? That’s “exceptional drought,” which is as high as the scale goes. The surrounding fire-engine red is “extreme drought.” All told, 38 million people live in California, and their access to water is at risk. Moreover, that brick-red region produces the lion’s share of milk, fruits, vegetables, and nuts consumed in the United States. California’s almond crop alone requires 100 billion gallons of water each year.

Here’s the national picture from the National Climatic Data Center, which, oddly, doesn’t seem to reflect the magnitude of the massive snowstorms in the Northeast this winter:

California, when you’re done with the moisturizer, pass it on to Arizona, New Mexico, and Nebraska (Texas, I’m assuming you have your own).

But how do we evaluate this picture in historical terms? According to the Guardian’s climate change reporter Andrew Freedman:

Longer-running records indicate the 13-month drought, which is part of a 3-year dry period, is equal to or worse than any other short-term drought and is among the top 10 worst droughts to hit California in the past 500 years, based on tree-ring records and instrument data. The drought is part of a broader Western drought that has lasted for roughly 13 years, raising the specter of a modern-day “megadrought” akin to events that doomed some ancient civilizations.

A megadrought is one that lasts two or more decades or one that inflicts a level of economic damage that leads to mass migration. Most megadroughts are associated with La Niña conditions on the Pacific Ocean. They are rare events: Even the Dust Bowl of the 1930s is not considered a megadrought. However, some scientists, including Lynn Ingram, have uncovered evidence that the 20th century was California’s wettest century in 1,300 years. During this time, dams were constructed to divert water, and population and industry developed in accordance with a quantity of water that was far above the geographical norm—to say nothing of launching an unprecedentedly massive agriculture industry smack dab in the middle of a desert. Ingram, the author of The West Without Water: What Past Floods, Droughts, and Other Climatic Clues Tell Us about Tomorrow, believes that this current drought could be the start of a trend that could last as long as a century.

Blame It on La Niña

Most climatologists blame La Niña for this current drought. Although anthropogenic climate change may exacerbate the situation and make summer heat waves worse,  overall the drought seems to have a more natural explanation. La Niña has coincided with all the major droughts of the past 100 years, some of which lasted up to 10 years, and this one is no exception. La Niña and its counterpart El Niño together form the Southern Oscillation climate pattern in which the surface temperature of the Eastern Central Pacific Ocean varies by several degrees. A lower-than-normal ocean temperature results in La Niña; a higher-than-normal temperature results in El Niño. Typically, La Niña results in droughts in the West and cool, rainy summers in the Midwest. El Niño has the opposite effect: Wet winters in the Southwest, including California, and warm, dry winters in most other places in the contiguous United States. This begs the question: Can we count on El Niño to end the drought?

Answer: Maybe. Some signs indicate that an El Niño may be triggered by the late summer of 2014 and last through next winter; the National Weather Service Climate Prediction Center gives this a 49 percent chance of happening. However, a strong El Niño will provide more relief than a mild El Niño, and there is no way to predict its strength even if it does appear.

Until Then. . . .

California Governor Jerry Brown declared a state of emergency on January 17, 2014. This requires enacting emergency plans in the event of drinking water shortages, hiring more firefighters, and launching a campaign urging citizens to reduce their water usage by 20 percent. In February, President Obama pledged $183 million in federal funds ease the state’s water woes, with the majority earmarked for livestock disaster assistance and some $60 million dedicated to food banks.

Kathy Wilson Peacock is a writer, editor, nature lover, and flaneur of the zeitgeist. She favors science over superstition and believes that knowledge is the best super power. Favorite secret weapon: A library card.

Posted on: June 10, 2014, 6:00 am Category: Current Issues Tagged with: , , , , , ,

The World’s Biggest Science Project: An Artificial Star in France

Let’s say you want to solve the world’s energy crisis. You decide to use hydrogen, a fuel source that is unlimited in quantity and inexpensive to obtain. You subject the hydrogen to a process that creates so much energy that we can close down our polluting coal mines forever and forget about the geopolitical consequences of fossil fuels. Best of all, your process generates no pollutants. Say goodbye to smog, acid rain, and greenhouse gas emissions. Before you book your flight to Oslo to accept your Nobel Peace Prize, however, consider that that a bunch of scientists have already beat you to it. The technique in question is nuclear fusion, and for years it has eluded scientists, who could not create plasma hot and dense enough to produce a net gain of energy. But that is close to changing.

 

Isotopes of hydrogen collide during fusion to create helium and energy.

Isotopes of hydrogen collide during fusion to create helium and energy.

Fusion is the physical process that powers the stars, and creating a machine to do the same thing on Earth is the goal of the International Thermonuclear Experimental Reactor (ITER), which is currently under construction in the South of France and is the largest scientific endeavor the world has ever known. Scientists from 35 countries have devoted decades of research to the project, and if all goes well within 10 years or so they’ll flip the switch and the particles inside the giant tokamak, a ringed-doughnut-shaped device, will ramp up to 200 million degrees Celsius and be contained by magnets cooled to -269 degrees Celsius. The plasma will be hotter than the surface of the sun, resulting in the first sustained power-producing fusion reactor in the world, which could pave the way for further reactors that could produce terawatts of power with no radioactive waste for the next 30 million years, give or take.

 

A cut-away schematic of the ITER tokamak.

A cut-away schematic of the ITER tokamak. For an interactive version here.

Why had I never heard of this? I asked myself as I read Raffi Khatchadourian’s profile of ITER in the March 3, 2014 issue of the New Yorker. You’d think that such a spectacular endeavor would be common knowledge, since it operates under no special Manhattan Project-like veil of secrecy. You’d think that skeptical activists would be taking their protests to social media, convinced that the project was either a waste of money or that it will blow up the world. But the reality is much more mundane, and possibly more insidious. As Khatchadourian explains, the real culprit is politics:

ITER was first proposed in 1985, during a tense summit in Geneva between Ronald Reagan and Mikhail Gorbachev. . . . Since then, the cooperation has expanded to include the European Union, China, Japan, South Korea, and India. . . . No partner has full control, and there is no over-all central budget. Each country makes its primary contribution in the form of finished components, which the ITER organization will assemble in France. The arrangement could serve as a model for future collaboration—or as one to avoid.

Because there are so many cooks in the kitchen, ITER’s progress will be fraught with power plays and controversies. In a February 28, 2014 article for Science, Daniel Clery reported on the assessment of ITER that “found serious problems with the project’s leadership, management, and governance. . . . ITER leaders fear that the damning assessment . . . could cause backers to pull their funding.” It was literally easier to land a man on the moon than it will be to create a sun on Earth. But not necessarily more expensive. The lowest estimate for ITER is $20 billion, and the  Apollo program came in around $109 billion—all of which was paid for by U.S. taxpayers, which is not the case for ITER, whose costs will be spread throughout the 35 participating nations. Optimists will consider this a small price to pay for the sheer knowledge the program produces.

The United States’ portion of ITER is based in Oak Ridge National Laboratory in Tennessee, the original “secret city” of the Manhattan Project. The Princeton Plasma Physics Laboratory and the Savannah River National Laboratory are also partners in the US portion of the project. The tasks they have taken on include the design of the tokamak shield, the cooling water systems, electron cyclotron heating transmission lines, and exhaust processing systems.

You can find lots of great animated videos that explain how ITER and the tokamak will work, and in these cases a picture is worth way more than 1,000 words. Try these:

Kathy Wilson Peacock is a writer, editor, nature lover, and flaneur of the zeitgeist. She favors science over superstition and believes that knowledge is the best super power. Favorite secret weapon: A library card.

Posted on: March 11, 2014, 6:00 am Category: Current Issues Tagged with: , , , , ,

Climate Change and Chocolate

If melting polar ice doesn’t make you take climate change seriously, then how about the rising cost of chocolate? Those in the candy biz are forecasting a global chocolate shortage, brought on by rising consumption in developing countries, a decline in cocoa output due to adverse growing conditions, and the clamor for cocoa-intense dark chocolate among discerning consumers. Sounds like a recipe for disaster.

According to a study by Euromonitor International cited in the Boston Globe, chocolate prices in the United States are forecasted to rise 45 percent in 2014 over 2012 prices. In September 2013 chocolate reached an all-time high of $12.25 a kilogram, itself a 45 percent increase over 2007 prices. While consumers can expect to pay more for their candy bars, manufacturers may also change their ingredients to keep costs down. They may substitute less expensive—and more problematic—ingredients for those that are hard to come by. Palm oil instead of cocoa butter may be a common solution. The problem with this is that palm oil is high in saturated fats, the kind that’s bad for you and raises your cholesterol. Artificial additives and fillers are likely to make up a larger percentage of the ingredients of lower-quality chocolate products. Caveat emptor, purists.

Theobroma cacao plant with mature seed pods.

Theobroma cacao plant with mature seed pods.

What’s Behind the Low Production?

Cacao trees, i.e., Theobroma cacao, like the Coffea plant responsible for our favorite caffeinated beverage, are grown in monoculture situations, mostly by subsistence farmers who don’t have the resources to invest in the fertilizers and other crop enhancements that will protect their crops and raise their yields. Impoverished Ivory Coast produces 40 percent of the world’s cocoa, with Nigeria, Cameroon, Ghana, and Indonesia bringing up the rear. In Ivory Coast specifically, lack of rain in the past several seasons has led to a lower than normal harvest and thus higher prices.

Even under optimal growing conditions, many cacao farmers also lack the knowledge of sustainable growing practices and pest management techniques that are common to monoculture elsewhere. Thus, their cacao beans are susceptible to pests and disease, including a nasty condition called witches’ broom.

Cacao beans in the pod, before being roasted and processed into cocoa butter, cocoa powder, and cocoa liquor and delivered to your mouth to satisfying your craving.

Cacao beans in the pod, before being roasted and processed into cocoa butter, cocoa powder, and cocoa liquor and delivered to your mouth to satisfying your craving.

Though recent price hikes are due to demand and short-term bad weather, the big picture remains grim. Farmers in Indonesia have been clear-cutting forests to grow palm trees and harvest palm oil which has led to the habitat destruction of the orangutan (among other species of plants and animals), and many farmers in Ivory Coast have switched from cacao to rubber farming because it provides higher and more stable returns. In the long run, however, climate change may wreak havoc on the industry. The ideal conditions for growing the cacao plant are narrow, which explains why most of the world’s cacao comes from a few countries. Moving production to other areas isn’t the greatest idea, says Rachel Cernansky of Treehugger:  “the ideal conditions for cocoa-growing will shift to higher altitudes—but most of West Africa is relatively flat, so there is not a lot of land at higher elevation to move to. But even where there is higher land, establishing new cocoa-producing areas could trigger the clearing of forests and important habitats for flora and fauna. Which means, yes, exacerbating climate change even further.”

Enter Science

But beleaguered cacao farmers aren’t entirely on their own. Researchers from the International Center for Tropical Agriculture (CIAT), led by Peter Laderach, have assessed likely future scenarios using conservative climate models and summarized their findings in their 2011 report Predicting the Impact of Climate Change on the Cocoa-Growing Regions in Ghana and Cote d’Ivoire.

Here’s the color-coded version of what they found:

Map of Ghana and Ivory Coast showing suitability for cocoa production in 2011.

Map of Ghana and Ivory Coast showing suitability for cocoa production in 2011.

Same map showing climate suitability for cocoa production in 2050 according to the researchers' climate prediction modeling.

Same map showing climate suitability for cocoa production in 2050 according to the researchers' climate prediction modeling.

The resolution on these maps is poor, but you get the idea: The amount of green from the first image to the second declines drastically. That’s a warning cry for action to prevent the decline of an important commodity. Doing nothing would be bad for business, and hardly an option for corporations like Nestle, Mars, and Hershey, whose continued success depends on a robust cocoa industry.

The report’s recommendations are hardly earth-shattering: It suggests that farmers adopt drought-resistant varieties of plants and install irrigation systems; provide shade to plants; actively prevent bushfires; and diversify their crops with plants that will adapt to the changing environment, including oranges, cashews, and the dreaded oil palm. The researchers also recommend that scientists concentrate on developing drought-tolerant cocoa and become active in helping nations develop sound agricultural policy. Finally, the researchers suggest that governments extend credit to cocoa farmers that will enable them to implement changes in their practices to promote sustainability.

As usual, the headlines are eye-catching—”Global Chocolate Crisis Looms“—but the solution is much more mundane. Just about every commodity these days has entered a “crisis” phase, which in the end is nothing more than the enactment of the law of supply and demand.

Kathy Wilson Peacock is a writer, editor, nature lover, and flaneur of the zeitgeist. She favors science over superstition and believes that knowledge is the best super power. Favorite secret weapon: A library card.

Posted on: February 25, 2014, 6:00 am Category: Current Issues Tagged with: , , , , , , , ,