Death by Degrees

A common refrain in the climate change literature, both popular and professional, is the impending doom of human society by human-induced global warming. A temperature threshold is even suggested that will tip the world into “dangerous climate change”: 2°C (3.2°F) over pre-industrial temperature, “an upper limit beyond which the risks of grave damage to ecosystems, and of non-linear responses, are expected to increase rapidly”. The U.S. National Research Council claims that such a threshold has been passed and “animals and other species are already struggling to keep up with rapid climate shifts, increasing the risk of mass extinction that would rival the end of the dinosaurs.” But, this has not always been the view of scientists working on the climate. Not long ago geologists thought that the Earth was headed into another ice age, and that very soon humans might be in danger from advancing ice, not thawing ice. (Some researchers still think this).

In the mid 20th Century, there was renewed interest in the origin of the Pleistocene “ice ages”, which waxed and waned between about 2.6 million and 12,000 years ago. During the Pleistocene large, miles-deep ice sheets covered most of northern Europe and northern North America (see figure below).

800px-Blakey_Pleistmoll

Ice Sheets during the last glacial maximum, about 18,000 years ago. http://commons.wikimedia.org/wiki/File:Blakey_Pleistmoll.jpg

Deposits left from these glaciers, recorded several ice advances and retreats. But the exact number and timing of these cycles were poorly known. Then in the 1950s and 1960s new tools were discovered that unlocked the details of the ice ages. One scientist who’s research fundamentally changed how we think about ice ages, was Cesare EmilianiEmiliani examined tiny fossils called foraminifera (termed “forams” for short). Forams construct a tiny shell out of calcium carbonate (similar to chalk) by extracting calcium and carbonate from seawater. So, their shells mirror the composition of seawater at the time they lived. When forams die and settle to the ocean bottom they record seawater composition over time as sediment builds up. Emilani discovered that he could use the signature of oxygen isotopes in forams to measure past ocean temperatures. (I will leave the details of this for a later post, it is somewhat complicated.)

Using these tiny “paleothermometers”, Emiliani was able to construct a much more detailed picture of Pleistocene ice sheet advance and retreat. (link to paper). He found numerous glacial advances and retreats over hundreds of thousands of years. He also correlated advances and retreats to the amount of the suns radiation hitting the Earth (termed, insolation) in the Northern Hemisphere. Insolation is fundamentally controlled by changes in the tilt of the Earth’s axis, the shape of its orbit, and the precession of the equinoxes, what are termed “orbital parameters. In 1966, Emiliani used these relationships to predict a coming glacial event: “…it is to be expected that a new glaciation will begin within a few thousand years and reach its peak about 15,000 years from now.” (figure below left; link).

Emiliani

Changes in ocean temperature calculated by oxygen isotopes by C. Emiliani (1966). Stages on right refer to different “ages” during the Pleistocene. Notice the large changes in temperature between glacial advances (colder) and retreats (warmer).

During the late 1960s and early 1970s other geologists looked at the newly-precise Pleistocene record and came up with similar predictions. In 1972, George Kukla summarized an international workshop on the subject: “It is likely that the presentday warm epoch will terminate relatively soon if man does not intervene”. And, the National Science Board found: “Judging from the record of the past interglacial ages, the present time of high temperatures should be drawing to an end, to be followed by a long period of considerably colder temperatures leading into the next glacial age some 20,000 years from now.” These views were based on the new understanding at the time of glacial cycles but were also spurred on by new global records of climate becoming available through better climate instrumentation. One of these instrumental records was for temperature. What it showed in the mid 1970s was a strong decrease in temperature over the last 40 years.

US_Temp_1895-1975

This trend combined with the new insights about the ice ages, lead to cautious predictions of continued cooling by many scientists, including those working for the NOAA. This research first made widespread public attention when it was summarized in a story in Time on June 24, 1974 titled “Another Ice Age?”. This article presented a map of increasing ice and snow cover in the Arctic, some of the first data from newly launched environmental satellites. Then a year later, “The Cooling World” was published in Newsweek on April 28, 1975. This article showed a global map of mostly cooling temperatures and a time series of global temperature showing the sharp drop in temperature after 1938 (as shown in the figure below).

Although these articles are now commonly presented as poor journalism because they do not fit our current view of global warming, they were not. They were mostly accurate accounts of what many scientists thought and had published at the time.

Now let’s fast forward to 2014. Forty years after the Time article on the coming of another ice age, we have a much deeper understanding of the length, magnitude and duration of Pleistocene glacial cycles. Let’s look at a couple of relative new articles to see what has changed. First, a paper in 2012 by P.C. Tzedakis from University College London and four co-authors (link here). Tzedakis and his co-authors found that if CO2 concentrations did not exceed 240 ppmV (they are now about 390 ppmV) our current interglacial would end in about 1500 years, followed by another glacial advance. They used both patterns of past interglacials matched to the isolation cycles for our present interglacial and adding ocean processes transporting heat to make their comparisons. Their conclusion was that humans have now added too much CO2 to the atmosphere for the next glaciation to occur if glacials and interglacials followed the same patterns and were controlled by the same processes in the late Pleistocene. But how long would that hold up? To start to answer that question we need to look at a paper by Archer and Ganopolski published in 2005 (link here).

Archer and Ganopolski used a climate model to simulate future climates and ice sheet conditions at different concentrations of CO2. They found that the next glaciation could be postponed a very long time if enough CO2 was added to the atmosphere. Since the start of the industrial revolution, humans have released a bit more than 300 gigatons (Gt) of excess carbon into the atmosphere. Their model predicted that those concentrations would postpone the start of the next glaciation by about 1500 years. But they found that if we had added about 1000 Gt of carbon (about three times what we have added so far), the next glaciation would be put off for 140,000 year. And if we had burned all the known reserves of fossil fuels, there would not be another glaciation for at least another 500,000 years—the length of their modeling experiments. If these modeling simulations are correct, humans could completely negate the effects of the global processes that have controlled ice sheet advance and retreat for at least the last 2.5 million years by burning all available fossil fuels.

This is a sobering example of how humans have become a major geologic force, rivaling the controls on Earth’s climate originating at a cosmic scale (orbit of the planet)!

Posted in Global Change | Leave a comment

The Goldilocks Effect and Megalopolization: Part II, Losing a Paleoclimate Legacy

While driving along I-15 through the Wasatch Front Megalopolis (see previous post), there are some interesting features. The most obvious is the steep front of the Wasatch Range to the east. This mountain range was uplifted along a large, active fault at the base of the mountains termed the Wasatch Fault (which has some interesting geologic hazard aspects for this rapidly growing area). There are two other interesting features along the front the mountains. One is that prime housing developments (and some fancy golf courses) are very common on a series of flat surfaces (terraces) along the mountain front (see figure below). These have great views of both the mountains and the valley below and so are prime real estate. The terraces are also mined for gravel to fuel development. Terraces are extensive extending along the entire Wasatch Front and beyond around the edges of the Great Salt Lake Valley. The largest and most complex terraces are associated with large canyons or prominent ridges extending out from the range front. What are these features and what can they tell us about the paleoclimate of the Wasatch Front and the role humans play in modifying the Earth’s landscapes?

Cottonwood Heights Delta

Terraces (extending from the canyon in the upper right all the way to the left edge of the picture) along the Wasatch Range front. There terraces support housing developments (left center) and gravel mines (center). This image is from above Cottonwood Heights, UT, looking towards the east. Image from Google Earth, 2013.

To answer that question we need to go back to the mid-late 1800s and examine the work of one of the West’s most famous Geologists, Grove Karl Gilbert. In 1890, G.K. Gilbert (as he is mostly known) published a report on a large prehistoric lake that filled the Great Salt Lake basin thousands of years ago. This work was based on extensive field work he and explorers before him had done in the Great Salt Lake basin. He named the prehistoric lake they discovered, Lake Bonneville, after explorer Benjamin Louis Eulalie de Bonneville (1796–1878), and identified a number of features formed by the lake. At it greatest extent it was over 500 km long, 200 km wide, and 300 m deep filling the Salt Lake Valley. Gilbert found evidence for Lake Bonneville in the terraces seen along the mountain fronts (figure below). He recognized that the basin was filled with a much larger lake during the last part of the geologic epoch called the Pleistocene.

Bonneville_Terraces

Drawing of Lake Bonneville terraces from Gilbert’s 1890 report.

These terraces and other shoreline features (deltas, spits, tombolos and barrier bars) established that the lake existed for a long time at several different elevation or stands. It remained at some particular elevations for long periods of time (hundreds to thousands or years–first figure below) forming large deposits of sand and gravel. Research since Gilbert’s pioneering work, has dated these stands and established the geologic history of Lake Bonneville and contemporaneous lakes throughout the Great Basin to the east of the Wasatch Range (second figure below). Lake Bonneville and contemporaneous paleo-lakes in the Great Basin are called Late Pleistocene lakes because they were at their high stands during the last part of that epoch. The Late Pleistocene encompasses the last major continental ice sheet advance from about 126,000 year ago to 11,700 year ago. The North American ice sheet reached its maximum southward extent (termed last glacial maximum or LGM) during the Late Pleistocene. The timing of the LGM, about 30,000-17,500 years ago encompassing the time of the high stands of Lake Bonneville. By 15,000-11,500 the Earth was warming and moving into the present interglacial, the Holocene (starting about 11,700 years ago and extending to the present).

Bonneville levels

Lake Bonnevile levels over the last c.a. 30,000 years. The highest stands are during the last glacial maximum, the time when the most recent continental glaciation was at its maximum extent in North America (see map below). From: http://weberstudies.weber.edu/archive/archive%20d%20vol.%2021.2-25.2/vol.%2024.3/adolph%20yonkee%20ess.htm

GB Pleist Lakes

Map of Late Pleistocene Lakes in the Western United States, c.a., 17,500 years before present. From: http://en.wikipedia.org/wiki/Lake_Bonneville. Note: There is also a nice map of Lake Bonneville stands with extensive explanation by Utah Geologic Survey: http://geology.utah.gov/online/m/m-73.pdf; the names of Lake Bonneville stands are explained in the publication. Interactive graphics to show Bonneville Lake levels at different times can be found here: http://geology.utah.gov/utahgeo/gsl/flash/lb_flash.htm.

The vast extent of Late Pleistocene lakes in the Great Basin and the huge size of Lake Bonneville in particular (and Lake Lahonton on the west side of the Great Basin), show that the climate was very different about 20,000 years ago compared to now. Some researchers think that rainfall in the region needed to be from 140-280% of present values to get high lake stands, while evaporation was likely about 30% lower due to decreased temperatures during glacial times (maybe as much as 5-10ºF). Annual mean precipitation for Salt Lake City is now 16 inches. Late Pleistocene precipitation would therefore be on the order of 23-46 inches, for a average of about 34 inches. The low end is about like that of present-day San Francisco, CA (21 inches), the high about like Tampa, FL (46 inches), and the mean pretty much like Seattle (38 inches). So, Salt Lake City in the late Pleistocene was a fairly wet place compared to now! Let’s now return to the Lake Bonneville shoreline deposits and look at how humans utilize those deposits and what that says about our ability to modify the landscape.

The higher precipitation in along the Wasatch Front lead to higher runoff. More flow in the streams eroded the surrounding mountains transporting sediment into Lake Bonnevile forming extensive shoreline deposits. Strong and persistent winds transported sediments along the shore forming the terraces, spits and beach ridges. Where rivers and streams entered the lake, deltas were formed. As the lake dropped sediment was spread out, forming the present-day deposits. These sediments are unconsolidated so easy to excavate. They are also very close to where all the building is taking place so easy to transport to building sites. So, they make building roads and structures along the Wasatch Front relatively cheap–there is nearly always a gravel bar close at hand to new development. These terrace deposits are the foundation of the Wasatch Front Megalopolis. From 1994-2007 (last date data is available), 287 million metric tons of sand and gravel were extracted in the region along the Wasatch Front (data from U.S. Minerals Information Service, see figure below). The reconstruction of Interstate-15 and associated federal highways in the run up to the 2002 Salt Lake City Olympics used over 12 million metric tons of sand and gravel. Construction of high-rise buildings, Olympic Villages and other associated structures used more, forming the large spike in the plot of sand and gravel mined from 1997-2000 (figure below).

Sand_Gravel_Production_Wasatch

Sand and gravel production from the district that encompasses the Wasatch Front corridor. Data is from the U.S. Minerals Information Services state data (http://minerals.usgs.gov/minerals/).

Now, another boom in building is happening that is not yet completely captured by data provided by the U.S. Minerals Information Service  because their last reported data is for 2007 production. In some places the excavation of sand and gravel has removed a substantial proportion of the deposits that formed over several thousand years along the shores of Lake Bonneville. Let’s look at one of those deposits, the Point of Mountain Spit.

Point of Mountain is a spectacular spot just south of Salt Lake City. A high, traverse ridge extends westward several miles from the Wasatch Range separating the Salt Lake Valley from Utah Valley at the Jordan Narrows. Strong shoreline currents carried sediment southward toward this spot, forming a gigantic spit of sand and gravel that extends (extended) from the Salt Lake Valley into Utah Valley. When Gilbert described this spit, it was an intact feature. In 1993 much of the spit was still enact and the shoreline terrace to the north was relatively undeveloped (figure below), but some gravel mining had destroyed the end of the spit (where I-15 curves around the ridge in the figure below), likely to build the first stages of I-15.

Point of Mtn Spit 1993 Obl-Outline

1993 oblique view (looking south) of the Point of Mountain Spit and associated shoreline terrace. The lake ward end of the spit is somewhat destroyed by gravel mining (light areas) but the general shape can be seen extending into the gap between Salt Lake Valley and Utah Valley. Yellow outline is approx. extent of spit and other terrace-like deposits.

Twenty years later in 2013, the spit was completely transformed (figure below).

Point of Mtn Spit 2013 Obl

2013 oblique view (looking south) of the Point of Mountain Spit area and associated shoreline terrace. Major excavation and development has modified the deposits. Image from Google Earth.

The entire end of the spit and much of the underlying gravels have been excavated in a huge gravel mine. Even the ends of the mountain ridge has been mined for rock. The shoreline terrace is nearly completely covered with housing developments (the undeveloped end is a paraglider park, so was saved). Much of the farm land in the valley has been transformed to housing developments and roads. This shows the accessibility of Lake Bonneville deposits to the demand for construction materials. Gravel from the spit goes right into adjacent roads, houses, business parks and shopping malls. The figure below is a closer and vertical view that better illustrates the magnitude of these excavations–all of the lighter areas are the sand/gravel/rock mines and roughly outline the previous extent of the spit.

Point of Mtn Spit 2013

2013 vertical view of the gravel mines at the Point of Mountain Spit. From Google Earth.

Extensive Lake Bonneville deposits like the Point of Mountain Spit probably took from 500-1500 years to form (length of a stand in the lake and following drawdown). Humans have nearly completely excavated it in about 20 years. Or a rate of destruction 25-75 times faster than construction. This shows what a tremendous power direct human actions are in the modern world. Research looking at human actions at the global scale show a similar rate of “human erosion”, about 40 times geologic rates (see “Moving Dirt” in the post archives). Humans now have become the premier mover of material on the Earth’s surface, more efficient than all other geologic processes. The Wasatch Front Megalopolis is a prime example of that ability. In a little over a century we have come a long way in completely transforming a natural landscape that took many thousands of years to develop into a human construct. That is incredible power. Is the next stage Trantor? Trantor

Posted in Global Change

The Goldilocks Effect and Megalopolization: Part I, Modern Climate

I recently drove through the Wasatch Front Urban Corridor (figure below), on my way from western Montana to southern Utah. This roughly 120-mile long corridor is a spectacular example of an emerging megalopolis and the power of humans to modify the Earth. But the Wasatch front also has some interesting climate and paleoclimate (prehistoric climate) features that help explain the development and how past climates can facilitate the expansion of human society.

Wasatch Front Megalopolis

The Wasatch Front Megalopolis (grey areas with city names), that encompasses Salt Lake City and the various cities to the north and south (north is to the left). The Wasatch Mountains are to the east (top of image); the Great Salt Lake (lower left) and Utah Lake (upper right) to the west. The Wasatch Megalopolis extends about 120 miles along I-15 and is composed of nearly continuos housing developments, shopping malls, business districts, schools, and other associated infrastructure of modern American suburbia/exurbia. From Google Earth imagery, 2013.

The spectacular fault-bounded Wasatch Range on the east and the Great Salt Lake and Utah Lake on the west hem in development along the front. Over 2.3 million people live in this corridor and population is growing fast (table below). About 80% of Utah’s population is concentrated along the Wasatch Front, where about 85% of Utah’s economic productivity is generated. This is an amazing place growth wise!

Table 1

Population of counties that encompass the Wasatch Front Megalopolis (from: http://en.wikipedia.org/wiki/Wasatch_Front).

So, why are all these people coming to the Wasatch Front? There are the usual socio-economic factors for sure (jobs, family, etc), but it is also a good place climatically. To show you, I will use a climate data plotting site called WeatherSpark that I have not introduced yet. WeatherSpark is a website that uses a wide range of available data to make unique graphs of local climate. Many of these differ from those that you can get from the U.S. Weather Service (that I presented in the last post) and add more useful climate parameters. In Part I of this post, I will present the climate of Salt Lake City, in the center of the Wasatch Megalopolis, to show why the Wasatch Front is such a good place climatically (at least for some people). Then in Part II of this post I will explore the recent paleoclimate (prehistorical climate) features of the Wasatch Megalopolis and look at what this all means in the context of humans’ ability to modify the Earth.

OK, now for the recent climate of Salt Lake City, the epicenter of the Wasatch Megalopolis.

Here is a plot from WeatherSpark showing the annual maximum and minimum temperatures for Salt Lake City (SLC). Looks pretty good, as long as you do not mind coolish winters and warmish summers.

SLC_Temps

The daily average low (blue) and high (red) temperature with percentile bands (inner band from 25th to 75th percentile, outer band from 10th to 90th percentile).

But how do these temperatures “feel” on average. Here is a plot that can give you an idea:

SLC_fraction_of_time_spent_in_various_temperature_bands_percent_pct

The average fraction of time spent in various temperature bands: frigid (below 15°F), freezing (15°F to 32°F), cold (32°F to 50°F), cool (50°F to 65°F), comfortable (65°F to 75°F), warm (75°F to 85°F), hot (85°F to 100°F) and sweltering (above 100°F).

Not much hot weather and a large amount of warm, comfortable, and cool weather: “not to hot, not too cold, just about right”.  At least for some folks. But here is a more useful plot, because in general people seem to like climates without much rain. Here is how SLC stacks up:

SLC_probability_of_precipitation_at_some_point_in_the_day_percent_pct

The fraction of days in which various types of precipitation are observed. If more than one type of precipitation is reported in a given day, the more severe precipitation is counted. For example, if light rain is observed in the same day as a thunderstorm, that day counts towards the thunderstorm totals. The order of severity is from the top down in this graph, with the most severe at the bottom.

Pretty nice summers, with only a quarter of the days with precipitation, and that mostly in thunderstorms. Snow in the winter, so good for skiing, but still with only 50% of the days with precipitation. But what about “humidity”? It can be not-so-hot, but really humid, so pretty miserable. Here is a WeatherSpark plot of dew point that takes into account temperature and humidity, which can be a better measure of how comfortable the climate is:

SLC_dew_point_temp

The daily average low (blue) and high (red) dew point with percentile bands (inner band from 25th to 75th percentile, outer band from 10th to 90th percentile).

Mostly comfortable summers and feels dry the rest of the year. So, if you like dry and sunny, SLC is for you! But remember, these are “average” conditions (see post on Normal vs. Normals) and not conditions that you will experience every day or even often. That is another nice feature of the WeatherSpark plots, they give you a feel for the spread of each climate parameter. The darker colored bands are the spread between 75th percentile and the 25th percentile, representing central 50%. The lighter colored bands represent 90% of the data. So, you can get a good feel for the spread of the parameter as well as the central tendency.  

There is another climate control along the Wasatch Front that makes it a desirable place to live and helps drive the growth of the megalopolis. The steep front of the mountains, just east of the corridor, add a third climate dimension. The Wasatch Range forms a steep elevation barrier to storms arriving from the southwest. Air moving eastward over the mountains, rises and cools. The cooler air cannot hold as much moisture, so it falls as snow or rain (depending on the season). The Great Salt Lake also supplies moisture to the atmosphere through evaporation as winds blow across the lake (called a “lake effect”). These climatic factors lead to heavier precipitation in the mountains than in the Wasatch Front corridor. Looking at the figure below, we can see that the average annual precipitation within the Wasatch Front Megalopolis is about 15 to 25 inches, while in the Wasatch Range, just east it is from 40 to >60 inches. So, people can live in a relatively warmer and drier climate but have quick (depending on traffic) access to deep, dry snow in the mountains to the east.

UT_PPT

Map of the annual average precipitaiton (30 year normal) of northern Utah. From the PRISM Climate Group, Oregon State University, http://www.prism.oregonstate.edu/gallery/.

Because the Wasatch Range snow is deep and dry it makes for excellent skiing so is home to some of the country’s best skiing and other winter sports. Unfortunately, the proximity to the about 2.3 million people of the Wasatch Front Megalopolis also makes it some of the most crowded (and expensive resorts) as well. But, at least for about 2.3 million people in the Wasatch Front Megalopolis, this area has a Goldilocks Effect, “not too hot, not too cold, just right”. For some things that is, but easy driving is not one of them! Nor is living in a natural landscape as the megalopolis gobbles it up. That is the topic of the next post.

 

Posted in Global Change | Tagged

Normal vs. Normals

We hear it all the time, but do we really know what it means? The news anchor pans to the meteorologist/weather person and says something like, “Well, Dana, what is the weekend weather hold for us? Will this rain continue? And Dana answers, “Clint, we are in for a surprise. The skies will clear and temperatures on Saturday will be 15 degrees above normal. So, get out those shorts for at least one more great fall weekend.” When we hear this we conjure up our definition of normal: Normal (adjective): Conforming to a standard; usual, typical, or expected: it’s quite normal for puppies to bolt their food; normal working hours (Oxford English Dictionary).

So, the temperatures on the weekend will be above typical, higher than what is expected for this day in the fall. But, that is not exactly the meaning of “climate normals”, which is what the media meteorologist is referring to. Climate Normals are a calculation by the National Weather Service, and have a very precise meaning, different from the meaning we think of when we hear the term. Here is the definition according to the National Weather Service, which calculates normals for each weather station in the U.S.: Climate Normals (proper noun) are three-decade averages of climatological variables including temperature and precipitation. This product is produced once every 10 years.

So, the 1981–2010 U.S. Climate Normals is the latest release. (There are past Climate Normals for 1971-2000, 1961-1990, etc.) This dataset contains the calculated daily and monthly averages (mean of all the values for that day over those three decades) of temperature and precipitation, plus other climate parameters such as, snowfall, heating and cooling degree days, frost/freeze dates, and growing degree days. Climate Normals are provided for most of the weather stations across the country. Below is an example (Figure 1) of what these look like for temperature and precipitation for Hamilton, MT.

Normals_Fig1

Figure 1: 1981-2010 Climate Normals for Hamilton, MT. From the Western Regional Climate Center at http://www.wrcc.dri.edu/cgi-bin/cliMAIN.pl?mt3885. You can also get tabular data from this site so you can easily compare Climate Normals to measured temperatures for any day of the year.

Climate Normals are calculations of the temperatures for the three decades, they are not the typical values and you may not even have experienced them for that day, even if you lived through all three decades. They are calculated averages, not measured values. This is something critically important about climate data: It is derived from calculations based on measured data, not the measurements themselves. So, for example, the Climate Normal for today in Hamilton is 64.1 degree F, while the high today was 75 degree F, about 10 degrees higher than the “normal” high temperature.

The plot below (Figure 2) shows how variable the climate is for any site. This is a plot of the daily maximum temperatures (Tmax) for Hamilton, MT for the 30 year of the last Climate Normal period (1981-2010).

Normals_Fig2

Figure 2: 1981-2010 daily maximum temperature records (red pluses) for Hamilton, MT. From http://cdiac.ornl.gov/cgi-bin/broker?id=243885&_PROGRAM=prog.gplot_clim_jd2013.sas&_SERVICE=default&param=TMAX&minyear=1981&maxyear=2010 .

The red pluses on the figure show the measured maximum temperature for each day over the three decades used to calculate the Climate Normals. The vertical line is October 6 and the horizontal line is the normal maximum temperature for that day; the grey diamond marks the intersection of the two, showing the calculated normal maximum temperature compared to the range of measured maximum temperatures for that day. You can see the huge range of data around the calculated normal compared to the measured Tmax. The normal TMax is not “typical” or “expected” in the normal sense of normal. That is, hardly any of the maximum temperatures measured for a particular day match the Climate Normal Tmax. So, normals are not really normal! But they are a convenient way to compare records to some average value. And there are some great ways to see these comparisons on the Weather Service sites. The graph below (Figure 3) is one of those plots. (See the note at the end of this post to see how to get these plots for your area. One catch is that these plots are not available for as many stations as the data I showed above. In chart below I have moved to a station 38 miles north of Hamilton, MT, Missoula, MT.

Normals_Fig3

Figure 3: Daily temperature and precipitation compared to 1981-2010 Climate Normals for Missoula, MT. From, http://www.wrh.noaa.gov/climate/yeardisp.php?wfo=mso&stn=KMSO&submit=Yearly+Charts.

This chart will take some explaining, but it is well worth the effort. I will concentrate on temperature first (upper graph). The blue vertical bars (they are just very thin so look like a jagged line) are the maximum (top of bar) and minimum (bottom of bar) temperatures recorded at the Missoula weather station. The upper boundary of the green band is the Climate Normal Tmax and the lower boundary is the Climate Normal Tmin (minimum temperature). So, the green band represents the band of calculated normal temperatures and the blue “line” represents the measured temperatures throughout the year. The top of the pink, jagged band is the extreme temperature recorded for that day (Record Max) and the bottom of the blue band is the extreme minimum temperature recorded for that day (Record Min). So, this graph shows an incredible amount of information, allowing comparison throughout the year of measured data to calculated normals and recorded extremes. Precipitation graphs are simpler because they show only the measured data (the “stepped” line) and the “normal precipitation”. The graph shows cumulative precipitation through the year, so it is easy to see if precipitation is above the Climate Normal (dark green) or below normal (light brown). You can say a lot by looking at these plots about how temperature and precipitation in a specific year compares to the latest three decades used to calculate the normals (1981-2010 in this case).

The most obvious is how “un-normal” the normals are, or maybe a better way to look at it is how “un-normal” the daily records are. For example, for 2014 until October 6, there are many days that fall outside the normal range of temperatures. There are also very few days that were near or beat the extreme records—two maximum and two minimum out of just over nine months of records. But some months were really wild. Let’s zoom in on February using the figure below (Figure 4).

Normals_Fig4

Figure 4: Daily maximum and minimum temperatures for February 2014, in Missoula, MT.

For the first third and last third of February, every day had minimum and maximum temperatures substantially lower than normal. During the middle of the month, it was the opposite with the temperatures substantially (but not as extreme) above normal. There is a clear step in the weather across the boundaries between these two conditions. Like a switch turning from really cold to really warm. That switch was the jet stream as it moved across the region bringing in warm or cold air. But that story is for another post.

Note

To get to the graphics in the last two plots takes some patience with the Weather Service site. The easiest way (but not the only way) is to follow these directions:

1. Start at this web site: http://www.nws.noaa.gov/climate/

2. Click on the region of the state your are interested in on the U.S. map. This will take some messing with to get where you want. When the right region comes up, select the station you want. I selected Missoula, MT (mso), so got this site: http://www.nws.noaa.gov/climate/index.php?wfo=mso.

3. Then click on the tab (upper part of the page) titled, “Local Data/Records”.

4. This brings up another page. Under “Climate Graphs” select “Graphical Climate Summaries for…” where the “…” is the list of stations in the region you selected.

Posted in Global Change

Warming Complexities in the News

Winds of Warming

September has been a great month for climate science in the news. Many major news media noticed a paper published in the Proceedings of the National Academy of Sciences (PNAS) by Drs. James A. Johnstone and Nathan J. Mantua titled, “Atmospheric controls on northeast Pacific temperature variability and change, 1900–2012“. Their research found that “changes in winds over the eastern Pacific Ocean explain most of the warming trend along the West Coast of North America in the last century”, not increases in solar insolation from increasing greenhouse gases in the atmosphere. For details you should read the summary of the article on the Southwest Fisheries Science Center’s web site (https://swfsc.noaa.gov/news.aspx?Division=FED&ParentMenuId=54&id=19504) where Dr. Mantua works. Also read Prof. Cliff Mass’ (University of Washington) article on the importance of the findings compared to other work on the region. You can find that article on the Cliff Mass Weather Blog (http://cliffmass.blogspot.com).

Considering the wealth of papers finding that global temperature increases in the last few decades result mostly from greenhouse enhancement, at first glance the Johnstone and Mantua result is surprising, and as many news media said, controversial. But, not really. It shows that the climate is complex and local/regional variability can be large, even larger than global forcing. The global greenhouse enhancement is imbedded in the west coast temperature signal, it is just swamped by natural variability. This variability is from the atmospheric systems (large scale wind patterns) in the North Pacific Ocean that drive major weather systems across the Pacific Northwest and transport heat through the atmosphere. The importance of the Johnstone and Mantua work is that it helps develop “…a fuller understanding of natural and anthropogenic changes…”. The take home message is that the world is a complicated place and regional variability is large. Their results can be seen best in a figure in the supplemental information that goes along with the paper (click on their figure S4 below). In this figure you can see the temperature trends from the Pacific Coast states before the Pacific atmospheric forcing is removed (left side) and after it is removed (right side). Once the effects from the wind/pressure processes described in the paper are corrected, temperature trends are diminished substantially across the region.  However, Southern California shows persistent significant upwards trends (warming) even with the atmospheric effects removed. One question is how far onto the continent do such controls extend? You can see that even areas interior of the coastal mountain ranges show strong responses to the oceanic conditions. This should not be a surprise because of the well known effects of El Niño (and other ocean-atmospheric processes) on climate throughout the world. But the quantification of this process on temperature trends is definitely important and needs to be  incorporated into modeling and predictions of how the climate will change in the future with continual addition of greenhouse gases and other human modifications of the landscape and waterscape.

Figure S4 from Johnstone and Mantua, 2014.

Figure S4 from Johnstone and Mantua, 2014.

The other interesting climate science publishing event was the release of “Explaining extreme events of 2013 from a climate perspective” published by the American Meteorological Society (AMS). For the last few years, the AMS has published a retrospective on the major climate events that affect the world in some dramatic way: droughts, floods, heat wave, etc. In the latest publication, several groups of meteorologists and climate scientists write about the same event from different perspectives, so it is a great volume to see a range of ideas on a topic–real science without the homogenization of “consensus”! You can read the report here (http://www2.ametsoc.org/ams/index.cfm/publications/bulletin-of-the-american-meteorological-society-bams/explaining-extreme-events-of-2013-from-a-climate-perspective/) and read a New York Times story on it here (http://www.nytimes.com/2014/09/30/science/earth/human-related-climate-change-led-to-extreme-heat-scientists-say.html). Over eighty authors in 22 articles examine drought, heat waves, hurricanes, downpours, cold snaps, and blizzards that occurred in 2013 around the world. The most interesting papers were the two sets on the California drought (four) and the Australian heat wave and drought. These papers give a great overview of the complexity of these events and how many different aspects of the climate come together to cause them, including some of the same large-scale atmospheric processes that Johnstone and Mantua present in their paper.

The thing that really stands out in the California papers is the severity of this drought within the historical record (click figure below from Swain et al.).

SwainFig2.1c

12-month (one-sided) moving average precipitation in California from 1895 to 2014. Major historical droughts highlighted. Swain et al. 2014, Bull. Amer. Meteor. Soc., 95 (9), S1–S96.

There are lots of detailed plots and discussion about outcomes in these papers in a short format. They are dense but well laid out with a short and informative introduction, followed by brief (but jargony) results and then a concise conclusion. You can get a clear picture of what the authors did, why they did it and what they came up with. It will take much more reading of the cited literature to understand the background the authors rely on for their conclusions, but the papers are a great presentation of the science of studying extreme climate events. I especially liked the paper by Hoerling and his eight co-authors on the extreme rain and flooding event in NE Colorado (around Boulder, CO). They examined very carefully the importance of the local situation in the broader context of regional/global weather/climate. A very nice job!  There is a topic for everyone in this volume, so it is worth the read.

Links:

Johnstone and Mantua article: www.pnas.org/cgi/doi/10.1073/pnas.1318371111

BAMS Collection: http://www2.ametsoc.org/ams/index.cfm/publications/bulletin-of-the-american-meteorological-society-bams/explaining-extreme-events-of-2013-from-a-climate-perspective/

News stories:

Seattle Times: http://seattletimes.com/html/localnews/2024601865_climateweatherstudyxml.html

Los Angles Times: http://www.latimes.com/science/la-sci-pacific-warming-20140923-story.html

http://www.ibtimes.com/climate-change-manmade-or-natural-west-coast-warming-linked-changing-winds-1693746

CBS News: http://www.cbsnews.com/news/west-coast-warming-blamed-on-natural-causes-not-human-activity/

Posted in Global Change | Tagged

Moving Dirt

An Abandoned Open Pit Copper Mine, Butte, MT

The discussion of human-induced climate change commonly revolves around indirect effects by humans. For example, we increase greenhouse gases, which leads to warming of the planet, which changes climate processes in myriad ways. But there are more direct effects by humans. Since I completed my post on human population (February 2, 2010) nearly 9 million people have been added to the planet—6.809 billion people total now. That is a lot of people to house, feed and transport around, and so requires lots of dirt to be moved. One interesting question is how does all the earth moving associated with human development compare to the amount of material moved by geologic processes, like mountain building, glaciation, and erosion by wind, waves and rivers?

Until 16 years ago most geologists would have thought that humans were no match for the Earth. But that all changed in 1994 when Dr. Roger Hooke, then a Professor at the University of Minnesota, published a paper in the Geological Society of America’s news magazine, GSA Today (see Hooke, R. LeB., 1994, On the efficacy of humans as geomorphic agents: GSA Today, v. 4, No. 9, p. 217, 224-225 and this link). In his article, Hooke did an elegant first-order analyses of resource and construction data along with geologic data that estimated humans moved about 45 billion tons of material a year (45 Gt/y), essentially equivalent to all the material moved by ricers, glaciers, waves, wind, and continental mountain building. Direct human actions moved as much material as all the geologic processes on Earth—phenomenal!

In his analyses, Hooke left out the material moved by agriculture. He included agriculture in another paper in 2000 (see Hooke, R.LeB., 2000, On the history of humans as geomorphic agents: Geology, v. 28, no 9, p. 843-846)  and found that humans moved almost 2 times the material moved by geologic processes on the continents (80 Gt/y vs. 45 Gt/y). In the last 10 years more detailed and complete analyses by other geologists have found even higher “human erosions rates”. Dr. Bruce Wilkinson, from the University of Michigan, constructed a detailed historical time series of direct human movement of earth material (see Geology; March 2005; v. 33; no. 3; p. 161-164 ). He found that by approximately 1000 AD humans equaled geologic processes (see figure below). In 2000 AD, humans moved upwards of 40 times the geologic erosion rate. Wilkinson summed this up with this quote: “At these rates, this amount of material would fill the Grand Canyon in 50 yr.” To put this in perspective, it took geologic erosion about 6 million years to form the Grand Canyon—humans have become the largest geologic force on Earth’s continents, bar none. Phenomenal!

Human Erosion vs. Geologic Erosion (from Wilkinson, 2005)

Posted in Global Change

The Carbon Cost of Human Reproduction

The human population is rarely discussed in the context of human-induced climate change. As I write this, on February 2, 2010 at noon Mountain Standard Time, the World population is estimated at 6,800,295,487 people (see figure above).

You can find the population at the U.S. Census Bureaus web site; check it as you start to read this post and then when you finish to get an idea of how rapidly the population is growing (http://www.census.gov/ipc/www/popclockworld.html).

What is astounding is that in the next hour, about 8600 people will be added to Earth.  In the next year over 75 million will be added–about two “Californias” (see Table 1 below). That means that water and food and jobs, all the things that go into living in the 21st century, have to be created for all these people. And it is not just next year, it is each subsequent year. And, the numbers keep rising exponentially. In only 4 years we will have added 309 million people, another “United States”!

Time unit       Births      Deaths       Increase
Year       131,940,516   56,545,138    75,395,378
Month       10,995,043    4,712,095     6,282,948
Day            361,481      154,918       206,563
Hour            15,062        6,455         8,607
Minute             251          108           143
Second             4.2          1.8           2.4

Table 1. From U.S. Census Bureau (link above in text).

As we add people they use energy—directly and to produce all the extra food water and goods. Because energy is mostly derived from fossil fuels, release of carbon to the atmosphere increases. We can think of this as human’s “carbon legacy”. A few years ago, Paul Murtaugh and Michael Schlax from Oregon State University, published an article titled Reproduction and the carbon legacies of individuals (Global Environmental Change, Vol. 19, No. 14-20, doi: 10.1016/j.gloenvcha.2008.10.007). They found that “Under current conditions in the United States, for example, each child adds about 9441 metric tons of carbon dioxide to the carbon legacy of an average female, which is 5.7 times her lifetime emissions.” This and a report written by the London School of Economics are highlighted in an article in the Washington Post  (September 15, 2009: http://www.washingtonpost.com/wp-dyn/content/article/2009/09/14/AR2009091403308.html).  The London report found that it is far cheaper to invest in preventing births than in supplying extra energy for extra people and is far cheaper than developing new solar or wind power ($7/ton vs. $51-$24/ton). This shows how closely linked population growth and energy use are.

Oh, and as I finish writing this post, I checked the Population Clock:

Over 10,000 people have been added to the world!

Posted in Global Change | 1 Comment