The Goldilocks Phenomena and Megalopolization: Part II, Losing a Paleoclimate Legacy

While driving along I-15 through the Wasatch Front Megalopolis (see previous post), there are some interesting features. The most obvious is the steep front of the Wasatch Range to the east. This mountain range was uplifted along a large, active fault at the base of the mountains termed the Wasatch Fault (which has some interesting geologic hazard aspects for this rapidly growing area). There are two other interesting features along the front the mountains. One is that prime housing developments (and some fancy golf courses) are very common on a series of flat surfaces (terraces) along the mountain front (see figure below). These have great views of both the mountains and the valley below and so are prime real estate. The terraces are also mined for gravel to fuel development. Terraces are extensive extending along the entire Wasatch Front and beyond around the edges of the Great Salt Lake Valley. The largest and most complex terraces are associated with large canyons or prominent ridges extending out from the range front. What are these features and what can they tell us about the paleoclimate of the Wasatch Front and the role humans play in modifying the Earth’s landscapes?

Cottonwood Heights Delta

Terraces (extending from the canyon in the upper right all the way to the left edge of the picture) along the Wasatch Range front. There terraces support housing developments (left center) and gravel mines (center). This image is from above Cottonwood Heights, UT, looking towards the east. Image from Google Earth, 2013.

To answer that question we need to go back to the mid-late 1800s and examine the work of one of the West’s most famous Geologists, Grove Karl Gilbert. In 1890, G.K. Gilbert (as he is mostly known) published a report on a large prehistoric lake that filled the Great Salt Lake basin thousands of years ago. This work was based on extensive field work he and explorers before him had done in the Great Salt Lake basin. He named the prehistoric lake they discovered, Lake Bonneville, after explorer Benjamin Louis Eulalie de Bonneville (1796–1878), and identified a number of features formed by the lake. At it greatest extent it was over 500 km long, 200 km wide, and 300 m deep filling the Salt Lake Valley. Gilbert found evidence for Lake Bonneville in the terraces seen along the mountain fronts (figure below). He recognized that the basin was filled with a much larger lake during the last part of the geologic epoch called the Pleistocene.


Drawing of Lake Bonneville terraces from Gilbert’s 1890 report.

These terraces and other shoreline features (deltas, spits, tombolos and barrier bars) established that the lake existed for a long time at several different elevation or stands. It remained at some particular elevations for long periods of time (hundreds to thousands or years–first figure below) forming large deposits of sand and gravel. Research since Gilbert’s pioneering work, has dated these stands and established the geologic history of Lake Bonneville and contemporaneous lakes throughout the Great Basin to the east of the Wasatch Range (second figure below). Lake Bonneville and contemporaneous paleo-lakes in the Great Basin are called Late Pleistocene lakes because they were at their high stands during the last part of that epoch. The Late Pleistocene encompasses the last major continental ice sheet advance from about 126,000 year ago to 11,700 year ago. The North American ice sheet reached its maximum southward extent (termed last glacial maximum or LGM) during the Late Pleistocene. The timing of the LGM, about 30,000-17,500 years ago encompassing the time of the high stands of Lake Bonneville. By 15,000-11,500 the Earth was warming and moving into the present interglacial, the Holocene, from about 11,700 years ago to the present.

Bonneville levels

Lake Bonnevile levels over the last c.a. 30,000 years. The highest stands are during the last glacial maximum, the time when the most recent continental glaciation was at its maximum extent in North America (see map below). From:

GB Pleist Lakes

Map of Late Pleistocene Lakes in the Western United States, c.a., 17,500 years before present. From: Note: There is also a nice map of Lake Bonneville stands with extensive explanation by Utah Geologic Survey:; the names of Lake Bonneville stands are explained in the publication. Interactive graphics to show Bonneville Lake levels at different times can be found here:

The vast extent of Late Pleistocene lakes in the Great Basin and the huge size of Lake Bonneville in particular (and Lake Lahonton on the west side of the Great Basin), show that the climate was very different 20,000 years ago compared to now. Some researchers think that rainfall in the region needed to be from 140-280% present values to get high lake stands, while evaporation was likely about 30% lower due to decreased temperatures during glacial times (maybe as much as 5-10ºF). Annual mean precipitation for Salt Lake City is now 16 inches. Late Pleistocene precipitation would therefore be on the order of 23-46 inches, for a average of about 34 inches. The low end is about like that of present-day San Francisco, CA (21 inches), the high about like Tampa, FL (46 inches), and the mean pretty much like Seattle (38 inches). So, Salt Lake City in the late Pleistocene was a fairly wet place compared to now! Let’s now return to the Lake Bonneville shoreline deposits and look at how humans utilize those deposits and what that says about our ability to modify the landscape. The higher precipitation in along the Wasatch Front lead to higher runoff. More flow in the streams eroded more material that was transported into Lake Bonnevile forming large deltas. Strong and persistent winds transported sediments along the shore forming the terraces, spits and beach ridges. As the lake dropped sediment was spread out, into the extensive deposits that now exist. These sediments are unconsolidated so easy to excavate. They are also very close to where all the building is taking place so easy to transport to building sites. They make building roads and structures along the Wasatch Front relatively cheap–there is nearly always a gravel bar close at hand to new development. These terrace deposits are the foundation of the Wasatch Front Megalopolis. From 1994-2007 (last date data is available), 287 million metric tons of sand and gravel were extracted in the region along the Wasatch Front (data from U.S. Minerals Information Service, see figure below). The reconstruction of Interstate-15 and associated federal highways in the run up to the 2002 Salt Lake City Olympics used over 12 million metric tons of sand and gravel. Construction of high-rise buildings, Olympic Villages and other associated structures corresponds to the large spike in the plot below in sand and gravel production from 1997-2000. Now, another boom in building is happening that is not yet captured by data provided by the U.S. Minerals Information Service (their last reported state data is for 2007 production). In some places the production of sand and gravel is a substantial proportion of the deposits that formed over several thousand years along the shores of Lake Bonneville. Let’s look at one of those deposits, the Point of Mountain Spit.


Sand and gravel production from the district that encompasses the Wasatch Front corridor. Data is from the U.S. Minerals Information Services state data (

Point of Mountain is a spectacular spot just south of Salt Lake City. A high, traverse ridge extends westward several miles from the Wasatch Range separating the Salt Lake Valley from Utah Valley. Strong shoreline currents carried sediment southward toward this spot, forming a gigantic spit of sand and gravel that extends (extended) from the Salt Lake Valley into Utah Valley. When Gilbert described this spit, it was an intact feature. In 1993 much of the spit was still enact and the shoreline terrace to the north was relatively undeveloped (figure below), but some gravel mining had destroyed the end of the spit (where I-15 curves around the ridge in the figure below).

Point of Mtn Spit 1993 Obl-Outline

1993 oblique view (looking south) of the Point of Mountain Spit and associated shoreline terrace. The lake ward end of the spit is somewhat destroyed by gravel mining (light areas) but the general shape can be seen extending into the gap between Salt Lake Valley and Utah Valley. Yellow outline is approx. extent of spit and other terrace-like deposits.

Twenty years later in 2013, the spit was completely transformed (figure below).

Point of Mtn Spit 2013 Obl

2013 oblique view (looking south) of the Point of Mountain Spit area and associated shoreline terrace. Major excavation and development has modified the deposits. Image from Google Earth.

The entire end of the spit and much of the underlying gravels have been excavated in a huge gravel mine. Even the ends of the mountain ridge has been mined for rock. The shoreline terrace is nearly completely covered with housing developments (the undeveloped end is a paraglider park, so was saved). Much of the farm land in the valley has been also transformed to housing developments and roads. This shows the accessibility of Lake Bonneville deposits to the demand for construction materials. Gravel from the spit goes right into adjacent roads, houses, business parks and shopping malls. The figure below is a closer and vertical view that better illustrates the magnitude of these excavations–all of the lighter areas are the gravel mines and roughly outline the previous extent of the spit.

Point of Mtn Spit 2013

2013 vertical view of the gravel mines at the Point of Mountain Spit. From Google Earth.

Extensive Lake Bonneville deposits like the Point of Mountain Spit probably took from 500-1500 years to form (length of a stand in the lake and following drawdown). Humans have nearly completely excavated it in about 20 years. Or a rate of destruction 25-75 times faster than construction. This shows what a tremendous power direct human actions are in the modern world. Research looking at human actions at the global scale show a similar rate of “human erosion”, about 40 times geologic rates (see “Moving Dirt” in the post archives). Humans now have become the premier mover of material on the Earth’s surface, more efficient than all other geologic processes. The Wasatch Front Megalopolis is a prime example of that ability. In a little over a century we have come a long way in completely transforming a natural landscape that took many thousands of years to develop into a human construct. That is incredible power. Is the next stage Trantor? Trantor

Posted in Global Change | Leave a comment

The Goldilocks Phenomena and Megalopolization: Part I, Modern Climate

I recently drove through the Wasatch Front Urban Corridor (figure below), on my way from western Montana to southern Utah. This roughly 120-mile long corridor is a spectacular example of an emerging megalopolis and the power of humans to modify the Earth. But the Wasatch front also has some interesting climate and paleoclimate (prehistoric climate) features that help explain the development and how past climates can facilitate the expansion of human society.

Wasatch Front Megalopolis

The Wasatch Front Megalopolis (grey areas with city names), that encompasses Salt Lake City and the various cities to the north and south (north is to the left). The Wasatch Mountains are to the east (top of image); the Great Salt Lake (lower left) and Utah Lake (upper right) to the west. The Wasatch Megalopolis extends about 120 miles along I-15 and is composed of nearly continuos housing developments, shopping malls, business districts, schools, and other associated infrastructure of modern American suburbia/exurbia. From Google Earth imagery, 2013.

The spectacular fault-bounded Wasatch Range on the east and the Great Salt Lake and Utah Lake on the west hem in development along the front. Over 2.3 million people live in this corridor and population is growing fast (table below). About 80% of Utah’s population is concentrated along the Wasatch Front, where about 85% of Utah’s economic productivity is generated. This is an amazing place growth wise!

Table 1

Population of counties that encompass the Wasatch Front Megalopolis (from:

So, why are all these people coming to the Wasatch Front? There are the usual socio-economic factors for sure (jobs, family, etc), but it is also a good place climatically. To show you, I will use a climate data plotting site called WeatherSpark that I have not introduced yet. WeatherSpark is a website that uses a wide range of available data to make unique graphs of local climate. Many of these differ from those that you can get from the U.S. Weather Service (that I presented in the last post) and add more useful climate parameters. In Part I of this post, I will present the climate of Salt Lake City, in the center of the Wasatch Megalopolis, to show why the Wasatch Front is such a good place climatically (at least for some people). Then in Part II of this post I will explore the recent paleoclimate (prehistorical climate) features of the Wasatch Megalopolis and look at what this all means in the context of humans’ ability to modify the Earth.

OK, now for the recent climate of Salt Lake City, the epicenter of the Wasatch Megalopolis.

Here is a plot from WeatherSpark showing the annual maximum and minimum temperatures for Salt Lake City (SLC). Looks pretty good, as long as you do not mind coolish winters and warmish summers.


The daily average low (blue) and high (red) temperature with percentile bands (inner band from 25th to 75th percentile, outer band from 10th to 90th percentile).

But how do these temperatures “feel” on average. Here is a plot that can give you an idea:


The average fraction of time spent in various temperature bands: frigid (below 15°F), freezing (15°F to 32°F), cold (32°F to 50°F), cool (50°F to 65°F), comfortable (65°F to 75°F), warm (75°F to 85°F), hot (85°F to 100°F) and sweltering (above 100°F).

Not much hot weather and a large amount of warm, comfortable, and cool weather: “not to hot, not too cold, just about right”.  At least for some folks. But here is a more useful plot, because in general people seem to like climates without much rain. Here is how SLC stacks up:


The fraction of days in which various types of precipitation are observed. If more than one type of precipitation is reported in a given day, the more severe precipitation is counted. For example, if light rain is observed in the same day as a thunderstorm, that day counts towards the thunderstorm totals. The order of severity is from the top down in this graph, with the most severe at the bottom.

Pretty nice summers, with only a quarter of the days with precipitation, and that mostly in thunderstorms. Snow in the winter, so good for skiing, but still with only 50% of the days with precipitation. But what about “humidity”? It can be not-so-hot, but really humid, so pretty miserable. Here is a WeatherSpark plot of dew point that takes into account temperature and humidity, which can be a better measure of how comfortable the climate is:


The daily average low (blue) and high (red) dew point with percentile bands (inner band from 25th to 75th percentile, outer band from 10th to 90th percentile).

Mostly comfortable summers and feels dry the rest of the year. So, if you like dry and sunny, SLC is for you! But remember, these are “average” conditions (see post on Normal vs. Normals) and not conditions that you will experience every day or even often. That is another nice feature of the WeatherSpark plots, they give you a feel for the spread of each climate parameter. The darker colored bands are the spread between 75th percentile and the 25th percentile, representing central 50%. The lighter colored bands represent 90% of the data. So, you can get a good feel for the spread of the parameter as well as the central tendency.  

There is another climate control along the Wasatch Front that makes it a desirable place to live and helps drive the growth of the megalopolis. The steep front of the mountains, just east of the corridor, add a third climate dimension. The Wasatch Range forms a steep elevation barrier to storms arriving from the southwest. Air moving eastward over the mountains, rises and cools. The cooler air cannot hold as much moisture, so it falls as snow or rain (depending on the season). The Great Salt Lake also supplies moisture to the atmosphere through evaporation as winds blow across the lake (called a “lake effect”). These climatic factors lead to heavier precipitation in the mountains than in the Wasatch Front corridor. Looking at the figure below, we can see that the average annual precipitation within the Wasatch Front Megalopolis is about 15 to 25 inches, while in the Wasatch Range, just east it is from 40 to >60 inches. So, people can live in a relatively warmer and drier climate but have quick (depending on traffic) access to deep, dry snow in the mountains to the east.


Map of the annual average precipitaiton (30 year normal) of northern Utah. From the PRISM Climate Group, Oregon State University,

Because the Wasatch Range snow is deep and dry it makes for excellent skiing so is home to some of the country’s best skiing and other winter sports. Unfortunately, the proximity to the about 2.3 million people of the Wasatch Front Megalopolis also makes it some of the most crowded (and expensive resorts) as well. But, at least for about 2.3 million people in the Wasatch Front Megalopolis, this area has a Goldilocks Effect, “not too hot, not too cold, just right”. For some things that is, but easy driving is not one of them! Nor is living in a natural landscape as the megalopolis gobbles it up. That is the topic of the next post.


Posted in Global Change | Tagged

Normal vs. Normals

We hear it all the time, but do we really know what it means? The news anchor pans to the meteorologist/weather person and says something like, “Well, Dana, what is the weekend weather hold for us? Will this rain continue? And Dana answers, “Clint, we are in for a surprise. The skies will clear and temperatures on Saturday will be 15 degrees above normal. So, get out those shorts for at least one more great fall weekend.” When we hear this we conjure up our definition of normal: Normal (adjective): Conforming to a standard; usual, typical, or expected: it’s quite normal for puppies to bolt their food; normal working hours (Oxford English Dictionary).

So, the temperatures on the weekend will be above typical, higher than what is expected for this day in the fall. But, that is not exactly the meaning of “climate normals”, which is what the media meteorologist is referring to. Climate Normals are a calculation by the National Weather Service, and have a very precise meaning, different from the meaning we think of when we hear the term. Here is the definition according to the National Weather Service, which calculates normals for each weather station in the U.S.: Climate Normals (proper noun) are three-decade averages of climatological variables including temperature and precipitation. This product is produced once every 10 years.

So, the 1981–2010 U.S. Climate Normals is the latest release. (There are past Climate Normals for 1971-2000, 1961-1990, etc.) This dataset contains the calculated daily and monthly averages (mean of all the values for that day over those three decades) of temperature and precipitation, plus other climate parameters such as, snowfall, heating and cooling degree days, frost/freeze dates, and growing degree days. Climate Normals are provided for most of the weather stations across the country. Below is an example (Figure 1) of what these look like for temperature and precipitation for Hamilton, MT.


Figure 1: 1981-2010 Climate Normals for Hamilton, MT. From the Western Regional Climate Center at You can also get tabular data from this site so you can easily compare Climate Normals to measured temperatures for any day of the year.

Climate Normals are calculations of the temperatures for the three decades, they are not the typical values and you may not even have experienced them for that day, even if you lived through all three decades. They are calculated averages, not measured values. This is something critically important about climate data: It is derived from calculations based on measured data, not the measurements themselves. So, for example, the Climate Normal for today in Hamilton is 64.1 degree F, while the high today was 75 degree F, about 10 degrees higher than the “normal” high temperature.

The plot below (Figure 2) shows how variable the climate is for any site. This is a plot of the daily maximum temperatures (Tmax) for Hamilton, MT for the 30 year of the last Climate Normal period (1981-2010).


Figure 2: 1981-2010 daily maximum temperature records (red pluses) for Hamilton, MT. From .

The red pluses on the figure show the measured maximum temperature for each day over the three decades used to calculate the Climate Normals. The vertical line is October 6 and the horizontal line is the normal maximum temperature for that day; the grey diamond marks the intersection of the two, showing the calculated normal maximum temperature compared to the range of measured maximum temperatures for that day. You can see the huge range of data around the calculated normal compared to the measured Tmax. The normal TMax is not “typical” or “expected” in the normal sense of normal. That is, hardly any of the maximum temperatures measured for a particular day match the Climate Normal Tmax. So, normals are not really normal! But they are a convenient way to compare records to some average value. And there are some great ways to see these comparisons on the Weather Service sites. The graph below (Figure 3) is one of those plots. (See the note at the end of this post to see how to get these plots for your area. One catch is that these plots are not available for as many stations as the data I showed above. In chart below I have moved to a station 38 miles north of Hamilton, MT, Missoula, MT.


Figure 3: Daily temperature and precipitation compared to 1981-2010 Climate Normals for Missoula, MT. From,

This chart will take some explaining, but it is well worth the effort. I will concentrate on temperature first (upper graph). The blue vertical bars (they are just very thin so look like a jagged line) are the maximum (top of bar) and minimum (bottom of bar) temperatures recorded at the Missoula weather station. The upper boundary of the green band is the Climate Normal Tmax and the lower boundary is the Climate Normal Tmin (minimum temperature). So, the green band represents the band of calculated normal temperatures and the blue “line” represents the measured temperatures throughout the year. The top of the pink, jagged band is the extreme temperature recorded for that day (Record Max) and the bottom of the blue band is the extreme minimum temperature recorded for that day (Record Min). So, this graph shows an incredible amount of information, allowing comparison throughout the year of measured data to calculated normals and recorded extremes. Precipitation graphs are simpler because they show only the measured data (the “stepped” line) and the “normal precipitation”. The graph shows cumulative precipitation through the year, so it is easy to see if precipitation is above the Climate Normal (dark green) or below normal (light brown). You can say a lot by looking at these plots about how temperature and precipitation in a specific year compares to the latest three decades used to calculate the normals (1981-2010 in this case).

The most obvious is how “un-normal” the normals are, or maybe a better way to look at it is how “un-normal” the daily records are. For example, for 2014 until October 6, there are many days that fall outside the normal range of temperatures. There are also very few days that were near or beat the extreme records—two maximum and two minimum out of just over nine months of records. But some months were really wild. Let’s zoom in on February using the figure below (Figure 4).


Figure 4: Daily maximum and minimum temperatures for February 2014, in Missoula, MT.

For the first third and last third of February, every day had minimum and maximum temperatures substantially lower than normal. During the middle of the month, it was the opposite with the temperatures substantially (but not as extreme) above normal. There is a clear step in the weather across the boundaries between these two conditions. Like a switch turning from really cold to really warm. That switch was the jet stream as it moved across the region bringing in warm or cold air. But that story is for another post.


To get to the graphics in the last two plots takes some patience with the Weather Service site. The easiest way (but not the only way) is to follow these directions:

1. Start at this web site:

2. Click on the region of the state your are interested in on the U.S. map. This will take some messing with to get where you want. When the right region comes up, select the station you want. I selected Missoula, MT (mso), so got this site:

3. Then click on the tab (upper part of the page) titled, “Local Data/Records”.

4. This brings up another page. Under “Climate Graphs” select “Graphical Climate Summaries for…” where the “…” is the list of stations in the region you selected.

Posted in Global Change

Warming Complexities in the News

Winds of Warming

September has been a great month for climate science in the news. Many major news media noticed a paper published in the Proceedings of the National Academy of Sciences (PNAS) by Drs. James A. Johnstone and Nathan J. Mantua titled, “Atmospheric controls on northeast Pacific temperature variability and change, 1900–2012“. Their research found that “changes in winds over the eastern Pacific Ocean explain most of the warming trend along the West Coast of North America in the last century”, not increases in solar insolation from increasing greenhouse gases in the atmosphere. For details you should read the summary of the article on the Southwest Fisheries Science Center’s web site ( where Dr. Mantua works. Also read Prof. Cliff Mass’ (University of Washington) article on the importance of the findings compared to other work on the region. You can find that article on the Cliff Mass Weather Blog (

Considering the wealth of papers finding that global temperature increases in the last few decades result mostly from greenhouse enhancement, at first glance the Johnstone and Mantua result is surprising, and as many news media said, controversial. But, not really. It shows that the climate is complex and local/regional variability can be large, even larger than global forcing. The global greenhouse enhancement is imbedded in the west coast temperature signal, it is just swamped by natural variability. This variability is from the atmospheric systems (large scale wind patterns) in the North Pacific Ocean that drive major weather systems across the Pacific Northwest and transport heat through the atmosphere. The importance of the Johnstone and Mantua work is that it helps develop “…a fuller understanding of natural and anthropogenic changes…”. The take home message is that the world is a complicated place and regional variability is large. Their results can be seen best in a figure in the supplemental information that goes along with the paper (click on their figure S4 below). In this figure you can see the temperature trends from the Pacific Coast states before the Pacific atmospheric forcing is removed (left side) and after it is removed (right side). Once the effects from the wind/pressure processes described in the paper are corrected, temperature trends are diminished substantially across the region.  However, Southern California shows persistent significant upwards trends (warming) even with the atmospheric effects removed. One question is how far onto the continent do such controls extend? You can see that even areas interior of the coastal mountain ranges show strong responses to the oceanic conditions. This should not be a surprise because of the well known effects of El Niño (and other ocean-atmospheric processes) on climate throughout the world. But the quantification of this process on temperature trends is definitely important and needs to be  incorporated into modeling and predictions of how the climate will change in the future with continual addition of greenhouse gases and other human modifications of the landscape and waterscape.

Figure S4 from Johnstone and Mantua, 2014.

Figure S4 from Johnstone and Mantua, 2014.

The other interesting climate science publishing event was the release of “Explaining extreme events of 2013 from a climate perspective” published by the American Meteorological Society (AMS). For the last few years, the AMS has published a retrospective on the major climate events that affect the world in some dramatic way: droughts, floods, heat wave, etc. In the latest publication, several groups of meteorologists and climate scientists write about the same event from different perspectives, so it is a great volume to see a range of ideas on a topic–real science without the homogenization of “consensus”! You can read the report here ( and read a New York Times story on it here ( Over eighty authors in 22 articles examine drought, heat waves, hurricanes, downpours, cold snaps, and blizzards that occurred in 2013 around the world. The most interesting papers were the two sets on the California drought (four) and the Australian heat wave and drought. These papers give a great overview of the complexity of these events and how many different aspects of the climate come together to cause them, including some of the same large-scale atmospheric processes that Johnstone and Mantua present in their paper.

The thing that really stands out in the California papers is the severity of this drought within the historical record (click figure below from Swain et al.).


12-month (one-sided) moving average precipitation in California from 1895 to 2014. Major historical droughts highlighted. Swain et al. 2014, Bull. Amer. Meteor. Soc., 95 (9), S1–S96.

There are lots of detailed plots and discussion about outcomes in these papers in a short format. They are dense but well laid out with a short and informative introduction, followed by brief (but jargony) results and then a concise conclusion. You can get a clear picture of what the authors did, why they did it and what they came up with. It will take much more reading of the cited literature to understand the background the authors rely on for their conclusions, but the papers are a great presentation of the science of studying extreme climate events. I especially liked the paper by Hoerling and his eight co-authors on the extreme rain and flooding event in NE Colorado (around Boulder, CO). They examined very carefully the importance of the local situation in the broader context of regional/global weather/climate. A very nice job!  There is a topic for everyone in this volume, so it is worth the read.


Johnstone and Mantua article:

BAMS Collection:

News stories:

Seattle Times:

Los Angles Times:

CBS News:

Posted in Global Change | Tagged

Moving Dirt

An Abandoned Open Pit Copper Mine, Butte, MT

The discussion of human-induced climate change commonly revolves around indirect effects by humans. For example, we increase greenhouse gases, which leads to warming of the planet, which changes climate processes in myriad ways. But there are more direct effects by humans. Since I completed my post on human population (February 2, 2010) nearly 9 million people have been added to the planet—6.809 billion people total now. That is a lot of people to house, feed and transport around, and so requires lots of dirt to be moved. One interesting question is how does all the earth moving associated with human development compare to the amount of material moved by geologic processes, like mountain building, glaciation, and erosion by wind, waves and rivers?

Until 16 years ago most geologists would have thought that humans were no match for the Earth. But that all changed in 1994 when Dr. Roger Hooke, then a Professor at the University of Minnesota, published a paper in the Geological Society of America’s news magazine, GSA Today (see Hooke, R. LeB., 1994, On the efficacy of humans as geomorphic agents: GSA Today, v. 4, No. 9, p. 217, 224-225 and this link). In his article, Hooke did an elegant first-order analyses of resource and construction data along with geologic data that estimated humans moved about 45 billion tons of material a year (45 Gt/y), essentially equivalent to all the material moved by ricers, glaciers, waves, wind, and continental mountain building. Direct human actions moved as much material as all the geologic processes on Earth—phenomenal!

In his analyses, Hooke left out the material moved by agriculture. He included agriculture in another paper in 2000 (see Hooke, R.LeB., 2000, On the history of humans as geomorphic agents: Geology, v. 28, no 9, p. 843-846)  and found that humans moved almost 2 times the material moved by geologic processes on the continents (80 Gt/y vs. 45 Gt/y). In the last 10 years more detailed and complete analyses by other geologists have found even higher “human erosions rates”. Dr. Bruce Wilkinson, from the University of Michigan, constructed a detailed historical time series of direct human movement of earth material (see Geology; March 2005; v. 33; no. 3; p. 161-164 ). He found that by approximately 1000 AD humans equaled geologic processes (see figure below). In 2000 AD, humans moved upwards of 40 times the geologic erosion rate. Wilkinson summed this up with this quote: “At these rates, this amount of material would fill the Grand Canyon in 50 yr.” To put this in perspective, it took geologic erosion about 6 million years to form the Grand Canyon—humans have become the largest geologic force on Earth’s continents, bar none. Phenomenal!

Human Erosion vs. Geologic Erosion (from Wilkinson, 2005)

Posted in Global Change

The Carbon Cost of Human Reproduction

The human population is rarely discussed in the context of human-induced climate change. As I write this, on February 2, 2010 at noon Mountain Standard Time, the World population is estimated at 6,800,295,487 people (see figure above).

You can find the population at the U.S. Census Bureaus web site; check it as you start to read this post and then when you finish to get an idea of how rapidly the population is growing (

What is astounding is that in the next hour, about 8600 people will be added to Earth.  In the next year over 75 million will be added–about two “Californias” (see Table 1 below). That means that water and food and jobs, all the things that go into living in the 21st century, have to be created for all these people. And it is not just next year, it is each subsequent year. And, the numbers keep rising exponentially. In only 4 years we will have added 309 million people, another “United States”!

Time unit       Births      Deaths       Increase
Year       131,940,516   56,545,138    75,395,378
Month       10,995,043    4,712,095     6,282,948
Day            361,481      154,918       206,563
Hour            15,062        6,455         8,607
Minute             251          108           143
Second             4.2          1.8           2.4

Table 1. From U.S. Census Bureau (link above in text).

As we add people they use energy—directly and to produce all the extra food water and goods. Because energy is mostly derived from fossil fuels, release of carbon to the atmosphere increases. We can think of this as human’s “carbon legacy”. A few years ago, Paul Murtaugh and Michael Schlax from Oregon State University, published an article titled Reproduction and the carbon legacies of individuals (Global Environmental Change, Vol. 19, No. 14-20, doi: 10.1016/j.gloenvcha.2008.10.007). They found that “Under current conditions in the United States, for example, each child adds about 9441 metric tons of carbon dioxide to the carbon legacy of an average female, which is 5.7 times her lifetime emissions.” This and a report written by the London School of Economics are highlighted in an article in the Washington Post  (September 15, 2009:  The London report found that it is far cheaper to invest in preventing births than in supplying extra energy for extra people and is far cheaper than developing new solar or wind power ($7/ton vs. $51-$24/ton). This shows how closely linked population growth and energy use are.

Oh, and as I finish writing this post, I checked the Population Clock:

Over 10,000 people have been added to the world!

Posted in Global Change | 1 Comment

Measuring Global Temperature: Part II–Spatial Averaging

Having discussed how thermometer data is used to construct time series of temperature at a station (see Part I), we now turn to how we compare data among all the weather stations in a region and on the Earth. Again, this seems simple at first glance, just add up all the data, calculate an average and plot that up. But, it turns out to be very complicated. The first problem is the disparity in the distribution of weather stations. Some regions have a high number of stations while others have very few. This is true at many levels. In the United States, urban areas have a large number of stations compared to rural areas. Mountainous regions have fewer than plains regions. If we simply averaged all these stations there would be large biases. Urban areas would dominate the data and mountainous regions would be under represented. There are even greater discrepancies globally (Figure 1) where some countries/regions/continents have dense networks of stations, while others very sparse networks (compare the U.S. with Africa in Figure 1). There are also more stations in the Northern Hemisphere and fewer in the Southern Hemisphere. All these complicate the calculation of a global average temperature. We will work through how the global calculation is made by starting locally and working up to the entire Earth.

Figure 1. The 7280 global meteorological stations used by NOAA to calculate the global mean temperature. From:

We determined how meteorologists calculate average temperatures at a specific station. Now think about how they would start to aggregate station data into a global average. The first problem is that the temperatures at different stations can be very different for particular days or longer time periods because of the local conditions differ substantially among sites. For example, the temperatures at a high-elevation site will be quite different from one near sea-level, even though they may show the same trends over time. One way to address this disparity is to calculate a temperature anomaly. An anomaly is calculated by using a long-term mean temperature for each station. This can be the entire record or some portion of the record. Each of the annual average temperatures at each station are averaged over this time period and then that number is subtracted from each of the yearly averages. This transforms the data for direct degrees to an anomaly, i.e., degrees away from the mean. Positive values are greater than the long-term mean value and negative values are less than the long-term mean. This results in plots like that below (Figure 2).

Figure 2: Temperature anomaly for Hamilton, MT from 1885 to 2009). The blue bars are the anomaly values above (positive) or below (negative) the long-term average (zero).

Calculating anomaly data for each station allows easy comparison with other stations that emphasizes change in temperature, commonly called delta-T (∆T) instead of temperature (T). The anomaly transformation makes it easier to visualize changes in temperatures among stations, but those stations still need to be averaged to get a global mean without incorporating the bias from the different station densities in different regions. To do this the anomaly data is gridded.

First the globe is divided into cells. Commonly these cells are based on longitude and latitude. A typical cell is 5 by 5 degrees. This results in different sizes of cells near the equator versus near the poles. There are an equal number of cells around the equator as there is at any other latitude, they are just smaller at higher latitudes. Then the stations that fall within a cell are identified and the mean temperature anomalies averaged (monthly or yearly) to give a value for the grid. Some grids may have many stations and some may have few or none. This approach tries to minimize the bias caused by differences in station numbers that would affect a simple arithmetic average of all stations. This results in an anomaly number for each year (or month of monthly averages are used), for each grid cell on the  Earth (Figure 3). Where there is no data the cells are left empty (or another approach is to estimate data for a cell based on the cells around it).

Figure 3: Year 2008 temperature anomalies (°C) for 5 X 5 degree cells of a global grid from NASA National Climate Data Center. Red circles are values above the long-term average and blue circles are below the long-term average. From:

More complete data coverage and interpolation between stations can produce more “filled” grid cells for the globe (Figure 4), but there are areas where no data are available, leaving holes in the global grid.

Figure 4: Global temperature anomalies for January, 1969 for a 5X5 degree grid. These data are in °C (not °F), as in Figure 3. From: Brohan et al. 2006, JOURNAL OF GEOPHYSICAL RESEARCH, VOL. 111, D12106, doi:10.1029/2005JD006548, 2006. Report Available as PDF.

Although there is a sparse record from weather stations in the ocean (not many islands) there is a long dataset from ships records that allow gridded data to be developed for the oceans (Figure 5).

Figure 5: Ocean and land gridded data for January, 1969. Sources same as above.

Once we have data for the global grid like that in Figure 5, the next step is easy, we just average them all for each year (or month) to get the global mean temperature anomaly. There are still biases. There are more holes in the grid in the high latitudes and there is a lingering bias due to the large differences in numbers of stations in each grid cell, but these can be assessed (see the paper linked above in Figure 4 for the details) by producing error plots like those below (Figure 6).

Figure 6: Time series of global mean annual temperature anomalies (black) showing the estimated error (colors); see original article cited in Figure 3 for details. Notice how the error increases back through time and tightens more recently. There are more stations that do a better job of measuring temperature in more recent years.

Determining global average temperatures relies on a series of careful analyses. Although it seems complex, it is really just a bookkeeping problem. One has to keep track of many numbers and the uncertainty behind those numbers. The plot above used over 4200 stations and millions of individual data points. But when it is done carefully, these calculations are a good measure of global temperature (the best we can do), based on measured temperature at individual stations. It is the basis for understanding the short-term (last 150 years or so) temperature changes on the Earth.

Posted in Global Change