Posted by: climanova | March 20, 2010

Moving Dirt

An Abandoned Open Pit Copper Mine, Butte, MT

The discussion of human-induced climate change commonly revolves around indirect effects by humans. For example, we increase greenhouse gases, which leads to warming of the planet, which changes climate processes in myriad ways. But there are more direct effects by humans. Since I completed my post on human population (February 2, 2010) nearly 9 million people have been added to the planet—6.809 billion people total now. That is a lot of people to house, feed and transport around, and so requires lots of dirt to be moved. One interesting question is how does all the earth moving associated with human development compare to the amount of material moved by geologic processes, like mountain building, glaciation, and erosion by wind, waves and rivers?

Until 16 years ago most geologists would have thought that humans were no match for the Earth. But that all changed in 1994 when Dr. Roger Hooke, then a Professor at the University of Minnesota, published a paper in the Geological Society of America’s news magazine, GSA Today (see Hooke, R. LeB., 1994, On the efficacy of humans as geomorphic agents: GSA Today, v. 4, No. 9, p. 217, 224-225 and this link). In his article, Hooke did an elegant first-order analyses of resource and construction data along with geologic data that estimated humans moved about 45 billion tons of material a year (45 Gt/y), essentially equivalent to all the material moved by ricers, glaciers, waves, wind, and continental mountain building. Direct human actions moved as much material as all the geologic processes on Earth—phenomenal!

In his analyses, Hooke left out the material moved by agriculture. He included agriculture in another paper in 2000 (see Hooke, R.LeB., 2000, On the history of humans as geomorphic agents: Geology, v. 28, no 9, p. 843-846)  and found that humans moved almost 2 times the material moved by geologic processes on the continents (80 Gt/y vs. 45 Gt/y). In the last 10 years more detailed and complete analyses by other geologists have found even higher “human erosions rates”. Dr. Bruce Wilkinson, from the University of Michigan, constructed a detailed historical time series of direct human movement of earth material (see Geology; March 2005; v. 33; no. 3; p. 161-164 ). He found that by approximately 1000 AD humans equaled geologic processes (see figure below). In 2000 AD, humans moved upwards of 40 times the geologic erosion rate. Wilkinson summed this up with this quote: “At these rates, this amount of material would fill the Grand Canyon in 50 yr.” To put this in perspective, it took geologic erosion about 6 million years to form the Grand Canyon—humans have become the largest geologic force on Earth’s continents, bar none. Phenomenal!

Human Erosion vs. Geologic Erosion (from Wilkinson, 2005)

Posted by: climanova | February 3, 2010

The Carbon Cost of Human Reproduction

The human population is rarely discussed in the context of human-induced climate change. As I write this, on February 2, 2010 at noon Mountain Standard Time, the World population is estimated at 6,800,295,487 people (see figure above).

You can find the population at the U.S. Census Bureaus web site; check it as you start to read this post and then when you finish to get an idea of how rapidly the population is growing (http://www.census.gov/ipc/www/popclockworld.html).

What is astounding is that in the next hour, about 8600 people will be added to Earth and in the next year over 75 million will be added. This is about two “Californias” (see Table 1 below). That means that water and food and jobs and all the things that go into living in the 21st century have to be created for all these people. And it is not just next year, it is each subsequent year and the numbers keep rising exponentially. In only 4 years we will have added 309 million people, another “United States”!

Time unit       Births      Deaths       Increase
Year       131,940,516   56,545,138    75,395,378
Month       10,995,043    4,712,095     6,282,948
Day            361,481      154,918       206,563
Hour            15,062        6,455         8,607
Minute             251          108           143
Second             4.2          1.8           2.4

Table 1. From U.S. Census Bureau (link above in text).

As we add people they use energy—directly and to produce all the extra food water and goods. Because energy is mostly derived from fossil fuels, release of carbon to the atmosphere increases. We can think of this as human’s “carbon legacy”. Last year Paul Murtaugh and Michael Schlax from Oregon State University, published an article titled Reproduction and the carbon legacies of individuals (Global Environmental Change, Vol. 19, No. 14-20, doi: 10.1016/j.gloenvcha.2008.10.007). They found that “Under current conditions in the United States, for example, each child adds about 9441 metric tons of carbon dioxide to the carbon legacy of an average female, which is 5.7 times her lifetime emissions.” This and a report written by the London School of Economics are highlighted in an article in the Washington Post  (September 15, 2009: http://www.washingtonpost.com/wp-dyn/content/article/2009/09/14/AR2009091403308.html).  The London report found that it is far cheaper to invest in preventing births than in supplying extra energy for extra people and is far cheaper than developing new solar or wind power ($7/ton vs. $51-$24/ton).

Oh, and as I finish writing this post, I checked the Population Clock:

Over 10,000 people have been added to the world!

Posted by: climanova | January 15, 2010

Measuring Global Temperature: Part II–Spatial Averaging

Having discussed how thermometer data is used to construct time series of temperature at a station (see Part I), we now turn to how we compare data among all the weather stations in a region and on the Earth. Again, this seems simple at first glance, just add up all the data, calculate an average and plot that up. But, it turns out to be very complicated. The first problem is the disparity in the distribution of weather stations. Some regions have a high number of stations while others have very few. This is true at many levels. In the United States, urban areas have a large number of stations compared to rural areas. Mountainous regions have fewer than plains regions. If we simply averaged all these stations there would be large biases. Urban areas would dominate the data and mountainous regions would be under represented. There are even greater discrepancies globally (Figure 1) where some countries/regions/continents have dense networks of stations, while others very sparse networks (compare the U.S. with Africa in Figure 1). There are also more stations in the Northern Hemisphere and fewer in the Southern Hemisphere. All these complicate the calculation of a global average temperature. We will work through how the global calculation is made by starting locally and working up to the entire Earth.

Figure 1. The 7280 global meteorological stations used by NOAA to calculate the global mean temperature. From: http://www.ncdc.noaa.gov/oa/climate/ghcn-monthly/index.php

We determined how meteorologists calculate average temperatures at a specific station. Now think about how they would start to aggregate station data into a global average. The first problem is that the temperatures at different stations can be very different for particular days or longer time periods because of the local conditions differ substantially among sites. For example, the temperatures at a high-elevation site will be quite different from one near sea-level, even though they may show the same trends over time. One way to address this disparity is to calculate a temperature anomaly. An anomaly is calculated by using a long-term mean temperature for each station. This can be the entire record or some portion of the record. Each of the annual average temperatures at each station are averaged over this time period and then that number is subtracted from each of the yearly averages. This transforms the data for direct degrees to an anomaly, i.e., degrees away from the mean. Positive values are greater than the long-term mean value and negative values are less than the long-term mean. This results in plots like that below (Figure 2).

Figure 2: Temperature anomaly for Hamilton, MT from 1885 to 2009). The blue bars are the anomaly values above (positive) or below (negative) the long-term average (zero).

Calculating anomaly data for each station allows easy comparison with other stations that emphasizes change in temperature, commonly called delta-T (∆T) instead of temperature (T). The anomaly transformation makes it easier to visualize changes in temperatures among stations, but those stations still need to be averaged to get a global mean without incorporating the bias from the different station densities in different regions. To do this the anomaly data is gridded.

First the globe is divided into cells. Commonly these cells are based on longitude and latitude. A typical cell is 5 by 5 degrees. This results in different sizes of cells near the equator versus near the poles. There are an equal number of cells around the equator as there is at any other latitude, they are just smaller at higher latitudes. Then the stations that fall within a cell are identified and the mean temperature anomalies averaged (monthly or yearly) to give a value for the grid. Some grids may have many stations and some may have few or none. This approach tries to minimize the bias caused by differences in station numbers that would affect a simple arithmetic average of all stations. This results in an anomaly number for each year (or month of monthly averages are used), for each grid cell on the  Earth (Figure 3). Where there is no data the cells are left empty (or another approach is to estimate data for a cell based on the cells around it).

Figure 3: Year 2008 temperature anomalies (°C) for 5 X 5 degree cells of a global grid from NASA National Climate Data Center. Red circles are values above the long-term average and blue circles are below the long-term average. From: http://www.ncdc.noaa.gov/oa/climate/research/ghcn/ghcngrid.html

More complete data coverage and interpolation between stations can produce more “filled” grid cells for the globe (Figure 4), but there are areas where no data are available, leaving holes in the global grid.

Figure 4: Global temperature anomalies for January, 1969 for a 5X5 degree grid. These data are in °C (not °F), as in Figure 3. From: Brohan et al. 2006, JOURNAL OF GEOPHYSICAL RESEARCH, VOL. 111, D12106, doi:10.1029/2005JD006548, 2006. Report Available as PDF.

Although there is a sparse record from weather stations in the ocean (not many islands) there is a long dataset from ships records that allow gridded data to be developed for the oceans (Figure 5).

Figure 5: Ocean and land gridded data for January, 1969. Sources same as above.

Once we have data for the global grid like that in Figure 5, the next step is easy, we just average them all for each year (or month) to get the global mean temperature anomaly. There are still biases. There are more holes in the grid in the high latitudes and there is a lingering bias due to the large differences in numbers of stations in each grid cell, but these can be assessed (see the paper linked above in Figure 4 for the details) by producing error plots like those below (Figure 6).

Figure 6: Time series of global mean annual temperature anomalies (black) showing the estimated error (colors); see original article cited in Figure 3 for details. Notice how the error increases back through time and tightens more recently. There are more stations that do a better job of measuring temperature in more recent years.

Determining global average temperatures relies on a series of careful analyses. Although it seems complex, it is really just a bookkeeping problem. One has to keep track of many numbers and the uncertainty behind those numbers. The plot above used over 4200 stations and millions of individual data points. But when it is done carefully, these calculations are a good measure of global temperature (the best we can do), based on measured temperature at individual stations. It is the basis for understanding the short-term (last 150 years or so) temperature changes on the Earth.




Posted by: climanova | January 8, 2010

Measuring Global Temperature: Part I–Temporal Averaging

“The thermometer record shows unequivocally that Earth is warming, and provides the main evidence that this is caused by human activity.”

—von Storch & Allen, Nature, 7 January 2010, doi:10.1038/463025a

When reading the quote above, the term “thermometer record” seems very straight forward: someone is reading a thermometer and recording the temperature. These records are used to create plots like that below (Figure 1), which have become icons of “global warming”. But, what exactly goes into making a graph of the “global annual mean surface air temperature change” like that below and that are then considered “unequivocal” evidence of changes in global temperature? Seems simple, someone just plots up all these thermometer readings. However, in practice it is quite complicated.

Figure 1

Think about how you would determine mean temperature for a day in your back yard. You would measure the temperature with a thermometer, say every hour and write down each number. You would then add all 24 measurements and divide by 24 to calculate a mean (average) temperature for the day. This is not a measurement, but a calculation, that is the first important point. The second is that this is demanding and tedious, someone has to take all those measurements. Considering there are thousands of weather stations around the globe it would demand a huge effort to collect data every hour. Although we could use this approach now with digital, computerized weather stations, that was not possible before computers. So, the average temperature is determined in a very different way in the long historical records used to make plots like that above.

The tool used is the “minimum-maximum thermometer” which records the lowest and highest temperature over a day. At weather stations, these numbers are recorded each day and then the markers reset for the next day’s measurement (now mostly done by computer in rich countries). The average temperature for the day is then calculated as the mean of the high and low temperature. Other averages can then be determined from the daily averages. The average for a month is the mean of all the daily means for that month. The yearly average can be determined by averaging the monthly data for the year or averaging all the daily data for the year. The objective is to calculate one number for the year that depicts the annual temperature. The plot below (Figure 2) shows all the daily average temperatures (red dots) for one year and the yearly average (blue line) for Hamilton, MT in 2008. (You can get data and make plots like these for your region from the U.S. Historical Climatology Network at http://cdiac.ornl.gov/epubs/ndp/ushcn/ushcn_map_interface.html).

Figure 2

In 2008, at Hamilton, MT, the daily average temperature ranged from -7°F to 77°F and the annual average temperature was 45°F. Note how few days fall on or near the average temperature line. You can think about this as a temporal average which then is used to make comparisons over a longer time period, say over the last century. The plot below (Figure 3) shows the annual average temperature at Hamilton, MT from 1885-2008.

Figure 3

Each yearly point is a representation of a plot like Figure 2, so there is a large amount of variability that is lost when such plots are made. But this simplification allows us to look at potential change over time. We just must remember that the annual average temperature is not a measured value but a simplified representation of the complex temperature distribution through the year. The above discussion deals only with one point, the next problem is to expand this to the entire globe.

Posted by: climanova | January 5, 2010

Welcome to Clima Nova

FRAMEWORK FOR GLOBAL CHANGE SCIENCE

Our present economic and social systems were developed within a framework of nearly unlimited geologic and ecologic resources. Starting in the distant past, humans locally or even regionally modified and changed the Earth’s landscape. In the 21st Century we now are the dominant force of global change. Humans now “control nature”, albeit sometimes in chaotic and unexpected ways. Nearly all the Earth’s systems are now affected by human endeavors: the landscape, atmosphere, climate, biodiversity, water. You name it, humans have a controlling interest in natural processes and resources. This means we can no longer always count on past experience or business as usual for future policy making and management. Nearly all decisions in the future will need to be made in an environment of change, in an era of resource constraint. To cope with this new era, we will need to transform our society into a system that better understands the natural processes humans have to contend with. The foundation of this understanding is Geoscience: the study of the Earth and the processes creating the landscapes and mineral and water resources society depends on for sustainability. Although not fully appreciated by most people and world leaders, understanding geoscience is the foundation of any stable system of government and management, because the type and availability of geologic resources determines the fundamental resource base for every country and region in the world and so the geopolitics of nations.

The bottom line is that we are moving into a new world, one humans have not experienced before, a world that Thomas Freidman has termed “Hot, Flat and Crowded” and that others have termed the “Anthropocene”. We are entering the age of human dominance of Earth systems. Within the discussion of how humans should respond to the Anthropocene, there is much talk of a new human society, one that will take control of resource and energy use and build a fully sustainable system based on renewable resources (those not pumped or mined from the Earth). However, society will need to rely on traditional geologic resources to build such a system and will need the input of those resources to accommodate the huge increase in population growth and economic development over the next 30-50 years. And no matter how sustainable a future society will be, it will require us to still live and work within the persistent hazards generated by geologic processes that will never be controlled by human ingenuity. There will always be earthquakes, tsunamis, volcanic eruptions, changes in sea level, and floods, that humans will need to contend with. There will always be a need for metals and concrete, and there will always be a need for clean water. And as population and resource use grows exponentially (a concept that is fundamental to dealing with the future) we will all need to know what the Earth can and will throw at us, because Earth processes are not always benign. We will also need to know how to best develop new resources without damaging essential Earth systems even more than we have already and how to repair those we have damaged. This is critical to sustain an increasingly complex world society with diminishing geologic resources under expanding human domination.

One very important aspect of the new world we are facing is that of climate change. What roles do humans play? How does that compare to natural variability? What can we expect to happen? Can we make reliable predictions of the future climate? What will those future scenarios mean for humans and the Earth’s other biota? These are important questions that require a clear understanding of Earth processes and geoscience principles.  Although Climate Change or Global Warming or similar terms now elicit pedantic rants across media outlets, it is critical to understand climate data within the geologic context without a political motive–to look carefully at what science can and cannot tell us. There is a need to understand how we measure the climate of the Earth. There is a need to understand how modeling is used and its limitations. There is a need to know how science is done and its limitations for helping policy makers make decisions. All these will be the purview of Clima Nova. The goal is for this blog to be a site to exchange knowledge about the basics of how the Earth works in relation to climate and how climate scientists collect and evaluate data to understand processes of change at a vast range of time and spatial scales. The objective will be to inform and educate, not to preach. All material presented will be referenced and links given for data used in the posts, giving readers places to go for primary literature and data in climate science and geoscience. The major theme is “climate”, but we will also explore much more than climate to make sense of broader aspects of global change and change at more local and regional scales and how humans are affecting the geophysical processes of the Earth. How the Earth works is fascinating and I hope this gives a glimmer of the knowledge we now have and what we need to learn.

Categories

Follow

Get every new post delivered to your Inbox.