Normal vs. Normals

We hear it all the time, but do we really know what it means? The news anchor pans to the meteorologist/weather person and says something like, “Well, Dana, what is the weekend weather hold for us? Will this rain continue? And Dana answers, “Clint, we are in for a surprise. The skies will clear and temperatures on Saturday will be 15 degrees above normal. So, get out those shorts for at least one more great fall weekend.” When we hear this we conjure up our definition of normal: Normal (adjective): Conforming to a standard; usual, typical, or expected: it’s quite normal for puppies to bolt their food; normal working hours (Oxford English Dictionary).

So, the temperatures on the weekend will be above typical, higher than what is expected for this day in the fall. But, that is not exactly the meaning of “climate normals”, which is what the media meteorologist is referring to. Climate Normals are a calculation by the National Weather Service, and have a very precise meaning, different from the meaning we think of when we hear the term. Here is the definition according to the National Weather Service, which calculates normals for each weather station in the U.S.: Climate Normals (proper noun) are three-decade averages of climatological variables including temperature and precipitation. This product is produced once every 10 years.

So, the 1981–2010 U.S. Climate Normals is the latest release. (There are past Climate Normals for 1971-2000, 1961-1990, etc.) This dataset contains the calculated daily and monthly averages (mean of all the values for that day over those three decades) of temperature and precipitation, plus other climate parameters such as, snowfall, heating and cooling degree days, frost/freeze dates, and growing degree days. Climate Normals are provided for most of the weather stations across the country. Below is an example (Figure 1) of what these look like for temperature and precipitation for Hamilton, MT.


Figure 1: 1981-2010 Climate Normals for Hamilton, MT. From the Western Regional Climate Center at You can also get tabular data from this site so you can easily compare Climate Normals to measured temperatures for any day of the year.

Climate Normals are calculations of the temperatures for the three decades, they are not the typical values and you may not even have experienced them for that day, even if you lived through all three decades. They are calculated averages, not measured values. This is something critically important about climate data: It is derived from calculations based on measured data, not the measurements themselves. So, for example, the Climate Normal for today in Hamilton is 64.1 degree F, while the high today was 75 degree F, about 10 degrees higher than the “normal” high temperature.

The plot below (Figure 2) shows how variable the climate is for any site. This is a plot of the daily maximum temperatures (Tmax) for Hamilton, MT for the 30 year of the last Climate Normal period (1981-2010).


Figure 2: 1981-2010 daily maximum temperature records (red pluses) for Hamilton, MT. From .

The red pluses on the figure show the measured maximum temperature for each day over the three decades used to calculate the Climate Normals. The vertical line is October 6 and the horizontal line is the normal maximum temperature for that day; the grey diamond marks the intersection of the two, showing the calculated normal maximum temperature compared to the range of measured maximum temperatures for that day. You can see the huge range of data around the calculated normal compared to the measured Tmax. The normal TMax is not “typical” or “expected” in the normal sense of normal. That is, hardly any of the maximum temperatures measured for a particular day match the Climate Normal Tmax. So, normals are not really normal! But they are a convenient way to compare records to some average value. And there are some great ways to see these comparisons on the Weather Service sites. The graph below (Figure 3) is one of those plots. (See the note at the end of this post to see how to get these plots for your area. One catch is that these plots are not available for as many stations as the data I showed above. In chart below I have moved to a station 38 miles north of Hamilton, MT, Missoula, MT.


Figure 3: Daily temperature and precipitation compared to 1981-2010 Climate Normals for Missoula, MT. From,

This chart will take some explaining, but it is well worth the effort. I will concentrate on temperature first (upper graph). The blue vertical bars (they are just very thin so look like a jagged line) are the maximum (top of bar) and minimum (bottom of bar) temperatures recorded at the Missoula weather station. The upper boundary of the green band is the Climate Normal Tmax and the lower boundary is the Climate Normal Tmin (minimum temperature). So, the green band represents the band of calculated normal temperatures and the blue “line” represents the measured temperatures throughout the year. The top of the pink, jagged band is the extreme temperature recorded for that day (Record Max) and the bottom of the blue band is the extreme minimum temperature recorded for that day (Record Min). So, this graph shows an incredible amount of information, allowing comparison throughout the year of measured data to calculated normals and recorded extremes. Precipitation graphs are simpler because they show only the measured data (the “stepped” line) and the “normal precipitation”. The graph shows cumulative precipitation through the year, so it is easy to see if precipitation is above the Climate Normal (dark green) or below normal (light brown). You can say a lot by looking at these plots about how temperature and precipitation in a specific year compares to the latest three decades used to calculate the normals (1981-2010 in this case).

The most obvious is how “un-normal” the normals are, or maybe a better way to look at it is how “un-normal” the daily records are. For example, for 2014 until October 6, there are many days that fall outside the normal range of temperatures. There are also very few days that were near or beat the extreme records—two maximum and two minimum out of just over nine months of records. But some months were really wild. Let’s zoom in on February using the figure below (Figure 4).


Figure 4: Daily maximum and minimum temperatures for February 2014, in Missoula, MT.

For the first third and last third of February, every day had minimum and maximum temperatures substantially lower than normal. During the middle of the month, it was the opposite with the temperatures substantially (but not as extreme) above normal. There is a clear step in the weather across the boundaries between these two conditions. Like a switch turning from really cold to really warm. That switch was the jet stream as it moved across the region bringing in warm or cold air. But that story is for another post.


To get to the graphics in the last two plots takes some patience with the Weather Service site. The easiest way (but not the only way) is to follow these directions:

1. Start at this web site:

2. Click on the region of the state your are interested in on the U.S. map. This will take some messing with to get where you want. When the right region comes up, select the station you want. I selected Missoula, MT (mso), so got this site:

3. Then click on the tab (upper part of the page) titled, “Local Data/Records”.

4. This brings up another page. Under “Climate Graphs” select “Graphical Climate Summaries for…” where the “…” is the list of stations in the region you selected.

Posted in Global Change

Warming Complexities in the News

Winds of Warming

September has been a great month for climate science in the news. Many major news media noticed a paper published in the Proceedings of the National Academy of Sciences (PNAS) by Drs. James A. Johnstone and Nathan J. Mantua titled, “Atmospheric controls on northeast Pacific temperature variability and change, 1900–2012“. Their research found that “changes in winds over the eastern Pacific Ocean explain most of the warming trend along the West Coast of North America in the last century”, not increases in solar insolation from increasing greenhouse gases in the atmosphere. For details you should read the summary of the article on the Southwest Fisheries Science Center’s web site ( where Dr. Mantua works. Also read Prof. Cliff Mass’ (University of Washington) article on the importance of the findings compared to other work on the region. You can find that article on the Cliff Mass Weather Blog (

Considering the wealth of papers finding that global temperature increases in the last few decades result mostly from greenhouse enhancement, at first glance the Johnstone and Mantua result is surprising, and as many news media said, controversial. But, not really. It shows that the climate is complex and local/regional variability can be large, even larger than global forcing. The global greenhouse enhancement is imbedded in the west coast temperature signal, it is just swamped by natural variability. This variability is from the atmospheric systems (large scale wind patterns) in the North Pacific Ocean that drive major weather systems across the Pacific Northwest and transport heat through the atmosphere. The importance of the Johnstone and Mantua work is that it helps develop “…a fuller understanding of natural and anthropogenic changes…”. The take home message is that the world is a complicated place and regional variability is large. Their results can be seen best in a figure in the supplemental information that goes along with the paper (click on their figure S4 below). In this figure you can see the temperature trends from the Pacific Coast states before the Pacific atmospheric forcing is removed (left side) and after it is removed (right side). Once the effects from the wind/pressure processes described in the paper are corrected, temperature trends are diminished substantially across the region.  However, Southern California shows persistent significant upwards trends (warming) even with the atmospheric effects removed. One question is how far onto the continent do such controls extend? You can see that even areas interior of the coastal mountain ranges show strong responses to the oceanic conditions. This should not be a surprise because of the well known effects of El Niño (and other ocean-atmospheric processes) on climate throughout the world. But the quantification of this process on temperature trends is definitely important and needs to be  incorporated into modeling and predictions of how the climate will change in the future with continual addition of greenhouse gases and other human modifications of the landscape and waterscape.

Figure S4 from Johnstone and Mantua, 2014.

Figure S4 from Johnstone and Mantua, 2014.

The other interesting climate science publishing event was the release of “Explaining extreme events of 2013 from a climate perspective” published by the American Meteorological Society (AMS). For the last few years, the AMS has published a retrospective on the major climate events that affect the world in some dramatic way: droughts, floods, heat wave, etc. In the latest publication, several groups of meteorologists and climate scientists write about the same event from different perspectives, so it is a great volume to see a range of ideas on a topic–real science without the homogenization of “consensus”! You can read the report here ( and read a New York Times story on it here ( Over eighty authors in 22 articles examine drought, heat waves, hurricanes, downpours, cold snaps, and blizzards that occurred in 2013 around the world. The most interesting papers were the two sets on the California drought (four) and the Australian heat wave and drought. These papers give a great overview of the complexity of these events and how many different aspects of the climate come together to cause them, including some of the same large-scale atmospheric processes that Johnstone and Mantua present in their paper.

The thing that really stands out in the California papers is the severity of this drought within the historical record (click figure below from Swain et al.).


12-month (one-sided) moving average precipitation in California from 1895 to 2014. Major historical droughts highlighted. Swain et al. 2014, Bull. Amer. Meteor. Soc., 95 (9), S1–S96.

There are lots of detailed plots and discussion about outcomes in these papers in a short format. They are dense but well laid out with a short and informative introduction, followed by brief (but jargony) results and then a concise conclusion. You can get a clear picture of what the authors did, why they did it and what they came up with. It will take much more reading of the cited literature to understand the background the authors rely on for their conclusions, but the papers are a great presentation of the science of studying extreme climate events. I especially liked the paper by Hoerling and his eight co-authors on the extreme rain and flooding event in NE Colorado (around Boulder, CO). They examined very carefully the importance of the local situation in the broader context of regional/global weather/climate. A very nice job!  There is a topic for everyone in this volume, so it is worth the read.


Johnstone and Mantua article:

BAMS Collection:

News stories:

Seattle Times:

Los Angles Times:

CBS News:

Posted in Global Change | Tagged

Moving Dirt

An Abandoned Open Pit Copper Mine, Butte, MT

The discussion of human-induced climate change commonly revolves around indirect effects by humans. For example, we increase greenhouse gases, which leads to warming of the planet, which changes climate processes in myriad ways. But there are more direct effects by humans. Since I completed my post on human population (February 2, 2010) nearly 9 million people have been added to the planet—6.809 billion people total now. That is a lot of people to house, feed and transport around, and so requires lots of dirt to be moved. One interesting question is how does all the earth moving associated with human development compare to the amount of material moved by geologic processes, like mountain building, glaciation, and erosion by wind, waves and rivers?

Until 16 years ago most geologists would have thought that humans were no match for the Earth. But that all changed in 1994 when Dr. Roger Hooke, then a Professor at the University of Minnesota, published a paper in the Geological Society of America’s news magazine, GSA Today (see Hooke, R. LeB., 1994, On the efficacy of humans as geomorphic agents: GSA Today, v. 4, No. 9, p. 217, 224-225 and this link). In his article, Hooke did an elegant first-order analyses of resource and construction data along with geologic data that estimated humans moved about 45 billion tons of material a year (45 Gt/y), essentially equivalent to all the material moved by ricers, glaciers, waves, wind, and continental mountain building. Direct human actions moved as much material as all the geologic processes on Earth—phenomenal!

In his analyses, Hooke left out the material moved by agriculture. He included agriculture in another paper in 2000 (see Hooke, R.LeB., 2000, On the history of humans as geomorphic agents: Geology, v. 28, no 9, p. 843-846)  and found that humans moved almost 2 times the material moved by geologic processes on the continents (80 Gt/y vs. 45 Gt/y). In the last 10 years more detailed and complete analyses by other geologists have found even higher “human erosions rates”. Dr. Bruce Wilkinson, from the University of Michigan, constructed a detailed historical time series of direct human movement of earth material (see Geology; March 2005; v. 33; no. 3; p. 161-164 ). He found that by approximately 1000 AD humans equaled geologic processes (see figure below). In 2000 AD, humans moved upwards of 40 times the geologic erosion rate. Wilkinson summed this up with this quote: “At these rates, this amount of material would fill the Grand Canyon in 50 yr.” To put this in perspective, it took geologic erosion about 6 million years to form the Grand Canyon—humans have become the largest geologic force on Earth’s continents, bar none. Phenomenal!

Human Erosion vs. Geologic Erosion (from Wilkinson, 2005)

Posted in Global Change

The Carbon Cost of Human Reproduction

The human population is rarely discussed in the context of human-induced climate change. As I write this, on February 2, 2010 at noon Mountain Standard Time, the World population is estimated at 6,800,295,487 people (see figure above).

You can find the population at the U.S. Census Bureaus web site; check it as you start to read this post and then when you finish to get an idea of how rapidly the population is growing (

What is astounding is that in the next hour, about 8600 people will be added to Earth.  In the next year over 75 million will be added–about two “Californias” (see Table 1 below). That means that water and food and jobs, all the things that go into living in the 21st century, have to be created for all these people. And it is not just next year, it is each subsequent year. And, the numbers keep rising exponentially. In only 4 years we will have added 309 million people, another “United States”!

Time unit       Births      Deaths       Increase
Year       131,940,516   56,545,138    75,395,378
Month       10,995,043    4,712,095     6,282,948
Day            361,481      154,918       206,563
Hour            15,062        6,455         8,607
Minute             251          108           143
Second             4.2          1.8           2.4

Table 1. From U.S. Census Bureau (link above in text).

As we add people they use energy—directly and to produce all the extra food water and goods. Because energy is mostly derived from fossil fuels, release of carbon to the atmosphere increases. We can think of this as human’s “carbon legacy”. A few years ago, Paul Murtaugh and Michael Schlax from Oregon State University, published an article titled Reproduction and the carbon legacies of individuals (Global Environmental Change, Vol. 19, No. 14-20, doi: 10.1016/j.gloenvcha.2008.10.007). They found that “Under current conditions in the United States, for example, each child adds about 9441 metric tons of carbon dioxide to the carbon legacy of an average female, which is 5.7 times her lifetime emissions.” This and a report written by the London School of Economics are highlighted in an article in the Washington Post  (September 15, 2009:  The London report found that it is far cheaper to invest in preventing births than in supplying extra energy for extra people and is far cheaper than developing new solar or wind power ($7/ton vs. $51-$24/ton). This shows how closely linked population growth and energy use are.

Oh, and as I finish writing this post, I checked the Population Clock:

Over 10,000 people have been added to the world!

Posted in Global Change | 1 Comment

Measuring Global Temperature: Part II–Spatial Averaging

Having discussed how thermometer data is used to construct time series of temperature at a station (see Part I), we now turn to how we compare data among all the weather stations in a region and on the Earth. Again, this seems simple at first glance, just add up all the data, calculate an average and plot that up. But, it turns out to be very complicated. The first problem is the disparity in the distribution of weather stations. Some regions have a high number of stations while others have very few. This is true at many levels. In the United States, urban areas have a large number of stations compared to rural areas. Mountainous regions have fewer than plains regions. If we simply averaged all these stations there would be large biases. Urban areas would dominate the data and mountainous regions would be under represented. There are even greater discrepancies globally (Figure 1) where some countries/regions/continents have dense networks of stations, while others very sparse networks (compare the U.S. with Africa in Figure 1). There are also more stations in the Northern Hemisphere and fewer in the Southern Hemisphere. All these complicate the calculation of a global average temperature. We will work through how the global calculation is made by starting locally and working up to the entire Earth.

Figure 1. The 7280 global meteorological stations used by NOAA to calculate the global mean temperature. From:

We determined how meteorologists calculate average temperatures at a specific station. Now think about how they would start to aggregate station data into a global average. The first problem is that the temperatures at different stations can be very different for particular days or longer time periods because of the local conditions differ substantially among sites. For example, the temperatures at a high-elevation site will be quite different from one near sea-level, even though they may show the same trends over time. One way to address this disparity is to calculate a temperature anomaly. An anomaly is calculated by using a long-term mean temperature for each station. This can be the entire record or some portion of the record. Each of the annual average temperatures at each station are averaged over this time period and then that number is subtracted from each of the yearly averages. This transforms the data for direct degrees to an anomaly, i.e., degrees away from the mean. Positive values are greater than the long-term mean value and negative values are less than the long-term mean. This results in plots like that below (Figure 2).

Figure 2: Temperature anomaly for Hamilton, MT from 1885 to 2009). The blue bars are the anomaly values above (positive) or below (negative) the long-term average (zero).

Calculating anomaly data for each station allows easy comparison with other stations that emphasizes change in temperature, commonly called delta-T (∆T) instead of temperature (T). The anomaly transformation makes it easier to visualize changes in temperatures among stations, but those stations still need to be averaged to get a global mean without incorporating the bias from the different station densities in different regions. To do this the anomaly data is gridded.

First the globe is divided into cells. Commonly these cells are based on longitude and latitude. A typical cell is 5 by 5 degrees. This results in different sizes of cells near the equator versus near the poles. There are an equal number of cells around the equator as there is at any other latitude, they are just smaller at higher latitudes. Then the stations that fall within a cell are identified and the mean temperature anomalies averaged (monthly or yearly) to give a value for the grid. Some grids may have many stations and some may have few or none. This approach tries to minimize the bias caused by differences in station numbers that would affect a simple arithmetic average of all stations. This results in an anomaly number for each year (or month of monthly averages are used), for each grid cell on the  Earth (Figure 3). Where there is no data the cells are left empty (or another approach is to estimate data for a cell based on the cells around it).

Figure 3: Year 2008 temperature anomalies (°C) for 5 X 5 degree cells of a global grid from NASA National Climate Data Center. Red circles are values above the long-term average and blue circles are below the long-term average. From:

More complete data coverage and interpolation between stations can produce more “filled” grid cells for the globe (Figure 4), but there are areas where no data are available, leaving holes in the global grid.

Figure 4: Global temperature anomalies for January, 1969 for a 5X5 degree grid. These data are in °C (not °F), as in Figure 3. From: Brohan et al. 2006, JOURNAL OF GEOPHYSICAL RESEARCH, VOL. 111, D12106, doi:10.1029/2005JD006548, 2006. Report Available as PDF.

Although there is a sparse record from weather stations in the ocean (not many islands) there is a long dataset from ships records that allow gridded data to be developed for the oceans (Figure 5).

Figure 5: Ocean and land gridded data for January, 1969. Sources same as above.

Once we have data for the global grid like that in Figure 5, the next step is easy, we just average them all for each year (or month) to get the global mean temperature anomaly. There are still biases. There are more holes in the grid in the high latitudes and there is a lingering bias due to the large differences in numbers of stations in each grid cell, but these can be assessed (see the paper linked above in Figure 4 for the details) by producing error plots like those below (Figure 6).

Figure 6: Time series of global mean annual temperature anomalies (black) showing the estimated error (colors); see original article cited in Figure 3 for details. Notice how the error increases back through time and tightens more recently. There are more stations that do a better job of measuring temperature in more recent years.

Determining global average temperatures relies on a series of careful analyses. Although it seems complex, it is really just a bookkeeping problem. One has to keep track of many numbers and the uncertainty behind those numbers. The plot above used over 4200 stations and millions of individual data points. But when it is done carefully, these calculations are a good measure of global temperature (the best we can do), based on measured temperature at individual stations. It is the basis for understanding the short-term (last 150 years or so) temperature changes on the Earth.

Posted in Global Change

Measuring Global Temperature: Part I–Temporal Averaging

“The thermometer record shows unequivocally that Earth is warming, and provides the main evidence that this is caused by human activity.”

—von Storch & Allen, Nature, 7 January 2010, doi:10.1038/463025a

When reading the quote above, the term “thermometer record” seems very straight forward: someone is reading a thermometer and recording the temperature. These records are used to create plots like that below (Figure 1), which have become icons of “global warming”. But, what exactly goes into making a graph of the “global annual mean surface air temperature change” like that below and that are then considered “unequivocal” evidence of changes in global temperature? Seems simple, someone just plots up all these thermometer readings. However, in practice it is quite complicated.

Figure 1

Think about how you would determine mean temperature for a day in your back yard. You would measure the temperature with a thermometer, say every hour and write down each number. You would then add all 24 measurements and divide by 24 to calculate a mean (average) temperature for the day. This is not a measurement, but a calculation, that is the first important point. The second is that this is demanding and tedious, someone has to take all those measurements. Considering there are thousands of weather stations around the globe it would demand a huge effort to collect data every hour. Although we could use this approach now with digital, computerized weather stations, that was not possible before computers. So, the average temperature is determined in a very different way in the long historical records used to make plots like that above.

The tool used is the “minimum-maximum thermometer” which records the lowest and highest temperature over a day. At weather stations, these numbers are recorded each day and then the markers reset for the next day’s measurement (now mostly done by computer in rich countries). The average temperature for the day is then calculated as the mean of the high and low temperature. Other averages can then be determined from the daily averages. The average for a month is the mean of all the daily means for that month. The yearly average can be determined by averaging the monthly data for the year or averaging all the daily data for the year. The objective is to calculate one number for the year that depicts the annual temperature. The plot below (Figure 2) shows all the daily average temperatures (red dots) for one year and the yearly average (blue line) for Hamilton, MT in 2008. (You can get data and make plots like these for your region from the U.S. Historical Climatology Network at

Figure 2

In 2008, at Hamilton, MT, the daily average temperature ranged from -7°F to 77°F and the annual average temperature was 45°F. Note how few days fall on or near the average temperature line. You can think about this as a temporal average which then is used to make comparisons over a longer time period, say over the last century. The plot below (Figure 3) shows the annual average temperature at Hamilton, MT from 1885-2008.

Figure 3

Each yearly point is a representation of a plot like Figure 2, so there is a large amount of variability that is lost when such plots are made. But this simplification allows us to look at potential change over time. We just must remember that the annual average temperature is not a measured value but a simplified representation of the complex temperature distribution through the year. The above discussion deals only with one point, the next problem is to expand this to the entire globe.

Posted in Global Change

Welcome to Clima Nova


Our present economic and social systems were developed within a framework of nearly unlimited geologic and ecologic resources. Starting in the distant past, humans locally or even regionally modified and changed the Earth’s landscape. In the 21st Century we now are the dominant force of global change. Humans now “control nature”, albeit sometimes in chaotic and unexpected ways. Nearly all the Earth’s systems are now affected by human endeavors: the landscape, atmosphere, climate, biodiversity, water. You name it, humans have a controlling interest in natural processes and resources. This means we can no longer always count on past experience or business as usual for future policy making and management. Nearly all decisions in the future will need to be made in an environment of change, in an era of resource constraint. To cope with this new era, we will need to transform our society into a system that better understands the natural processes humans have to contend with. The foundation of this understanding is Geoscience: the study of the Earth and the processes creating the landscapes and mineral and water resources society depends on for sustainability. Although not fully appreciated by most people and world leaders, understanding geoscience is the foundation of any stable system of government and management, because the type and availability of geologic resources determines the fundamental resource base for every country and region in the world and so the geopolitics of nations.

The bottom line is that we are moving into a new world, one humans have not experienced before, a world that Thomas Freidman has termed “Hot, Flat and Crowded” and that others have termed the “Anthropocene”. We are entering the age of human dominance of Earth systems. Within the discussion of how humans should respond to the Anthropocene, there is much talk of a new human society, one that will take control of resource and energy use and build a fully sustainable system based on renewable resources (those not pumped or mined from the Earth). However, society will need to rely on traditional geologic resources to build such a system and will need the input of those resources to accommodate the huge increase in population growth and economic development over the next 30-50 years. And no matter how sustainable a future society will be, it will require us to still live and work within the persistent hazards generated by geologic processes that will never be controlled by human ingenuity. There will always be earthquakes, tsunamis, volcanic eruptions, changes in sea level, and floods, that humans will need to contend with. There will always be a need for metals and concrete, and there will always be a need for clean water. And as population and resource use grows exponentially (a concept that is fundamental to dealing with the future) we will all need to know what the Earth can and will throw at us, because Earth processes are not always benign. We will also need to know how to best develop new resources without damaging essential Earth systems even more than we have already and how to repair those we have damaged. This is critical to sustain an increasingly complex world society with diminishing geologic resources under expanding human domination.

One very important aspect of the new world we are facing is that of climate change. What roles do humans play? How does that compare to natural variability? What can we expect to happen? Can we make reliable predictions of the future climate? What will those future scenarios mean for humans and the Earth’s other biota? These are important questions that require a clear understanding of Earth processes and geoscience principles.  Although Climate Change or Global Warming or similar terms now elicit pedantic rants across media outlets, it is critical to understand climate data within the geologic context without a political motive–to look carefully at what science can and cannot tell us. There is a need to understand how we measure the climate of the Earth. There is a need to understand how modeling is used and its limitations. There is a need to know how science is done and its limitations for helping policy makers make decisions. All these will be the purview of Clima Nova. The goal is for this blog to be a site to exchange knowledge about the basics of how the Earth works in relation to climate and how climate scientists collect and evaluate data to understand processes of change at a vast range of time and spatial scales. The objective will be to inform and educate, not to preach. All material presented will be referenced and links given for data used in the posts, giving readers places to go for primary literature and data in climate science and geoscience. The major theme is “climate”, but we will also explore much more than climate to make sense of broader aspects of global change and change at more local and regional scales and how humans are affecting the geophysical processes of the Earth. How the Earth works is fascinating and I hope this gives a glimmer of the knowledge we now have and what we need to learn.

Posted in Global Change