John R. Christy, PhD

Alabama State Climatologist

The University of Alabama in Huntsville

House Energy and Power Subcommittee

20 September 2012

One Page Summary

1.Extreme events, like the recent U.S. drought, will continue to occur, with or without human causation. These recent U.S. “extremes” were exceeded in previous decades.

2.The average warming rate of 38 CMIP5 IPCC models is greater than observations, suggesting models over-react to CO2. Policy based on observations will likely be far more effective than if based on speculative models, no matter what the future climate does. Regarding Arctic sea ice loss, the average model response to CO2 engenders little confidence because the models’ output fails when applied to Antarctic sea ice conditions.

3.New discoveries explain part of the warming found in popular surface temperature datasets which is unrelated to the accumulation of heat due to the extra greenhouse gases, but related to human development around the stations. This means popular surface datasets are limited as proxies for greenhouse warming.

4.Widely publicized consensus reports by “thousands” of scientists rarely represent the range of scientific opinion that attends our murky field of climate research. Funding resources are recommended for “Red Teams” of credentialed investigators, who study low climate sensitivity and the role of natural variability. Policymakers need to be aware of the full range of scientific views, especially when it appears that one-sided-science is the basis for policies which, for example, lead to increased energy costs for citizens.

5.Atmospheric CO2 is food for plants which means it is food for people and animals. More CO2 generally means more food for all. Today, affordable carbon-based energy is a key component for lifting people out of crippling poverty. So, rising CO2 emissions are one indication of poverty-reduction which gives hope for those now living in a marginal existence without basic needs brought by electrification, transportation and industry. Additionally, modern, carbon-based energy reduces the need for deforestation and alleviates other environmental problems such as water and deadly indoor-air pollution. Until affordable and reliable energy is developed from non-carbon sources, the world will continue to use carbon as the main energy source.

Energy and Power Subcommittee

1

John R. Christy, 20 September 2012

Written Statement of John R. Christy

The University of Alabama in Huntsville

Subcommittee Energy and Power, U.S. House of Representatives

20 Sep 2012

I am John R. Christy, Professor of Atmospheric Science, Alabama’s State Climatologist and Director of the Earth System Science Center at The University of Alabama in Huntsville. I have served as a Lead Author and Contributing Author of IPCC assessments, have been awarded NASA’s Medal for Exceptional Scientific Achievement, and in 2002 elected a Fellow of the American Meteorological Society.

It is a privilege for me to offer my views of climate change based on my experience as a climate scientist. My research area might be best described as building datasets from scratch to advance our understanding of what the climate is doing and why. I have used traditional surface observations as well as measurements from balloons and satellites to document the climate story. Many of my datasets are used to test hypotheses of climate variability and change. In the following I will address five issues that are part of the discussion of climate change today, some of which will be assisted by the datasets I have built and published.

1. EXTREME EVENTS

Recently it has become popular to try and attribute certain extreme events to human causation. The Earth however, is very large, the weather is very dynamic, especially at local scales, so that extreme events of one type or another will occur somewhere on the planet in every year. Since there are innumerable ways to define an extreme event (i.e. record high/low temperatures, number of days of a certain quantity, precipitation total over 1, 2, 10 … days, snowfall amounts, etc.) this essentially assures us

Energy and Power Subcommittee

2

John R. Christy, 20 September 2012

that there will be numerous “extreme events” in every year because every year has unique weather patterns. The following assesses some of the recent “extreme events” and demonstrates why they are poor proxies for making claims about human causation.

Midwestern Drought

To put it simply, Andreadis and Lettenmaier (2006) found that for the Midwest, “Droughts have, for the most part, become shorter, less frequent, less severe, and cover a smaller portion of the country over the last century.” In other words, droughts have always happened in the Midwest and they are not getting worse (more on Midwest heat waves below and on Midwest drought in Section 2). The figure below indicates no long- term changes in drought in the primary corn and soybean belt.

Extreme High and Low Temperatures

Another extreme metric is the all-time record high temperature for each state. The occurrence of the records by decade (Figure 1.1 below) makes it obvious that the 1930s were the most extreme decade and that since 1960, there have been more all-time cold records set than hot records in each decade.

Energy and Power Subcommittee

3

John R. Christy, 20 September 2012

25

20

15

10

5

0

FIGURE 1.1 Number of State Record High and Low Temperatures by Decade (NOAA/NCDC/Extremes/SCEC

State Record High

State Record Low

However, there are only 50 states, and this is a number that isn’t large enough to give the best statistical results. Below in Fig. 1.2 are the year-by-year numbers of daily all-time record high temperatures that stood as of 2011 from a set of 970 weather stations with at least 80 years of record (NOAA/NCDC/USHCNv2). There are 365 opportunities in each year (366 in leap years) for each of the 970 stations to set a record high (TMax). The clear evidence is that extreme high temperatures are not increasing in frequency. The recent claims about thousands of new record high temperatures were based on stations whose length-of-record could begin as recently as 1981, thus missing the many heat waves of the 20th century. So, any moderately hot day now will be publicized as setting records for these young stations because they were not operating in the 1930s. The figure below gives what a climatologist would want to know because it uses only stations with long records.

Energy and Power Subcommittee

4

John R. Christy, 20 September 2012

 

 

FIGURE 1.2 Tmax Daily Records 1895-2011

 

10000

970 USHCN StaSons with at least 80 years of ObservaSons

 

 

 

 

 

 

 

 

8000

 

 

 

 

 

 

 

 

6000

 

 

 

 

 

 

 

 

4000

 

 

 

 

 

 

 

 

2000

 

 

 

 

 

 

 

 

0

 

 

 

 

 

 

 

 

1895

1910

1925

1940

1955

1970

1985

2000

2015

A more meaningful result comes if we take the total record highs by ten-year totals, i.e. 1895-1904, 1896-1905,2002-2011. In Figure 1.3 below are the record daily highs for 704 stations with at least 100 years of data. Note that the value for the most recent decade is less than half of what was observed in the 1930s.

FIGURE 1.3 704 USHCNv2 StaSons with 100 years of data 10-year Running Total of TMax Daily Records

1895-1904 to 2002-2011

50000

40000

30000

20000

10000

0

1895

1910

1925

1940

1955

1970

1985

2000

2015

Last Year of 10-year Total

Regarding the heat wave of 2012, I calculated the number of record high temperatures that stand as of 2012 (Fig. 1.4) for stations in the 8 hardest-hit central states

Energy and Power Subcommittee

5

John R. Christy, 20 September 2012

(AR-IL-IN-IA-KS-MO-NE-OK) and stations on the West Coast (CA-OR-WA). Notice that the Central-US and West Coast both felt the heat waves of the 1930s when the highest number of events occurred for both regions. However, the current 2012 event shows high numbers in the Central-US, but a dearth of record highs along the West Coast, indicating the heat wave is smaller and less severe than previous events.

Figure 1.4 Number of Daily Record Hi Temperatures set in given year that

 

stand as of 2012 for months Jan-Aug. USHCNv2 staSons with at least 80

 

2500

 

years of record: 127 in Central US, 110 in West coast US

 

 

 

 

 

 

 

 

 

 

 

 

 

 

2000

 

 

 

 

 

AR-IL-IN-IA-KS-MO-NE-OK

 

 

 

 

 

 

 

 

 

 

 

 

1500

 

 

 

 

 

CA-OR-WA

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

1000

 

 

 

 

 

 

 

 

 

 

 

 

500

 

 

 

 

 

 

 

 

 

 

 

 

0

 

 

 

 

 

 

 

 

 

 

 

 

1895

1905

1915

1925

1935

1945

1955

1965

1975

1985

1995

2005

2015

Record cold temperatures are shown in Fig. 1.5 (TMin). Through the 1980s there was an even distribution with a fairly noticeable drop-off in record lows over the past 25 years. The cause for this drop-off is discussed in Section 3 of this testimony.

 

 

FIGURE 1.5 TMin Daily Records 1895-2011

 

 

10000

970 USHCN StaSons with at least 80 years of ObservaSons

 

 

 

 

 

 

 

 

 

8000

 

 

 

 

 

 

 

 

6000

 

 

 

 

 

 

 

 

4000

 

 

 

 

 

 

 

 

2000

 

 

 

 

 

 

 

 

0

 

 

 

 

 

 

 

 

1895

1910

1925

1940

1955

1970

1985

2000

2015

Energy and Power Subcommittee

6

John R. Christy, 20 September 2012

An interesting result is produced by taking the ratio year-by-year of the number of TMax daily records divided by the number of TMin daily records (Figure 1.6 below). The two large periods of more record highs than lows are in the 1930s and the last 15 years. The first high-ratio period in the 1930s was due to numerous TMax records while the more recent period was due to fewer TMin records. This decline in the record low temperatures (TMin) in the past 25 years is likely related to the general disturbance by human development around the thermometer stations (again, discussed in Section 3). Meehl et al., 2009 did a similar analysis, but started later, in 1950. This led to the claim of a rapidly rising ratio of record highs to record lows. Had the authors gone back only two more decades to look at a more complete climate record, and had taken into account the contamination of TMin values, the claim of rapidly increasing ratios would not hold.

10.0

 

FIGURE 1.6 RaSo Tmax/Tmin Daily Records by year: 1895-2011

 

 

704 USHCNv2 StaSons with at least 100 years of observaSons

 

 

 

 

 

 

 

 

 

 

ScaleLog 1.0

0.1

1895

1910

1925

1940

1955

1970

1985

2000

2015

Texas Drought of 2011

A recent claim that the 2011 drought in Texas was 20 times more likely due to extra greenhouse gases was based on statistics from a modeling exercise

Energy and Power Subcommittee

7

John R. Christy, 20 September 2012

(http://www.noaanews.noaa.gov/stories2012/20120710_stateoftheclimatereport.html.) However, the model overstated the warming rate of Texas, so that it’s statistics wouldn’t apply correctly to the real world. In fact, the authors actually made that point in their study saying the result gave very limited information about real world impacts, and that the impact of greenhouse gases was unknown. See http://blog.chron.com/climateabyss/2012/07/twenty-times-more-likely-not-the-science/ for more explanation.

Colorado Fires

Colorado was in the news earlier this year due to a number of serious wildfires. These fires are usually caused by humans and problematic to study from a climate standpoint because of this and the fire suppression activities that have been around since 1900 or so. Whereas there were many low-intensity fires before settlement, now there tend to be fewer but more intense fires due to the buildup of fuel. In any case, droughts are related to weather patterns that become stationary, so it is useful to ask the question: have weather patterns shown a tendency to become more stationary, thus creating the opportunity for long dry/hot or wet/cool spells

A project which seeks to generate consistent and systematic weather maps back to 1871 (20th Century Reanalyisis Project, http://www.esrl.noaa.gov/psd/data/20thC_Rean/) has taken a look at the three major weather patterns which are often related to extreme events. As Dr. Gill Campo of the University of Colorado, leader of the study, noted to the Wall Street Journal (10 Feb 2011) “… we were surprised that none of the three major indices of climate variability that we used show a trend of increased circulation going

Energy and Power Subcommittee

8

John R. Christy, 20 September 2012

back to 1871” (Compo et al. 2011.) In other words, there appears to be no supporting evidence that human factors have influenced the circulation patterns which drive the larger-scale extreme events. Again we point to natural, unforced variability (i.e. Mother Nature) as the dominant feature of events that have transpired in the past 130 years.

U.S. Drought

Though the conterminous U.S. covers only 1.8% of the globe (6% of land area), we have good records for many weather variables. Below is the month-by-month percentage of the area that is classified as moderate to extreme for dryness and wetness from NOAA. As can be seen below there is a tremendous amount of variability (near zero to near 80 percent), but no long-term trend.

Claims of increasing extremes

NASA’s James Hansen recently claimed that the Earth is experiencing extreme hot temperature conditions whose geographical extent is far in excess of what would be expected from natural variations (Hansen et al. 2012 and a Washington Post OpEd:

Energy and Power Subcommittee

9

John R. Christy, 20 September 2012

http://www.washingtonpost.com/opinions/climate-change-is-here--and-worse-than-we- thought/2012/08/03/6ae604c2-dd90-11e1-8e43-4a3c4375504a_story.html.) In a given year, the area covered by such extremes (known as 3-sigma or 3-s) would be less than 1

percent, and under a gradual warming trend, such as we have seen since the cold 19th century, the area should not exceed the lower single-digit percentages. However, Hansen found that the average area experiencing these high extremes in the Northern Hemisphere summer was very large since 2006. This area, which averaged 12%, for 2006-11 is shown in the solid circles (top line) of Fig. 1.7 below as calculated by Hansen et al.

To arrive at such a large area, Hansen relied on daily mean temperatures which, as shown above and below, are contaminated by nighttime warming, giving a false climate- warming signal. I have recalculated this areal coverage using only daytime high temperatures from the Berkeley Earth Surface Temperature project (BEST, Muller et al. 2012.) When this climate-relevant metric is used, the area of extreme hot events drops by almost half, averaging only 6.7% (gray circles in Fig. 1.7.) Then, Hansen et al. selected a very quite period, 1951-80, as the reference from which to calculate today’s extremes. This was a period with few hot events, so any hot events now would look unduly extreme by comparison. By simply adding 20 years of earlier data, i.e. picking up the extreme heat events of the 1930s, today’s extremes don’t appear nearly as dramatic with an average area of only 3.6% (open circles in Fig. 1.7.) Finally, if one takes the 80-year period as the reference (1931-2010, open squares in Fig. 1.7), the areal extent averages only 1.2%, the amount expected for a slow warming trend. A complete write-up may be found at http://www.drroyspencer.com/2012/08/fun-with-summer-statistics-part-2-the- northern-hemisphere-land/.

Energy and Power Subcommittee

10

John R. Christy, 20 September 2012

Fig 1.7 FracSon of Monitored Area that exceeds a 3-s

0.25Threshold. Reference period for staSsScs as indicated

Hansen et al. Tmean 1951-1980 0.20 BEST TMx 1951-1980

BEST TMx 1931-1980

0.15 BEST TMx 1931-2010

0.10

0.05

0.00

2006

2007

2008

2009

2010

2011

Hansen then suggests that besides extreme high temperatures, loading of the “climate dice” from “human-made global warming” means “extreme drought conditions can develop.” Hansen makes this claim while not explaining evidence such as the NOAA charts in this testimony that droughts have not increased at all in the U.S. Hansen et al.’s claim about increasing droughts were shown to be false when applied to the U.S. by Dr. Patrick Michaels in which actual observations were used to assess the relationship between global temperatures and the magnitude and extent of U.S. droughts. In an ironic twist, Michaels shows that if anything, global warming has led to fewer U.S. droughts. http://www.nationalreview.com/planet-gore/316541/obama-s-drought-facts-patrick- michaels. So, in summary, the expression of “worse than we thought” climate change as documented in Hansen’s OpEd does not stand up to scrutiny.

Recent snowfall in the United States

Snowfall reached record levels in 2009-10 and 2010-11 in some eastern US locations and also in a few western locations in 2010-11. NOAA’s Climate Scene

Energy and Power Subcommittee

11

John R. Christy, 20 September 2012

Investigators committee issued the following statement regarding this, indicating, again, that natural, unforced variability (again, Mother Nature) explains the events.

Specifically, they wanted to know if human-induced global warming could have caused the snowstorms due to the fact that a warmer atmosphere holds more water vapor. The CSI Team’s analysis indicates that’s not likely. They found no evidence — no human “fingerprints” — to implicate our involvement in the snowstorms. If global warming was the culprit, the team would have expected to find a gradual increase in heavy snowstorms in the mid-Atlantic region as temperatures rose during the past century. But historical analysis revealed no such increase in snowfall.

I have looked closely at the snowfall records of the Sierra Nevada mountains of California from the earliest records gathered by the Southern Pacific Railroad beginning in 1878. Long-term trends in snowfall (and thus water resources) in this part of California are essentially zero, indicating no change in this valuable resource to the state (Christy and Hnilo, 2010, Christy 2012.)

From the broad perspective, where we consider all the extremes above, we should see a warning – that the climate system has always had within itself the capability of causing devastating events and these will certainly continue with or without human influence on the climate. Thus, societies should plan for infrastructure projects to withstand the worst that we already know has occurred, and to recognize, in such a dynamical system, that even worse events should be expected. In other words, the set of

Energy and Power Subcommittee

12

John R. Christy, 20 September 2012

the measured extreme events of the small climate history we have, since about 1880, does not represent the full range of extreme events that the climate system (i.e. Mother Nature) can actually generate. The most recent 130 years is simply our current era’s small sample of the long history of climate. Records are made to be broken. For example, one would assume that about 10 percent of the record extremes that occur over a thousand-year period ending in 2100 should occur in the 21st century. Are we prepared to deal with events even worse than we’ve seen so far? Spending which is directed to creating resiliency to these sure-to-come extremes, particularly drought/flood extremes, seems rather prudent to me – since there are no human means to make them go away.

Looking at the longer record of climate patterns

Climatologists realize that the period of time over which we have had instruments to measure the climate (~130 years) is very brief compared to the history of the current 10,000-year interglacial period. Taking a look at the larger picture shows the capability of Mother Nature to produce extreme situations.

Megadroughts of the past 1000+ years

There are several types of records from the flora and fauna of the past 1000 years that provide evidence that droughts of extreme duration (decades) occurred in our nation, primarily in the Great Plains westward to the Pacific Coast.

California

At right are photos from Lindstrom (1990) in which trees grew on dry ground around 900 years ago in what is now a Sierra Nevada alpine lake. This indicates

Energy and Power Subcommittee

13

John R. Christy, 20 September 2012

that a drastic but natural change to a much drier climate must have lasted for at least a century for trees to have grown to these sizes on dry ground.

Rocky Mountains

A 500-year history of moisture in the upper Colorado River basin (below)

indicates the past century was quite moist while major multi- decadal droughts occurred in all four prior centuries (Piechota et

al.

2004.)

Indeed,

the

conclusion of

Piechota

et al.

states that after examining the paleo-record, the present-day droughts “could be worse.” These and other evidences point to the real probability that water supply in the West will see declines simply as a matter of the natural variability of climate.

Great Plains

In the Great Plains, the period from 3000 to 1500 years ago saw a drier and warmer climate during which a significant parabolic sand dune ecosystem developed, especially in western Nebraska and NE Colorado (Muhs 1985). In other words, the Great Plains resembled a desert. Many of these areas experienced dune “reactivation” during Medieval times (900-1300 AD). Then, the climate moistened and cooled beginning around 1300 AD to support the short-grass prairie seen today, though “reactivation” is possible at any time (Schmeisser, 2009). Indeed, Muhs and Holliday (1995) found that dune reactivation can occur within decadal time scales from extended drought by examining the Great Plains environment of only the past 150 years.

Energy and Power Subcommittee

14

John R. Christy, 20 September 2012

With the massive use of ground water for irrigation, the High Plains Aquifer has declined an average of 12.8 ft, with some areas in the Texas panhandle down over 150 ft. The key point here is that the Plains is subject to natural (and sobering) long-term droughts that would very likely tax the current water management system (ground-water withdrawals) while not replenishing the aquifer, producing a situation of reduced agricultural productivity, especially in its southern reaches.

Why extreme events are poor metrics for studying global changes

In the examples above, we don’t see increases in extreme events (which is also true for tornadoes, hurricanes, floods, etc. - see my House testimony of 31 March 2011) but we must certainly be ready for more to come as part of nature’s variability. I want to illustrate how one might use extreme events to conclude (improperly I believe) that the weather in the USA is becoming less extreme and/or colder.

Going back to Fig. 1.1 (the number of all-time state records) we see the following. About 75 percent of the states recorded their hottest temperature prior to 1955, and, over 50 percent of the states experienced their record cold temperatures after 1940. Overall, only a third of the records (hot or cold) have been set in the second half of the whole period. One could conclude, if they were so inclined, that the climate of the US is becoming less extreme because the occurrence of state extremes of hot and cold has diminished dramatically since 1955. Since 100 of anything appears to be a fairly large sample (2 values for each of 50 states), this on the surface seems a reasonable conclusion.

Then, one might look at the more recent record of extremes and learn that no state has achieved a record high temperature in the last 15 years (though one state has tied

Energy and Power Subcommittee

15

John R. Christy, 20 September 2012

theirs.) However, five states have observed their all-time record low temperature in these past 15 years plus one tie. This includes last year’s record low of 31°F below zero in Oklahoma, breaking their previous record by a rather remarkable 4°F. If one were so inclined, one could conclude that the weather that people worry about (extreme cold) is getting worse in the US. (Note: this lowering of absolute cold temperature records is nowhere forecast in climate model projections, nor is a significant drop in the occurrence of extreme high temperature records.)

I am not using these statistics to prove the weather in the US is becoming less extreme and/or colder. My point is that extreme events are poor metrics to use for detecting climate change. Indeed, because of their rarity (by definition) using extreme events to bolster a claim about any type of climate change (warming or cooling) runs the risk of setting up the classic “non-falsifiable hypothesis.” For example, we were told by the IPCC that “milder winter temperatures will decrease heavy snowstorms” (TAR WG2, 15.2.4.1.2.4). After the winters of 2009-10 and 2010-11, we are told the opposite by advocates of the IPCC position, “Climate Change Makes Major Snowstorms More Likely” (http://www.ucsusa.org/news/press_release/climate-change-makes-snowstorms- more-likely-0506.html).

The non-falsifiable hypotheses can be stated this way, “whatever happens is consistent with my hypothesis.” In other words, there is no event that would “falsify” the hypothesis. As such, these assertions cannot be considered science or in anyway informative since the hypothesis’ fundamental prediction is “anything may happen.” In the example above if winters become milder or they become snowier, the non-falsifiable hypothesis stands. This is not science.

Energy and Power Subcommittee

16

John R. Christy, 20 September 2012

As noted above, there are innumerable types of events that can be defined as extreme events – so for the enterprising individual (unencumbered by the scientific method), weather statistics can supply an unlimited, target-rich environment in which to discover a “useful” extreme event. Thus, when the enterprising individual observes an unusual weather event, it may be tempting to define it as a once-for-all extreme metric to “prove” a point about climate change – even if the event was measured at a station with only 30 years of record. This works both ways with extremes. If one were prescient enough to have predicted in 1996 that over the next 15 years, five states would break all- time record cold temperatures while none would break record high temperatures as evidence for cooling, would that prove CO2 emissions have no impact on climate? No. Extreme events happen, and their causes are intricately tied to the semi-unstable dynamical situations that can occur out of an environment of natural, unforced variability.

Science checks hypotheses (assertions) by testing specific, falsifiable predictions implied by those hypotheses. The predictions are to be made in a manner that, as much as possible, is blind to the data against which they are evaluated. It is the testable predictions from hypotheses, derived from climate model output, that run into trouble as shown in Section 2. Before going on to that test, the main point here is that extreme events do not lend themselves as being rigorous metrics for convicting human CO2 emissions of being guilty of causing them.

2. RECENT CLIMATE MODEL SIMULATIONS

One of the key questions policymakers ask is what will happen with the Earth’s weather in the decades to come. More importantly, they want to know how things might

Energy and Power Subcommittee

17

John R. Christy, 20 September 2012

change specifically for their constituents. One pathway to seek answers is to examine the output of climate models that attempt to predict likely outcomes. If one has confidence in the model projections that terrible weather is on the horizon, then it is tempting to devise policy that the same models say would indicate would somehow mitigate that problem.

In Figure 2.1 below, I display the results from 38 of the latest climate model simulations of global temperature that will be used in the upcoming IPCC AR5 assessment on climate change (KNMI Climate Explorer). All of the data are given a reference of 1979-1983, i.e. the same starting line. Along with these individual model runs I show their average (thick black line) and the results from observations (symbols). The two satellite-based results (circles, UAH and RSS) have been proportionally adjusted so they represent surface variations for an apples-to-apples comparison. The evidence indicates the models on average are over-warming the planet by quite a bit, implying there should be little confidence that the models can answer the question asked by policymakers. Basing policy on the circles (i.e. real data) seems more prudent than basing policy on the thick line of model output. Policies based on the circles would include adaptation to extreme events that will happen because they’ve happened before (noted above and below) and since the underlying trend is relatively small.

Energy and Power Subcommittee

18

John R. Christy, 20 September 2012

Fig 2.1 Global CMIP5 RCP45 38 Models, Annual Tas Reference base 1979-1983, 7-yr running average

1.5

1.0

°C

0.5

0.0

-0.5

0

1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

17

18

19

20

21

22

23

24

25

26

27

28

29

30

31

32

33

34

35

36

37

Model Mean

StDev+

StDev-

NCDC

NASA

HadCRUT4

RSSLT SfcAdj

UAHLT SfcAdj

 

 

1975

1980

1985

1990

1995

2000

2005

2010

2015

2020

Arctic Sea Ice Loss

At present, the sea ice extent in the Arctic is at the lowest areal coverage since satellites began monitoring the extent over 30 years ago. In an area with extremely large natural variations, the question is: How much of the loss might be due to extra greenhouse gas warming relative to other causes? We know that there has been warming in the Arctic since the 1960s from all data sets. To explain this observation, Wallace et al. 2012 examined the different patterns of atmospheric circulation that can contribute to a warmer Arctic versus what might be expected from the extra warming due to the additional greenhouse gases being added to the atmosphere. They report:

These results support the notion that the enhanced wintertime warming

over high northern latitudes from 1965 to 2000 was mainly a reflection

Energy and Power Subcommittee

19

John R. Christy, 20 September 2012

of unforced variability of the coupled climate system. Some of the simulations exhibit an enhancement of the warming along the Arctic coast, suggestive of exaggerated feedbacks.

In other words, natural variations of the circulation patterns that create warmer Arctic temperatures explain most of the warming that is detected according to this study (see also Liu and Curry 2004 and Curry’s analysis using the notion of “climate shifts” in which combinations of natural modes of variability can lead to large changes in ice coverage: http://judithcurry.com/2011/03/19/pondering-the-arctic-ocean-part-i-climate- dynamics/). However, there is another non-greenhouse factor that may contribute to Arctic sea ice loss too. When particles from incomplete combustion of carbon fuels, i.e. black carbon aerosols, are transported to the Arctic they can settle on the ice, making the ice darker and more absorbent (less reflective) of the sun’s energy (Jacobson 2006, Hansen et al. 2007.) This extra energy absorbed by the ice speeds up the melting process. It has been suggested that reducing black carbon aerosols may be the quickest way to slow the Arctic sea ice loss.

The question remains as to the contribution of the extra greenhouse gases to warming of the Arctic and the associated sea ice reduction. In Fig. 2.2 below is the Arctic surface temperature simulated by the same 38 models now being studied for the next IPCC report compared with four observational datasets.

Energy and Power Subcommittee

20

John R. Christy, 20 September 2012

Figure 2.2 ArcSc Polar Cap: 70N to 90N CMIP5 RCP45 38 Models, Surface Temp. Reference base 1979-1983, 7-yr running average

4.5

3.5

2.5

°C

1.5

0.5

-0.5

-1.5

0

1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

17

18

19

20

21

22

23

24

25

26

27

28

29

30

31

32

33

34

35

36

37

Model Mean

StDev+

StDev-

NCDC

 

HadCRUT4

RSS LT SfcAdj

UAHLT SfcAdj

 

 

1975

1980

1985

1990

1995

2000

2005

2010

2015

2020

The main cause for the upward trend in Figs. 2.1 and 2.2 is the way models react to added greenhouse gases. In Fig. 2.1 (global) the evidence indicates models are over- reacting, leading to more warming than is being observed. In the Arctic, the model average matches one of the data sets (HadCRUT4) while still being much warmer than the satellite datasets. Is this evidence that the Arctic warming is due to greenhouse gases? As a check, we can do the same analysis but for the southern hemisphere sea ice area shown in Fig. 2.3. Now we see what was generally demonstrated in Fig. 2.1, that the model average depicts a rising temperature trend in this sea ice band which is much greater than seen in observations. Indeed, all observational datasets portray a slight downward trend in temperature which is consistent with an increase in the extent of the sea ice there (near record coverage as we speak). Recently, Zwally et al. 2012, also find that the balance of ice on the continent of Antarctica has been positive, i.e. gains are

Energy and Power Subcommittee

21

John R. Christy, 20 September 2012

larger than losses. This is in contrast to many recent reports that the balance is negative, i.e. that Antarctica is losing ice. What is probably most important here is that the changes in Antarctica are so small relative to its size that determining whether there is gain or loss is not quite within our ability to measure precisely.

Figure 2.3 AntarcSc Sea Ice Band: 75S-60S CMIP5 RCP45 38 Models, Surface Temperature. Reference base 1979-1983, 7-yr averages

 

0

1

2

 

3

4

5

 

6

7

8

2.5

9

10

11

12

13

14

 

15

16

17

 

18

19

20

2.0

21

22

23

24

25

26

 

27

28

29

 

30

31

32

1.5

33

34

35

36

37

Model Mean

 

 

StDev+

StDev-

NCDC

 

 

HadCRUT4

RSSLT SfcAdj

1.0

UAHLT SfcAdj

 

 

 

 

 

°C

 

 

 

 

 

 

 

 

 

0.5

 

 

 

 

 

 

 

 

 

0.0

 

 

 

 

 

 

 

 

 

-0.5

 

 

 

 

 

 

 

 

 

-1.0

 

 

 

 

 

 

 

 

 

-1.5

 

 

 

 

 

 

 

 

 

1975

1980

1985

1990

1995

2000

2005

2010

2015

2020

The conclusion drawn from Figs. 2.1-2.3 is that the closer agreement of Arctic model temperatures and observations is likely so for the wrong reasons. In other words, the extra greenhouse gases induce strongly rising temperatures in the models for all these regions while observations show only one actually comes somewhat close to that result (Arctic.) This relates to the general failings of CMIP5 models to depict the way actual climate patterns vary over time (see also Driscoll et al. 2012 for a study on how none of the CMIP5 models they examined reproduced the large-scale features of the climate response to volcanoes.) Thus, the main reasons for the loss of Arctic sea ice seem to be

Energy and Power Subcommittee

22

John R. Christy, 20 September 2012

its large natural variability and perhaps the darkening of the ice due to black carbon. The greenhouse explanation doesn’t hold up when examined for other regions.

I often describe climate science as a murky science, and no where is this more true than in the field of paleoclimate where efforts are made to reconstruct the long-term climate history from evidence such as tree rings, sediments, ice cores, etc. Results are often contradictory, as in, for example, the magnitude of the medieval warm period.

As to sea ice extent, Kinnard et al. 2011 reconstructed the last 1450 years using several types of proxy data, including ice cores and tree-rings, showing the largest extent around the year 1430 and the second largest extent around 1920. At the very end of the time series, Kinnard et al. 2011 show a dramatic drop (as in an inverted hockey stick) with the most recent values below any of the early warm periods of the last 1450 years. However, there seems to be confusion with the tree-ring data and an examination of the long oxygen proxy component of this time series suggests only a long-term increase in sea ice (http://climateaudit.org/2011/12/03/kinnard-and-the-darrigo-wilson-chronologies/ and http://climateaudit.org/2011/12/05/kinnard-arctic-o18-series/). Indeed Esper et al. 2012 separately found that many previous tree-ring-based proxy studies underestimated the impact of solar forcing in the high latitudes. They produced a 2000+ year record showing a decline in high-latitude temperature since the Roman times. Esper et al. suggest that previous studies “may underestimate the pre-instrumental temperatures including the warmth during Medieval and Roman times.” Antoniades et al. 2011 also report warmer Arctic temperatures 1400 to 800 years ago and before 3000 years ago based on paleo conditions of the largest Arctic ice shelf on Ellesmere Island. This implies

Energy and Power Subcommittee

23

John R. Christy, 20 September 2012

that consistently warmer temperatures than we experience today occurred in the Arctic within the relatively short history of Western civilization.

Southeastern Growing Season Rainfall

A more relevant question for those of us in the Southeast or Midwest is what might happen to our growing season rainfall – a key variable for our economies. Figure 2.4 below shows what 34 models depict for March to July rainfall (7-year running averages) with the circles being the observations. It’s apparent first of all that the models are generally too dry. Secondly, there really is no information for policy here. The trend in the average of the models is so close to zero as to be uninformative (+0.8 inches/century for 1980 - 2100) with results varying from 3.7 inches/century wetter to 1.6 inches/century drier. Neither one of these rates is important because the year-to-year variations in rainfall from observations show a range from 14.9 to 30.7 inches. It is apparent that for a critical quantity such as precipitation, one cannot have confidence in model projections, nor in their attempts to demonstrate what might happen with control strategies for carbon dioxide. Again, an examination of the historical record of rainfall (circles) gives considerable information on what might be expected in terms of the variability, and thus a pathway to plan to accommodate the droughts and floods that are sure to come since they’ve happened in the past.

Energy and Power Subcommittee

24

John R. Christy, 20 September 2012

 

 

 

FIGURE 2.4 SE USA Mar-Jul PrecipitaSon

 

 

35

 

 

 

 

7-year running average

 

 

 

 

 

 

 

34 CMIP5 RCP4.5 Models

 

 

 

 

 

 

 

 

 

 

 

30

 

 

 

 

 

 

 

 

 

 

 

 

25

 

 

 

 

 

 

 

 

 

 

 

 

20

 

 

 

 

 

 

 

 

 

 

 

 

Inches

 

 

 

 

 

 

 

 

 

 

 

 

15

 

 

 

 

 

 

 

 

 

 

 

 

10

0

 

1

 

2

 

3

4

 

5

 

 

 

6

 

7

 

8

 

9

10

 

11

 

 

 

12

 

13

 

14

 

15

16

 

17

 

 

5

18

 

19

 

20

 

21

22

 

23

 

 

24

 

25

 

26

 

27

28

 

29

 

 

 

 

 

 

 

 

 

 

30

 

31

 

32

 

33

 

 

Median

 

 

Obs

 

 

 

 

 

 

 

 

 

 

 

0

 

 

 

 

 

 

 

 

 

 

 

 

1860

1880

1900

1920

1940

1960

1980

2000

2020

2040

2060

2080

2100

Central U.S. Growing Season Precipitation

A similar exercise was done for the Midwest region in Figure 2.5 below (100W- 85W, 37.5N-45N), since crop losses in 2012 are in the news. The character of the actual precipitation shows a clear rise in total amount through the years. However, the same comments regarding the model results for the Southeast apply for the Midwest too as the models indicate an average trend (1980-2100) of a tiny +0.9 inches/century but which really comes down to a shift around 2020 with steady values thereon. The natural range for this region from history varies wildly from 8.7 to 26.7 inches from one growing season to the next. Once again, policies which deal with the large year-to-year variations which cause the most problems for the economy would address a real threat that will continue to occur regardless of the human effects on climate change. The model output

Energy and Power Subcommittee

25

John R. Christy, 20 September 2012

provides no information for substantive policy (see also Stephens et al. 2010 whose title

is self explanatory, “The dreary state of precipitation in global models.”)

 

 

 

FIGURE 2.5 Midwest USA Mar-Jul PrecipitaSon

 

35

 

 

 

 

7-year running average

 

 

 

 

 

 

 

 

34 CMIP5 RCP4.5 Models

 

 

 

30

 

 

 

 

 

 

 

 

 

 

 

 

25

 

 

 

 

 

 

 

 

 

 

 

 

20

 

 

 

 

 

 

 

 

 

 

 

 

Inches

 

 

 

 

 

 

 

 

 

 

 

 

15

 

 

 

 

 

 

 

 

 

 

 

 

10

0

 

1

 

2

 

3

 

4

 

5

 

 

6

 

7

 

8

 

9

 

10

 

11

 

 

12

 

13

 

14

 

15

 

16

 

17

 

5

18

 

19

 

20

 

21

 

22

 

23

 

24

 

25

 

26

 

27

 

28

 

29

 

 

 

 

 

 

 

 

 

30

 

31

 

32

 

33

 

 

 

Median

 

Obs

 

 

 

 

 

 

 

 

 

 

 

0

 

 

 

 

 

 

 

 

 

 

 

 

1860

1880

1900

1920

1940

1960

1980

2000

2020

2040

2060

2080

2100

3. NEW STUDIES ON SURFACE TEMPERATURE PROCESSES

In general, the issue of global warming is dominated by considering the near- surface air temperature (Tsfc) as if it were a standard by which one might measure the climate impact of the extra warming due to increases in greenhouse gases. Fundamentally, the proper variable to measure is heat content, or the amount of heat energy (measured in joules) in the climate system. Thus the basic measurement for detecting greenhouse warming is how many more joules of energy are accumulating in the climate system over that which would have occurred naturally. This is a truly “wicked” problem (see House Testimony, Dr. Judith Curry, 17 Nov 2010) because we do not know how much accumulation can occur naturally.

Energy and Power Subcommittee

26

John R. Christy, 20 September 2012

Unfortunately, discussions about global warming focus on Tsfc even though it is affected by many more processes than greenhouse gas increases. This means that using Tsfc, as a proxy for heat content (the real greenhouse variable) can lead to an overstatement of greenhouse warming if the two are assumed to be too closely related.

A new paper by my UAHuntsville colleague Dr. Richard McNider (McNider et al. 2012) looked at reasons for the fact daytime high temperatures (TMax) are really not warming much while nighttime low temperatures (TMin) show significant warming. This has been known for some time and found in several locations around the world (e.g. California - Christy et al. 2006, East Africa – Christy et al. 2009). Without going into much detail, the bottom line is that as humans disturb the surface (cities, farming, deforestation, etc.) this disrupts the normal formation of the shallow, surface layer of cooler air during the night when TMin is measured. In a complicated process, due to these local changes, there is greater mixing of the naturally warmer air above down to the shallow nighttime cool layer. This makes TMin warmer, giving the appearance of warmer nights over time. The subtle consequence of this phenomenon is that TMin temperatures will show warming, but this warming is caused by a turbulent process which redistributes heat near the surface not to the accumulation of heat related to greenhouse warming of the deep atmosphere. The importance of this is that many of the positive feedbacks that amplify the CO2 effect in climate models depend on warming of the deep atmosphere not the shallow nighttime layer.

During the day, the sun heats up the surface, and so air is mixed through a deep layer. Thus, the daily high temperature (TMax) is a better proxy of the heat content of the deep atmosphere since that air is being mixed more thoroughly down to where the

Energy and Power Subcommittee

27

John R. Christy, 20 September 2012

thermometer station is. The relative lack of warming in TMax indicates that the rate of warming due to the greenhouse effect is smaller than models project (Section 2).

The problem with the popular surface temperature datasets is they use the average of the daytime high and nighttime low (i.e. (TMax+TMin)/2). But if TMin is not representative of the greenhouse effect, then the use of TMin with TMax will be a misleading indicator of the greenhouse effect. TMax should be viewed as a more reliable proxy for the heat content of the atmosphere and thus a better indicator of the enhanced greenhouse effect. This exposes a double problem with models. First of all, they overwarm their surface compared with the popular surface datasets (the squares in Fig. 2.1). Secondly, the popular surface datasets are likely warming too much to begin with. This is why I include the global satellite datasets of temperature which are not affected by these surface problems and more directly represent the heat content of the atmosphere (see Christy et al. 2010, Klotzbach et al. 2010).

4. CONSENSUS SCIENCE

The term “consensus science” will often be appealed to regarding arguments about climate change to bolster an assertion. This is a form of “argument from authority.” Consensus, however, is a political notion, not a scientific notion. As I testified to the Inter-Academy Council in June 2010, wrote in Nature that same year (Christy 2010), and documented in my written House Testimony last year (House Space, Science and Technology, 31 Mar 2011) the IPCC and other similar Assessments do not represent for me a consensus of much more than the consensus of those selected to agree with a particular consensus. The content of these climate reports is actually under the

Energy and Power Subcommittee

28

John R. Christy, 20 September 2012

control of a relatively small number of individuals - I often refer to them as the “climate establishment” – who through the years, in my opinion, came to act as gatekeepers of scientific opinion and information, rather than brokers. The voices of those of us who object to various statements and emphases in these assessments are by-in-large dismissed rather than accommodated. This establishment includes the same individuals who become the “experts” called on to promote IPCC claims in government reports such as the endangerment finding by the Environmental Protection Agency. As outlined in my House Testimony, these “experts” become the authors and evaluators of their own research relative to research which challenges their work. But with the luxury of having the “last word” as “expert” authors of the reports, alternative views vanish.

I’ve often stated that climate science is a “murky” science. We do not have laboratory methods of testing our hypotheses as many other sciences do. As a result what passes for science includes, opinion, arguments-from-authority, dramatic press releases, and fuzzy notions of consensus generated by preselected groups. This is not science.

I noticed the House passed an amendment last year to de-fund the U.N.’s Intergovernmental Panel on Climate Change (IPCC.) We know from Climategate emails and many other sources that the IPCC has had problems with those who take different positions on climate change than what the IPCC promotes. There is another way to deal with this however. Since the IPCC activity is funded by US taxpayers, then I propose that five to ten percent of the funds be allocated to a group of well-credentialed scientists to produce an assessment that expresses legitimate, alternative hypotheses that have been (in their view) marginalized, misrepresented or ignored in previous IPCC reports (and thus EPA and National Climate Assessments). Such activities are often called “Red

Energy and Power Subcommittee

29

John R. Christy, 20 September 2012

Team” reports and are widely used in government and industry. Decisions regarding funding for “Red Teams” should not be placed in the hands of the current “establishment” but in panels populated by credentialed scientists who have experience in examining these issues. Some efforts along this line have arisen from the private sector (i.e. The Non-governmental International Panel on Climate Change at http://nipccreport.org/ and Michaels (2012) ADDENDUM:Global Climate Change Impacts in the United States). I believe policymakers, with the public’s purse, should actively support the assembling all of the information that is vital to addressing this murky and wicked science, since the public will ultimately pay the cost of any legislation alleged to deal with climate.

Topics to be addressed in this “Red Team” assessment, for example, would include (a) evidence for a low climate sensitivity to increasing greenhouse gases, (b) the role and importance of natural, unforced variability, (c) a rigorous and independent evaluation of climate model output, (d) a thorough discussion of uncertainty, (e) a focus on metrics that most directly relate to the rate of accumulation of heat in the climate system, (f) analysis of the many consequences, including benefits, that result from CO2 increases, and (g) the importance that affordable and accessible energy has to human health and welfare. What this proposal seeks is to provide to the Congress and other policymakers a parallel, scientifically-based assessment regarding the state of climate science which addresses issues which here-to-for have been un- or under-represented by previous tax-payer funded, government-directed climate reports. In other words, our policymakers need to see the entire range of findings regarding climate change.

Energy and Power Subcommittee

30

John R. Christy, 20 September 2012

5. IMPACT OF EMISSION CONTROL MEASURES

The evidence above suggests that climate models over-react to greenhouse gas increases. Also there is a lack of evidence to blame humans for an increase in extreme events. One cannot convict CO2 of causing any of these events, because they’ve happened in the past before CO2 levels rose. Even so, using these climate model simulations we can calculate that the theoretical impact of legislation on the global temperature is essentially imperceptible (Christy JR, House Ways and Means Testimony, 25 Feb 2009). In such calculations we simply run the model with and without the proposed changes in greenhouse gases to see the difference in the models’ climates. The result is that actions will not produce a measurable climate effect that can be attributable or predictable with any level of confidence, especially at the regional level.

When I testified before the Energy and Commerce Oversight and Investigations subcommittee in 2006 I provided information on an imaginary world in which 1,000 1.4 gW nuclear power plants would be built and operated by 2020. This, of course, will not happen. Even so, this Herculean effort would result in at most a 10 percent reduction in global CO2 emissions, and thus exert a tiny impact on whatever the climate is going to do. The results today are still the same. Indeed, with the most recent estimates of low climate sensitivity, the impact of these emission-control measures will be even tinier since the climate system doesn’t seem to be very sensitive to CO2 emissions. The recent switch to natural gas represents a partial move to decarbonize our energy production since methane has four hydrogen atoms for every one carbon atom. Thus, there are now even less U.S. CO2 emissions to legislate away.

Energy and Power Subcommittee

31

John R. Christy, 20 September 2012

The Energy Information Administration lists 190 countries by CO2 emissions and Gross Domestic Product. This can be used to answer the question, how much in terms of goods and services does a country generate per ton of CO2 emissions? In terms of efficiency, the U.S. is ranked 81st near Australia (91st) and Canada (78th) two other geographically-large and well-advanced countries with considerable natural resources. China is 186th but France is 9th due to the fact over 80 percent of its electricity comes from nuclear power rather than carbon. A different way to look at this is to realize the U.S. produces 29 percent of the world’s goods and emits only 18 percent of the world’s CO2 emissions (EIA 2009 values.) In other words, the U.S. ranks rather well considering the energy intensive industries of farming, manufacturing, mining, metals processing, etc. that are performed here, the goods of which are sold to the world. So, we produce quite a bit relative to our emissions – the kind of products and services that the world wants to buy. With the recent shift to more natural gas, the U.S. efficiency continues to rise. I suppose if one wanted to reduce U.S. emissions, one could legislate what the world should and should not buy. This, of course, is not a serious idea.

When thinking about policy regarding CO2, one cannot ignore the immense benefits produced directly by CO2 or indirectly from in its relationship to low-cost energy. It is a simple fact that CO2 is plant food and the world around us evolved when levels of CO2 were five to ten times what they are today. Our green world is a consequence of atmospheric CO2. And, food for plants means food for people. The extra CO2 we are putting into the atmosphere not only invigorates the biosphere, but also enhances the yields of our food crops. In my view, this is a tremendous benefit to nature and to us.

Energy and Power Subcommittee

32

John R. Christy, 20 September 2012

Now, with all due respect to former president Bush, in my opinion, he was not accurate to say in 2006 that we are “addicted to oil.” Oil and other carbon-based energies are simply the affordable means by which we satisfy our true addictions – long life, good health, plentiful food, internet services, freedom of mobility, comfortable homes with heating, cooling, lighting and even colossal entertainment systems, and so on. Carbon energy has made these possible.

A rising CO2 concentration is thus an indicator of human progress in health, welfare and security provided by affordable carbon-based energy. As someone who has lived in a developing country, I can assure the committee that without energy, life is brutal and short. At present, hundreds of millions of people are dependent on low-grade biomass (tree branches, dung, etc.) for energy. These sources place a huge burden, literally, on people to find, cut and carry the material where needed. Landscapes are deforested and waterways contaminated by these activities. And tragically, the U.N. estimates about 2 million children die each year due to diseases fostered by the toxic fumes produced when burning wood and dung in the homes. Higher density sources of fuel such as coal and natural gas utilized in centrally-produced power stations actually improve the environmental footprint of the poorest nations while at the same time lifting people from the scourge of poverty.

Coal use, which generates a major portion of CO2 emissions, will continue to rise as indicated by the Energy Information Administration’s chart below. Developing countries in Asia already burn more than twice the coal that North America does, and that discrepancy will continue to expand. The fact our legislative actions will be inconsequential in the grand scheme of things can be seen by noting that these actions

Energy and Power Subcommittee

33

John R. Christy, 20 September 2012

attempt to bend the blue curve for North American down a little, and that’s all. So, downward adjustments to North American coal use will have virtually no effect on global CO2 emissions (or the climate), no matter how sensitive one thinks the climate system might be to the extra CO2 we are putting back into the atmosphere.

Thus, if the country deems it necessary to de-carbonize civilization’s main energy sources, then compelling reasons beyond human-induced climate change need to be offered that must address, for example, ways to help poor countries develop affordable energy. Climate change alone is a weak leg on which to stand to justify a centrally- planned, massive change in energy production, infrastructure and cost.

Thank you for this opportunity to offer my views and research on climate change.

Energy and Power Subcommittee

34

John R. Christy, 20 September 2012

References

Andreadis, K.M. and D.P. Lettenmaier, 2006: Trends in 20th century drought over the continental United States. Geophys. Res. Lett., 33, L10403, doi:10.1029/2006GL025711.

Antoniades, D., P. Francus, R. Pienitz, G. St-Onge and W. Vincent, 2011: Holocence dynamics of the Arctic’s largest ice shelf. Proc. Nat. Acad. Sci., doi:10.1073/pnas.1106378108.

Christy, J.R., 2012: Searching for information in 133 years of California snowfall observations. J. Hydro. Met. DOI:10.1175/JHM-D-11-040.1.

Christy, J.R., B. Herman, R. Pielke, Sr., P. Klotzbach, R.T. McNider, J.J. Hnilo, R.W. Spencer, T. Chase and D. Douglass, 2010: What do observational datasets say about modeled tropospheric temperature trends since 1979? Remote Sens. 2, 2138-2169. Doi:10.3390/rs2092148.

Christy, J.R. and J.J. Hnilo, 2010: Changes in snowfall in the southern Sierra Nevada of California since 1916. Energy & Env., 21, 223-234.

Christy, J.R. 2010: Open Debate: Wikipedia Style, The IPCC, Cherish it, Tweak it, or Scrap it. Nature. 463, 730-732

Christy, J.R., W.B. Norris and R.T. McNider, 2009: Surface temperature variations in East Africa and possible causes. J. Clim. 22, DOI: 10.1175/2008JCLI2726.1.

Christy, J.R., W.B. Norris, K. Redmond and K. Gallo, 2006: Methodology and results of calculating central California surface temperature trends: Evidence of human-induced climate change? J. Climate, 19, 548-563.

Compo, G.P. et al. 2011. Review Article: The Twentieth Century Reanalysis Project. Q. J. R. Meteorol. Soc., 137, 1-28.

Driscoll, S., A. Bozzo, L.J. Gray, A. Robock and G.L. Stenchikov, 2012: Coupled model intercomparison project 5 (CMIP5) simulations of climate following volcanic eruptions. J. Geophys. Res., 117, DOI:10.102929=012JD017607.

Esper, J., D.C. Frank, M. Timonen, E. Zorita, R.J.S. Wilson, J. Luterbacher, S. Holzkamper, N. Fischer, S. Wagner, D. Nievergelt, A. Verstege and U. Buntgen, 2012: Orbital forcing of tree-ring data. Nature, Clim. Chg. DOI:10.1038/nclimate1589.

Fall, S., A. Watts, J. Nielsen-Gammon, E. Jones, D. Niyogi, J.R. Christy, and R.A. Pielke Sr., 2011: Analysis of the impacts of station exposure on the U.S. Historical Climatology Network temperatures and temperature trends. J. Geophys. Res., 116, D14120, doi:10.1029/2010JD015146.

Hansen, J., M. Sato and R. Ruedy, 2012: Perception of climate change. Proc. Nat. Acad. Sci. doi:10.1073/pnas.1205276109.

Hansen, J. et al., 2007: Climate Change and Trace Gases, 365 PHIL. TRANS. R. SOC. 1925, 1942 Jacobson, M.Z., 2006: Effects of Externally-Through-Internally-Mixed Soot Inclusions within Clouds and

Precipitation on Global Climate, 110 J. PHYS. CHEM. A. 6860-6873.

Klotzbach, P.J., R.A.Pielke, Sr., R.A.Pielke, Jr., J.R. Christy, R.T. McNider. Correction to “An alternative explanation for differential temperature trends at the surface and in the lower troposphere.” J. Geophys. Res. 2010. Doi:10.1029/2009JD013655.

Leroy, M., 1999: Classification d’un site, Note Tech. 35, 12 pp. Dir. Des Syst. D’Obs., Meteo-France, Trappes, France.

Leroy, M., 2010: Siting Classification for Surface Observing Stations on Land, Climate, and Upper-air Observations JMA/WMO Workshop on Quality Management in Surface, Tokyo, Japan 27-30 July 2010

Lindstrom, Susan G. 1990. Submerged Tree Stumps as Indicators of Mid-Holocene Aridity in the Lake Tahoe Region. Journal of California and Great Basin Anthropology. 12(2):146-157.

Liu, J. and J.A. Curry, 2004: Recent Arctic sea ice variability: Connections to the Arctic Oscillation and ENSO. Geophys. Res. Lett., 31, doi:10.1029/2004GL019858.

Energy and Power Subcommittee

35

John R. Christy, 20 September 2012

McNider, R.T., G.J. Steeneveld, A.A.M. Holtslag, R.A.Pielke Sr., S. Mackaro, A. Pour-Biazar, J. Walters, U. Nair and J.R. Christy, 2012. Response and sensitivity of the nocturnal boundary layer over land to added longwave radiative forcing. J. Geophys. Res. in press.

Meehl, G.A., C. Tebaldi, G. Walton, D. Easterling, and L. McDaniel, 2009: The relative increase of record high maximum temperatures compared to record low minimum temperatures in the U.S. Geophys. Res. Lett.

Michaels, P., Editor, 2012: ADDENDUM: Global Climate Change Impacts in the United States. CATO Institute. 213 pp.

Muhs, D.R., 1985: Age and paleoclimatic significance of Holocene sand dunes in Northeastern Colorado.

Annals Assoc. Amer. Geographers. 75, 566-582.

Muhs, D.R. and V.T. Holliday, 1995: Evidence of active dune sand on the Great Plains in the 19th century from accounts of early explorers. Quaternary Res., 43, 198-208.

Muller, R.A., J. Wurtele, R. Rohde, R. Jackobsen, S. Permutter, A. Rosenfeld, J. Curry, D. Groom and C. Wickham, 2012: Earth atmospheric land surface temperature and station quality in the continuous United States. J. Geophys. Res., submitted.

Piechota, T., J. Timilsena, G. Tottle and H. Hidalgo, 2004: The western U.S. drought, How bad is it? EOS Transactions, AGU, 85, 301-308.

Schmeisser, R.L., 2009: Reconstruction of paleoclimate conditions and times of the last dune reactivation in the Nebraska Sand Hills. University of Nebraska – Lincoln. Paper AAI3352250.

Stephens, G. et al. 2010: The dreary state of precipitation in global models. J. Geophys. Res., 115, doi:101029/1010JD014532.

Wallace, J.R., Q Fu, B.V. Smoliak, P. Lin and C.M. Johanson, 2012: Simulated versus observed patterns of warming over the extratropical Northern Hemisphere continents during the cold season. Proc. Nat. Aca. Sci. doi:101.1073/pnas.1204875109.

Zwally, H.J., J. Li, J. Robbins, J.L. Saba, D. Yi, A. Brenner, and D. Bromwich, 2012: Mass gains of the Antarctic ice sheet exceed losses. SCAR ISMASS Workshop, 14 July 2012.

Energy and Power Subcommittee

36

John R. Christy, 20 September 2012

Convert PDF to HTML