OKIsItJustMe
OKIsItJustMe's JournalJames Hansen Webinar: An Intimate Conversation with Leading Climate Scientists To Discuss New Research on Global Warming
NREL: The Four Phases of Storage Deployment: A Framework for the Expanding Role of Storage in the U.S. Power System
(Please note, this is a publication of a national research lab. Copyright concerns are nil.)
Storage Futures Study
The Four Phases of Storage Deployment: A Framework for the Expanding Role of Storage in the U.S. Power System (PDF)
Preface
This report is one in a series of NRELs Storage Futures Study (SFS) publications. The SFS is a multiyear research project that explores the role and impact of energy storage in the evolution and operation of the U.S. power sector. The SFS is designed to examine the potential impact of energy storage technology advancement on the deployment of utility-scale storage and the adoption of distributed storage, and the implications for future power system infrastructure investment and operations. The research findings and supporting data will be published as a series of publications. The table on the next page lists the planned publications and specific research topics they will examine under the SFS.
This report, the first in the SFS series, explores the roles and opportunities for new, cost- competitive stationary energy storage with a conceptual framework based on four phases of current and potential future storage deployment, and presents a value proposition for energy storage that could result in substantial new cost-effective deployments. This conceptual framework provides a broader context for consideration of the later reports in the series, including the detailed results of the modeling and analysis of power system evolution scenarios and their operational implications.
The SFS series provides data and analysis in support of the U.S. Department of Energys Energy Storage Grand Challenge, a comprehensive program to accelerate the development, commercialization, and utilization of next-generation energy storage technologies and sustain American global leadership in energy storage. The Energy Storage Grand Challenge employs a use case framework to ensure storage technologies can cost-effectively meet specific needs, and it incorporates a broad range of technologies in several categories: electrochemical, electromechanical, thermal, flexible generation, flexible buildings, and power electronics.
More information, any supporting data associated with this report, links to other reports in the series, and other information about the broader study are available at https://www.nrel.gov/analysis/storage-futures.html.
Executive Summary
The U.S. electricity system currently has about 24 GW of stationary energy storage with the majority of it being in the form of pumped storage hydropower (PSH). Given changing technologies and market conditions, the deployment expected in the coming decades is likely to include a mix of technologies. Declining costs of energy storage are increasing the likelihood that storage will grow in importance in the U.S. power system. This work uses insights from recent deployment trends, projections, and analyses to develop a framework that characterizes the value proposition of storage as a way to help utilities, regulators, and developers be better prepared for the role storage might play and to understand the need for careful analysis to ensure cost-optimal storage deployment.
To explore the roles and opportunities for new cost-competitive stationary energy storage, we use a conceptual framework based on four phases of current and potential future storage deployment (see Table ES-1). The four phases, which progress from shorter to longer duration, link the key metric of storage duration to possible future deployment opportunities, considering how the cost and value vary as a function of duration.
The 23 GW of PSH in the United States was built mostly before 1990 to provide peaking capacity and energy time-shifting for large, less flexible capacity. The economics of PSH allowed for deployment with multiple hours of capacity that allowed it to provide multiple grid services. These plants continue to provide valuable grid services that span the four phases framework, and their use has evolved to respond to a changing grid. However, a variety of factors led to a multidecade pause in new development with little storage deployment occurring from about 1990 until 2011.¹
Changing market conditions, such as the introduction of wholesale electricity markets and new technologies suggest storage deployment since 2011 may follow a somewhat different path, diverging from the deployment of exclusively 8+hour PSH. Instead, more recent deployment of storage has largely begun with shorter-duration storage, and we anticipate that new storage deployment will follow a trend of increasing durations.
We characterize this trend in our four phases framework, which captures how both the cost and value of storage changes as a function of duration. Many storage technologies have a significant cost associated with increasing the duration, or actual energy stored per unit of power capacity. In contrast, the value of most grid services does not necessarily increase with increasing asset durationit may have no increase in value beyond a certain duration, or its value may increase at a rapidly diminishing rate. As a result, the economic performance of most storage technologies will rapidly decline beyond a certain duration. In current U.S. electricity markets, the value of many grid services can be captured by discrete and relatively short-duration storage (such as less than 1 hour for most operating reserves or 4 hours for capacity).
Together, the increasing cost of storage with duration and the lack of incremental value with increasing storage duration will likely contribute to growth of storage in the U.S. power sector that is characterized by a progression of deployments that aligns duration with specific services and storage technologies.
The four phases conceptual framework introduced in this work is a simplification of a more complicated evolution of the stationary energy storage industry and the power system as a whole. While we present four distinct phases, the boundaries between each phase will be somewhat indistinct and transitions between phases will occur at different times in different regions as various markets for specific services are saturated, and phases can overlap within a region. These transitions and the total market sizes are strongly influenced by the regional deployment of variable renewable energy (VRE) as well as hybrid deployments. However, we believe it is a useful framework to consider the role of different storage technologies, and particularly the importance of duration in driving adoption in each phase.
Phase 1, which began around 2011, is characterized by the deployment of storage with 1-hour or shorter duration, and it resulted from the emergence of restructured markets and new technologies that allow for cost-competitive provision of operating reserves, including regulating reserves. Potential deployment of short-duration storage in Phase 1 is bounded by the overall requirements for operating reserves, which is less than 30 GW in the United States even when including regulating reserves, spinning contingency reserves, and frequency responsive reserves, some of which are not yet a widely compensated service.
Phase 2 is characterized by the deployment of storage with 26 hours of discharge duration to serve as peaking capacity. Phase 2 has begun in some regions, with lithium-ion batteries becoming cost-competitive where durations of 26 hours are sufficient to provide reliable peaking capacity. As prices continue to fall, batteries are expected to become cost-competitive in more locations. These storage assets derive much of their value from the replacement of traditional peaking resources, (primarily natural gas-fired combustion turbines), but they also take value from time-shifting/energy arbitrage of energy supply. The potential opportunities of Phase 2 are limited by the local or regional length of the peak demand period and have a lower bound of about 40 GW. However, the length of peak demand is highly affected by the deployment of VRE, specifically solar photovoltaics (PV), which narrows the peak demand period. Phase 2 is characterized in part by the positive feedback between PV increasing the value of storage (increasing its ability to provide capacity) and storage increasing the value of PV (increasing its energy value by shifting it output to periods of greater demand). Thus, greater deployment of solar PV could extend the storage potential of Phase 2 to more than 100 GW in the United States in scenarios where 25% of the nations electricity is derived from solar.
Phase 3 is less distinct, but is characterized by lower costs and technology improvements that enable storage to be cost-competitive while serving longer-duration (412 hour) peaks. These longer net load peaks can result from the addition of substantial 26 hour storage deployed in Phase 2. Deployment in Phase 3 could include a variety of new technologies and could also see a reemergence of pumped storage, taking advantage of new technologies that reduce costs and siting constraints while exploiting the 8+ hour durations typical of many pumped storage facilities. The technology options for Phase 3 include next-generation compressed air and various thermal or mechanical-based storage technologies. Also, storage in this phase might provide additional sources of value, such as transmission deferral and additional time-shifting of solar and wind generation to address diurnal mismatches of supply and demand. Our scenario analysis identified 100 GW or more of potential opportunities for Phase 3 in the United States, in addition to the existing PSH that provides valuable capacity in several regions. Of note for both Phase 2 and 3 is a likely mix of configurations, with some stand-alone storage, but also a potentially significant fraction of storage deployments associated with hybrid plants, where storage can take advantage of tax credits, or shared capital and operating expenses. As in Phase 2, additional VRE, especially solar PV, could extend the storage potential of Phase 3, enabling contributions of VRE exceeding 50% on an annual basis.
Phase 4 is the most uncertain of our phases. It characterizes a possible future in which storage with durations from days to months is used to achieve very high levels of renewable energy (RE) in the power sector, or as part of multisector decarbonization. Technologies options in this space include production of liquid and gas fuels, which can be stored in large underground formations that enable extremely long-duration storage with very low loss rates. This low loss rate allows for seasonal shifting of RE supply, and generation of a carbon-free fuel for industrial processes and feedstocks. Phase 4 technologies are generally characterized by high power-related costs associated with fuel production and use but with very low duration-related costs. Thus, traditional metrics such as cost per kilowatt-hour of storage capacity are less useful, and when combined with the potential use of fuels for non-electric sector applications, makes comparison of Phase 4 technologies with other storage technologies more difficult. The potential opportunities for Phase 4 technologies measure in the hundreds of gigawatts in the United States, and these technologies could potentially address the residual demand that is very difficult or expensive to meet with RE resources and storage deployed in Phases 13.
Our four phases framework is intended to describe a plausible evolution of cost-competitive storage technologies, but more importantly, it identifies key elements needed for stakeholders to evaluate alternative pathways for both storage and other sources of system flexibility. Specifically, an improved characterization of various grid services needed, including capacity and duration, could help provide a deeper understanding of the tradeoffs between various technologies, and non-storage resources such as responsive demand. Such a characterization would help ensure the mix of flexibility technologies deployed is robust to an evolving a grid, which will ultimately determine the amount of storage and flexibility the power system will need.
James Hansen: To Understand and Protect the Home Planet
To Understand and Protect the Home Planet (PDF)James Hansen
Global Warming in the Pipeline will be published in Oxford Open Climate Change of Oxford University Press next week. The paper describes an alternative perspective on global climate change alternative to that of the Intergovernmental Panel on Climate Change (IPCC), which provides scientific advice on climate change to the United Nations.
Our paper may be read as being critical of IPCC. But we have no criticism of individual scientists, who include world-leading researchers volunteering their time to produce IPCC reports. Rather we are questioning whether the IPCC procedure and product yield the advice that the public, especially young people, need to understand and protect their home planet.
Discussion of our paper will likely focus on differences between our conclusions and those of IPCC. I hope, however, that it may lead to consideration of some basic underlying matters.
Three-pronged analysis. IPCC climate analysis leans heavily on GCMs (global climate models), too heavily in my opinion. We prefer a comparable weight on (1) information from Earths paleoclimate history, (2) GCMs, and (3) observations of ongoing climate processes and climate change. This 3-pronged approach can result in rather complex papers, but, so, too, is the real-world complex. We use this 3-pronged approach in both the heavily peer-reviewed paper, Ice Melt, Sea Level Rise, and Superstorms, published in 2016 and in our present Global Warming in the Pipeline (these papers hereinafter abbreviated as Ice Melt and Pipeline, respectively). Below I note specific travails and consequences for the Ice Melt paper that resulted from the fact that our 3-pronged approach differed from that of IPCC. I hope that some explanation here may help avoid a similar fate for Pipeline, as the world is running short on time to develop a strategy to preserve a propitious climate for todays young people and their children.
The Department of Energy Organization Act of 1977
https://www.energy.gov/sites/prod/files/2017/10/f38/DOE%20Organization%20Act%20in%20U.S.C..pdf§7111. Congressional findings
The Congress of the United States finds that(Pub. L. 9591, title I, §101, Aug. 4, 1977, 91 Stat. 567.)
- the United States faces an increasing shortage of nonrenewable energy resources;
- this energy shortage and our increasing dependence on foreign energy supplies present a serious threat to the national security of the United States and to the health, safety and welfare of its citizens;
- a strong national energy program is needed to meet the present and future energy needs of the Nation consistent with overall national economic, environmental and social goals;
- responsibility for energy policy, regulation, and research, development and demonstration is fragmented in many departments and agencies and thus does not allow for the comprehensive, centralized focus necessary for effective coordination of energy supply and conservation programs; and
- formulation and implementation of a national energy program require the integration of major Federal energy functions into a single department in the executive branch.
§7112. Congressional declaration of purpose
The Congress therefore declares that the establishment of a Department of Energy is in the public interest and will promote the general welfare by assuring coordinated and effective administration of Federal energy policy and programs. It is the purpose of this chapter:(Pub. L. 9591, title I, §102, Aug. 4, 1977, 91 Stat. 567; Pub. L. 101510, div. C, title XXXI, §3163, Nov. 5, 1990, 104 Stat. 1841.)
- To establish a Department of Energy in the executive branch.
- To achieve, through the Department, effective management of energy functions of the Federal Government, including consultation with the heads of other Federal departments and agencies in order to encourage them to establish and observe policies consistent with a coordinated energy policy, and to promote maximum possible energy conservation measures in connection with the activities within their respective jurisdictions.
- To provide for a mechanism through which a coordinated national energy policy can be formulated and implemented to deal with the short-, mid- and long-term energy problems of the Nation; and to develop plans and programs for dealing with domestic energy production and import shortages.
- To create and implement a comprehensive energy conservation strategy that will receive the highest priority in the national energy program.
- To carry out the planning, coordination, support, and management of a balanced and comprehensive energy research and development program, including
- assessing the requirements for energy research and development;
- developing priorities necessary to meet those requirements;
- undertaking programs for the optimal development of the various forms of energy production and conservation; and
- disseminating information resulting from such programs, including disseminating information on the commercial feasibility and use of energy from fossil, nuclear, solar, geothermal, and other energy technologies.
- To place major emphasis on the development and commercial use of solar, geothermal, recycling and other technologies utilizing renewable energy resources.
- To continue and improve the effectiveness and objectivity of a central energy data collection and analysis program within the Department.
- To facilitate establishment of an effective strategy for distributing and allocating fuels in periods of short supply and to provide for the administration of a national energy supply reserve.
- To promote the interests of consumers through the provision of an adequate and reliable supply of energy at the lowest reasonable cost.
- To establish and implement through the Department, in coordination with the Secretaries of State, Treasury, and Defense, policies regarding international energy issues that have a direct impact on research, development, utilization, supply, and conservation of energy in the United States and to undertake activities involving the integration of domestic and foreign policy relating to energy, including provision of independent technical advice to the President on international negotiations involving energy resources, energy technologies, or nuclear weapons issues, except that the Secretary of State shall continue to exercise primary authority for the conduct of foreign policy relating to energy and nuclear nonproliferation, pursuant to policy guidelines established by the President.
- To provide for the cooperation of Federal, State, and local governments in the development and implementation of national energy policies and programs.
- To foster and assure competition among parties engaged in the supply of energy and fuels.
- To assure incorporation of national environmental protection goals in the formulation and implementation of energy programs, and to advance the goals of restoring, protecting, and enhancing environmental quality, and assuring public health and safety.
- To assure, to the maximum extent practicable, that the productive capacity of private enterprise shall be utilized in the development and achievement of the policies and purposes of this chapter.
- To provide for, encourage, and assist public participation in the development and enforcement of national energy programs.
- To create an awareness of, and responsibility for, the fuel and energy needs of rural and urban residents as such needs pertain to home heating and cooling, transportation, agricultural production, electrical generation, conservation, and research and development.
- To foster insofar as possible the continued good health of the Nation's small business firms, public utility districts, municipal utilities, and private cooperatives involved in energy production, transportation, research, development, demonstration, marketing, and merchandising.
- To provide for the administration of the functions of the Energy Research and Development Administration related to nuclear weapons and national security which are transferred to the Department by this chapter.
- To ensure that the Department can continue current support of mathematics, science, and engineering education programs by using the personnel, facilities, equipment, and resources of its laboratories and by working with State and local education agencies, institutions of higher education, and business and industry. The Department's involvement in mathematics, science, and engineering education should be consistent with its main mission and should be coordinated with all Federal efforts in mathematics, science, and engineering education, especially with the Department of Education and the National Science Foundation (which have the primary Federal responsibility for mathematics, science, and engineering education).
NREL: 100% Clean Electricity by 2035 Study
(Please note. This is a publication by The National Renewable Energy Laboratory - NREL. Copyright concerns are nil.)
This is the goal. No major technological breakthroughs are required, just commitment and a lot of work. Four paths are explored. Pick your favorite, or a combination.
https://www.nrel.gov/analysis/100-percent-clean-electricity-by-2035-study.html
An NREL study shows there are multiple pathways to 100% clean electricity by 2035 that would produce significant benefits exceeding the additional power system costs.
For the study, funded by the U.S. Department of Energys Office of Energy Efficiency and Renewable Energy, NREL modeled technology deployment, costs, benefits, and challenges to decarbonize the U.S. power sector by 2035, evaluating a range of future scenarios to achieve a net-zero power grid by 2035.
The exact technology mix and costs will be determined by research and development, among other factors, over the next decade. The results are published in Examining Supply-Side Options To Achieve 100% Clean Electricity by 2035.
Scenario Approach
To examine what it would take to achieve a net-zero U.S. power grid by 2035, NREL leveraged decades of research on high-renewable power systems, from the Renewable Electricity Futures Study, to the Storage Futures Study, to the Los Angeles 100% Renewable Energy Study, to the Electrification Futures Study, and more.
NREL used its publicly available flagship Regional Energy Deployment System capacity expansion model to study supply-side scenarios representing a range of possible pathways to a net-zero power grid by 2035from the most to the least optimistic availability and costs of technologies.
The scenarios apply a carbon constraint to:
- Achieve 100% clean electricity by 2035 under accelerated demand electrification
- Reduce economywide, energy-related emissions by 62% in 2035 relative to 2005 levelsa steppingstone to economywide decarbonization by 2050.
Key Findings
Technology Deployment Must Rapidly Scale Up
In all modeled scenarios, new clean energy technologies are deployed at an unprecedented scale and rate to achieve 100% clean electricity by 2035. As modeled, wind and solar energy provide 60%80% of generation in the least-cost electricity mix in 2035, and the overall generation capacity grows to roughly three times the 2020 level by 2035including a combined 2 terawatts of wind and solar.
To achieve those levels would require rapid and sustained growth in installations of solar and wind generation capacity. If there are challenges with siting and land use to be able to deploy this new generation capacity and associated transmission, nuclear capacity helps make up the difference and more than doubles todays installed capacity by 2035.
Across the four scenarios, 58 gigawatts of new hydropower and 35 gigawatts of new geothermal capacity are also deployed by 2035. Diurnal storage (212 hours of capacity) also increases across all scenarios, with 120350 gigawatts deployed by 2035 to ensure demand for electricity is met during all hours of the year.
Seasonal storage becomes important when clean electricity makes up about 80%95% of generation and there is a multiday to seasonal mismatch of variable renewable supply and demand. Across the scenarios, seasonal capacity in 2035 ranges about 100680 gigawatts.
Significant additional research is needed to understand the manufacturing and supply chain associated with the unprecedent deployment envisioned in the scenarios.
Download Infographic. (PDF) View Data.
Significant Additional Transmission Capacity
In all scenarios, significant transmission is also added in many locations, mostly to deliver energy from wind-rich regions to major load centers in the eastern United States. As modeled, the total transmission capacity in 2035 is one to almost three times todays capacity, which would require between 1,400 and 10,100 miles of new high-capacity lines per year, assuming new construction starts in 2026.
Climate and Health Benefits of Decarbonization Offset the Costs
NREL finds in all modeled scenarios the health and climate benefits associated with fewer emissions offset the power system costs to get to 100% clean electricity.
Decarbonizing the power grid by 2035 could total $330 billion to $740 billion in additional power system costs, depending on restrictions on new transmission and other infrastructure development. However, there is substantial reduction in petroleum use in transportation and natural gas in buildings and industry by 2035. As a result, up to 130,000 premature deaths are avoided by 2035, which could save between $390 billion to $400 billion in avoided mortality costs.
When factoring in the avoided cost of damage from floods, drought, wildfires, and hurricanes due to climate change, the United States could save over an additional $1.2 trilliontotaling an overall net benefit to society ranging from $920 billion to $1.2 trillion.
Necessary Actions To Achieve 100% Clean Electricity
The transition to a 100% clean electricity U.S. power system will require more than reduced technology costs. Several key actions will need to take place in the coming decade:
- Dramatic acceleration of electrification and increased efficiency in demand
- New energy infrastructure installed rapidly throughout the country
- Expanded clean technology manufacturing and the supply chain
- Continued research, development, demonstration, and deployment to bring emerging technologies to the market.
Failing to achieve any of the key actions could increase the difficulty of realizing the scenarios outlined in the study.
I have decided that the "birth control"/"climate change" meme is racist
The US has a relatively low total fertility rate (births per woman) compared to Africa:
Map of countries by fertility rate (2018), according to CIA World Factbook
Yet, the US is the primary source of carbon dioxide emissions:
Countries by carbon dioxide emissions in thousands of tonnes per annum, via the burning of fossil fuels (blue the highest and green the lowest).
The reason is our very high per capita CO₂ emissions:
Birth rates clearly are not the cause of "climate change."
By harping about "birth control," US citizens can blame Africans for "climate change" because their birth rate is so high. If our per capita CO₂ emissions matched Africas, we wouldnt be in this predicament.
Thats why I have decided the meme is racist in nature.
Its also a convenient excuse not to do something difficult, like cutting our per capita emissions. After all, such efforts are useless if those Africans are going to keep breeding? (Right?)
New Studies Increase Confidence in NASA's Measure of Earth's Temperature
(Please note: Story from NASA copyright concerns are nil.)
https://climate.nasa.gov/news/2876/
New Studies Increase Confidence in NASA's Measure of Earth's Temperature
By Jessica Merzdorf,
NASA's Goddard Space Flight Center
Earths long-term warming trend can be seen in this visualization of NASAs global temperature record, which shows how the planets temperatures are changing over time, compared to a baseline average from 1951 to 1980. The record is shown as a running five-year average. Credit: NASAs Scientific Visualization Studio/Kathryn Mersmann. Download related visualizations here.
A new assessment of NASA's record of global temperatures revealed that the agency's estimate of Earth's long-term temperature rise in recent decades is accurate to within less than a tenth of a degree Fahrenheit, providing confidence that past and future research is correctly capturing rising surface temperatures.
The most complete assessment ever of statistical uncertainty within the GISS Surface Temperature Analysis (GISTEMP) data product shows that the annual values are likely accurate to within 0.09 degrees Fahrenheit (0.05 degrees Celsius) in recent decades, and 0.27 degrees Fahrenheit (0.15 degrees C) at the beginning of the nearly 140-year record.
This data record, maintained by NASAs Goddard Institute for Space Studies (GISS) in New York City, is one of a handful kept by major science institutions around the world that track Earth's temperature and how it has risen in recent decades. This global temperature record has provided one of the most direct benchmarks of how our home planet's climate has changed as greenhouse gas concentrations rise.
The study also confirms what researchers have been saying for some time now: that Earth's global temperature increase since 1880 about 2 degrees Fahrenheit, or a little more than 1 degree Celsius cannot be explained by any uncertainty or error in the data. Going forward, this assessment will give scientists the tools to explain their results with greater confidence.
GISTEMP is a widely used index of global mean surface temperature anomaly it shows how much warmer or cooler than normal Earths surface is in a given year. "Normal" is defined as the average during a baseline period of 1951-80.
NASA uses GISTEMP in its annual global temperature update, in partnership with the National Oceanic and Atmospheric Administration. (In 2019, NASA and NOAA found that 2018 was the fourth-warmest year on record, with 2016 holding the top spot.) The index includes land and sea surface temperature data back to 1880, and today incorporates measurements from 6,300 weather stations, research stations, ships and buoys around the world.
Previously, GISTEMP provided an estimate of uncertainty accounting for the spatial gaps between weather stations. Like other surface temperature records, GISTEMP estimates the temperatures between weather stations using data from the closest stations, a process called interpolation. Quantifying the statistical uncertainty present in those estimates helped researchers to be confident that the interpolation was accurate.
Uncertainty is important to understand because we know that in the real world we dont know everything perfectly, said Gavin Schmidt, director of GISS and a co-author on the study. All science is based on knowing the limitations of the numbers that you come up with, and those uncertainties can determine whether what youre seeing is a shift or a change that is actually important.
The study found that individual and systematic changes in measuring temperature over time were the most significant source of uncertainty. Also contributing was the degree of weather station coverage. Data interpolation between stations contributed some uncertainty, as did the process of standardizing data that was collected with different methods at different points in history.
After adding these components together, GISTEMPs uncertainty value in recent years was still less than a tenth of a degree Fahrenheit, which is very small, Schmidt said.
The team used the updated model to reaffirm that 2016 was very probably the warmest year in the record, with an 86.2 percent likelihood. The next most likely candidate for warmest year on record was 2017, with a 12.5 percent probability.
Weve made the uncertainty quantification more rigorous, and the conclusion to come out of the study was that we can have confidence in the accuracy of our global temperature series, said lead author Nathan Lenssen, a doctoral student at Columbia University. We dont have to restate any conclusions based on this analysis.
Another recent study evaluated GISTEMP in a different way that also added confidence to its estimate of long-term warming. A paper published in March 2019, led by Joel Susskind of NASA's Goddard Space Flight Center, compared GISTEMP data with that of the Atmospheric Infrared Sounder (AIRS), onboard NASA's Aqua satellite.
GISTEMP uses air temperature recorded with thermometers slightly above the ground or sea, while AIRS uses infrared sensing to measure the temperature right at the Earth's surface (or skin temperature) from space. The AIRS record of temperature change since 2003 (which begins when Aqua launched) closely matched the GISTEMP record.
Comparing two measurements that were similar but recorded in very different ways ensured that they were independent of each other, Schmidt said. One difference was that AIRS showed more warming in the northernmost latitudes.
The Arctic is one of the places we already detected was warming the most. The AIRS data suggests that its warming even faster than we thought, said Schmidt, who was also a co-author on the Susskind paper.
Taken together, Schmidt said, the two studies help establish GISTEMP as a reliable index for current and future climate research.
Each of those is a way in which you can try and provide evidence that what youre doing is real, Schmidt said. Were testing the robustness of the method itself, the robustness of the assumptions, and of the final result against a totally independent data set.
In all cases, he said, the resulting trends are more robust than what can be accounted for by any uncertainty in the data or methods.
Access the paper here.
Energy Flow Diagrams
https://flowcharts.llnl.gov/commodities/energy2018: United-States
2018 Fourth Warmest Year in Continued Warming Trend, According to NASA, NOAA
(Please note, NASA press releaseCopyright concerns are nil.)
https://www.nasa.gov/press-release/2018-fourth-warmest-year-in-continued-warming-trend-according-to-nasa-noaa
RELEASE 19-002
2018 Fourth Warmest Year in Continued Warming Trend, According to NASA, NOAA
Earth's global surface temperatures in 2018 were the fourth warmest since 1880, according to independent analyses by NASA and the National Oceanic and Atmospheric Administration (NOAA).
Global temperatures in 2018 were 1.5 degrees Fahrenheit (0.83 degrees Celsius) warmer than the 1951 to 1980 mean, according to scientists at NASAs Goddard Institute for Space Studies (GISS) in New York. Globally, 2018's temperatures rank behind those of 2016, 2017 and 2015. The past five years are, collectively, the warmest years in the modern record.
2018 is yet again an extremely warm year on top of a long-term global warming trend, said GISS Director Gavin Schmidt.
Since the 1880s, the average global surface temperature has risen about 2 degrees Fahrenheit (1 degree Celsius). This warming has been driven in large part by increased emissions into the atmosphere of carbon dioxide and other greenhouse gases caused by human activities, according to Schmidt.
Earths long-term warming trend can be seen in this visualization of NASAs global temperature record, which shows how the planets temperatures are changing over time, compared to a baseline average from 1951 to 1980. The record is shown as a running five-year average.
Credits: NASAs Scientific Visualization Studio/Kathryn Mersmann
Download high-definition video and still imagery here.
Weather dynamics often affect regional temperatures, so not every region on Earth experienced similar amounts of warming. NOAA found the 2018 annual mean temperature for the contiguous 48 United States was the 14th warmest on record.
Warming trends are strongest in the Arctic region, where 2018 saw the continued loss of sea ice. In addition, mass loss from the Greenland and Antarctic ice sheets continued to contribute to sea level rise. Increasing temperatures can also contribute to longer fire seasons and some extreme weather events, according to Schmidt.
The impacts of long-term global warming are already being felt in coastal flooding, heat waves, intense precipitation and ecosystem change, said Schmidt.
NASAs temperature analyses incorporate surface temperature measurements from 6,300 weather stations, ship- and buoy-based observations of sea surface temperatures, and temperature measurements from Antarctic research stations.
This line plot shows yearly temperature anomalies from 1880 to 2018, with respect to the 1951-1980 mean, as recorded by NASA, NOAA, the Japan Meteorological Agency, the Berkeley Earth research group, and the Met Office Hadley Centre (UK). Though there are minor variations from year to year, all five temperature records show peaks and valleys in sync with each other. All show rapid warming in the past few decades, and all show the past decade has been the warmest.
Credits: NASAs Earth Observatory
These raw measurements are analyzed using an algorithm that considers the varied spacing of temperature stations around the globe and urban heat island effects that could skew the conclusions. These calculations produce the global average temperature deviations from the baseline period of 1951 to 1980.
Because weather station locations and measurement practices change over time, the interpretation of specific year-to-year global mean temperature differences has some uncertainties. Taking this into account, NASA estimates that 2018s global mean change is accurate to within 0.1 degree Fahrenheit, with a 95 percent certainty level.
NOAA scientists used much of the same raw temperature data, but with a different baseline period and different interpolation into the Earths polar and other data poor regions. NOAAs analysis found 2018 global temperatures were 1.42 degrees Fahrenheit (0.79 degrees Celsius) above the 20th century average.
NASAs full 2018 surface temperature data set and the complete methodology used to make the temperature calculation are available at:
https://data.giss.nasa.gov/gistemp
GISS is a laboratory within the Earth Sciences Division of NASAs Goddard Space Flight Center in Greenbelt, Maryland. The laboratory is affiliated with Columbia Universitys Earth Institute and School of Engineering and Applied Science in New York.
NASA uses the unique vantage point of space to better understand Earth as an interconnected system. The agency also uses airborne and ground-based monitoring, and develops new ways to observe and study Earth with long-term data records and computer analysis tools to better see how our planet is changing. NASA shares this knowledge with the global community and works with institutions in the United States and around the world that contribute to understanding and protecting our home planet.
For more information about NASAs Earth science missions, visit:
https://www.nasa.gov/earth
The slides for the Feb. 6 news conference are available at:
https://www.nasa.gov/sites/default/files/atoms/files/noaa-nasa_global_analysis-2018-final_feb6.pdf
NOAAs Global Report is available at:
http://bit.ly/Global201812
-end-
Steve Cole
Headquarters, Washington
202-358-0918
stephen.e.cole@nasa.gov
Last Updated: Feb. 6, 2019
Editor: Sean Potter
Diffusing the methane bomb: We can still make a difference
http://www.iiasa.ac.at/web/home/about/news/190206-Tundra-methane.htmlDiffusing the methane bomb: We can still make a difference
The Arctic is warming twice as fast as the rest of the planet, causing the carbon containing permafrost that has been frozen for tens or hundreds of thousands of years to thaw and release methane into the atmosphere, thereby contributing to global warming. The findings of a study that included researchers from IIASA, however, suggest that it is still possible to neutralize this threat
In their analysis, the researchers quantified the upper range value for natural methane emissions that can be released from the Arctic tundra, as it allows it to be put in relation to the much larger release of methane emissions from human activities. Although estimates of the release of methane from natural sources in the Arctic and estimates of methane from human activity have been presented separately in previous studies, this is the first time that the relative contribution of the two sources to global warming has been quantified and compared.
According to the researchers, their findings confirm the urgency of a transition away from a fossil fuel based society as well as the importance of reducing methane emissions from other sources, in particular livestock and waste. The results indicate that man-made emissions can be reduced sufficiently to limit methane-caused climate warming by 2100 even in the case of an uncontrolled natural Arctic methane emission feedback. This will however require a committed, global effort towards substantial, but feasible reductions.
In essence, we want to convey the message that the release of methane from human activities is something we can do something about, especially since the technology for drastic reductions is readily available - often even at a low cost. If we can only get the human emissions under control, the natural emissions should not have to be of major concern, concludes Höglund-Isaksson.
Profile Information
Gender: Do not displayHometown: New York State
Home country: United States of America
Member since: Mon Mar 6, 2006, 03:51 PM
Number of posts: 21,031