When Will the Obama Administration Possibly Approve the Keystone XL Pipeline?

Obama and Keystone

The Keystone XL pipeline project permit application has been delayed for multiple years.  Despite making changes to reroute the pipeline around the most environmentally sensitive watershed areas and receiving affected State’s approval, the Administration required a new-expanded environmental review, and implemented a new presidential permit process that delayed the permit decision well past the 2012 elections.  The added ‘Supplemental Impact Environmental Statement’ (SIES), which should address all of the significant environmental issues, change of project scope to mitigate major environmental risks, and the new focus on ‘national interest’ impacts, took most of 2012 and all of 2013 to complete.  The draft SIES was recently issued and the formal review and Public comment process should begin soon.  Timing of the pending Obama Administration decision on whether to approve or not approve the Keystone XL pipeline project is now expected to be mid-2014 at the earliest.  This decision could, however, be once again delayed until after the November 2014 elections.  When will the Obama Administration possibly make a final decision on the Keystone XL project?

History of the Keystone XL Pipeline Permitting Process â€" The original Keystone XL pipeline project permit was submitted back in 2008.  The Obama Administration delayed and eventually rejected the TransCanada Keystone XL pipeline cross-border project phase early 2012.  To supposedly make the permit review process more efficient the Administration issued an Executive Order 13604 that added more complexity to the past permit process.  Prior to issuing this new-modified permit process, pipeline reviews were the responsibility of 10 major Federal Agencies; primarily the DOT, DOE, DHS, BLM and the EPA.  Executive Order 13604 directed the Secretary of State to address ‘national interests’ in addition to normal environmental/safety/economic issues.  The presidential permit process must now cover many additional factors including energy security, cultural, and foreign policy.  To make this determination now requires engaging the DOD, DOJ, DOI and DOC in addition to the 10 Federal Agencies that had primarily responsible for pipeline project permit reviews before 2012.

Immediately following the Obama Administration’s denial of the original 2008 Keystone XL cross-border pipeline permit at the beginning of 2012, TransCanada modified their project scope and reapplied for a new pipeline construction permit.  TransCanada made changes to the pipeline routing as needed to avoid the most environmentally sensitive areas identified with the original project; primarily bypassing the Nebraska Ogallala Aquifer-watershed area.  These changes were approved by the Nebraska Governor January 2013.  Following Nebraska’s approval, the Obama Administration initially decided to delay any Keystone XL decision until at least the second quarter 2013.  Primary reasons for this delay was requiring the development of a new SIES to address all of the environmental issues apparently not covered in the original Environmental Impact Report, and evaluating ‘national interest’ impacts of the project.  Slow development of the SIES led to the Keystone XL permit decision being deferred into 2014.

The draft Keystone XL SIES was recently issued.  The SIES must next go through the Public and State/Federal Agency review process, which will take many months to complete beginning with the initial 45-day Public comment period, followed by addressing Public/Agencies’ significant comments/new issues, and making needed changes to the final report/analysis.  The Administration should then have all the information necessary to make a final permit approval decision.  To complete this process will likely take until at least mid-2014. 

Major Keystone XL Pipeline Issues of Concern â€" The major issues surrounding the decision to approve or not approve the Keystone XL project are apparently environmental, climate change, economic, energy security, and foreign policy:

Environmental Impacts â€" Oil spills are normally considered the largest risk of transporting crude.  Pipeline systems are most often used to minimize this risk since pipelines are the most reliable, safest and most efficient mode of transportation.  Over 80% of all the crude and petroleum oils (>20 million barrels per day; MBD) are routinely transported by pipeline within the U.S. year-round.  The original (2008) Keystone pipeline routing crossed large sections of the Nebraska Sandhills and Ogallala Aquifer areas that were determined to be at highest risk to a possible future pipeline leak or failure.  To avoid these and other potential environmental risks the pipeline routing was changed and shortened.  As a result of modifying the routing, the Sandhills region was bypassed, the number of waterways crossed was reduced by over 80% and the pipeline length reduced by almost 40%.  Re. Project Evaluation Factsheet.

Climate Change â€" Increased carbon dioxide equivalent (CO2e) emissions is the primary concern with Canadian Oil Sands crude increased production-consumption.  Determining the net change in full lifecycle CO2e emissions is a fairly complex analysis.  Canadian Oil Sands or bitumen crude oils are very heavy-essentially solid, which requires strip mining for initial recovery-production.  To convert the heavy bitumen into pipeline pumpable oil for Refining feedstocks requires conversion into lighter crude oils called syncrudes or synbit & dilbit.  Synbit & dilbit are commonly classified as ‘unconventional’ crudes since this type of crude oil production & refining is more energy intensive and costly compared to lighter ‘conventional’ crudes.  My recent detailed analysis (Re. Climate Change Impacts table) shows the substituting synbit/dilbit Canadian Oil Sands crude for average U.S. crudes could increase total U.S. annual CO2e emissions by up to 5-10 million metric tons (MMT).  This represents less than a 0.2% increase in recent total U.S. CO2e emissions.

Another major factor that will impact changes in World CO2e emissions are the alternatives to the Keystone XL pipeline.  Since 2010, U.S. Canadian crude oil imports have increased by 560 thousand barrels per day (KBD); compared to the capacity of the Keystone XL of 830 KBD.  These increased Canadian imports have been supplied via existing and the Keystone Phase 1 pipeline that was completed in 2010, and increase transport by rail.  This 560 KBD increase in Canadian imports since 2010 represents 2/3rds the capacity of the current Keystone XL pipeline project under presidential permit review.  These increased Canadian imports are largely synbit/dilbit crudes and could have increased CO2e emissions by similar levels since 2010.  However, due to a combination of displacing heavy Venezuelan syncrude imports (similar to Canadian synbit/dilbit) and expansion of lighter U.S. tight/shale oil (Bakken crude oil production) has actually resulted in minimal change in the average gravity of U.S. refined crude.  Associated Refining CO2e emissions increases are also relatively insignificant.  The primary reason why U.S. Refining average feedstock gravities and associated CO2e emissions have not changed significantly despite increased Canadian heavy crude imports is due to the designs and processing constraints of existing domestic Refineries.  Refineries normally require replacing existing/design crude oil feedstocks with similar gravity blended feedstocks (heavy synbit/dilbit + lighter Bakken crude oil for example).

If the Obama Administration does not approve the Keystone XL pipeline, the Canadians will reroute their Oil Sands crude to one or both coasts for export into to world markets; most likely Asia.  Since U.S. Refineries are among the most efficient in the world (least energy/ CO2e emission intensive) and marine transport of the Oil Sands crude will add significantly to the fossil fuels consumption to deliver the synbit/dilbit crudes to distant continents, the overall lifecycle CO2e emissions will actually be significantly higher (many MMT/year) for exports outside North America.

Economic Impacts â€" The latest pipeline project cost is estimated at $5.4 Billion.  The project is projected to create 15,000+ construction labor and materials fabrication jobs and add substantially to State and Federal (tax) revenues; in addition to major contributions to the U.S. GDP overall after startup.  Removing a major constraint to Canadian crude imports will also benefit all Consumers and most domestic Commercial/Industrial Companies by keeping U.S. domestic crude oil and petroleum products’ costs significantly below higher average world market prices.  Refer to WTI-Brent spread data section.

Energy Security â€" Overall U.S. Energy Security or risk of crude oil imports’ supply disruption has improved somewhat in recent years.  This has been due to a combination of large increases of domestic crude oil production and a decrease in petroleum consumption immediately following the 2007-09 economic recession.  Re. recent EIA data.  My previous analysis shows the recent rapid increase in domestic crude oil production is due overwhelmingly to Free Market factors and the decrease in petroleum consumption from a combination of slowing of the economy and increased energy efficiency upgrades.  Unfortunately, as the U.S. slowly recovers from the 2007-09 economic recession EIA MER 2013 data indicates that petroleum consumption has begun to increase again; exceeding recent year efficiency improvements.

Another recent analysis shows that U.S. Energy Security has not improved significantly over the years.  This is primarily due to effectively no change in the level of highest risk OPEC Persian Gulf crude oil imports that must pass through the Strait of Hormuz.  Despite the recent Obama Administration’s nuclear negotiations, the risk of a future confrontation and Iran shutting down the Strait of Hormuz still exists.  Such an event puts about 2 MBD of current U.S. crude oil imports at risk to supply disruption.  This represents a real risk of losing over 10% of total U.S. crude and petroleum oil supplies should Iran shutdown the Strait of Hormuz in the future.  To put such an event’s impact in perspective, during the 1973 Arab OPEC oil embargo that created a historic energy shortage and crisis, the U.S. only lost 6% of total crude and petroleum oil supplies.

The Keystone XL pipeline provides an opportunity to substantially increase U.S. Energy Security.  Increasing Canadian imports by up to 830 KBD (new pipeline capacity) could displace a similar amount of OPEC Persian Gulf imports (such as Saudi Arabia heavy crudes) that are at greatest risk to disruption.

Foreign Policy â€" Canada is the U.S.’s largest and most important Trade Partner and Ally.  Not only are these North America economics strongly connected, but major Energy Sectors including Power Grids and Petroleum infrastructures-supplies are also highly integrated.  Canada and the U.S. have strongly supported and defended each other both locally and internationally.  Completing the Keystone XL pipeline is clearly in the best interest of overall economies and consistent with the long term relationship-support between both Countries.

Consequences of Blocking the Keystone XL Pipeline Project â€" Delaying and possibly not approving the Keystone XL can and will negatively impact U.S. Energy Security and the economy, have insignificant impact on world CO2e emissions, and possibly damage the important relationship with Canada.  As previously described without the Keystone XL the negative impacts will continue to exit and could increase.

Energy Security â€" existing Canadian-U.S. pipeline infrastructure will be a major barrier to increasing this most secure source of imports and require continue reliance on higher risk OPEC and Persian Gulf imports.

The Economy â€" all the benefits of building the pipeline on jobs creation, increased tax revenues and the GPD contributions will be lost.  In addition, further possible reduction in North America crude and petroleum oil prices will be lost, leading to increased petroleum costs significantly greater than the levels that would develop with increased Canadian imports via the Keystone XL.  The loss of ‘shovel-ready’ pipeline jobs also appears to be inconsistent with the Administration’s claimed continued support of increased infrastructure jobs.

Climate Change â€" the impacts on CO2e emissions are relative small, and will be directionally greater if Oil Sands crude must be bypassed around the U.S.  Without the Keystone XL the Canadians will have no major option other than to build pipelines from Alberta Oil Sands reserves-production to one or both coasts for export into world markets.  This option will result in significantly greater CO2e emissions than if the oil sands are refined and consumed within the U.S.’s higher efficiency and environmentally cleaner Refining & Transportation Sectors.

Canadian-U.S. Relations â€" this historic strong Trade, Energy systems integration, and important values-military Ally relationship risks being damaged significantly.  While the Obama Administration negotiates easing and possibly eliminating the current sanctions imposed previously to curtail Iran’s nuclear ambitions, these negotiations will likely result in removing current constraints imposed on Iran’s ability to produce and expand their crude oil production-exports into world markets.  Such an action appears inconsistent with the reasonable treatment of the U.S.’s largest Trade/Energy Partner and most important Ally by forcing Canada to supply the same world oil markets, and do so by bypassing their future expanded oil production around U.S. borders.

The Politics Associated with the Keystone XL Pipeline Permit Approval â€" The primary campaign of the Keystone XL Opponents has been obviously environmental or climate change.  Concerns include the Oil Sands production impacts on Alberta natural forests, increased full lifecycle CO2e emissions of unconventional synbit/dilbit crudes vs. lighter conventional crudes, and assumed increased CO2e emissions impact on climate change or global warming.  The overall strategy and assumption of Keystone XL Opponents is that if the pipeline was not built, this would bottleneck and prevent the development of future increased Oil Sands production.  This ‘block the pipeline and Oil Sands production’ strategy assumes that no possible alternatives to the Keystone XL pipeline exists or could be developed, which is not likely to be realistic if the Obama Administration fails to approve the project permit.

While the SIES report very extensively reviews all of the significant environmental and national interest issues (Re. executive summary) those Environmental groups that strongly oppose the Keystone XL pipeline project, period, are not likely to be persuaded by any new analysis or data that could lead to the project’s approval.  Many Environmentalists, such as the Sierra Club’s Mr. Brune, are beginning to tune-up their project opposition rhetoric such as stating if the President approves the Keystone XL it will be “the Vietnam of his presidency”.  This and other possible Administration Keystone XL project approval verbal attacks, of course, will likely be followed by more anti-pipeline demonstrations, anti-project/supporter communiqués, and very likely, follow-up legal actions to stop the project permit if approved by the Obama Administration.

When Will the Administration Make a Final Decision on the Keystone XL? â€" In the President’s 2014 State of the Union speech, no mention was made of the Keystone XL.  There was reference to his ‘all-the-above’ energy strategy, which he claims led to increased domestic oil production exceeding imports, and America being closer to energy independence than in the past several decades.   He stated the need for developing first class jobs in support of the need for (building) first class infrastructure.  He promised to streamline the permitting process for key projects, so we can get more construction workers on the job as fast as possible.  He also referenced the importance of supporting America’s trade partners and allies in Europe and Asia-Pacific, but made no mention of Canada. 

Based on this latest State of the Union speech, the Keystone XL pipeline project appears fairly consistent with many of the stated needed improvements (jobs, energy independence/security, improving the economy, etc.), but is apparently still not a priority for the Administration.  It was disappointing that the President made no mention of the importance and opportunities of engaging and building on the economic/energy opportunities with Canada, and, apparently putting more priority on releasing sanctions that will enable Iran to increase their oil production.  Delaying or blocking a major constraint to the most secure source of U.S. oil imports from Canada, continues to force America to rely on highest risk imports from OPEC and the Persian Gulf.

So, when will the Obama Administration make a final decision on the Keystone XL pipeline project permit?  With the total omission or lack of recognition for the pipeline’s benefits and the obvious consistency with many major talking points in the President’s 2014 State of the Union speech, your guess is as good as mine?  What do you think?  Should the President continue to delay this decision past the 2014 elections possibly for political leverage, or should he make a final decision mid-2014 when all the necessary information and review processes should be complete, as required to make this important-final decision?  At some point the Canadians will have to assume the pipeline will not be approved and more aggressively pursue their options to placing their Oil Sands crudes into world markets, where Developing Countries such as China, are apparently much more receptive to facilitating the development and purchase of these crude oil supplies.

Read More

The Pros and Cons of Exporting US Crude Oil

US Oil export pros and cons

  • Calls for an end to the effective ban on exporting most crude oil produced in the US are based on a growing imbalance in domestic crude quality.
  • At least recently, the ban has likely benefited refiners more than consumers. Assessing the impact of its repeal on energy security requires further study. 

Senator Lisa Murkowski (R-AK), the ranking member of the Senate Energy & Natural Resources Committee, issued a white paper earlier this month calling for an end to the current ban on US crude oil exports. Her characterization of existing regulations in this area as "antiquated" is spot on; the policy is a legacy of the 1970s Arab Oil Embargo. However, not everyone sees it the same way, either in Congress or the energy industry.

This isn't just a matter of politics, or of self-interest on the part of those benefiting from the current rules. Questions of economics and energy security must also be considered. The main reason these restrictions are still in place is that for much of the last three decades US oil production was declining. The main challenges for the US oil industry were slowing that decline while ensuring that US refineries were equipped to receive and process the increasingly heavy and "sour" (high sulfur) crudes available in the global market. The shale revolution has sharply reversed these trends in just a few years.

No one would suggest that the US has more oil than it needs. Despite the recent revival of production, the US still imported around 48% of its net crude oil requirements last year. Even when production reaches its previous high of 9.6 million barrels per day (MBD) as the Energy Information Agency now projects to occur by 2017, the country is still expected to import a net 38% of refinery inputs, or 25% of total liquid fuel supply. The US is a long way from becoming a net oil exporter.

The driving force behind the current interest in exporting US crude oil is quality, not quantity, coupled with logistics. If the shale deposits of North Dakota and Texas yielded oil of similar quality to what most US refineries have been configured to process optimally, exports would be unnecessary; US refiners would be willing to pay as much for the new production as any non-US buyer might. Instead, the new production is mainly what Senator Murkowski's report refers to as "LTO"--light tight oil. It's too good for the hardware in many US refineries to handle in large quantities, and for most that can process it, its better yield of transportation fuels doesn't justify as large a price premium as for international refineries with less complex equipment.

As a result, and with exports to most non-US destinations other than Canada or a few special exceptions effectively barred, US producers of LTO must discount it to sell it to domestic refiners. Based on recent oil prices and market differentials, producers might be able to earn as much as $5-10 per barrel more by exporting it. Meanwhile the refiners currently processing this oil are enjoying something of a buyer's market and are able to expand their margins. The export issue thus pits shale oil producers and large, integrated companies (those with both production and refining) such as ExxonMobil against independent refiners like Valero.

Producers are justified in claiming that these regulations penalize them and threaten their growth as available domestic refining capacity for LTO becomes saturated. Additional production is forced to compete mainly with other LTO production, rather than with imports and OPEC.

I believe producers are also largely correct that claims that crude exports would raise US refined product prices are mistaken. The US markets for gasoline, diesel fuel, jet fuel and other refined petroleum products have long been linked to global markets, with prices especially near the coasts generally moving in sync with global product prices, plus or minus freight costs. I participated in that trade myself in the 1980s and '90s. What's at stake here isn't so much pump prices for consumers as US refinery margins and utilization rates.

Petroleum product exports have become a major factor in US refining profitability, and refiners are reportedly investing and reconfiguring to enhance their export capabilities. This provides a hedge against tepid domestic demand. Nationally, refined products have become the largest US export sector and contributed to shrinking the US trade deficit to its lowest level in four years.  If prices for light tight oil rose to world levels US refineries might be unable to sustain their current export pace. It's up to policymakers to assess whether that risk is merely of concern to the shareholders of refining companies or a potential threat to US GDP and employment.

The quest to capture the "value added"--the difference between the value of manufactured products and raw materials--from petroleum production is not new. It helped motivate the creation of the integrated US oil companies more than a century ago and impelled national oil companies such as Saudi Aramco, Kuwait Petroleum Company, and Venezuela's PdVSA to purchase or buy into refineries in Europe, North America and Asia in the 1980s and '90s.

On the whole, OPEC's producers probably would have been better off investing in T-bills or the stock market, because the return on capital employed in refining has frequently averaged at or below the cost of capital over the last several decades. It's no accident most of the major oil companies have reduced their exposure to this sector. When today's US refiners argue that it is in the national interest to preserve the advantage that discounted LTO gives them they are swimming against the tide of oil industry history.

The energy security case for crude exports looks harder to make. An excellent article from the Associated Press quoted Michael Levi of the Council on Foreign Relations as saying, "It runs against the conventional wisdom about what oil security means. Something seems upside-down when we say energy security means producing oil and sending it somewhere else."  The argument hinges on whether allowing US crude exports would simultaneously promote more production and increase the pressure on global oil prices. That makes sense to me as a former crude oil and refined products trader, but it will be a harder sell to Senators, Members of Congress, and their constituencies back home.

The politics of exports may be easing somewhat, though, as a Senate vacancy in Montana could lead to a new Chair at Energy & Natural Resources who would be a natural partner for Senator Murkowski on this issue. (That shift may incidentally be part of a strategy to help Democrats retain control of the Senate.) Will that be enough to overcome election-year inertia and the populist arguments arrayed against it?

As for logistics, the administration could ease the pressure on producers without opening the export floodgates by exempting the oil output from the Bakken, Eagle Ford and other shale deposits from the Jones Act requirement to use only US-flag tankers between US ports. That could open up new domestic markets for today's light tight oil, while allowing Congress the time necessary to debate the complex and thorny export question.

Senator Murkowski wasn't alone in calling for an end to the oil export ban. In his annual State of American Energy speech presented the day as the Senator's remarks, Jack Gerard, CEO of the American Petroleum Institute, noted, "We should consider and review quickly the role of crude exports along with LNG exports and finished products exports, because of the advantages it creates for this country and job creation and in our balance of payments." In a similar address on Wednesday, the head of the US Chamber of Commerce stated, "I want to lift the ban. It's not going to happen overnight, but it's going to happen."  I'd wager he at least has the timing right.

A different version of this posting was previously published on the website of Pacific Energy Development Corporation.

Photo Credit: Export US Crude Oil?/shutterstock

Authored by:

Geoffrey Styles

Geoffrey Styles is Managing Director of GSW Strategy Group, LLC, an energy and environmental strategy consulting firm. Since 2002 he has served as a consultant and advisor, helping organizations and executives address systems-level challenges. His industry experience includes 22 years at Texaco Inc., culminating in a senior position on Texaco's leadership team for strategy development, ...

See complete profile

Read More

How the Smart Grid Helps Address Climate Change in California

smart grid and california

Last week’s article on the California Energy Commission’s 2013 Integrated Energy Policy Report (IEPR) identified how climate changes impact energy needs and create new challenges for the state of California’s electricity, natural gas, and transportation fuel sectors.  Heat and precipitation are two of the major climate changes that have outsized impacts on the state’s energy sector.  That should influence the ongoing design and deployment of Smart Grid technologies and policies.  For one thing, harkening back to my ten Smart Grid and Smart City predictions for 2020, infrastructure like a grid or a community can’t be called smart if it lacks resiliency.  Climate changes will require that we create more resilient critical infrastructures â€" whether it is in the design and management of energy and water, or the policies that determine the quality of responsiveness by governmental agencies to meet their citizens’ needs in times of disruption.

How can the Smart Grid address these challenges and threats?  Here are five suggestions.

1) A Smart Grid delivers grid resiliency by putting more reliance on distributed generation (DG).  A comprehensive DG strategy locates generation assets close to demand.  This strategy also reduces reliance on vulnerable transmission lines that might fry in the next wild fire conflagration.  California already has a good start on DG with the rapid growth of rooftop solar.  Technology and financial innovations are in place to enable continued growth.  Policy innovations should look at defining clear benefits for utilities to encourage investments in generation sited at the distribution grid level and the technologies to manage diverse assets; and encourage partnerships with third parties that can assist in accelerating DG deployments.

2) Deploy applicable monitoring and telemetry technologies for leak detection to the aging water infrastructure, which is in dismaying disrepair and suspected to be leaking like a sieve.  The emphasis is on the word suspected â€" lacking reliable data or visibility into pipeline health means that everyone is offering educated guesses about the overall infrastructural integrity of our water systems.  This activity won’t create more water, but it will help the state and communities manage existing water supplies with intelligence that is lacking today.  Smart water management can deliver situational awareness about operations and create proactive rather than reactive policies and plans â€" similar to the benefits the Smart Grid delivers to the electricity infrastructure.  And let’s acknowledge that energy/water nexus.  When we save water, we save electricity.

3) Deploy water meters across the state, which contains a surprising number of communities that don’t have water meters.  Just like we’ve demonstrated with smart meters for electricity, simple awareness of water consumption can reduce usage.

4) Study the possibilities of instituting Time of Use rates for water that are tied to energy use.  Using water during times of peak electricity demand simply increases overall electricity needs.  Timing water consumption to off-peak times saves electricity.  5) Rationalize the varying municipal and county codes about water consumption, conservation, and gray water use.  A Sierra Club volunteer effort highlighted great disparities in permit fees for rooftop solar across Silicon Valley communities, resulting in state legislation that set limits on those fees, and created standards for fee computations.  State officials need to similarly understand the difficulties that our extremely fragmented water utility sector has in putting together programs that must accommodate multiple jurisdictions.  There’s plenty of process friction that could be reduced or eliminated through such rationalizations.

We can’t stop human-caused climate change, but we can mitigate its worst effects by continuing Smart Grid solution deployments in the electrical grid and applying these solutions in the water grid. We have no choice but to adapt to the impacts of climate change.  Smart Grid technologies and policies can certainly help accelerate economic and societal adaptations as well as support creative mitigation strategies.

Photo Credit: Smart Grid and California/shutterstock

Authored by:

Christine Hertzog

Christine Hertzog is a consultant, author, and a professional explainer focused on Smart Grid technologies and solutions. She provides strategic advisory services to startups and established companies that include corporate development, market development, and funding strategies. She is the author of the Smart Grid Dictionary, now in its 5th Edition, the first and only dictionary that ...

See complete profile

Read More

The Four Men Who Caused The Majority Of Global Warming

Global Warming and Responsibility

The climate crisis of the 21st century has been caused largely by just four men, who between them invented machines responsible for the majority of the greenhouse gas emissions generated since the dawning of the industrial age.

Left to right: Frank Whittle, Rudolf Diesel, Nicolaus August Otto and Charles Algernon Parsons

Prime movers, machines that turn thermal energy into electrical or mechanical energy play a fundamental role in the global economy. Without these you would not be able to get from London to New York in seven hours, ride the subway to work, transport your iPhone from Shenzen to Los Angeles, or even read this sentence. And the world of prime movers is dominated by a small number of machines: steam turbine, diesel engine, petrol engine and gas turbine. Not only are these machines of great economic importance, they are responsible for almost all of the carbon dioxide emissions from electricity generation and transport.

The steam turbine was invented by Charles Algernon Parsons in Newcastle, England in 1884. These are incredible machines, with the biggest capable of providing enough electricity to power a couple of million homes. They also provide the majority of the planet's electricity. In 1900 the cheapest way to generate electricity was to burn coal and use a steam turbine. Things obviously have not changed much. When China started to build over 50 gigawatts worth of electrical capacity each year they decided to do it almost entirely with coal and steam turbines.  And the carbon dioxide produced by one these machines is impressive. Running at typical capacity factors, a 1 GW machine will produce five million tonnes of carbon dioxide each year. The phenomenal growth of carbon emissions in China is very much a steam turbin driven affair. And this is all the result of the work of Parsons.

Without container shipping the modern globalised economy would probably be fundamentally different. The spread of container shipping was dependent on the simple, but disruptive idea, of putting cargo in a box, the development of complex logistics, and above all the diffusion of diesel engines. Invented by Rudolf Diesel in 1893, the diesel engine gradually took over the market for marine engines, reaching 50% market penetration in the 1950s, and now represent almost 100% of marine engines. Diesels now dominate in heavy duty vehicles, such as trucks and buses, and a high percentage of trains are powered by diesels.

The diesel engine however still only maintains a small share of the car market, despite its higher efficiency. Cars, whether they are Hummers or Honda Civics, are still overwhelming powered by the petrol engine, invented by Nicolaus August Otto in 1876. Attempts by luxury car companies to save the planet aside, close to 100% of new cars in North America are old fashioned petrol engines. Only Europeans have started to even transition away from them to diesels. Moving people and stuff around on land and sea therefore is still completely reliant on two machines invented before 1900.

There are few more reliable machines than a gas turbine. And there is no greater example of this reliability and efficiency than the engines of a Boeing 747. At peak thrust it uses a power equivalent of 290 megawatts, thirty times larger than the capacity of the world's largest wind turbine. After their invention by Frank Whittle in 1936 the gas turbine spread faster than almost any primer mover in history, and today it dominates global aviation. In electricity generation CCGT power plants, which couple a gas turbine with a steam turbine, offer incredibly high power density, efficiency (typical thermal efficiency is 60% compared with 40% for coal power plants, and high flexibility. As a result gas power plants now make up more than 30% of electricity generation in many modernised countries, and continue to grow.

Here then is a summary. The vast majority of carbon emissions from electricity generation are from the steam turbine and gas turbine, the vast majority from aviation comes from the gas turbine, and the vast majority from transport comes from the petrol and diesel engine. We can therefore only conclude that the majority of global warming can be laid at the hands of four men: Charles Algernon Parsons, Nicolaus August Otto, Frank Whittle and Rudolf Diesel.

Or perhaps more enlightened attitudes can prevail over such logic.

Authored by:

Robert Wilson

Robert Wilson is a PhD Student in Mathematical Ecology at the University of Strathclyde.

His secondary interests are in energy and sustainability, and writes on these issues at The Energy Collective.


Email: robertwilson190@gmail.com

See complete profile

Read More

How About a Wiki for Clean Energy to Share Best Practices for Improving Energy Efficiency, Boosting Renewables and Reducing Emissions

A Wiki for Clean Energy

There is no shortage of ideas on how businesses, governments and households in the U.S. and other industrialized countries can become more energy efficient; same applies on how to grow cleaner supplies of energy while lowering harmful greenhouse gas emissions and doing so in ways that create sustainable jobs.

There are so many ideas, in fact, how do policymakers, engaged business leaders, informed citizens, stakeholders and the media make sense of them all? Where and how can someone track what’s been proposed? How about ideas that have been adopted and the impact they are having? What lessons might we take away from laws that aren’t working as intended?

During a press briefing last week in Washington, DC designed to begin discussing 200 such ideas, it dawned on me THIS is what every industrialized country deserves: a searchable clearinghouse of laws, policies and public proposals to scale up efficiency, produce energy that is cleaner, more cost-effective and safer while tallying the new jobs they create.

With a healthy array of non-profit, academic and myriad foundations stirring the policy pot with their own ideas, who out there is willing to create such clearinghouse?

This, of course, would be one huge undertaking.  But I think it’s doable in a way that can draw from all energy / environmental / economic points of view. It might even enable critics of policies to contribute their thoughts and invite questions that deserve to be answered.

The 200 ideas came from the Center for a New Energy Economy at Colorado State University. Founded by former Colorado Gov. Bill Ritter, the Center released an extensive menu of options that do not require Congressional action. Given the stalemate there, the Center’s Powering Forward: Presidential and Executive Agency Actions to Drive Clean Energy in America now online could be an extremely helpful head start.

More than 100 leaders from private industry, utilities, academia, non-profit organizations, think tanks and others contributed their ideas with the promise their identities would be kept secret. For a handful of them, that did not matter. We’re talking about leading thinkers such as Moray Dewhurst, Vice Chairman and Chief Financial Officer of NextEra Energy, a large utility holding company and major renewable energy developer; Dennis Beal, Vice President â€" Global Vehicles at FedEx and energy consultant Susan Tierney, who served as Deputy Energy Secretary under President Clinton.  Among those who helped with Ritter included Dan Esty , Commissioner of Connecticut’s Department of Energy and Environmental Protection and Heather Zichal , President Obama’s former Deputy Assistant on Energy and Climate Change (who has yet to decide on her next gig).

Among the recommendations, Ritter and colleagues urged the President to:

·          Direct the Environmental Protection Agency to explain to states how they can be credited for reducing greenhouse gas emissions from existing fossil-fuel power plants with early adoption of new energy efficiency and renewable energy measures.

·          Request that the IRS use its existing authority to issue rulings and interpretations of the tax code that increase incentives for private investors to capitalize clean energy technologies.  The idea here, Ritter said, is to make the tax code “more fair by offering clean energy the same investment tools and tax benefits now given to fossil fuels.”

·          More clearly define the President’s criteria for what he’s called “responsible” natural gas production. This would require that oil and gas companies use best available production practices on federal lands. States could then require these practices to be used within their borders.

·          ImageCompare full life-cycle benefits and costs of each energy resource as White House energy programs are implemented. A report could distinguish carbon-rich and low-carbon resources consistent with the President’s goals for minimizing greenhouse gas emissions most responsible for climate change.

You can read the Center’s full report here. See the report cover, at left.  

Using the U.S. as an example, how about if we combine ideas from this Powering Forward collaboration with the 70 or so ideas put forth by President’s Climate Change Action Plan. Next, we could call on the  American Council for an Energy Efficient Economy (ACEEE) to weigh in with their  best ideas to further incentivize energy efficiency. The analysts there have done an enviable job of tracking and rating efficiency initiatives in all 50 of the states and the District of Columbia.

The faculty, students and staff managing the DSIRE database at North Carolina State University in Raleigh could pitch in with laws and other policies in place to develop sources of renewable energy in states throughout the U.S.  An organization such as Resources for the Future, which has weighed in recommending more even-handed ways to regulate hydraulic fracturing of shale natural gas, could begin by reflecting on the policies in the states such as Texas,  Pennsylvania and others that are producing the most natural gas with the fewest safety mishaps while controlling methane emissions.

These and countless other ideas could be submitted using a spreadsheet, web form, or some other template designed to  organize the policies, laws, thoughts consistently into a public database.

A few energy policy experts I shared this with expressed a range of responses; they either shook their head in disbelief (over why I think this is even possible) to those such as Ritter, Presidential Climate Action Project Executive Director Bill Becker and Michael Northrup, who directs sustainability programs for the Rockefeller Brothers Fund. Each of them saw the value but also implied the herculean effort it would entail. But they did not say no.

We gotta start somewhere. Who’s up to it?

Authored by:

Jim Pierobon

As a career-long advocate for cleaner, safer and more secure energy solutions, Jim creates and helps execute digital campaigns for a variety of trade association/NGO, government agency, smart grid, renewable energy and utility clients through Pierobon & Partners. He provides updates on these columns at TheEnergyFix.com. Among other positions, he has co-managed the energy and environmental ...

See complete profile

Read More

What Is The Most Dangerous Impact Of Climate Change?

NOAA concluded in 2011 that “human-caused climate change [is now] a major factor in more frequent Mediterranean droughts.” Reds and oranges highlight lands around the Mediterranean that experienced significantly drier winters during 1971-2010 than the comparison period of 1902-2010. [Click to enlarge.]

What is the most dangerous climate change impact? That is a question Tom Friedman begins to get at in his must-read NY Times column, “WikiLeaks, Drought and Syria.” The piece is about a “WikiLeaks cable that brilliantly foreshadowed how environmental stresses would fuel the uprising” in Syria.

One of Friedman’s key arguments is that “Syria’s government couldn’t respond to a prolonged drought when there was a Syrian government. So imagine what could happen if Syria is faced by another drought after much of its infrastructure has been ravaged by civil war.” Thanks to human-caused climate change, that is all but inevitable.

The 2008 cable from the U.S. Embassy in Damascus to the State Department details the prescient warnings from Syria’s U.N. food and agriculture representative, Abdullah bin Yehia:

“Yehia told us that the Syrian minister of agriculture … stated publicly that economic and social fallout from the drought was ‘beyond our capacity as a country to deal with.’ What the U.N. is trying to combat through this appeal, Yehia says, is the potential for ‘social destruction’ that would accompany erosion of the agricultural industry in rural Syria. This social destruction would lead to political instability.”

The cable emerged as part of the research for Showtime’s landmark climate change TV series on the experiences and personal stories of people whose lives have been touched by climate change, Years Of Living Dangerously. Friedman is one of the correspondents, and he travels to Syria to witness first-hand the devastation wrought by warming-driven drought.

Friedman explains:

Yehia was prophetic. By 2010, roughly one million Syrian farmers, herders and their families were forced off the land into already overpopulated and underserved cities. These climate refugees were crowded together with one million Iraqi war refugees. The Assad regime failed to effectively help any of them, so when the Arab awakenings erupted in Tunisia and Egypt, Syrian democrats followed suit and quickly found many willing recruits from all those dislocated by the drought.

What is the most dangerous climate change impact? The worst direct impacts to humans from climate change will probably be Dust-Bowlification and extreme weather and the resulting food insecurity.

But the most physically dangerous impact to humans may well turn out to be how Dust-Bowlification combines with the other impacts to create conditions favorable for political instability and conflict (see “Syria Today Is A Preview Of Veterans Day, 2030“).

And remember, in the future, these impacts will not be occurring only intermittently in distant lands. On that point, Friedman quotes me near the end:

And, finally, consider this: “In the future, who will help a country like Syria when it gets devastated by its next drought if we are in a world where everyone is dealing with something like a Superstorm Sandy,” which alone cost the U.S. $60 billion to clean up? asks Joe Romm, founder of ClimateProgress.org.

What I meant is that if we don’t act to reverse carbon pollution emissions trends quickly, then, by mid-century, every country in the world will be dealing with epic catastrophes simultaneously on a regular basis â€" drought, sea level rise, heat waves, invasive species, acidification, and super storms (see “An Illustrated Guide to the Science of Global Warming Impacts“).

This means the rich countries probably will not be offering much assistance to the poorer ones â€" or willing to intervene in foreign conflicts â€" since we’ll be suffering at the same time. When the West and Southwest are Dust-Bowlifying, the Southeast is heating up and suffering alternatively from brutal droughts and brutal floods, and the east coast is seeing a Sandy-level storm surge (or worse) every year, we’ll be devoting all our resources to our own troubles. Compassion fatigue will be replaced by compassion exhaustion.

What is the most dangerous climate change impact â€" drought, flooding, heat wave, or superstorm? All of them are â€" when they are occurring everywhere simultaneously year after year, decade after decade. The time to act was decades ago, but now is still infinitely better than later.

The post What Is The Most Dangerous Impact Of Climate Change? appeared first on ThinkProgress.

Authored by:

Joseph Romm

Joe Romm is a Fellow at American Progress and is the editor of Climate Progress, which New York Times columnist Tom Friedman called "the indispensable blog" and Time magazine named one of the 25 "Best Blogs of 2010." In 2009, Rolling Stone put Romm #88 on its list of 100 "people who are reinventing America." Time named him a "Hero of the Environment″ and “The Web’s most influential ...

See complete profile

Read More

Can Crowdfunding Help Cleantech Ride the Big Data Wave?

"Big data is a coming wave of 'green gold,'" says Nick Eisenberger of Pure Energy Partners."Information technology will be the most powerful tool to address resource challenges in our lifetime."
Eisenberger was speaking at the Markle Foundation offices in New York, where Agrion, which bills itself as the global network for energy, cleantech and corporate sustainability, convened an impressive roundtable to talk about advances in funding new ideas and uncovering innovation for energy and sustainability.

"Energy is where the most data is," said Clean.Data.Project's Jon Roberts. And the panelists agreed its where the most opportunities are in energy, improving its generation, use, and efficiency. Yet not all the necessary datasets are open.

"The biggest obstacle is generational," Eisenberger claimed. "The older generation, my generation, is not comfortable with the 'open' in open innovation."

Richard Robertson explained how GE Ventures has invested in Quirky, a crowdsourcing platform for inventors, and has turned to the crowd for help with design and product naming.

The innovation is out there, Robertson suggested. 

"Sungevity has an incubator in their Oakland facility, and Solar Mosaic [the crowfunding platform] is in that incubator,” said Robertson. “Mosaic is looking at not just solar projects, but wind, energy storage. They're putting projects in the ground." 

At least one audience-participant in the Agrion session, a project finance expert, called into question some of the deals Mosaic and others are funding and the stated returns listed on the Mosaic site. "We wouldn't fund those projects," he offered. "It's like charity."

But that's precisely the point made by Mosaic and others, such as alternative financing renewable energy companies SolarCity, SunRun, and United Wind, among others. They are there to fund the deals no one else will, deals that aren't big enough for the big banks, which is a pretty good sized niche.

That is, in fact, why these new models exist.

For those not familiar with the Mosaic model: projects are listed on its online marketplace, which allows investors pick projects to fund and then provide the capital for rooftop solar installations. The investors are repaid once the project is up and running and generating electricity. Their first projects were mostly on affordable housing in California.

The company provides capital to developers at about 5.5 percent and collects a 1 percent fee, according to some sources, returning roughly 4.5 percent.

“To rapidly deploy clean energy, the industry needs access to low cost capital and lots of it,” Mosaic co-founder, Billy Parrish, told Bloomberg News last year. “We are able to source capital from the crowd and lend that capital to clean-energy developers at lower interest rates than they would get from banks.”

Platforms like Mosaic, which seek a large, “democratic” community of engaged funders, as opposed to the elite accredited few, got a boost last year when the SEC released its proposed guidelines about crowdfunding as it relates to the Jumpstart Our Business Startups or JOBS Act.

If the admittedly hand-picked crowd at the Agrion session is any indication, crowdfunding is here to stay. 

However, as several roundtable participants suggested, it may be best deployed as part of a mix of financing options.

Authored by:

Scott Edward Anderson

Scott Edward Anderson is currently global marketing director for cleantech at Ernst & Young. He is the founder of the popular blog, The Green Skeptic, and the VerdeStrategy consultancy. He has held management positions with Ashoka and The Nature Conservancy and is co-founder of the Cleantech Alliance Mid-Atlantic. An award-winning poet, Scott was a John Sawhill Conservation Leadership Fellow, ...

See complete profile

Read More

Can Geoengineering Save the Planet?

The concept of geoengineering, whereby humans artificially moderate the Earth’s climate, is not the sexiest sounding topic, but a small group of scientists say it might be able to prevent catastrophic global warming. In a fast-flowing and sometimes heated head-to-head climate professors David Keith and Mike Hulme set out the “for” and “against.” Keith, a geoengineering advocate, doesn't believe that this science is a solve-all but says "it could significantly reduce climate impacts to vulnerable people and ecosystems over the next half century." While Hulme sets out his stall in no uncertain terms: "Solar climate engineering is a flawed idea seeking an illusory solution to the wrong problem."

Geoengineering. It’s not the sexiest sounding topic, but a small group of scientists say it just might be able to save the world.

The basic idea behind geonengineering (or climate engineering) is that humans can artificially moderate the Earth's climate allowing us to control temperature, thereby avoiding the negative impacts of climate change. There are a number of methods suggested to achieve this scientific wizardry, including placing huge reflectors in space or using aerosols to reduce the amount of carbon in the air.

It's a hugely controversial theory. One of the main counter-arguments is that promoting a manmade solution to climate change will lead to inertia around other efforts to reduce human impact. But the popularity of geoengineering is on the rise among some scientists and even received a nod from the IPCC in its recent climate change report.

In a fast-flowing and sometimes heated head-to-head climate professors David Keith and Mike Hulme set out the for and against. Keith, a geoengineering advocate, doesn't believe that this science is a solve-all but says "it could significantly reduce climate impacts to vulnerable people and ecosystems over the next half century." While Hulme sets out his stall in no uncertain terms: "Solar climate engineering is a flawed idea seeking an illusory solution to the wrong problem."

Enjoy the debate and do add your comments at the end.

David Keith: Gordon McKay professor of applied physics (SEAS) and professor of public policy at Harvard Kennedy School

Deliberately adding one pollutant to temporarily counter another is a brutally ugly technical fix, yet that is the essence of the suggestion that sulphur be injected into the stratosphere to limit the damage caused by the carbon we've pumped into the air.

I take solar geoengineering seriously because evidence from atmospheric physics, climate models, and observations strongly suggest that it could significantly reduce climate impacts to vulnerable people and ecosystems over the next half century.

The strongest arguments against solar geoengineering seem to be the fear that it is a partial fix that will encourage us to slacken our efforts to cut carbon emissions. This is moral confusion. It is our responsibility to limit the impact that our cheap energy has on our grandchildren independently of the choices we make about temporary solar geoengineering.

Were we faced with a one-time choice between making a total commitment to a geoengineering program to offset all warming and abandoning geoengineering forever, I would choose abandonment. But this is not the choice we face. Our choice is between the status quoâ€"with almost no organized research on the subjectâ€"and commitment to a serious research program that will develop the capability to geoengineer, improve understanding of the technology's risks and benefits, and open up the research community to dilute the geo-clique. Given this choice, I choose research; and if that research supports geoengineering's early promise, I would then choose gradual deployment.

Mike Hulme: professor of climate and culture in the School of Social Science & Public Policy at King's College London

David, your ambition to significantly reduce future climate impacts is one of course we can share along with many others. But I am mystified by your faith that solar climate engineering is an effective way of achieving this. More direct and assured methods would be to invest in climate adaptation measuresâ€"a short-term gainâ€"and to invest in new clean energy technologiesâ€"a long-term gain.

My main argument against solar engineering is not the moral hazard argument you refer to. It is twofold. First, all evidence to dateâ€"from computer simulations and from the analogies of explosive volcanic eruptionsâ€"is that deliberately injecting sulphur into the stratosphere will further destabilize regional climates. It may reduce globally-averaged warming, but that is not what causes climate damage. It is regional weather that does thatâ€"droughts in the US, floods in Pakistan, typhoons in Philippines. Solar climate engineering in short is a zero-sum game: some will win, some will lose.

Which leads me to my second argument. The technology is ungovernable. Even the gradual deployment you propose will have repercussions for all nations, all peoples, and all species. All of these affected agents therefore need representation in any decisions made and over any regulatory bodies established. But given the lamentable state in which the conventional UN climate negotiations linger on, I find it hard to envisage any scenario in which the world's nations will agree to a thermostat in the sky.

Solar climate engineering is a flawed idea seeking an illusory solution to the wrong problem.

DK - You are correct that climate impacts are ultimately felt at the local scale as changes in soil moisture, precipitation, or similar quantities. No one feels the global average temperature. Precisely because of this concern my group has studied regional responses to geoengineering.

In the first quantitative look at the effectiveness of solar geoengineering we foundâ€"to our surpriseâ€"that it can reduce changes in both temperature and precipitation on a region-by-region basis. This work has now been replicated by much larger study using a whole set of climate models led by Alan Robock, one of the more skeptical scientist working on the topic, and they got the same result. While there are claims in the popular press that it will "destabilize regional climates"â€"presumably meaning that it will increase local variabilityâ€"I know of no scientific paper that backs this up.

I have no faith in geoengineering. I have some faith in empirical science and reasoned argument. It's true that we don't have mechanisms for legitimate governance of this technology. Indeed in the worse case this technology could lead to large-scale conflict. This exactly why I and others have started efforts to engage policy makers from around the world to begin working on the problem.

MH - David, The point here is how much faith we can place in climate models to discern these types of regional changes. As the recent report from the UN's Intergovernmental Panel on Climate Change has shown, at sub-continental scales state-of-the-art climate models do not robustly simulate the effects of greenhouse gas accumulation on climate.

What you are claiming then is that we can rely upon these same models to be able to ascertain accurately the additional effects of sulphur loading of the stratosphere. Frankly, I would not bet a dollar on such results, let alone the fate of millions.

You may say that this is exactly why we need more researchâ€"bigger and better climate models. I've been around the climate research scene long enough to remember 30 years of such claims. Are we to wait another 30 years? What we can be sure about is that once additional pollutants are injected into the skies, the real climate will not behave like the model climate at scales that matter for people.

As for getting political scientists to research new governance mechanisms for the global thermostat - you again place more faith in human rationality than I. We have had more than 20 years of a real-world experiment into global climate governance: it's called the UN Framework Convention on Climate Change. It's hardly been a roaring success! You must be a supreme optimist to then expect a novel system of global governance can be invented and sustained over the time periods necessary for solar climate engineering to be effective.

DK: You made a very strong claim that geoengineering is zero-sum. If true, I would oppose any further work on the technology. I responded that results from all climate models strongly suggest that this is not the case. Your response was to dismiss climate models. Assume for the moment that climate models tell us nothing about regional climate response, on what then do you base your claim that solar geoengineering is zero sum - that is, that it just shuffles winners and losers?

When climate skeptics rubbish models, I defend science by agreeing if all we had was complex models I too would be a doubter; but, I then argue, that we base our conclusions on a breadth of evidence from basic physics and a vast range of observations to simpleâ€"auditableâ€"models as well as the full-blow three dimensional climate models. Models of atmospheric circulation and aerosols developed for earth make good predictions of the climates of other planets. This is a triumph of science.

The same science that shows us that carbon dioxide will change the climate shows that scattering a bit more sunlight will reduce that climate change. How you do you accept one and reject the other?

On the other points: I am not excited by an endless round of climate model improvements nor do it think that political scientist will solve this. We need less theory and more empiricism.

MH: David, I agree that we need less theory and more empiricism. This is one of the reasons why I am skeptical that climate models are able to reveal confidently what will happen to regional climatesâ€"especially precipitationâ€"once sulphur is pumped into the stratosphere.

I don't dismiss climate models, but I discriminate between what they are good for and what they are less good for. Having spent nearly half of my professional life studying their ability to simulate regional and local rainfallâ€"by comparing simulations against observations, empiricism if you willâ€"I have little faith in their skill at the regional and local scales.

But let's assume for a moment that climate models were reliable at these scales. Another argument against intentional solar climate engineering is that it will introduce another reason for antagonism between nations. There are those who claim that their models are good enough to precisely attribute specific local meteorological extremesâ€"and ensuing human damagesâ€"to greenhouse gas emissions. There will be nations who will want to claim that any damaging weather extreme following sulphur injection was aerosol-caused rather than natural- or greenhouse gas-caused. The potential for liability and counter-liability claims between nations is endless.

I am against solar climate engineering not because some violation of nature's integrity - the argument used by some. I am against it because my reading of scientific evidence and of collective human governance capabilities suggests to me that the risks of implementation greatly outweigh any benefits. There are surer ways of reducing the dangers of climate change.

This debate originally appeared on the Guardian Sustainable Business blog.

Read More

Commission Proposes a Market Stability Reserve for Europe's Carbon Market

EU Carbon Emissions

The European Commission has just proposed a major reform of the European carbon market. It released this week a proposal for a non-discretionary mechanism that would reduce the amount of pollution allowances available to emitters in cases of exceptional drops in emission levels. The purpose of the so called “market stability reserve” is to limit how many surplus permits can be in the market at any one time. The mechanism will likely lead to a higher predictability for low carbon investments and an enhanced credibility of the carbon market. 

Why a market stability reserve

In the current form of the ETS, the number of pollution allowances available to emitters is determined for each phase of market operation well in advance of the phase’s beginning. During the course of a market’s phase, however, emissions levels may end up far below the cap, as they did in the wake of Europe’s financial crisis. This systemic quirk of the carbon market, whereby supply is rigid but demand is fluid, is the reason why emitters have accumulated a hefty surplus of allowances and why the European carbon price has deteriorated substantially in recent years.

The low carbon price has hurt the credibility of the emissions trading system and this has in turn prompted politicians to intervene in the market. Member states recently approved a “backloading” of allowances, which would delay some allowance auctions from the next few years to 2019 and 2020, in order to raise the carbon price in the short term. But while this move signals the continued political backing for cap-and-trade in Europe, it is not a long term solution. Discretionary interventions reflect poorly on both the credibility and predictability of climate policy.

The proposed market stability reserve will be non-discretionary. It will therefore steer clear of the politics of the day. Instead, the mechanism will provide a predictable framework for how the carbon market will respond to exceptional market surpluses.

The mechanism will be based on data regarding the market’s cumulative surplus, which will be published annually. The cumulative surplus is defined as the difference between the market supply - allowances issued and credits surrendered - and historical emissions, accumulated since 2008. When the cumulative surplus exceeds 833 million tons in any given year (say, year 0), the mechanism will withdraw 12 percent of this surplus from the auctions scheduled to take place in two years’ time (in year 2). The withdrawn allowances will be placed in a reserve. They will be gradually released to the market at a later date, when the surplus of allowances has decreased to below 400 million tons.

Towards enhanced policy predictability and credibility

While the proposal remains to be adopted, the legal proposal will likely trigger a much needed debate in Europe about the stability of the bloc’s carbon market. A market in which supply is fixed in advance and demand fluctuates widely with GDP can be expected to lead to oversized fluctuations in the carbon price. A drastic drop in the carbon price in the short term lets investors continue to support high-carbon energy infrastructure. Such investments, however, are inconsistent with Europe’s long term ambitions of decreasing emissions by 80-95 percent by 2050.

Short-term fluctuations in the carbon price can thereby create inter-temporal inefficiencies which lead to the lock-in of high-carbon infrastructure and potentially stranded assets. In this context, the proposed market stability reserve will provide a more consistent and predictable carbon price signal. Such a signal is likely to strengthen the incentives for competitive low-carbon technologies. It will also enhance the credibility of the emissions trading system by preventing a carbon price collapse that may be followed by discretionary interventions by politicians.

The choice of trigger levels

Assuming a consensus is reached on the need for flexibility in EU's carbon market, attention will likely turn to the threshold levels at which the reserve is triggered. The upper threshold - which the Commission proposed should be 833 million tons - provides an implicit support for the European carbon price. Against the backdrop of the current market oversupply, expected to reach 2.6 billion tons by 2020, the threshold of 833 million tons substantially tightens the market supply and demand balance. This is expected to result in a significant increase in the carbon price compared to a scenario reflecting the current market design.

Just as important to the carbon price are the implications of the lower threshold of 400 million tons. By choosing this threshold, the Commission is proposing that allowances should be released from the reserve at a time when the market is still oversupplied. Using this threshold, the market stability reserve will likely return allowances to the market relatively quickly after they have been withdrawn. The choice of a lower limit of 400 million tons therefore weakens the potential of the proposal to bring about long-lasting change in the market’s surplus.

But regardless of the final design of the stability reserve, the Commission's proposal exemplifies the evolution of carbon markets. Regulators who seek to put a price on carbon are increasignly recognizing the importance of price stability in enhancing the predictability and credibility of climate policy.

The views in this article do not represent the views of Thomson Reuters Point Carbon.

Photo Credit: EU Commissions and Targets/shutterstock

Authored by:

Emil Dimantchev

I am an analyst at Thomson Reuters Point Carbon where I write market and policy analysis covering the European carbon market, aviation climate policy, and REDD.

See complete profile

Read More

Converting Coal to Synthetic Natural Gas in China

  • With so much attention focused on China's shale gas potential, its growing synthetic natural gas industry is a wild card.

  • In light of China's severe air quality problems,  trading smog for higher CO2 emissions is an understandable choice, but one with global implications.

coal and sng in china

In its latest Medium-Term Coal Market Report the International Energy Agency (IEA) forecasts a slowing of coal demand growth but no retreat in its global use. That won’t surprise energy realists, but the item I wasn’t expecting was the reference in the IEA press release to growing efforts in China to convert coal into liquid fuels and especially synthetic natural gas (SNG).  It’s not hard to imagine China’s planners viewing SNG as a promising avenue for addressing the severe local air pollution in that country’s major cities, but the resulting increase in CO2 emissions could be substantial. It could also affect the economics of natural gas projects around the Pacific Rim.

Air quality in China’s cities has fallen to levels not seen in developed countries for many decades. There’s even a smartphone app to help residents and visitors avoid the worst exposures. Much of this pollution, in the form of oxides of sulfur and nitrogen and particulate matter, is the result of coal combustion in power plants. Although China is adding wind and solar power capacity at a rapid clip, after years of exporting most of their solar panel output, the scale of the country’s coal use doesn’t lend itself to easy or quick substitution by these renewables.

Natural gas offers a lower-emitting alternative to coal on a larger scale than renewables. Existing coal-fired power plants could be converted to run on gas or replaced with modern combined-cycle gas turbine power plants. Gas-fired power plants emit up to 99% fewer local, or “criteria” pollutants than coal plants, especially those with minimal exhaust scrubbing.

Unfortunately, China doesn’t have enough domestic natural gas to go around. Despite potentially world-class shale gas resources and the rapid growth of coal-bed methane and more conventional gas sources, natural gas supplies only 4% of China’s energy needs. Imported LNG can help fill the gap, but it isn’t cheap. What China has in abundance is coal. Converting some of it to SNG could boost China’s gas supply relatively quicklyâ€"perhaps faster than the country’s shale gas infrastructure and expertise can gear up.

SNG is hardly a new idea; the Great Plains Synfuels Plant has been producing it in North Dakota since the 1980s. When that facility was built, natural gas prices were volatile and rising, and greenhouse gas emissions appeared on no one’s radar. The process for making SNG from coal is straightforward, and its primary building block, the gasification unit, is off-the-shelf technology. I worked with this technology briefly in the 1980s, and my former employer, Texaco, licensed dozens of gasification units in China before the technology was eventually purchased by GE. Other vendors offer similar processes.

Gasifying coal adds a layer of complexity, compared to gasifying liquid hydrocarbons but this, too, has been demonstrated in commercial operations. Most of the output of the facilities Texaco sold to China was used to make chemicals, but the chemistry of turning syngas (hydrogen plus carbon monoxide) into pipeline-quality methane is no more challenging.

This effort is already under way in China. Last October Scientific American reported that the first of China’s SNG facilities had started shipping gas to customers, with four more plants in various stages of construction and another five approved earlier this year. The combined capacity of China’s nine identified SNG projects comes to around 3.5 billion cubic feet per day, or a bit more than the entire Barnett Shale near Dallas, Texas produced in 2007 as US shale gas production was ramping up. It’s also just over a quarter of China’s total natural gas consumption in 2012, including imported LNG.

To put that in perspective, if that quantity of SNG were converted to electricity in efficient combined cycle plants their output would be roughly double that of China’s 75,000 MW of installed wind turbines in 2012, when wind generated around 2% of the country’s electricity.

The appeal of converting millions of tons a year of dirty coal into clean-burning natural gas, in facilities located far from China’s population centers, is clear. This strategy even has some similarities to one pursued by southern California’s utilities, which for years imported power from the big coal-fired plants at Four Corners.  For that matter, the gasification process has some key advantages over the standard coal power plant technologies in the ease with which criteria pollutants can be addressed. Generating power from coal-based SNG might actually reduce total criteria pollutants, rather than just relocating them.

However, wherever these plants are built they would add around 500 million metric tons per year of CO2, or around 5% of China’s 2012 emissions, a figure that dwarfs even the most pessimistic estimates of the emissions consequences of building the Keystone XL pipeline. That’s because the lifecycle emissions for SNG-generated power have been estimated at seven times those from natural gas, and 36-82% higher than simply burning the coal for power generation.

What could possibly lead China’s government to pursue such an option, in spite of widespread concerns about climate change and China’s own commitments to reduce the emissions intensity of its economy? Having lived in Los Angeles when it was still experiencing frequent first-stage smog alerts and occasional second-stage alerts, I have some sympathy for their problem. China’s air pollution causes even more serious health and economic impacts and has been blamed for over a million premature deaths each year. By comparison the consequences of greenhouse gas emissions are more indirect, remote and uncertain. Any rational system of governance would have to put a higher priority on air pollution at China’s current levels than on CO2 emissions.

It might even turn out to be a reasonable call on emissions, if China’s planners envision carbon capture and sequestration (CCS) becoming economical within the next decade. It’s much easier to capture high-purity, sequestration-ready CO2 from a gasifier than a pulverized coal power plant. (At one time I sold the 99% pure CO2 from the gasifier at what was then Texaco’s Los Angeles refinery to companies that produced food-grade dry ice.) It should also be much easier and cheaper to retrofit a gasifier for CCS than a power plant.

In an internal context the trade-off that China is choosing in converting coal into synthetic natural gas is understandable. However, that perspective is unlikely to be shared by other countries that won’t benefit from the resulting improvement in local air quality and view China’s rising CO2 emissions with alarm. I would be surprised if the emissions from SNG were factored into anyone’s projections, and nine SNG plants could be just the camel’s nose under the tent.

In an environment that the IEA has described as a potential Golden Age of Natural Gas, large-scale production of SNG could also constitute an unexpected wild card for energy markets. When added to China’s shale gas potential, it’s another trend for LNG developers and exporters in North America and elsewhere to monitor closely.

A different version of this posting was previously published on Energy Trends Insider.

Photo Credit: Coal to Synthetic Natural Gas in China/shutterstock

Authored by:

Geoffrey Styles

Geoffrey Styles is Managing Director of GSW Strategy Group, LLC, an energy and environmental strategy consulting firm. Since 2002 he has served as a consultant and advisor, helping organizations and executives address systems-level challenges. His industry experience includes 22 years at Texaco Inc., culminating in a senior position on Texaco's leadership team for strategy development, ...

See complete profile

Read More
Powered By Blogger · Designed By Alternative Energy