Can Flexible Generation Make for a Happy Marriage between Renewables and Natural Gas?

by Bob Shively, Enerdynamics President and Instructor

Renewable power and natural gas generation have somewhat of a love-hate relationship.  On one hand, they are symbiotic: Variable renewable resources need flexible generation resources to pick up load when the wind doesn’t blow or the sun doesn’t shine.  And other than limited hydro resources, gas generation is best suited to provide the flexible generation needed.  But the renewable industry realizes that cheap gas generation makes it harder for renewables to compete on price.

Gas producers and generators have their own worries about renewable portfolio standards: It appears, in the short term at least, that gas units may be used less frequently to accommodate mandated renewable generation without the gas generators being paid for the flexibility they provide to the system.

An excellent summary of the two ways of seeing the relationship is provided by Navigant’s Richard Smead in the paper Incorporating Gas Fired Generation and Renewables – Looking at the Whole Picture.

One way to bridge the gap is for gas power plant manufacturers to design specifically for use on a system with high penetrations of renewables. One example recently announced is the GE FlexEfficiency 50 Combined Cycle Power Plant.

GE FlexEfficiency 50 Combined Cycle Power Plant

This unit is designed towork efficiently both as a baseload power plant (one that runs most hours of the day) and as a unit that is required to ramp up and down in response to renewable variability.

In the past, units that were designed for baseload use operated inefficiently when frequently ramped up and down, and units designed to be ramped up and down effectively were too expensive to run at baseload.  The new design has a ramp rate of 51 MW-minute compared to a typical 17 MW-minute ramp on a traditional baseload combined-cycle unit.  Yet the unit has an expected efficiency of 60% compared to a typical 40% for traditional gas units used for frequent ramping.  The result is a unit that integrates well with variable renewables.

Unfortunately for those in the U.S., GE will first introduce the new units in Europe and China. Why? Because these countries have stronger long-term policies that support renewables.  If we want to see it available here, it is up to policy makers, regulators and market participants. They must make necessary market changes so that power plant developers are assured of a revenue stream reflective of the benefits they bring to the market. Then they can finance projects that utilize technologies such as the FlexEfficiency 50. When that occurs, we may truly see natural gas and renewables join in a symbiotic, happy relationship.

Posted in Electricity, Natural Gas, Renewables | Tagged , , , , , , | 2 Comments

The Business of Wind: Top Wind Power Installations of Q1 2011

by Bob Shively, Enerdynamics President and Instructor

Wind power is big business in many states and is only growing bigger as market demand for renewable power grows. According to the American Wind Energy Association:

“The first quarter of 2011 saw over 1,100 megawatts (MW) of wind power capacity installed – more than double the capacity installed in the first quarter of 2010.  The U.S. wind industry had 40,181 MW of wind power capacity installed at the end of 2010, with 5,116 MW installed in 2010 alone.  The U.S. wind industry has added over 35% of all new generating capacity over the past 4 years, second only to natural gas, and more than nuclear and coal combined.”
(source: http://www.awea.org/learnabout/industry_stats/index.cfm )

So who is building all these projects and where are they being built? While Texas, Iowa, and California currently have the most installed wind capacity in the U.S., wind generation capacity grew the most in the first quarter of 2011 in Minnesota, Illinois, Washington, Idaho and Nebraska.

The biggest projects installed in the first quarter of 2011 include:

1 – Big Sky Wind Facility: Sited in the Midwest in the northern Illinois counties of Bureau and Lee, the Big Sky Wind Facility was the country’s largest project installation of Q1 2011 in terms of nameplate rating (source: American Wind Energy Association, 1st Quarter 2011 Market Report). Developed by Chicago-based Midwest Wind Energy and owned by Edison Mission Group, the Big Sky project’s capacity is 239.4 MW and comprises 114 Suzlon-manufactured turbines spanning 13,000 acres of elevated farmland. Each turbine is capable of producing approximately 2.1 megawatts of wind power.

Additionally, an 18-mile, 138kV transmission line developed and permitted by Midwest Wind Energy connects the Big Sky project to the transmission network. Wind resources in northern Illinois boast the unique advantage of close proximity to Chicago’s huge
electric load. This means less transmission investment is required to bring wind-generated power to the electric grid. According to the Illinois Wind Energy Association, “parts of Northern Illinois are on the PJM electric grid, a regional transmission organization serving a market of over 50 million people in 13 states. Illinois has the strongest winds in the PJM market, making Illinois projects very attractive to utilities in several states.”  Big Sky Wind plans to sell into the PJM marketplace as a merchant generator.

2 – Bent Tree Wind Farm: Southern Minnesota’s strong and consistent prairie winds feed the Bent Tree Wind Farm, which began Phase One of its commercial operation in February 2011 to supplement the power needs of those in neighboring Wisconsin. Bent Tree’s Phase One nameplate rating is 201.3 MW – enough to power about 50,000 homes. If and when subsequent phases are completed, the entire farm will have the potential to deliver 400 MW of wind power.

Sited near the town of Albert Lea (100 miles due south of Minneapolis), Bent Tree
spans 32,500 acres developed by Wind Capital Group and Alliant Energy, though the project is owned and operated by Alliant Energy’s subsidiary Wisconsin Power and Light Company (WPL). Each of its current 122 wind turbines was manufactured by Vestas and boasts a 1.65 MW turbine rating. According to AlliantEnergy.com: “The addition of Bent Tree allows WPL to meet and exceed Wisconsin’s existing Renewable Portfolio Standard. … A decision on the remaining 200 MW of wind capacity will likely be driven by Renewable Portfolio Standards.” Given this statement, we can conclude that this project is selling its power to WPL through bilateral agreements so that WPL can meet its state Renewable Portfolio Standard (RPS) requirements.

3 – Juniper Canyon: South central Washington’s Klickitat County is home to the
Juniper Canyon Wind Farm, a 250 MW, two-phase energy project located on roughly 28,000 acres of undulating terrain between the towns of Bickleton and Roosevelt. The first phase of the project (Juniper Canyon I) was installed in Q1 2011 and comprises 63 2.4-MW turbines manufactured by Mitsubishi. This brings Juniper Canyon I’s total capacity to 151.2 MW. When complete, the Pacific Wind Development project will include up to 128 turbines and a total proposed generating capacity of up to 250 MW – enough to provide renewable energy to 75,000 area homes.

Pacific Wind Development is a wholly owned subsidiary of Iberdrola Renewables, Inc., which independently owns and operates Juniper Canyon I (and II, when the time comes). Additionally, Iberdrola is the independent builder, owner and operator of a transmission line to the Bonneville Power Administration’s (BPA) interconnection point. BPA is purchasing the power generated by Juniper Canyon I and transmitting it to BPA’s Rock Creek substation via 230kV transmission lines.

4 – Idaho Wind Partners: Idaho Wind Partners, a collaboration among GE Energy Financial Services, Exergy Development Group, Atlantic Power Corp. and Reunion Power, began operation on its $500 million, 11-site wind project earlier this year. The portfolio of wind farms is located in southern Idaho and has the capacity to deliver 183 MW of wind power using GE Energy-made turbines, each with a 1.5 MW rating (source: http://www.nawindpower.com/e107_plugins/content/content.php?content.7296). Eight of the wind farms are located in Hagerman, Idaho, while three others are 70 miles away in Burley. Despite the distance, all operate on a unified system and all deliver power to Idaho Power Co. as part of a 20-year agreement.

Posted in Renewables | Tagged , , , , , | 2 Comments

Texas Outages Due to Physical Problems, not Market Failure

by Greg Stark, Enerdynamics Instructor

Back in February, I wrote a blog post titled Why Does Cold Create Outages in Texas?, which examined the series of events leading up to and the possible reasons for central Texas’ rolling blackouts of Feb. 2, 2011. Since that cold (and dark) February day, the Electric Reliability Council of Texas (ERCOT) has spent time investigating the cause of the blackouts. Part of what ERCOT sought to determine was whether the blackouts resulted from a failure in the physical system or whether market participants’ manipulation strategies and/or abuses of market power played a role in the blackouts.

The investigation’s subsequent report from the Independent Market Monitor (IMM) for ERCOT found no market manipulation during the Feb 2. blackouts.  While the report doesn’t give a very complete explanation of  specific events causing the blackout, it does provide some interesting data related to how much tripped off how fast and what the reserve margins were at various times.  I personally wish the report better described how much tripped off how fast and why from a physical manner.  That information has been very hard to come by except in bits and pieces from multiple reports.

However, the report does contain several interesting points that I feel are worthy of note and discussion. These include:

  • The data behind the analysis aimed at determining if physical withholding of capacity and/or unplanned outages (whether intentional or unintentional) contributed to profitability (or lack thereof). The IMM looked at some of the biggest generators and found that any company with over 10% reduced/withheld/lost capacity during the blackouts on Feb. 2 lost so much money from penalties and contractual commitments that they weren’t even close to overcoming such losses with the higher market prices on Feb. 3.
  • The day-ahead market bids for Feb. 3 (the day after the blackouts) were closing during mid- morning on Feb. 2 when the system was in pretty bad shape, so the Feb. 3 day-ahead LMP prices reflected an assumption of scarcity.  However, by the time Feb. 3 rolled around, the system was stabilized and the real-time prices were well below the day-ahead prices.   This demonstrates a working market since prices reflect known conditions at the time the prices were set.
  • A couple of 1500 MW baseload plants were scheduled for planned outages starting Feb. 1, and a 500 MW plant was scheduled for Feb. 3.  Current rules don’t give ERCOT the authority to require a plant to delay its planned outage with short notice if a potential problem might exist in the next several days. Thus, ERCOT did not ask the 1500 MW units to delay their planned maintenance outages on Jan. 31. During the height of the problem, ERCOT contacted the owner of the 500 MW plant scheduled to go offline Feb. 3; the owner voluntarily delayed its outage.
  • ERCOT does have the authority to cancel transmission outages and did cancel a number of scheduled transmission outages starting on Jan. 31 in advance of the cold weather.
  • The report contains an interesting analysis of the performance of ERCOT’s energy-only market as opposed to the capacity market approach used in other ISOs such as PJM, ISO New England, and NY ISO.  The energy-only market depends on high prices incurred during occasional supply shortages to result in sufficient revenues that incentivize generation owners to build enough generation capacity to meet peak loads. This is in contrast to capacity markets in which generators receive an on-going fixed payment for providing capacity. According to the IMM, the fact that prices rose to the $3000 cap during the event indicates  the market is working as intended from the standpoint of providing sufficient revenues to generators to ensure needed capacity for reliability purposes.

So if the markets worked, no one manipulated them and there was sufficient capacity, why did all the lights (and heaters) go off?  With the exception of ERCOT not having authority to cancel planned outages, the answer is physical systems need to be upgraded to handle colder temperatures if the once-a-decade-or–so outage is going to be avoided. Whether that makes financial sense or not is a discussion for another time.  ERCOT experienced the same cold temperatures a week later on Feb. 9-10 without incident.  By that time, most of the problem plants had taken measures to add temporary heaters or implement other mitigation strategies in areas that experienced freezing.

Posted in Electricity | Tagged , , , , , , , | Leave a comment

Mother Nature’s Wrath: Could a Devastating Hurricane Season Wreak Havoc on Natural Gas Markets?

Hurricane Katrina in the Gulf of Mexico near i...

Image via Wikipedia

by Cynthia Ellis, Enerdynamics
Sales Manager, Texas/Southeast

Watching the recent news coverage of the devastating tornadoes and their unusual size and ferocity, I couldn’t help but wonder about the 2011 hurricane season. Is it possible that this year’s hurricanes could also be larger and more destructive than we’ve seen in recent years?  If so, in addition to the human toll, what might that mean for offshore gas production?  With the abundance of shale gas keeping natural gas prices in the $4.50 range, does a severe hurricane in the Gulf Coast mean higher prices? Certainly it could in the short term.  If such a catastrophe were to happen before, or near, the end of the normal injection season, what then?

In 2005, hurricanes Katrina and Rita came to call, delivering a devastating one-two punch. The Minerals Management Service (MMS) estimated that 3,050 of the Gulf’s 4,000 platforms and 22,000 of the 33,000 miles of Gulf pipelines were in the direct path of either Hurricane Katrina or Hurricane Rita. In addition, 47 major natural gas processing plants and 17 natural gas liquids fractionation sites located within the 70 counties and parishes along the Gulf Coast of Texas, Louisiana, Mississippi, and Alabama were threatened by the storms’ approach.

In its final shut-in statistics report on June 21, 2006, MMS estimated that about 9.4% of daily gas production remained shut in as of June 19, 2006. Cumulative shut-in gas production from August 26, 2005, through June 19, 2006, was 803.6 Bcf, which is equivalent to about 22% of yearly gas production in the Federal OCS in the Gulf. 

 In 2008, hurricanes Gustav and Ike struck the Gulf Coast with similar results.

So, is it good news or bad news for the 2011 hurricane season? According to meteorologists, 2011 is mirroring the climactic conditions of 2008.  In his 2011 Hurricane Season briefing, President Obama was told that there is a 70% chance of three to six major hurricanes between now and the end of November. Colorado State University meteorologists Philip J. Klotzbach and William M. Gray, conclude that “Overall, conditions remain conducive for a very active hurricane season.” And as I write this, the first low pressure system of the season is forming in the Caribbean.

So, it appears that we could have an active hurricane season.  If indeed that occurs, whether the hurricanes impact Gulf production depends on a number of factors including the strength of the hurricanes and whether they track over production areas or move in a different direction. Natural gas futures prices for the summer and fall months have trended upward recently and one factor is likely the strong hurricane forecast. But the big unknown is whether the abundance of shale gas will provide supply that can quickly come on line to replace any lost Gulf production.  If so, any storm-induced price spikes may be short in nature, and once it became clear to the market that replacement gas was available, prices could quickly fall back into the $4 range.

Posted in Natural Gas | Tagged , , , , , | Leave a comment

Japan Looks to Renewable Energy Technology to Maintain, Rebuild in the Wake of Disaster

by Dan Bihn, Enerdynamics Instructor

Aftershocks are still being felt in Tokyo more than two months after the magnitude 9.0 earthquake and tsunami that devastated the coastline of northeastern Japan. But the repercussions of the subsequent meltdown of the Fukushima nuclear power station are being felt even further away and will likely persist a lot longer.

The Japanese government recently announced plans to update its long-term energy policy “starting from a blank piece of paper.” There’s lots of talk about shifting to more renewable energy. Since less than 2% of Japanese electricity comes from non-hydro renewables, this shift would definitely be a long-term policy. Currently, Japan is the third largest consumer of nuclear power in the world just behind the United States and France, according to the EIA.

Japan's Electricity Consumption, 2009 (source: EIA)

The short-term policy is to get through this summer with as few rolling blackouts as possible. At the beginning of 2011, the summer peak load was projected to exceed 55,000 MW. But this summer’s peak load will only be about 45,000 MW – even if it turns out to be a very hot summer. That’s all the generation Japan has; load will be limited.
The only question is, will the load be limited by a few substation breakers or by millions of household switches and thermostats? I’m betting on the people. They’ll be motivated by a compelling price incentive: reduce peak demand and your company will have power. That’s the big stick the grid operators have. But little carrots are starting to pop up everywhere, and some are even candy-coated. It’s not just about rewarding people for their energy efficient behavior. It’s about giving them something cool to buy and show off while positively contributing to a national energy solution.

Such “cool” things include energy-efficient products like LED light bulbs, intelligent powerstrips and LCD TVs with a “Peak Shift” feature that charges an internal battery at night and then, with a click of the remote, disconnects from the grid during the peak hours of the day. On May 11th, Mitsubishi unveiled a prototype ‘model home’ that features rooftop PV (photovoltaic solar electric), a Home Energy Management System (HEMS), and a host of smart appliances. And this could be exactly how Japan rebuilds its infrastructure – one smart home at a time.

So, how can such innovations make a difference? With a traditional system, if you just put PV on your roof, you’ll reduce your peak most of the time. But when a cloud passes overhead, your air conditioner will still think it’s hot out and keep running. The net effect is that for a few minutes your house will put a big load on the grid. But Japan no longer has enough capacity for that kind of peak. So PV panels by themselves don’t help much.

That’s where the HEMS comes in. When a cloud passes overhead, your PV generation still drops dramatically, but now the HEMS immediately turns off your air conditioner compressor and maybe the refrigerator compressor. Result? Smaller peak. If the sun comes back out, things return to normal. If the sun doesn’t, the house will start cooling off naturally, so the air conditioner won’t be needed as much. Still no peak.

So the grid stays on – and Japan doesn’t need to build a lot of new fossil fuel power plants that run on imported fuel. And there is a big increase in consumer spending to help the economy through these challenging times. Of course, a smart grid with dynamic pricing and a lot of price-responsive loads can achieve the same basic load-flattening results. But somehow a PV system connected to a home energy management system seems a lot simpler. And consumers respond MUCH faster than bureaucracies.

Posted in Electricity, Renewables | Tagged , , , , , , , | 1 Comment

Is the Era of Abundant, Long-term, Low-cost Natural Gas Truly Here? Part II

by Bob Shively, Enerdynamics President

In last week’s blog post (Is the Era of Abundant, Long-term, Low-cost Natural Gas Truly Here? Part I) I discussed the price volatility that has impeded growth of the natural gas industry and how many in the industry feel natural gas has finally reached a stable condition of long-term abundant supply and low prices. But that may or may not be reality. Let’s examine the dialogue and data behind the future of natural gas markets:

Low Prices Are Here to Stay

In the last few years, U.S. producers have begun to successfully exploit new shale gas resources.  U.S. natural gas production in 2010 was at its highest level since 1973, and this year appears likely to hit an all-time high.  A recent study by the Potential Gas Committee suggests the U.S. total natural gas resource base is 1,898 Tcf, which at current rates of use will last 80 years.  Natural gas consumption in 2010 was at an all-time high, yet prices have still remained low.  While the number of rigs drilling for gas fell precipitously in 2009, they have started to creep back up.  And total current gas reserves are approaching their all-time high.  These factors have many predicting that we are in for at least a decade of low gas prices.

U.S. Dry Natural Gas Production in MMcf

U.S. Natural Gas Total Consumption in MMcf

U.S. Natural Gas Rotary Rigs in Operation

U.S. Dry Natural Gas Proved Reserves in Billion Cubic Feet

 source for above graphs: http://www.eia.doe.gov

Not So Fast
So is there anything looming that could turn the optimistic tide?  In a word, yes.  We need to realize that the 80 years of gas supply noted above is estimated total resource base*, not reserves. If a standard of current reserves is used, we probably have about 12 years worth of consumption in the U.S. This estimate isn’t much higher than the 10 years that was oft-quoted during the period of high prices in the mid-2000s.

In the case of shale gas, some industry experts maintain that many producers are making an insufficient profit at current prices.  Drilling has continued partly because of investments by international players who have motives beyond pure profits (see U.S. Becoming Natural Gas Exporter? in Q1 2011 Energy Insider) and partly because the value of natural gas liquids has producers drilling based on selling liquids, not gas (see http://www.bentekenergy.com/InTheNews.aspx#Article3094).  So drilling could fall and with it, supply.  And while I believe that environmental impacts of drilling techniques such as fracking are manageable, there is certainly the possibility that public perception won’t come to that conclusion and that pressure to reduce drilling will grow.

As mentioned, demand is at all-time highs. Some major industrial consumers have recently indicated plans to ramp up consumption of natural gas, and natural gas generation capacity is expected to increase by 8% in the four years from 2010 to 2014, according to the EIA.  Proposed export LNG terminals, which would access much higher gas price markets in Asia and Europe, could add an additional chunk of demand into the market.  These factors coupled with the production issues described above suggest that in another year or two, the U.S. could face falling production and rapidly rising demand.  And we all know what that does to prices!
 
So What to Do?
Certainly, history tells us natural gas goes boom and bust, and that these cycles can happen quickly.  We’ve taught in our classes for years that your best position is to accept that you can’t know whether conventional wisdom is indeed wise or not.  It is better to plan for uncertainty rather than a certain market outlook.  Wise market participants develop business strategies that are well-enough hedged that they can at least survive, and hopefully do well, in a boom, bust or something in between. 

Examples include Xcel Energy, which recently began to replace existing coal generation with over 500 MW of natural gas combined-cycle turbines. To protect rate payers against future gas price increases while taking advantage of current low prices, Excel signed a 10-year gas supply deal with Anadarko at an estimated price of $5.48/MMBtu.  Another is major shale gas producer Chesapeake, which is actively seeking new markets such as LNG exports and even gas-to-liquids (GTL) (see http://www.pennenergy.com).  These strategies seem much better than simply going along for the ride and letting fate determine the future.

* Resource base is the assumed total amount of gas, discovered or undiscovered, that can reasonably be expected to exist, while reserves are the more proven estimated quantity of gas that analysis of geologic and engineering data demonstrates with reasonable certainty is recoverable under existing economic and operating conditions.

 

Posted in Natural Gas | Tagged , , , , , , | Leave a comment

Is the Era of Abundant Long-term Low-cost Natural Gas Truly Here? Part I

by Bob Shively, Enerdynamics’ President

In the mid 1990s, I attended a speaking engagement by Larry Bickle, then CEO of gas marketing company Tejas Power.  In the room full of natural gas producers and marketers, Bickle asked, “How many people think high gas prices are good?”  When almost all the hands in the room shot up, Bickle said he thought they were all wrong.  What was needed, Bickle argued, were long-term moderate gas prices that would allow gas to build its market share and become the dominant fuel choice. Unfortunately, the desire for stable prices was wishful thinking.

Since 1997, U.S. Henry Hub monthly spot prices have run as low as $1.72/MMBtu and as high as $13.42/MMBtu.  But what we have not had is a stable price that facilitates long-term planning.  As a result, any consumers considering a long-term investment with a value dependent on natural gas prices must first face the question of whether gas price volatility makes the investment too risky.

Henry Hub Monthly Spot Prices Since 1997

In May 2008, Henry Hub spot prices were $11.27/MMBtu.  The market perception was that we had entered an era of tight gas supply and high demand, and that the market needed to learn to live with prices at these levels. Numerous companies rushed to complete multi-billion dollar investments in liquefied natural gas (LNG) terminals so that cheaper international sources of gas could be moved into the U.S.  And many electric utility companies begin planning for new coal generation with the assumption that gas was too costly. The Energy Information Administration (EIA) projected that annual average prices over the next decade would range from $6 to $7.50 and that higher prices would hold down gas demand. Futures prices for the next year were in the range of $9 to $11.  

Three short years, one economic recession and a gas shale boom later, Henry Hub spot prices started the month (May 2011) at $4.69/MMBtu.  The market perception is that we now have 100 years of natural gas supply in the U.S. and that gas prices will remain low for a long time.  Companies that rushed to complete LNG import terminals are now applying to FERC to convert them to export terminals to send U.S. gas to Europe, Asia and South America.  Electric utility companies have halted almost all coal units not already under construction and have moved quickly to favored gas units. The EIA projects annual average prices over the next decade will range from only $4.48 to $4.87.  Forecasters predict natural gas demand will grow steadily by just under 1% per year.  Futures prices for the next year are in the range of $4.70 to $5.30. 
 
So which is it?  Should we be building gas-fired power plants and converting our cars to natural gas? Or should we be stocking up now on cheap gas to profit when the inevitable future high prices hit?  It’s a great question, and, to be honest, no one knows the answer. Certainly no one I’ve heard is talking about $10 gas anytime soon.  But despite all the folks talking about long-term $4 to $5 gas, there are a few others suggesting that a $6 to $8 range may not be so far-fetched. 

In next week’s blog post, I’ll continue this discussion by summarizing some of the data on both sides of the argument.

Posted in Natural Gas | Tagged , , , , , | 1 Comment

Can Solar Power on Our Rooftops Compete with Existing Generation on Price?

by Bob Shively, Enerdynamics President

Imagine a world in which photovoltaic (PV) solar power installed on the rooftop of your house was a cheaper source of power than the existing fleet of coal, natural gas, nuclear and hydro generating units. 

Such a reality would fundamentally change the world of electricity: Distribution utilities would become two-way networks that moved power among consumers and into storage during the day, and then from central units to consumers at night.  Many existing generation units built to meet peak loads on hot sunny summer afternoons would no longer be needed except as backups for cloudy days.  New transmission lines would not be required since power would be generated at the point of use.  Emissions caused by electric generation would fall significantly. And electric service providers would go from a focus on buying and selling a commodity to providing financing and maintenance on PV units.

But before getting too deep into your imagination, we need to ask – is this even feasible, or is it just a fantasy?  After all, I’ve been hearing about the promise of PVs ever since I joined the electric industry in 1982! 

The average cost of retail power in the U.S. was 9.88 cents/kWh in 2010 according to the Energy Information Administration.  This ranged from low costs just above 6 cents/kWh in Wyoming and high costs above 17 cents/kWh in Connecticut*.  Current levelized costs for PV systems are around 20 to 24 cents/kWh.  But they have come down a lot, by about 50% since 2004.  And since PVs are electronic in nature, many in the research and development field believe that costs can keep coming down, just like they have for computers.

PV costs per kWh

 In fact the Department of Energy recently announced the SunShot program with the goal of reducing the cost of installed PV systems to $1/watt by 2017.  This is equivalent to about $0.06/kWh without any subsidies.  The program envisions achieving the target through a number of cost reductions including cell manufacturing, installation and power electronics as well as improvements in cell efficiency.  And in reading the details (available here: http://www1.eere.energy.gov/solar/sunshot/about.html) and talking to experts throughout the industry, it does indeed seem feasible.  Lots of brilliant minds not just in the U.S. but in places like China, Japan, Korea, and Europe are working on it. 

Does this mean it will happen?  Of course not.  But it might.  And if it does, it will fundamentally change our business.

*We are ignoring the even higher cost state of Hawaii which is a unique case.
Posted in Renewables | Tagged , , , , | 6 Comments

Are Capacity Markets Necessary in Competitive Wholesale Markets?

By Bob Shively, Enerdynamics’ President

At a recent Enerdynamics seminar, I was asked if capacity markets are necessary in a competitive wholesale market.  With Entergy planning to join MISO, competitive wholesale markets are becoming an increasingly large portion of the U.S. electric business, so it is worthwhile to address this question for all of our readers.

A key principle for design and operation of electric infrastructure is to ensure a high level of reliability.  One key factor is ensuring there is enough generation available to cover demand peaks.  Under the traditional market design of a vertically integrated monopoly utility, sufficient generation capacity is ensured by a regulatory requirement called resource adequacy. 

Utilities are required to file resource plans with the state. These plans describe how the utility will serve projected demand peaks and propose any necessary new construction to cover load growth.  State regulators review the plans and approve construction of new generation as needed. The costs of new units are put into utility ratebases. Thus, the costs plus a reasonable profit for the utility are paid by customer rates. This process generally works well from a reliability standpoint, and capacity-related power outages have become rare.

ISO RTO operating regions

But with the movement to competitive wholesale markets run by an ISO, states that broke up the vertically integrated utility no longer have control over generation construction.  New generation is built by merchant generators who only build new units when they believe market power prices will support a solid return for shareholders.  And if they make a mistake and build unneeded capacity, shareholders – not ratepayers – are on the hook for unrecovered costs.

For baseload units that run most hours of the year, competitive markets are friendly.  The units run and get paid the market price, which usually provides a profit since the market energy price is determined by the most expensive unit dispatched.  The issue becomes more critical when a market requires new peaking units.  These units may only run 100 hours out of the year, and in a cool summer they may not run at all.  That means if the unit owners are solely dependent upon energy revenues, those units will be losers in many years.  While they may make high profits when prices spike during shortages, such opportunities may be few and far between. 

The alternative is to provide an additional revenue stream through a capacity market in which generators are paid up front to be available should the market need their capacity at some point during the year.  With this revenue stream, new capacity can get financing to build, and shareholders are assured of at least getting some return on their investment. 

Regions like PJM, New York, and New England where capacity has been tight have relied on capacity markets.  Others that went into deregulation with significant capacity overhangs such as MISO and ERCOT have gotten by with just energy markets.  And California, after finding its energy-only market resulted in rolling blackouts, returned to state-run resource adequacy. 

So are capacity markets necessary?  I’d say that unless you have a market mechanism that allows state-run resource adequacy to ensure capacity, they eventually are necessary.  And even where state regulation is available, I’d contend that competitive capacity markets are a better way to ensure capacity than depending on regulators to decide what gets built.

Posted in Electricity | Leave a comment

Was Rockies Express a Mistake? Part II

By Belinda Petty, Enerdynamics Instructor

As I discussed in Part I of this article, Rockies Express or REX was slated to be among the biggest, most impactful pipeline projects in U.S. natural gas history. The project, which commenced in 2004, could alleviate the problem of producers with gas trapped in the low-price Rockies region by moving huge volumes of pent-up supply while simultaneously moving the gas toward the east coast – the highest-priced market in the U.S. But a combination of construction delays, cost overruns and a boom in Marcellus shale production on the East Coast turned Rockies producers’ high hopes into potential failure. So what happened? And was REX a mistake?

The majority owner and operator, Kinder Morgan Energy Partners, was able to hedge both steel and certain labor costs associated with REX. Owning the pipeline would have been a financial disaster without this cost containment. Even so, right-of-way issues and rising skilled labor costs were a big factor in cost overruns.

Lawsuits combined with local and state regulatory delays were very costly in terms of extra time, dollars, and public image. Instead of partnering with stakeholders like city officials and landowners, it appears as if REX rammed the project through. While no one ever expects 100% stakeholder approval, the reputation REX earned has left a negative industry impression with some stakeholder groups. Public/customer relations could have been better managed.

From an asset owner perspective, the project has been moderately successful. The pipeline was fully subscribed, meaning owners have locked in the reservation rate as minimum cash flow for the first 10 years of operation. Thus shippers who signed firm contracts, not the pipeline owners, carried much of the risk. However, given the lack of delivery market price differential, the pipeline is not being fully utilized today, and pipeline owners are not fully recovering expected profits since revenues associated with flowing gas are not being realized.

The street forecast does not change this fact for the next several years. In fact, one of the minority owners, ConocoPhillips, recently attempted to sell its 25% ownership but withdrew the offer for lack of interest. The expectation for the near and intermediate future is that price differentials will not support the $1.10/Dth transportation rate. As the original firm contracts expire or are released, the asset owners might expect cash flow to drop.

For Rockies producers/shippers, REX has provided a much-needed outlet to ship and sell production away from the local market. Initially, gas sellers were able to capture a higher market price in the Midwest and East that more than paid for their pipeline cost. However, as Marcellus production has ramped up, that price differential (known as basis) between the Rockies and the Midwest/East has shrunk. Today, the basis modestly covers the variable pipeline cost plus only a portion of the reservation cost on most days.  This means shippers do not collect enough extra value to pay for the reservation charge they are obligated to pay.

So, even though gas can move to market, that elusive higher netback has not materialized. While the pipeline may have contributed to higher prices in the Rockies than would have otherwise occurred, it also has significantly contributed to a differential collapse throughout the U.S. While it has given Rockies producers a new, big option, the cost of transportation into a production-rich market is prohibitive. On one hand, REX has been a godsend to producers. On the other hand, REX has been just another margin bubble that has burst.

If REX had been completed just two years earlier, Rockies producer/shippers and pipeline owners would have been rewarded by capturing huge netback premiums while building market share in the eastern market. If REX had been proposed two years later, it most likely would never have been built. Since gas markets can change rapidly, the jury is still out on whether REX will be deemed a success or a very expensive mistake. But shippers on REX shouldn’t feel singled out; there are also a number of new LNG terminals in the U.S. that are heavily underutilized, too!

Posted in Natural Gas | Tagged , , , , | 3 Comments