Court Halts Implementation of New CSAPR Power Plant Emissions Rules

by Bob Shively, Enerdynamics President and Lead Instructor

In our latest issue of Energy Insider I examined the proposed new regulations on U.S. power plant emissions in 2012 and the dramatic effect they may have on the energy business. On December 30, 2011, just two days before the new rules were to go into effect, the The United States Court of Appeals for the D.C. Circuit issued its ruling to stay the CSAPR pending judicial review (see http://www.epa.gov/crossstaterule/
pdfs/CourtDecision.pdf
). This means the CSAPR rules cannot go into effect until further ruling by the court.  The ruling requested that parties submit proposed schedules that would allow hearings to be held in April 2012.  However it is uncertain when a decision might be rendered. In the meantime, regulation of SO2 and NOx reverts back to the existing Clean Air Interstate Rule (CAIR).  For details on CAIR see  http://www.epa.gov/cair/index.html.

The court decision creates major uncertainty in both electricity and natural gas markets.  A trading program that was set to go into effect for power plants in 28 states is now on hold.  Coal units that seemed sure to be closed gain a new, although possibly temporary, lease on life.  And gas units that planned on running more hours to replace coal now may not find the markets so favorable.  The decision also creates uncertainty as to whether the recently announced Mercury and Air Toxics Standards (MATS)  (see http://www.epa.gov/
mats/basic.html
) will be able to survive court challenges.  The result is that all market participants must make decisions under a very uncertain future and that long-term decision making will again fall by the wayside.

Posted in Electricity | Tagged , , , , , , , , , , , | 1 Comment

Fuel Your Career in 2012 with Energy Business Training

Happy New Year! Enerdynamics is proud to announce its lineup of 2012 public seminars aimed at helping those in the energy industry thrive in today’s uncertain energy environment.

With a comprehensive understanding of the energy business and an industry-savvy staff, Enerdynamics has helped thousands of people develop a better understanding of today’s fast-paced gas and electric industries. Our courses are carefully crafted, blending compelling graphics, interactive market simulations and participation from all attendees to create a positive and valuable learning experience. As industry professionals, our instructors offer a depth of understanding not found in other classes of this kind.

Below is our 2012 course schedule (click on each course link for details and pricing). Want some great ways to save on registration fees? Register at least 3 weeks prior to each seminar to receive our $200 earlybird discount and like us on Facebook (www.Facebook.com/Enerdynamics) to get a promo code for an additional 20% off your seminar registration dues.

Enerdynamics 2012 Public Seminar Schedule

Gas and Electric Business Understanding
March 5-6, New York, NY
Click here for details and registration

Electric Market Dynamics
March 7-8, New York, NY
October 17-18, Chicago, IL
Click here for details and registration

Electric Business Understanding
April 16-17, Chicago, IL
May 23-24, Houston, TX
October 15-16, Chicago, IL
December 5-6, Washington, D.C.
Click here for details and registration

Gas Market Dynamics
April 18-19, Chicago, IL
Click here for details and registration

Gas  Business Understanding
May 21-22 , Houston, TX
December 3-4, Washington, D.C.
Click here for details and registration

For additional details or questions, contact us at www.enerdynamics.com, info@enerdynamics.com or 866-765-5432.

Posted in Electricity, Natural Gas, Renewables | Tagged , , , , , , , , | Leave a comment

All Gas Companies are Affected by the San Bruno Pipeline Rupture

by Bob Shively, Enerdynamics President and Lead Instructor

Safety and emergency response are critical to pipeline and distribution operations.  In general, the industry has a very strong safety record.  But every so often accidents do occur, and the consequences can be dramatic.

One recent example of such an accident occurred Sept. 9, 2010, in San Bruno, Calif., when a portion of the 30-inch diameter underground natural gas transmission system owned by Pacific Gas and Electric Company (PG&E) suddenly ruptured.

According to a California Public Utilities Commission (CPUC) report on the incident: “An explosion ensued, fueled by blowing natural gas. The explosion and fire resulted in the loss of eight lives and the total destruction of 38 homes. Seventy homes sustained damage and eighteen homes adjacent to the destroyed dwellings were left uninhabitable.”[1]

An investigation by the National Transportation Safety Board (NTSB)[2] identified several factors contributing to the incident:

  • The original pipe that was installed in 1956 contained a substandard and poorly welded pipe section with a seam weld flaw that caused a crack in the pipe
  • Despite the flaw, the pipeline functioned normally from the time of its installation until the incident
  • The crack was not discovered during standard pipeline testing because the nature of the pipeline prevented use of smart pigs; the line was not tested using hydraulic methods because such testing was not required under grandfathered provisions in CPUC and U.S. Department of Transportation regulations
  • The pipe was progressively weakened over time due to crack growth until the pipe ruptured at a pressure below the Maximum Allowable Operating Pressure (MAOP)
  • Concurrent electric work at a nearby gas terminal resulted in limitation in the pipeline SCADA system, which made it difficult for operators to respond promptly to the incident
  • A lack of automatic or remote-controlled shut-off valves contributed to a 95-minute delay in crews’ ability to manually shut off the pipeline

Mitigating Risk
Unfortunately, with 2.2 million miles[3] of natural gas pipelines in the U.S., much of which was installed many years ago, it is likely that numerous other flawed pipes exist buried underground – literally accidents waiting to happen.  The lead regulator for pipeline safety is the Office of Pipeline Safety (OPS) within the U.S. Department of Transportation. State agencies often also take a role in regulation, either implementing state safety legislation or working as agents for the OPS. As a part of OPS regulation, transmission and distribution pipelines must implement a Pipeline Integrity Management program. These programs provide for identification of pipe located in high-consequence areas – those with high risk of human injury or property damage if an accident occurs. Pipe in high-consequence areas must then be assessed to determine pipe condition, threats, and consequences should an incident occur. Pipelines then must develop specific mitigation plans to deal with identified threats[4].

In the PG&E situation, all of the above were followed according to regulation; however the facts relating to the pipe’s condition were unknown. This resulted in a mitigation strategy that proved inadequate.

A Safer Future?
So what can be done to avoid incidents like San Bruno in the future?  The answer really comes down to weighing safety versus costs.  Indeed, safety could greatly be enhanced by replacing all pipelines older than a certain age, but it’s highly doubtful consumers would be willing to pay the ensuing higher costs of natural gas. Instead, lawmakers and regulators must decide how to best balance consumers’ desire for low gas rates with improved safety measures such as more thorough yet more costly inspections and the installation of automatic valves.

In January 2012, President Obama signed into law a new pipeline safety bill based on recommendations from OPS (see http://fastlane.dot.gov/2012/01/president-obama-signs-pipeline-safety-bill.html#more).  This bill includes:

  • doubling of potential fines for safety violations to $2 million per incidence
  • an increase in the number of federal inspectors
  • mandatory automatic or remote-controlled shut-off valves for new or replaced pipelines; (it did not require retrofitting these values onto existing pipelines)
  • Up to $110 million/year in grants for state pipeline safety programs

However, the effect of the bill was muted a bit by an added provision stating that outside of high-consequence areas, the pipeline safety regulator cannot issue regulations establishing leak-detection requirements or expanding integrity management requirements until a study and rulemaking process is completed, which will take at least two years.

Meanwhile, California signed into law a bill that requires California pipeline owners to install automatic shut-off valves in vulnerable areas and to pressure test pipelines. And, in August 2011, OPS opened a new rule making to investigate whether rules for integrity management programs should be changed.  Industry comments are due this month.

It is safe to say that the San Bruno incident, coupled with other recent pipeline incidents, has caught the attention of legislatures, regulators, and pipeline companies. Already more stringent pipeline testing and installation of safety devices has begun as companies try to get ahead of the anticipated new regulation.

Will our pipeline system ever be 100 percent safe? No. That’s just not realistic. But careful evaluation of integrity issues and reasonable system upgrades can and should reduce the number of future incidents.


References and Resources 

[1] Report of the Independent Review Panel San Bruno Explosion Prepared For California Public Utilities Commission Revised Copy June 24, 2011, p.1

[2] NTSB Pipeline Accident Report, NTSB/PAR-11/01 PB2011-916501 available at http://www.ntsb.gov/doclib/reports/2011/PAR1101.pdf

[3] American Gas Association: http://www.aga.org/Kc/aboutnaturalgas/consumerinfo/Pages/NGDeliverySystemFacts.aspx retrieved December 8, 2011

[4] A good visualization of the Pipeline Integrity Management process can be viewed here: http://pipelineintegrity.willbros.com/Home-651.html

Posted in Natural Gas | Tagged , , , , , , , , , , , , | 2 Comments

What Is Cap and Trade?

by Bob Shively, Enerdynamics’ President and Lead Instructor

In the mid-1990’s, environmental regulators came up with a new concept to let markets and engineers figure out the best way to reduce pollution at the lowest cost possible.  This was a regulatory mechanism called Cap and Trade.  Cap and Trade will get further use with two new regulatory actions that will take effect in the U.S. in 2012.

Barring last-minute action by Congress or the courts, the Environmental Protection Agency (EPA) will implement the new Cross-State Air Pollution Rule (known as CSAPR), which will significantly increase limitations on sulfur dioxide (SO2) and nitrous oxide (NOx) emissions in 27 states in the Eastern and Texas interconnects. Meanwhile, California is set to implement regulation of greenhouse gas emissions. (We touched on these issues in a recent blog post titled “Will New EPA Regulations Impact Gas and Electric Markets Across the U.S.?”)

Both new rules will utilize Cap and Trade.  This method is designed for flexibility in allowing the marketplace to determine the most efficient ways of meeting an emissions cap set by the regulator.

Under Cap and Trade:

  • the regulatory agency starts with a historical amount of emissions in a specific geographical area
  • the regulator then sets a cap that will be allowed that is lower than the historic emissions
  • typically, the cap is further reduced over time to result in lower and lower emissions as the program is implemented over multiple years

Using the mandated cap to determine the amount of allowances, the regulator allocates allowances to emit regulated substances to various market participants who own power plants.  These may be allocated for free, or they may be auctioned.  At the end of a compliance period, power plant owners must submit a number of allowances equal to the amount of emissions they have put into the air during the period.  Failure to do so results in significant fines or other penalties.

Cap and Trade

To facilitate efficient solutions, market participants are permitted to trade allowances.  Thus participants that can cost-effectively reduce emissions can then trade unutilized allowances to other parties who have a higher cost of reducing emissions.  This means that the market should be able to find the least-cost way of reducing emissions to a specific cap.  This methodology has been used since 1995 for SO2 and more recently for NOx in certain regions of the U.S.  In general it has proved successful in reducing emissions at low cost.

During discussions about a national greenhouse gas law in the U.S. over the last few years, opponents from the right argued that Cap and Trade amounted to no more than a burdensome tax. Opponents from the left argued that it was too lenient in that it allowed parties to pay to pollute.  As the new regulations implemented in 2012 play out, we will all get a further chance to evaluate the benefits and costs of Cap and Trade regulation.

Posted in Electricity | Tagged , , , , , , , , , | 1 Comment

Enerdynamics Instructor Examines Post-Nuclear Japan in Series for Solar Today Magazine

Dan Bihn is one of Enerdynamics’ talented instructors who primarily teaches courses relating to renewable energy and the Smart Grid. Dan is currently writing a series of articles on “Post-Nuclear Japan” for Solar Today Magazine. He lived in Japan for seven years and speaks fluent Japanese, each of which has allowed him to more intimately study and understand the issues Japan faces as it looks to rebuild and redefine its energy plan.

Dan’s most recent article in Solar Today Magazine is titled Home Energy Management: Japan’s Market Model for the Energy Future.  The article examines Japan’s energy future in light of the tragedies of March 11, 2011. Writes Dan:

“The massive 9.0 earthquake of March 11, 2011 (the day is now known in Japan as 3-11) moved the earth’s axis nearly a foot and triggered the calamitous tsunami that killed some 20,000 people.

It also swept away Japan’s post-war energy policy, clearing space for a new smarter, cleaner and safer policy. Today, Japan stands with one foot planted on 20th-century nuclear power. The other foot seeks purchase on a smart energy platform, powered by the sun and the wind.”

Read the full article here and look for future articles by Dan in Solar Today Magazine and Enerdynamics’ quarterly eNewsletter, Energy Insider. Learn more about Dan at www.danbihn.com.

Posted in Electricity, Renewables | Tagged , , , , , , , | Leave a comment

Will New EPA Regulations Impact Gas and Electric Markets Across the U.S.?

by Bob Shively, Enerdynamics’ President and Lead Instructor

There is still time for things to change but as of now the Environmental Protection Agency (EPA) is scheduled to implement significant new environmental rules that will require electric generators in 27 states to cut Nitrogen Oxide (NOx) and Sulfur Dioxide (SO2) emissions beginning in 2012. The rule, called the Cross-State Air Pollution Rule (abbreviated CSAPR and commonly pronounced “casper”) will implement multiple new cap-and-trade programs with significant reductions in allowed emissions.

These new regulations will increase the cost of coal generation by adding the cost of allowances to the variable fuel and operations/maintenance (O&M) costs.  The rule may also result in the closing of marginal coal units that are too expensive to upgrade with new emissions control equipment.

The result? An increase in the marginal cost of coal generation and additional use of gas-fired generating units.  A recent webinar sponsored by GDFSuez featuring a presentation by Andy Weissman, Publisher of Energy Business Watch, made a key point suggesting that implementation of the rule will increase natural gas and electric prices throughout the U.S., not just in the states covered by the regulations.  Most observers expect that CSAPR will increase the cost of electricity in the regulated states, but they give little thought to the potential impacts on natural gas prices that are currently at the lowest level in a decade.

According to Weissman, gas prices are being set by the market at a level that is low enough to keep gas-fired generation competitive with certain low-efficiency coal units.  So once the cost of running these coal units go up, the price of natural gas will go up, too.  And given the continental nature of the North American gas market, this will result in gas prices rising across the U.S.  This means that generators in California and the rest of the West will see higher costs of operation even though they won’t need to buy NOx or SO2 allowances.  And if this does occur, the result will be higher electricity prices throughout the U.S.

We’ll be exploring the new CSAPR rules along with the coming carbon cap-and-trade regulations in California in our next Energy Insider issue available at http://www.enerdynamics.com/energy-insider-news.asp before the end of the year.

Posted in Electricity, Natural Gas | Tagged , , , , , , , , , , , , | 1 Comment

The Future of Electric Utilities, Part II

by Bob Shively, Enerdynamics’ President and Lead Instuctor

As discussed in The Future of Electric Utilities, Part I, long-time electric industry expert Peter Fox-Penner describes in his recent book Smart Power two possible models for the future utility. His views are very similar to visions Enerdynamics’ instructors frequently discuss when questions of the future arise in our courses.

The first model is the Smart Integrator.  As described by Fox-Penner, “The Smart
Integrator (SI) is a utility that operates the power grid and its information and control systems but does not actually own or sell the power delivered by the grid” (Peter Fox-Penner, Smart Power, p. 175).

Under this model, the utility’s mission will be to effectively run two networks: 1) an electric T&D network that delivers electricity from both centralized and distributed sources, keeps everything in balance with loads, and allows consumers to shift usage in response to price signals, plus 2) an information network that communicates with generators, meters, and price-responsive appliances to send information and control signals, and to collect data tracking each source’s supply and/or consumption.  In this model, the utility company will be similar to an internet service provider that gets paid for building and running a reliable network but does not provide the “content” or services that consumers use the network to obtain.  In this model, services are provided by third-party competitive retailers separate and distinct from the utility.

The second model is the Energy Services Utility.  Again, as described by Fox-Penner, “The mission of the Energy Services Utility (ESU) is to provide lowest-cost energy services to its customers – light, heat, cooling, computer-hours, and the dozens of other things we get from power each day” (Ibid, p. 189). In essence the ESU provides the network described above plus the energy services that are delivered along the network.

So, will competing technologies completely replace the need for the utility? That seems highly unlikely since it would be a huge stretch to think we’ll go to generating all our needs with self-contained equipment in our home.  We will need the network for reliability and to obtain cheaper supplies when someone else can do it cheaper than we can.

Then is the utility destined to go the way of the traditional phone company serving a smaller and smaller base of customers while trying to build revenue by selling new services in competition with non-regulated providers?  This is certainly a risk for utilities that try to implement the Energy Services Utility model.  Their management needs to think long and hard about whether their company culture can deliver services more effectively than non-regulated retailing entities.

The alternative is to focus on the network. The future of the electric utility is murky, but one thing is clear.  Someone will need to run the dual electric delivery/information highway network to enable the future “energy web.”  And the current electric utility appears uniquely suited to this task.

Posted in Electricity | Tagged , , , , , , , , , | Leave a comment

The Future of Electric Utilities, Part I

Cropped version:A worker climbing down an elec...
Image via Wikipedia

by Bob Shively, Enerdynamics’ President and Lead Instructor

At Enerdynamics, we often end our classroom sessions with a quote from business guru Peter Drucker:  “The corporation as we know it, which is now 120 years old, is not likely to survive the next 25 years.”

While Drucker wasn’t speaking specifically about the electric utility industry, we believe that his quote is directly applicable.  Over the next few years, utilities are going to be asked to effectively respond to ongoing penetration of disruptive technology, increasing environmental challenges, and significant change among the customer base.  So the question for utility management, customers, investors and regulators is: What is the future for the electric utility? Is it destined to go the way of the traditional phone company serving a smaller and smaller base of customers while trying to build revenue by selling new services in competition with non-regulated providers?  Or will competing technologies completely replace the need for the utility?

First let’s discuss the challenges.  Disruptive technology threatens to significantly change the way that utilities generate and deliver power.  We already are seeing a major shift in generation – natural gas combined-cycle units and renewable power are pushing older and less efficient coal units out of the market, and construction of new coal units has more or less been halted.  New environmental regulations taking effect in 2012 are likely to further this trend.  Meanwhile energy efficiency and load shifting are becoming a key resource, reducing the need for new generation. While not yet ready for commercial application, storage technologies may be on the cusp of changing the paradigm that electricity can’t be stored.

And while the supply side changes, transformation is also coming rapidly to the transmission and distribution (T&D) sector in the myriad technologies termed the Smart Grid. Initial implementations often consist of better sensors and controls in the T&D system hidden from customer view and/or smart meters that initially don’t do much except make it easier to collect usage data. However, the long-term potential for Smart Grid is vast.

A decade from now, we may well be using an electric network that communicates unique real-time energy prices to preprogrammed appliances that know when to run and when to shut-down to save consumers money.  This network may also communicate with thousands of distributed supply sources such as solar photovoltaic cells, fuel cells, and localized storage to determine when local supply is more cost effective than centralized generation, and it must manage supply flowing from both centralized and distributed sources. A recent McKinsey paper even suggests that new homes in 2020 may consume 90% less energy than today’s home.

From an environmental standpoint, despite current politics, it still seems likely that low carbon generation will become a key issue for electric utilities.  And even absent federal carbon legislation, the impact that regulation of other environmental concerns such as cooling water, sulfur dioxide, nitrous oxides, and mercury is having significant impacts on utility resource planning.  It is highly unlikely that these concerns will abate as time goes on.  Future generation dispatch may be asked to take environmental considerations into effect alongside the current factors of cost and reliability.

Lastly, electric consumers are likely undergoing a significant change.  Baby boomers brought up on a single black phone in the kitchen, just four television channels and no iPods are being replaced as consumers by individuals brought up with hyper change and vast choices.  It is hard to imagine that the new consumer base will be content with one or two rate schedules with frequent rate increases.

And all three changes are likely to occur in a world where electric demand may be flat, meaning that new costs cannot be buried by simply spreading them out over an ever increasing customer base. So where does this leave utilities?

In next week’s blog post, we’ll look at long-time electric industry expert Peter Fox Penner’s two proposed models for the future utility.  His views, as explained in his recent book Smart Power, are very similar to the visions that Enerdynamics’ instructors
requently discuss when questions of the future arise in our courses.

Posted in Electricity | Tagged , , , , , , , , , | 1 Comment

EPA to Begin Regulation of Fracking

Texas Barnett Shale gas drilling rig near Alva...
Image via Wikipedia

by Bob Shively, Enerdynamics’ President and Lead Instructor

As we’ve discussed in previous blog posts including The Natural Gas Fracking Debate: What Is Fracking and Why Does It Matter? Part II, the potential benefits of natural gas fracking are at risk of being overshadowed by the various hazards that the fracking process may introduce to public health/safety and the environment. These hazards include the disposal of fracking fluids, the migration of fracking fluids and/or natural gas into water supplies, and the leaking of greenhouse gases.

The exact threat each of these poses to the public and/or the environment has yet to be quantitatively proven. However, it is clear that the future of fracking greatly depends on the gas industry’s ability to work with regulators and the public to create and execute policies and procedures that make public health and environmental protection top priorities.

Until now, the federal Environmental Protection Agency (EPA) has left regulation of fracking up to each state. But on October 20, 2011, the EPA announced a proposal to implement regulation of disposal of wastewater discharges produced by natural gas extraction from underground coalbed and shale formations over the next two to three years.

Environmental journalism supports the protecti...

The EPA is currently conducting a national study of whether fracking has polluted groundwater and drinking water. Anywhere from 15 to 80% of fracking fluids are returned to the surface and must be disposed of. The fluids include chemical additives that are used to improve the fracking process and may also include additional substances absorbed from the underground formation. While some of the wastewater is currently recycled or injected into deep underground wells, much of it is sent to treatment plants that may not be equipped to handle the volume or toxicity of wastewater that fracking produces.

The new EPA regulation will require drillers to meet certain standards before sending fracking wastewater to treatment plants.  The specific rules will be developed through consultation with various stakeholders including the gas industry and environmental experts. According to the EPA, the standards will “be based on demonstrated, economically achievable technologies.” The EPA expects to implement standards for coal-bed methane gas in 2013 and for shale gas in 2014.

The EPA’s announcement of such regulation has received cheers and jeers from those on each side of the fracking debate. Proponents of the federal standards feel it’s a long overdue first step in regulating the exploitation of shale gas, which the EPA projects will comprise over 20% of the total U.S. gas supply by 2020. Opponents of the federal regulation feel the EPA is overreaching and intruding on what they believe to be more-than-sufficient regulation on the state level.

What side of this debate are you on? Let us know your thoughts on natural gas fracking by leaving a comment below.

Posted in Natural Gas | Tagged , , , , , , , , , , , , | 2 Comments

Are We On the Cusp of a Transformation in Electric Generation Technology?

by Bob Shively, Enerdynamics’ President and Lead Instructor

The U.S. electric power industry was born in 1882 when Thomas Edison’s Edison Electric Illuminating Company opened a central generating station at Manhattan’s Pearl Street. Soon thereafter a number of small distribution systems were created, and, by the early 1900s, electric utilities became widespread in cities.

The steam-powered reciprocating engine, which spun an electric generator, served as the original technology for Edison’s power generation.  Initial utilities used this technology plus hydropower to grow their systems.  But the reciprocating engines, which converted up-and down motion to the rotary motion required by generators, proved to be noisy, bulky and hard to maintain.

Meanwhile in England, Charles Parsons invented the steam turbine, which directly produced rotary motion as steam passed through vanes on a long shaft:

 The Steam Turbine(Click on image above or click here to see a short video of how a steam turbine works) 

The steam turbine proved to be smaller in size, mechanically simpler, quieter than reciprocating engines, and it could be scaled up to produce large amounts of electricity.  As steam turbines were implemented in the early 1900s, costs of power plunged and society became dependent upon electricity.

Into the 1990s, the steam turbine remained the favored technology for utility power plants.  And it proved versatile as it could be fueled by coal, natural gas and later on uranium in nuclear power plants. Even today, more than 70% of U.S. electricity is generated using steam turbines.

The steam turbine in recent years has lost much of its market share to a new technology – the gas turbine. A gas turbine uses ignited compressed air and fuel, resulting in gasses expanded directly through a turbine:

YouTube Gas Combustion Turbine

(Click on image above or click here to see a short video of how a gas combustion turbine works)

First used for electric generation in 1939 in Switzerland, gas turbines gained the notice of the utility industry when a gas turbine power plant on Long Island provided power to recover from the infamous New York blackout of 1965.  Initially gas turbines were used mostly for peaking purposes, but another innovation lead to more widespread use: using a gas turbine as the primary source of power, followed by using the turbine’s waste heat to create steam for use in a steam turbine.

This technology, called the combined-cycle gas turbine, became the new favored source of power in the mid-1990s:

YouTube Combine Cycle Gas Turbine

(Click on image above or click here to see a short video of how a combine-cycle gas turbine works)

Its benefits included operating flexibility, low up-front capital costs, and reduced environmental impacts relative to steam turbines powered by coal.  Today, the majority of new power plant installations in the U.S. utilize some form of gas turbine technology.

So for the first 100 years of the power industry, the steam turbine dominated.  In the last 20 years, the gas turbine has taken the lead in new installations.  But in today’s world of rapid advancement, could yet another technology or technologies be on the cusp of supremacy?

The U.S. Energy Information Administration (EIA) data for new power plants for the first six months of 2011 indicates that 51% of new plants used gas turbines or combined-cycle turbines, 24% used steam turbines and 24% used wind turbines.  Indeed it appears that wind will become an important source of power. But there’s another renewable technology worth keeping our eyes on — the photovoltaic cell (PV):

Photovoltaic Cells(Click on image above or click here to see a short video of how PV cells work)

While the PV currently is a very small part of new installations, falling costs could rapidly change this picture.

As discussed in our recent blog “Can Solar Power on Our Rooftops Compete with Existing Generation on Price?” there is the distinct possibility that PVs could compete with coal-fired steam turbines directly on price within the next decade. If so, expect to see a third revolution in generation technology.

Posted in Electricity, Renewables | Tagged , , , , , , , , , , , | 1 Comment