Why Solving the Natural Gas Fracking Conundrum Matters – A LOT!

by Bob Shively, Enerdynamics President and Lead Instructor

In our earlier blog posts, The Natural Gas Fracking Debate: What Is Fracking and Why Does It Matter?, we discussed the environmental issues involved with fracking. Our discussion examined why it is unclear whether we’ll get such issues resolved or whether public opposition could lead to restrictions on use of shale gas and other reserves.  In this post, we’ll discuss why it is very important to find a way to safely frack.

The ability to exploit reserves via fracking is critical both from an environmental protection standpoint and from an economic development standpoint.  Despite our currently fractured political arena in Washington, this issue should be one that both sides of the debate should desperately want to solve and then demonstrate to the public that it has been solved.  Here’s why.

Natural Gas is Necessary to Reduce Environmental Impacts of Power Generation
We need electricity to power our society.  And, while energy efficiency can help mute demand growth, it can’t replace the need for generation or eliminate electricity’s rapid growth in the developing world.  At some point, scientists and engineers can envision a world without use of fossil fuels for generation.  But even the true optimists talk about this happening in 20 years – not tomorrow.  Others envision a world powered by a revitalized nuclear industry and/or one driven by coal where carbon emissions are captured and sequestered.  But again, these visions are decades into the future.

In the meantime, natural gas is the only fuel that can bridge the gap while reducing environmental impacts. The largest reserves are located in Russia and the Middle East. In fact, if just conventional reserves are considered, about 40% of the world’s natural gas reserves are located in just two countries: Russia and Iran.  So for geopolitical reasons, dependence on international conventional reserves may not be the best strategy.

And from an economic standpoint, as of late September 2011, spot LNG trades for around $10/MMBtu in Europe and $16/MMBtu in Japan compared to a U.S. spot price of less than $4/MMBtu. We have already seen industrial companies with a large demand for natural gas ramp up production in the U.S., which helps in a weak employment environment.  Ongoing growth of non-conventional gas supplies through fracking promises to continue this advantage for the U.S. economy.

So What Do We Do?
Where does this scenario leave us? Gas and oil production companies may have believed that fracking’s potential to spur economic growth would outweigh the public’s concern about the environmental impacts of fracking. But that seems unlikely now given the public distrust in a post-Gulf oil spill world.

The only way to move forward is for oil company executives, environmentalists and government regulators to unite and implement reasonable concrete ways to frack with limited impact on the environment. And then they will have to work collectively to convince the public that the issues have been resolved correctly. If this can be accomplished, we’ll have an opportunity to clean up our power production and transportation sectors while lifting ourselves out of the country’s current economic malaise.

Posted in Natural Gas | Tagged , , , , , , , , , | Leave a comment

IT’s Growing Need for Energy Business Training

By John Ferrare, Enerdynamics’ CEO

There is little argument that the energy industry needs to learn more about information technology (IT). On the electric side, IT is critical to running power plants, operating transmission systems, scheduling power flows, and accurately communicating with and billing consumers. But the industry’s current use of IT is just the tip of the iceberg. According to the Electric Power Research Institute (EPRI) the annual rate of data intake by electric companies may increase by 3000 to 8000% annually as the Smart Grid is implemented!

As shown in the graph below, the amount of data storage required has already grown significantly in the last five years. Data growth starts with automation of the distribution system and moves through deployment of new information-based systems all the way to customer-owned appliances that are expected ultimately to participate directly in electric markets. A similar progression, though less dramatic in terms of volume, will occur on the gas side of the business.

Annual Growth in Data Storage at a Utility Company

While the energy industry clearly needs to learn about IT so it can be used effectively, the IT industry also needs to learn about the energy business. It is one thing to design systems to move data from one place to another, but it is hugely more challenging to design systems that utilize data in ways that support the business needs of the organization. Without IT professionals who understand both IT and the energy business, many IT projects are doomed to waste millions of dollars and drag on forever without ever achieving their goals. Even successful projects such as the CalISO’s Market Redesign and Technology Upgrade (MRTU) and ERCOT’s Texas Nodal Market Implementation took many years and significant amounts of re-programming to get right.

This might be why we are seeing increasing interest from IT departments in our energy business acumen curriculum. While we’ve always had IT people attend our classes, increasing numbers of IT groups are requesting training programs on their own. Several examples come to mind:

  • One of our clients in  India  writes software for energy companies worldwide. We are working with this client to provide a certification program for its programmers. The company’s goal is to ensure that programmers understand the business aspects of gas and electricity allowing them to work effectively with their clients’ technical and business experts.
  • One of the largest electric utilities in the nation, with an IT department larger than some utilities in their entirety, is offering several of our electric courses exclusively to employees within their department. This ensures access to the classes without having to go through the corporate learning program and allows us to customize the content to the specific needs of the IT employee.
  • A client who wrote software for gas scheduling found that business acumen training greatly improved its programmers’ ability to understand the business rules the pipeline wanted programmed into systems.

Changes in the energy industry will only increase the value of business acumen training to IT professionals.  These employees simply cannot design the systems that will be required without a basic knowledge of how the business of energy actually works. So we won’t be surprised to see increasing numbers of IT organizations take training matters into their own hands.

Please contact Enerdynamics for more information on the types of training that we’ve developed for IT groups and how we can help your organization better understand the business of energy: jferrare@enerdynamics.com or 866-765-5432.

Posted in Electricity, Natural Gas, Renewables | Tagged , , , , , , , | Leave a comment

How to Understand Electric Market Structures

By Bob Shively, Enerdynamics’ President and Lead Instructor

Understanding how electric markets are structured used to be simple. Just figure out who the vertically integrated utility is, then who regulates it and how. But since the days of deregulation (or liberalization or restructuring depending on which term you prefer), things are a lot more complex. Today you might find a different market structure in each country. And in some countries such as Australia, Canada, and the U.S., you might find different structures in each province or state. 

With this in mind, we’ve put together the following guide to understanding electric market structures.

First you must understand the various sectors: generation, wholesale markets, system operations, transmission, distribution, and retail supply.  Let’s briefly discuss each one:

  • Generation is the sector in which power is created, usually by large centralized power plants but also by smaller decentralized plants located at or near customer facilities. Generation can be owned by vertically integrated utilities, power authorities, independent power producers (also called merchant generators or gencos) or by end users. And, in some limited cases, aggregated economic demand response can also participate in markets as a source of “generation.”
  • Wholesale markets are where power is bought and sold between generators and entities that resell power to end users. These markets can depend on bilateral contracts (private contracts between two parties) or can depend on organized markets, which are run by a central authority such as a Power Exchange (PX), an Independent System Operator (ISO), or a mix of both.
  • System operations is the sector in which supply and demand are balanced and system reliability is maintained. This occurs through provision of ancillary services such as regulation, reserves, voltage support, and black start as well as real-time balancing of supply and demand. System operations may be provided by a vertically integrated utility, a power authority, a transmission owner (TO), or an ISO.
  • Transmission is the high-voltage network that moves power long distances from generators to distribution systems. Transmission may be owned by vertically integrated utilities, power authorities or stand-alone transmission companies (transcos). The transmission owner, also called a TO, may provide transmission services directly or services may be provided by the ISO with payment going back to the TO from the ISO.
  • Distribution is the low-voltage network that moves power from the transmission system to the consumer.  Distribution services may be provided by a vertically integrated utility or by a stand-alone utility distribution company (UDC).
  • Retail supply is the provision of electricity supply to an end-use customer. In many cases retail supply is provided by the distribution company as a service bundled into distribution services. In other cases, end-use customers have the option of buying supply directly from a non-utility retail marketer.

So now that you understand the sectors, how do you parse out the market structure in a specific area? Simply answer the following questions and you will be able to map out the market structure:

  1. What entities own generation and to whom are they allowed to sell it?
  2. Are there centralized markets run by a PX or an ISO, is all wholesale power traded in bilateral markets, or is there a mix of both?
  3. If there are centralized markets, which services trade in these markets?
  4. Who is responsible for system operations?
  5. What entities own transmission and do they provide services directly to the market? Or are transmission services provided by an ISO?
  6. What entity owns the distribution system?
  7. Are any end-use customers allowed to buy supply from non-utility retail marketers or is supply bundled with distribution services?
  8. If some end-use customers are allowed to shop for supply from non-utility retail marketers, which customers have that option and do they also have the option of buying bundled services from the distribution company?

Want to see an example of how these questions are applied to a real-world scenario? Read the full article in the Q3 issue of our Energy Insider newsletter to see how answering these questions can shed insight on Brazil’s electric market structure.

Posted in Electricity | Tagged , , , , , , , , | Leave a comment

The Natural Gas Fracking Debate: What Is Fracking and Why Does It Matter? Part II

The headquarters of the United States Environm...
EPA Headquarters – Image via Wikipedia

By Bob Shively, Enerdynamics President and Lead Instructor

Considering the benefits outlined in The Natural Gas Fracking Debate: What Is Fracking and Why Does It Matter? Part I, fracking sounds great. It gives access to a clean domestic energy source that works well in conjunction with renewables. But there is a catch. As fracking has been used more, it is apparent that there are potential environmental impacts. These include:

  • Disposal of fracking fluids: Anywhere from 15 to 80% of the fracking fluids are returned to the surface and must be disposed of.  The fluids include chemical additives that are used to improve the fracking process and may also include additional substances absorbed from the underground formation. Until recently, the production industry has been reluctant to reveal what chemicals are in the fracking fluids, which must be disposed of in a safe manner. In a few limited cases, humans appear to have had severe reactions after coming in contact with fracking fluids.
  • Migration of fracking fluids and/or natural gas into water supplies: While the production industry claims that fracking fluids and produced natural gas cannot leak into water supplies because fracking is performed at much lower depths than the water tables, anecdotal evidence seems to indicate that, either through improper drilling techniques or through other causes, cases of ground water pollution have occurred in limited incidents.
  • Leaks of greenhouse gases: At least one study has suggested that the fracking process can result in significant release of methane during the development process. Methane is a significant greenhouse gas.

Can These Issues be Resolved?
So, can these issues be resolved and safe production occur? It’s the million dollar question. Most within the gas industry believe the answer is yes, but the industry must convince the public and regulators. The U.S. Environmental Protection Agency (EPA) is currently performing a comprehensive study with results expected by 2012 (see http://water.epa.gov/type/groundwater/uic/class2/hydraulicfracturing/index.cfm). France has banned fracking until studies are complete. On a domestic front, New Jersey has halted all fracking until more research is done, and many other U.S. states are grappling with the question of how to regulate fracking. And it is possible that the EPA study may lead to federal regulations on fracking.

Worst-case scenario? The public and policy makers decide that fracking is unsafe and suddenly natural gas supplies are very tight, which will result in major gas price increases. A best-case scenario is that the production industry works closely with scientists, regulators and the public to develop and implement safe techniques that allow for exploitation of huge gas resources. It remains to be seen whether the outcome is worst-case, best-case or somewhere in between.

Meanwhile, the U.S. Department of Energy (DOE) is also looking into safety issues related to developing shale gas. Department of Energy Secretary Steven Chu said he sees evidence that “bad things have happened,” including water pollution where fracking fluids and natural gas have appeared in drinking water supply.

“The question is, what is the cause of that, and how can they be prevented and mitigated,” Chu said. “Science will give us better ways of monitoring what is going on.”

A sub-panel of scientists set up by the DOE to come up with recommendations released a report on August 11 . It suggests that impacts of fracking are manageable, but it also presented a number of recommendations on how fracking could be made safer.

Posted in Natural Gas | Tagged , , , , , , | 3 Comments

The Natural Gas Fracking Debate: What Is Fracking and Why Does It Matter? Part I

By Bob Shively, Enerdynamics President and Lead Instructor

Those who follow environmental issues and/or are involved in the energy industry have no doubt heard about the fracking debate. One headline announces that fracking is a huge advancement that has given us a huge new supply of domestic natural gas supply (see for instance this summary of a talk by U.S. Department of Energy Secretary Steven Chu). The next headline reads that fracking is one of the biggest current threats to the environment and human health (see Greenpeace’s take on the issue). This article examines what fracking is and the potential environmental issues at the heart of the fracking debate.

What Is Fracking?
Hydraulic fracturing, commonly called “fracking,” is a method of stimulating gas flow in underground formations.  Under so-called conventional natural gas production, the underground formations that hold natural gas are permeable rock meaning that small holes and fissures in the rock allow gas to flow within the formation. The gas is normally trapped by a layer of non-permeable rock that forms a cap. To produce natural gas, a well is drilled into the rock holding the gas and the gas flows from the higher pressure underground to the lower pressure wellhead.

The need for fracking arises when the rock holding the gas does not have sufficient permeability to flow adequate volumes to make a well economic. Producers have learned that by increasing permeability of the rock, more gas can be recovered. This is the purpose of fracking, which is a technique that fractures the underground rock as a means of increasing the flow.

The process of fracturing begins with drilling a well. First the well is drilled vertically, but once the desired depth is reached, the well is then drilled horizontally. After pipe has been inserted in the well and the pipe has been cemented in, perforations are made in the pipe and cement in sections where gas flow is desired. This is done using a device called a perf gun. Next, a mixture of fluids including water and chemical additives are pumped into the well at high pressure. The fluids flow through the pipe and out the perforations and, given the high pressure, cause fractures in the rock. When the fluids are then pumped back out of the well, gas flows through the newly created fractures, into the pipe via the perforations, and then to the wellhead. (For a good video of the process see the American Petroleum Institute’s website.)

source: http://www.propublica.org/special/hydraulic-fracturing-national

What Fracking Technology Has Done for Gas Reserves
The technique of fracking has resulted in a huge boost to U.S. gas reserves. It allows more gas to flow from some conventional wells, but more importantly, it allows use of gas in formations where the rock is not permeable enough to allow economic gas production. The key unconventional resource is shale gas, which in 2009 made up 14% of U.S. gas supply but is projected to increase to 45% by 2035 according the U.S. Energy Information Administration.

Production of these unconventional reserves is not possible without fracking. So using these techniques means we have access to a large domestic energy resource that is the cleanest of the fossil fuels and is useful for heating our homes, running our power plants, and fueling large industry such as chemical production. And gas power plants function well with renewable energy production since gas units have the flexibility to ramp up and down in response to variability of wind and solar supply.

This provides a good summary of what fracking is and what it can do for the natural gas industry. In next week’s post we’ll examine why natural gas fracking is in the hot seat with environmental groups as well as if and how the issues can be resolved.

Posted in Natural Gas, Uncategorized | Tagged , , , , , , , , | 5 Comments

Are Renewables Vital to Our Nation’s Security?

by Bob Shively, Enerdynamics President and Lead Instructor

In the hallways, offices and hearing rooms of energy policy makers the debate goes on. Are renewables just a fashionable trend that we as a nation can’t afford to spend our time and money on, or are they critical to our society’s future? The traditional fuel industries tell regulators that they should require the energy companies to focus on the lowest cost sources available, which to the industry executives mean existing fuel sources (for instance, see comments by Peabody Chairman and CEO Gregory Boyce).

Others argue that the true cost of fossil fuels is not embodied in today’s prices and that a number of benefits can accrue from moving to more sustainable resources (see a study by the Union of Concerned Scientists arguing the benefits of clean energy for the Midwest U.S.)

Lately a surprising new player, the U.S. military, has shifted its focus on the value of renewable energy. According to the Sierra Club, the U.S. military uses more petroleum and more total energy than any other organization on the planet. It accounts for 80% of the U.S. federal government’s energy tab.* Given this statistic, one would think the U.S. military would focus on cheap energy sources rather than the “softer side” of energy. But interestingly the U.S. military has become a big supporter of renewable energy for a variety of practical reasons. In fact, the Department of Defense has a goal of providing 26% of its energy from renewable sources by 2020. It is currently using 11.3% renewable energy and is on track to meet its 26% goal if efforts continue, according to Office of Management and Budget.**

Why has the military suddenly joined the same side of the energy debate as the “tree huggers”? Because military planners and top leaders have concluded that it makes sense.  One of the biggest requirements for military transport is moving fuel.  Watch the news and you will continually hear reports of military convoys being attacked.  Replace a diesel generator with solar cells and batteries and suddenly your transport needs have declined significantly.

A modern infantry soldier often carries five pounds of batteries just to maintain the communication and other electronic devices used in the field.  Replace most of the batteries with a fold-up solar charger and suddenly everyone’s pack is lighter.

When running a military base, in the U.S. or elsewhere, one of the commander’s biggest concerns is energy security.  Each knows that if a base’s energy supply is cut, it is highly vulnerable. By developing localized smart grids that are not dependent on the larger utility grid outside the base, commanders can rest more easily knowing that someone blowing up a transmission line won’t impact the base’s electricity.

And by developing realistic alternatives to fossil fuels for powering airplanes and ships, the military reduces its risk that foreign oil producers will pull the plug on necessary fuel supply.***

Assuming the military does achieve the goal of 26% renewable energy in the current decade, what does this mean for the rest of us?  We now take for granted a number of technologies that once had a military beginning – the Internet, microwave ovens, GPS, jet engines, and even SUVs.  These technologies were once too expensive for the common citizen,  but today they are mainstream conveniences.  A military push into renewable energy will likely accelerate the day that renewables will stand side by side with conventional energy sources in pure economic analyses.

*See http://www.sierraclub.org/sierra/201107/blood-and-oil.aspx
 
**See http://www.americanprogressaction.org/issues/2011/06/pdf/energy_security_memo.pdf
 
***See http://seattletimes.nwsource.com/html/nationworld/2015419818_milbiofuels26.html and http://www.defensenews.com/story.php?i=3885995

______________________________________________________________________________________________

Coming soon…Enerdynamics’ newest online course, Renewable Energy Overview. Contact John Ferrare at 866-765-5432 or jferrare@enerdynamics.com for details.

Posted in Renewables | Tagged , , , , , , | 1 Comment

Why Bill Gates Still Believes in Nuclear Power

Nuclear power plant in Cattenom, France
Image via Wikipedia

by Bob Shively, Enerdynamics President and Lead Instructor

Nuclear, once the bane of environmentalists, found more and more acceptance as concerns over global warming shifted from radiation to greenhouse gas emissions.  Some environmentalists spoke in favor of nuclear power, while many others at least grudgingly accepted that it was better to at least keep running the existing units than to replace them with units run by fossil fuel.

But the equation changed again when the earthquake and tsunami hit Fukushima, Japan, resulting in the greatest nuclear power incident since Chernobyl.  Energy policy makers around the world were forced to grapple with the question of whether low-probability but high-consequence accidents are worth risking to attain the benefits of clean baseload nuclear power.  In some countries such as France, South Korea and the U.S., at least for now, the policy has remained pro-nuclear.  But in others, such as Germany, plans are unfolding to prevent new nuclear units and to shut down existing generation.

Wired Magazine recently held a business conference where former Microsoft CEO Bill Gates spoke on the subject of future energy sources (see http://www.wired.com/magazine/2011/06/mf_qagates/).  According to Wired, Gates contends that nuclear power is significantly safer than coal or natural gas generation.

Stated Gates: “Coal and natural gas…tend to kill only a few at a time” while “nuclear mishaps tend to come in these big events.”

And, says Gates, if you compare the nuclear record to the number of people that coal or gas have killed per kilowatt-hour generated, nuclear is far less. Gates adds that nuclear power design has hardly seen any innovation in the last three decades.  He believes that by using modern supercomputer simulation capabilities, a new wave of safer and cheaper reactors can be developed.

But what about renewables such as solar?  Gates says the barriers to development are too high.  He thinks that solar power, at least at the distributed level, is too expensive and will only appeal to the well-off who want the status symbol of solar panels on their rooftops.  Also Gates believes that the need for storing power to balance out variability is a huge technology problem.  While he believes that battery development for electric cars is achievable, he thinks the volume requirements for batteries to balance out renewables is too huge.

Others think differently. For instance, U.S. Energy Secretary Steven Chu, a co-winner of the Nobel Prize in Physics (1997), is pushing the sunshot program that he believes will bring the cost of solar power down to levels competitive with existing generation in the U.S. ( see http://www1.eere.energy.gov/solar/sunshot/).

Needless to say, the battle between various generation technologies will be interesting to watch.  And the results will be one of the key factors that will define our future society.

Posted in Electricity, Renewables | Tagged , , , , , , | 2 Comments

How to Deal with Low Probability, High Impact Risks

By Bob Shively, Enerdynamics’ President and Lead Instructor

A pipeline explodes and bursts into flames in California.  An earthquake followed by a tsunami results in nuclear disaster in Japan.  Once the initial impacts have been addressed, serious questions follow: What happened and why?

The media and general public often question how or why the utility companies could operate such a dangerous system.  Why didn’t they anticipate such disaster and prevent it from happening? Meanwhile, engineers and technicians study the accident with greater focus and attempt to learn specifically what happened and how future systems can be designed and/or operated differently to avoid such disaster in the future.

A good example of this scenario is the recent nuclear disaster in Japan.  In the U.S., the
Nuclear Regulatory Commission (NRC) quickly instituted a task force that assessed whether U.S. nuclear power plants are prepared for a natural disaster on the scale of what happened in Japan in March 2011.  On July 12, 2011, the task force released its
findings (http://pbadupws.nrc.gov/docs/ML1118/ML111861807.pdf).  Based on the report’s revelations, the task force recommended numerous changes focused on strengthening units’ resistance to failure during disasters and improving disaster-response plans.

Chairman of the NRC Jeremy Jaczko said in a speech July 18, 2011, that he believes the agency should act within 90 days to require nuclear power plants to bolster emergency preparedness. (It should also be noted that he said the units are currently safe.) Whether or not that occurs will depend on the Chairman and votes from the four other commissioners. (View the current makeup of the NRC here: http://www.nrc.gov/about-nrc/organization/commfuncdesc.html.)

So why isn’t everyone in favor of enhanced safety?  Quite simply, it costs time and money.  Already, the President and CEO of the industry group Nuclear Energy Institute,
Marvin Fertel, stated that the as the NRC considers changes it “should expect the staff to justify the value of any new or revised requirements.”  This says it all: Change means increased costs, which either means reduced energy company profits and/or increased electric rates for consumers.

According to the NRC, the chance of major damage at any single U.S. plant is less than 1 in 10,000. Are we as a society willing to spend millions – maybe even billions – of
dollars to protect against this?  That is a question we need to explore thoroughly given today’s energy infrastructure.

Posted in Electricity, Natural Gas | Tagged , , , , , , , | Leave a comment

Has the Need for Remote North American Natural Gas Been Supplanted by Shale Gas?

By Bob Shively, Enerdynamics’ President and Lead Instructor

Just a few years ago, Americans thought domestic natural gas supply could not keep up with growing demand.  Forecasts indicated that the U.S. would need gas supply from Alaska, Arctic Canada, and from other regions of the world via Liquefied Natural Gas (LNG) to supply consumers.

But that picture has changed.  New capabilities to exploit shale gas in the lower 48 states and western Canada have led to common belief that supplies are robust.  And many owners of LNG terminals or proposed terminals are now working to change their strategy to gas export, not import.

What does this mean for the two long-discussed projects – the MacKenzie Gas Project in northern Canada and the Alaskan Natural Gas Pipeline?  Well, while their proponents may not be ready to admit it, it appears these projects have moved to the back burner.  Shell announced July 15, 2011, that it plans to sell its share of the MacKenzie project as well as its other assets in the region.

This is a big blow to the project since Shell has long been a key partner.  And this comes mere months after the National Energy Board of Canada issued a certificate for construction of the project.  The remaining partners (Imperial Oil, ConocoPhillips and ExxonMobil) have said no decision will be made on whether to move forward until at least the end of 2013.

Meanwhile in Alaska, one of two competing pipeline projects – the Denali Pipeline (owned by subsidiaries of BP and ConocoPhillips) – announced on March 17, 2011, that the project was terminated due to lack of sufficient customer commitments.

The other project – the Alaskan Pipeline Project (owned by TransCanada and ExxonMobil) – states that it is “currently assessing open seasons bids submitted by multiple shippers and conducting ongoing negotiations to secure signed Precedent Agreements.” At its earliest, this project could begin the regulatory approval process in late 2012.  More likely, it too will be delayed further.

The problem in one word: costs.  These projects require from $16 to $35 billion to construct.  When producers look at the cost to transport gas to markets plus the costs to develop supplies in unfriendly arctic environments, they just can’t see getting gas to market and competing with shale gas that is being produced and sold at prices around $4.50/MMBtu.

It is possible a smaller pipeline will be developed to deliver Alaskan supplies to markets within the State of Alaska, but most likely remote Canadian and Alaskan supplies will stay underground waiting for a future change in the supply/demand situation.

Posted in Natural Gas | Tagged , , , , , , | Leave a comment

Solar Installations Up 66 Percent in Q1 2011 Compared to Q1 2010

by Enerdynamics staff

Based on reports of exponential growth from Q1 2010 to Q1 2011, the U.S. solar industry’s future seems as bright as the sun that fuels it.

According to the U.S. Solar Market Insight™: Q1 2011 released in mid June by the Solar Energy Industries Association® (SEIA®) and GTM Research, the U.S. installed 252 megawatts (MW) of grid-connected photovoltaics (PV) in Q1 2011 alone. This marks a 66 percent increase from Q1 2010 to Q1 2011.

The SEIA® reports the United States’ cumulative grid-connected PV installations have reached more than 2.85 gigawatts (GW), enough to power nearly 600,000 U.S. homes. And while no concentrating solar power (CSP) projects came online in Q1 2011, a total of 1.1 GW of CSP and concentrating photovoltaic (a technology in which sunlight is concentrated on PV cells) projects coupled with significant forecasted additional PV growth has the U.S. on pace to become the world’s largest solar market within the next few years.

So in what geographic regions and market sectors is this growth most prevalent? Geographically, the top seven states in Q1 2010 accounted for 82 percent of U.S. installations. While the Top Seven list changed a bit since 2010, the seven top-ranked states of Q1 2011 (California, New Jersey, Arizona, Pennsylvania, Colorado, New York and Massachusetts) made up 88 percent of U.S. installations. In other words, the top-ranked states continue to gain traction and dominate market share. This is partly attributed to state-specific programs that incentivize customers to participate in solar installations. For example, as explained in the U.S. Solar Market Insight™: Q1 2011:

“In California, the CSI’s relatively new solar water heating incentive of up to $1,875 per installation for residential homes and $500,000 per installation for commercial and multi-family structure is helping to drive increased interest in solar water heating that we saw begin in 2010. Arizona’s market also remains quite strong, with most utilities offering production incentives that can cover up to half of a system’s costs. Look for Arizona to be a leading market by the end of 2011.”

From a market sector standpoint, non-residential (commercial, public sector and non-profit) installations ruled with 119 percent growth from Q1 2010 to Q1 2011. Residential installations continued a pattern of marginal yet steady quarter-over-quarter growth.

So to what specific factors are industry analysts attributing U.S. solar power’s strong showing in Q1 2011? The two primary factors identified in the SEIA® report are “market fundamentals” and “2010 overhang.”

Each factor is explained in more detail in the U.S. Solar Market Insight™ Q1 2011, but briefly, “market fundamentals” refers to improvements within the market that have made solar installations more viable for the masses. These improvements include a price decrease on modules, inverters and other components required for solar installation; expansion of new business models like the residential solar lease; and the aforementioned state-sponsored incentive programs.

The “2010 overhang” refers to the completion of installations that commenced in 2010 as part of the Section 1603 Treasury grant originally set to expire on Dec. 31, 2010. Though the deadline was ultimately extended to Dec. 31, 2011, many projects were started in 2010 to meet original grant guidelines, and many of those projects wrapped up in Q1 2011.

What does all this good news mean for the solar industry in the remaining months of 2011? According to the report’s authors, it means the bar has been set very high and the market must continue its rapid acceleration to meet expectations. Says the SEIA® report:

“Despite strong growth in the first quarter, the market will need to ramp up even faster in order to meet industry expectations, which generally anticipate at least another doubling of the total U.S. PV market in 2011. Given the pipeline of projects and recent module price declines, we believe this outcome remains likely.”

It appears that solar power, which has traditionally been a very small part of the overall electric marketplace, may finally be gaining the traction it needs to become a more significant part of our energy mix.

Posted in Electricity, Renewables | Tagged , , , , , , , | Leave a comment