CPUC penalizes PG&E $1.6 billion for pipeline violations

By California Public Utilities Commission


Electrical Testing & Commissioning of Power Systems

Our customized live online or in‑person group training can be delivered to your staff at your location.

  • Live Online
  • 12 hours Instructor-led
  • Group Training Available
Regular Price:
$599
Coupon Price:
$499
Reserve Your Seat Today
The California Public Utilities Commission CPUC imposed the largest penalty it has ever assessed by ordering Pacific Gas and Electric Company PG&E shareholders to pay $1.6 billion for the unsafe operation of its gas transmission system, including the pipeline rupture in San Bruno, Calif., in 2010.

In approving CPUC President Michael Picker's penalty proposal, the Commissioners increased the penalty amount $200 million over what was proposed by Administrative Law Judges.

Today's decision orders PG&E to pay $850 million in gas transmission pipeline safety infrastructure improvements, most of which will be spent on capital investments that PG&E will not add to its rate base and thus will not earn any profit on $300 million in a fine to the state's General Fund $400 million in a one-time bill credit spread across PG&E's gas customers and approximately $50 million towards other remedies to enhance pipeline safety. When added to the disallowances already adopted in a prior CPUC Decision, the penalties and remedies exceed $2.2 billion.

"PG&E failed to uphold the public's trust. The CPUC failed to keep vigilant. Lives were lost. Numerous people were injured. Homes were destroyed. We must do everything we can to ensure that nothing like this happens again," said President Picker. "Our decision commits a significant portion of the shareholder-funded penalty - one of the biggest utility sanctions in U.S. history - to making PG&E's gas transmission system as safe as possible for the public, consumers, utility workers, and the environment."

"The Decisions we adopt today 4/9 signal that we expect accountability and performance from utilities we regulate and from ourselves at the CPUC," said Commissioner Catherine J.K. Sandoval. "Californians should receive what the law says they have the right to expect: safe, reliable utility service through adequate facilities at just and reasonable rates. This landmark Decision provides redress for the systemic causes that led to the San Bruno tragedy and will improve gas pipeline safety for generations of Californians."

Said Commissioner Carla J. Peterman, "No decision can rectify the loss that the community of San Bruno suffered as a result of the gas transmission pipeline rupture, but I believe that our decision will enable us to focus going forward on making sure the system is the safest it can be."

Penalties and remedies assessed against PG&E must be paid by shareholders and are not recoverable from PG&E's customers.

Related News

Solar Becomes #3 Renewable Electricity Source In USA

U.S. Solar Generation 2017 surpassed biomass, delivering 77 million MWh versus 64 million MWh, trailing only hydro and wind; driven by PV expansion, capacity additions, and utility-scale and small-scale growth, per EIA.

 

Key Points

It was the year U.S. solar electricity exceeded biomass, hitting 77 million MWh and trailing only hydro and wind.

✅ Solar: 77 million MWh; Biomass: 64 million MWh (2017, EIA)

✅ PV expansion; late-year capacity additions dampen annual generation

✅ Hydro: 300 and wind: 254 million MWh; solar thermal ~3 million MWh

 

Electricity generation from solar resources in the United States reached 77 million megawatthours (MWh) in 2017, surpassing for the first time annual generation from biomass resources, which generated 64 million MWh in 2017. Among renewable sources, only hydro and wind generated more electricity in 2017, at 300 million MWh and 254 million MWh, respectively. Biomass generating capacity has remained relatively unchanged in recent years, while solar generating capacity has consistently grown.

Annual growth in solar generation often lags annual capacity additions because generating capacity tends to be added late in the year. For example, in 2016, 29% of total utility-scale solar generating capacity additions occurred in December, leaving few days for an installed project to contribute to total annual generation despite being counted in annual generating capacity additions. In 2017, December solar additions accounted for 21% of the annual total. Overall, solar technologies operate at lower annual capacity factors and experience more seasonal variation than biomass technologies.

Biomass electricity generation comes from multiple fuel sources, such as wood solids (68% of total biomass electricity generation in 2017), landfill gas (17%), municipal solid waste (11%), and other biogenic and nonbiogenic materials (4%).These shares of biomass generation have remained relatively constant in recent years, even as renewables' rise in 2020 across the grid.

Solar can be divided into three types: solar thermal, which converts sunlight to steam to produce power; large-scale solar photovoltaic (PV), which uses PV cells to directly produce electricity from sunlight; and small-scale solar, which are PV installations of 1 megawatt or smaller. Generation from solar thermal sources has remained relatively flat in recent years, at about 3 million MWh, even as renewables surpassed coal in 2022 nationwide. The most recent addition of solar thermal capacity was the Crescent Dunes Solar Energy plant installed in Nevada in 2015, and currently no solar thermal generators are under construction in the United States.

Solar photovoltaic systems, however, have consistently grown in recent years, as indicated by 2022 U.S. solar growth metrics across the sector. In 2014, large-scale solar PV systems generated 15 million MWh, and small-scale PV systems generated 11 million MWh. By 2017, annual electricity from those sources had increased to 50 million MWh and 24 million MWh, respectively, with projections that solar could reach 20% by 2050 in the U.S. mix. By the end of 2018, EIA expects an additional 5,067 MW of large-scale PV to come online, according to EIA’s Preliminary Monthly Electric Generator Inventory, with solar and storage momentum expected to accelerate. Information about planned small-scale PV systems (one megawatt and below) is not collected in that survey.

 

Related News

View more

Opinion: Now is the time for a western Canadian electricity grid

Western Canada Electric Grid could deliver interprovincial transmission, reliability, peak-load support, reserve sharing, and wind and solar integration, lowering costs versus new generation while respecting AESO markets and Crown utility structures.

 

Key Points

Interprovincial transmission to share reserves, boost reliability, integrate wind and solar, and cut peak capacity costs.

✅ Cuts reserve margins via diversity of peak loads

✅ Enables wind and solar balancing across provinces

✅ Saves ratepayers vs replacing retiring thermal plants

 

The 2017 Canadian Free Trade Agreement does not do much to encourage provinces to trade electric energy east and west. Would a western Canada electric grid help electricity consumers in the western provinces? Some Alberta officials feel that their electric utilities are investor owned and they perceive the Crown corporations of BC Hydro, SaskPower and Manitoba Hydro to be subsidized by their provincial governments, so an interprovincial electric energy trade would not be on a level playing field.

Because of the limited trade of electric energy between the western provinces, each utility maintains an excessive reserve of thermal and hydroelectric generation greater than their peak loads, to provide a reliable supply during peak load days as grids are increasingly exposed to harsh weather across Canada. This excess does not include variable wind and solar generation, which within a province can’t be guaranteed to be available when needed most.

This attitude must change. Transmission is cheaper than generation, and coordinated macrogrids can further improve reliability and cut costs. By constructing a substantial grid with low profile and aesthetically designed overhead transmission lines, the excess reserve of thermal and hydroelectric generation above the peak electric load can be reduced in each province over time. Detailed assessments will ensure each province retains its required reliability of electric supply.

As the provinces retire aging thermal and coal-fired generators, they only need to replace them to a much lower level, by just enough to meet their future electric loads and Canada's net-zero grid by 2050 goals. Some of the money not spent in replacing retired generation can be profitably invested in the transmission grid across the four western provinces.

But what about Alberta, which does not want to trade electric energy with the other western provinces? It can carry on as usual within the Alberta Electric System Operator’s (AESO) market and will save money by keeping the installed reserve of thermal and hydroelectric generation to a minimum. When Alberta experiences a peak electric load day and some generators are out of service due to unplanned maintenance, it can obtain the needed power from the interprovincial electric grid. None of the other three western provinces will peak at the same time, because of different weather and time zones, so they will have spare capacity to help Alberta over its peak. The peak load in a province only lasts for a few hours, so Alberta will get by with a little help from its friends if needed.

The grid will have no energy flowing on it for this purpose except to assist a province from time to time when it’s unable to meet its peak load. The grid may only carry load five per cent of the time in a year for this purpose. Under such circumstances, the empty grid can then be used for other profitable markets in electric energy. This includes more effective use of variable wind and solar energy, by enabling a province to better balance such intermittent power as well as allowing increased installation of it in every province. This is a challenge for AESO which the grid would substantially ease.

Natural Resources Canada promoted the “Regional Electricity Co-Operative and Strategic Infrastructure” initiative for completion this year and contracted through AESO, alongside an Atlantic grid study to explore regional improvements. This is a first step, but more is needed to achieve the full benefit of a western grid.

In 1970 a study was undertaken to electrically interconnect Britain with France, which was justified based on the ability to reduce reserve generation in both countries. Initially Britain rejected it, but France was partially supportive. In time, a substantial interconnection was built, and being a profitable venture, they are contemplating increasing the grid connections between them.

For the sake of the western consumers of electricity and to keep electricity rates from rising too quickly, as well as allowing productive expansion of wind and solar energy in places like British Columbia's clean energy shift efforts, an electric grid is essential across western Canada.

Dennis Woodford is president of Electranix Corporation in Winnipeg, which studies electric transmission problems, particularly involving renewable energy generators requiring firm connection to the grid.

 

Related News

View more

New Power Grid “Report Card” Reveal Dangerous Vulnerabilities

U.S. Power Grid D+ Rating underscores aging infrastructure, rising outages, cyber threats, EMP and solar flare risks, strained transmission lines, vulnerable transformers, and slow permitting, amplifying reliability concerns and resilience needs across national energy systems.

 

Key Points

ASCE's D+ grade flags aging infrastructure, rising outages, and cyber, EMP, and weather risks needing investment.

✅ Major outages rising; weather remains top disruption driver.

✅ Aging transformers, transmission lines, limited maintenance.

✅ Cybersecurity gaps via smart grid, EV charging, SCADA.

 

The U.S. power grid just received its “grade card” from the American Society of Civil Engineers (ASCE) and it barely passed.

The overall rating of our antiquated electrical system was a D+. Major power outages in the United States, including widespread blackouts, have grown from 76 in 2007 to 307 in 2011, according to the latest available statistics. The major outage figures do not take into account all of the smaller outages which routinely occur due to seasonal storms.

The American Society of Civil Engineers power grid grade card rating means the energy infrastructure is in “poor to fair condition and mostly below standard, with many elements approaching the end of their service life.” It further means a “large portion of the system exhibits significant deterioration” with a “strong risk of failure.”

Such a designation is not reassuring and validates those who purchased solar generators over the past several years.

#google#

The vulnerable state of the power grid gets very little play by mainstream media outlets. Concerns about a solar flare or an electromagnetic pulse (EMP) attack instantly sending us back to an 1800s existence are legitimate, but it may not take such an extreme act to render the power grid a useless tangle of wires. The majority of the United States’ infrastructure and public systems evaluated by the ASCE earned a “D” rating. A “C” ranking (public parks, rail and bridges) was the highest grade earned. It would take a total of $3.6 trillion in investments by 2020 to fix everything, the report card stated. To put that number in perspective, the federal government’s budget for all of 2012 was slightly more, $3.7 trillion.

“America relies on an aging electrical grid and pipeline distribution systems, some of which originated in the 1880s,” the report read. “Investment in power transmission has increased since 2005, but ongoing permitting issues, weather events, including summer blackouts that strain local systems, and limited maintenance have contributed to an increasing number of failures and power interruptions. While demand for electricity has remained level, the availability of energy in the form of electricity, natural gas, and oil will become a greater challenge after 2020 as the population increases. Although about 17,000 miles of additional high-voltage transmission lines and significant oil and gas pipelines are planned over the next five years, permitting and siting issues threaten their completion. The electric grid in the United States consists of a system of interconnected power generation, transmission facilities, and distribution facilities.”

 

Harness the power of the sun when the power goes out…

There are approximately 400,000 miles of electrical transmission lines throughout the United States, and thousands of power generating plants dot the landscape. The ASCE report card also stated that new gas-fired and renewable generation issues increase the need to add new transmission lines. Antiquated power grid equipment has reportedly prompted even more “intermittent” power outages in recent years.

The American Society of Civil Engineers accurately notes that the power grid is more vulnerable to cyber attacks than ever before, including Russian intrusions documented in recent years, and it cites the aging electrical system as the primary culprit. Although the decades-old transformers and other equipment necessary to keep power flowing around America are a major factor in the enhanced vulnerability of the power grid, moving towards a “smart grid” system is not the answer. As previously reported by Off The Grid News, smart grid systems and even electric car charging stations make the power grid more accessible to cyber hackers. During the Hack in the Box Conference in Amsterdam, HP ArcSight Product Manager Ofer Sheaf stated that electric car charging stations are in essence a computer on the street. The roadway fueling stations are linked to the power grid electrical system. If cyber hackers garner access to the power grid via the charging stations, they could stop the flow of power to a specific area or alter energy distribution levels and overload the system.

While a relatively small number of electric car charging stations exist in America now, that soon will change. Ongoing efforts by both federal and state governments to reduce our reliance on fossil fuels have resulted in grants and privately funded vehicle charging station projects. New York Governor Andrew Cuomo in April announced plans to build 360 such electrical stations in his state. A total of 3,000 car charging stations are in the works statewide and are slated for completion over the next five years.

SHIELD ActWeather-related events were the primary cause of power outages from 2007 to 2012, according to the infrastructure report card. Power grid reliability issues are emerging as the greatest threat to the electrical system, with rising attacks on substations compounding the risks. The ASCE grade card also notes that retiring and rotating in “new energy sources” is a “complex” process. Like most items we routinely purchase in our daily lives, many of the components needed to make the power grid functional are not manufactured in the United States.

The SHIELD Act is the first real piece of federal legislation in years drafted to address power grid vulnerabilities. While the single bill will not fix all of the electrical system issues, it is a big step in the right direction – if it ever makes it out of committee. Replacing aging transformers, encasing them in a high-tech version of a Faraday cage, and stockpiling extra units so instant repairs are possible would help preserve one of the nation’s most critical and life-saving pieces of infrastructure after a weather-related incident or man-made disaster.

“Geomagnetic storm environments can develop instantaneously over large geographic footprints,” solar geomagnetic researcher John Kappenman said about the fragile state of the power grid. He was quoted in an Oak Ridge National Laboratory report. “They have the ability to essentially blanket the continent with an intense threat environment and … produce significant collateral damage to critical infrastructures. In contrast to well-conceived design standards that have been successfully applied for more conventional threats, no comprehensive design criteria have ever been considered to check the impact of the geomagnetic storm environments. The design actions that have occurred over many decades have greatly escalated the dangers posed by these storm threats for this critical infrastructure.”

The power grid has morphed in size tenfold during the past 50 years. While solar flares, cyber attacks, and an EMP are perhaps the most extensive and frightening threats to the electrical system, the infrastructure could just as easily fail in large portions due to weather-related events exacerbated by climate change across regions. The power grid is basically a ticking time bomb which will spawn civil unrest, lack of food, clean water, and a multitude of fires if it does go down.

 

Related News

View more

Was there another reason for electricity shutdowns in California?

PG&E Wind Shutdown and Renewable Reliability examines PSPS strategy, wildfire risk, transmission line exposure, wind turbine cut-out speeds, grid stability, and California's energy mix amid historic high-wind events and supply constraints across service areas.

 

Key Points

An overview of PG&E's PSPS decisions, wildfire mitigation, and how wind cut-out limits influence grid reliability.

✅ Wind turbines reach cut-out near 55 mph, reducing generation.

✅ PSPS mitigates ignition from damaged transmission infrastructure.

✅ Baseload diversity improves resilience during high-wind events.

 

According to the official, widely reported story, Pacific Gas & Electric (PG&E) initiated power shutoffs across substantial portions of its electric transmission system in northern California as a precautionary measure.

Citing high wind speeds they described as “historic,” the utility claims that if it didn’t turn off the grid, wind-caused damage to its infrastructure could start more wildfires.

Perhaps that’s true. Perhaps. This tale presumes that the folks who designed and maintain PG&E’s transmission system are unaware of or ignored the need to design it to withstand severe weather events, and that the Federal Energy Regulatory Commission (FERC) and North American Electric Reliability Corp. (NERC) allowed the utility to do so.

Ignorance and incompetence happens, to be sure, but there’s much about this story that doesn’t smell right—and it’s disappointing that most journalists and elected officials are apparently accepting it without question.

Take, for example, this statement from a Fox News story about the Kincade Fires: “A PG&E meteorologist said it’s ‘likely that many trees will fall, branches will break,’ which could damage utility infrastructure and start a fire.”

Did you ever notice how utilities cut wide swaths of trees away when transmission lines pass through forests? There’s a reason for that: When trees fall and branches break, the grid can still function, and even as the electric rhythms of New York City shifted during COVID-19, operators planned for variability.

So, if badly designed and poorly maintained infrastructure isn’t the reason PG&E cut power to millions of Californians, what might have prompted them to do so? Could it be that PG&E’s heavy reliance on renewable energy means they don’t have the power to send when a “historic” weather event occurs, especially as policymakers weigh the postponed closure of three power plants elsewhere in California?

 

Wind Speed Limits

The two most popular forms of renewable energy come with operating limitations, which is why some energy leaders urge us to keep electricity options open when planning the grid. With solar power, the constraint is obvious: the availability of sunlight. One doesn’t generate solar power at night and energy generation drops off with increasing degrees of cloud cover during the day.

The main operating constraint of wind power is, of course, wind speed, and even in markets undergoing 'transformative change' in wind generation, operators adhere to these technical limits. At the low end of the scale, you need about a 6 or 7 miles-per-hour wind to get a turbine moving. This is called the “cut-in speed.” To generate maximum power, about a 30 mph wind is typically required. But, if the wind speed is too high, the wind turbine will shut down. This is called the “cut-out speed,” and it’s about 55 miles per hour for most modern wind turbines.

It may seem odd that wind turbines have a cut-out speed, but there’s a very good reason for it. Each wind turbine rotor is connected to an electric generator housed in the turbine nacelle. The connection is made through a gearbox that is sized to turn the generator at the precise speed required to produce 60 Hertz AC power.

The blades of the wind turbine are airfoils, just like the wings of an airplane. Adjusting the pitch (angle) of the blades allows the rotor to maintain constant speed, which, in turn, allows the generator to maintain the constant speed it needs to safely deliver power to the grid. However, there’s a limit to blade pitch adjustment. When the wind is blowing so hard that pitch adjustment is no longer possible, the turbine shuts down. That’s the cut-out speed.

Now consider how California’s power generation profile has changed. According to Energy Information Administration data, the state generated 74.3 percent of its electricity from traditional sources—fossil fuels and nuclear, amid debates over whether to classify nuclear as renewable—in 2001. Hydroelectric, geothermal, and biomass-generated power accounted for most of the remaining 25.7 percent, with wind and solar providing only 1.98 percent of the total.

By 2018, the state’s renewable portfolio had jumped to 43.8 percent of total generation, with clean power increasing and wind and solar now accounting for 17.9 percent of total generation. That’s a lot of power to depend on from inherently unreliable sources. Thus, it wouldn’t be at all surprising to learn that PG&E didn’t stop delivering power out of fear of starting fires, but because it knew it wouldn’t have power to deliver once high winds shut down all those wind turbines

 

Related News

View more

Data Show Clean Power Increasing, Fossil Fuel Decreasing in California

California clean electricity accelerates with renewables as solar and wind surge, battery storage strengthens grid resilience, natural gas declines, and coal fades, advancing SB 100 targets, carbon neutrality goals, and affordable, reliable power statewide.

 

Key Points

California clean electricity is the state's transition to renewable, zero-carbon power, scaling solar, wind and storage.

✅ Solar generation up nearly 20x since 2012

✅ Natural gas power down 20%; coal nearly phased out

✅ Battery storage shifts daytime surplus to evening demand

 

Data from the California Energy Commission (CEC) highlight California’s continued progress toward building a more resilient grid, achieving 100 percent clean electricity and meeting the state’s carbon neutrality goals.

Analysis of the state’s Total System Electric Generation report shows how California’s power mix has changed over the last decade. Since 2012:

Solar generation increased nearly twentyfold from 2,609 gigawatt-hours (GWh) to 48,950 GWh.

  • Wind generation grew by 63 percent.
  • Natural gas generation decreased 20 percent.
  • Coal has been nearly phased-out of the power mix, and renewable electricity surpassed coal nationally in 2022 as well.

In addition to total utility generation, rooftop solar increased by 10 times generating 24,309 GWh of clean power in 2022. The state’s expanding fleet of battery storage resources also help support the grid by charging during the day using excess renewable power for use in the evening.

“This latest report card showing how solar energy boomed as natural gas powered electricity experienced a steady 20 percent decline over the last decade is encouraging,” said CEC Vice Chair Siva Gunda. “Even as climate impacts become increasingly severe, California remains committed to transitioning away from polluting fossil fuels and delivering on the promise to build a future power grid that is clean, reliable and affordable.”

Senate Bill 100 (2018) requires 100 percent of California’s electric retail sales be supplied by renewable and zero-carbon energy sources by 2045. To keep the state on track, last year Governor Gavin Newsom signed SB 1020, establishing interim targets of 90 percent clean electricity by 2035 and 95 percent by 2040.

The state monitors progress through the Renewables Portfolio Standard (RPS), which tracks the power mix of retail sales, and regional peers such as Nevada's RPS progress offer useful comparison. The latest data show that in 2021 more than 37 percent of the state’s electricity came from RPS-eligible sources such as solar and wind, an increase of 2.7 percent compared to 2020. When combined with other sources of zero-carbon energy such as large hydroelectric generation and nuclear, nearly 59 percent of the state’s retail electricity sales came from nonfossil fuel sources.

The total system electric generation report is based on electric generation from all in-state power plants rated 1 megawatt (MW) or larger and imported utility-scale power generation. It reflects the percentage of a specific resource compared to all power generation, not just retail sales. The total system electric generation report accounts for energy used for water conveyance and pumping, transmission and distribution losses and other uses not captured under RPS.

 

Related News

View more

How to Get Solar Power on a Rainy Day? Beam It From Space

Space solar power promises wireless energy from orbital solar satellites via microwave or laser power beaming, using photovoltaics and rectennas. NRL and AFRL advances hint at 24-7 renewable power delivery to Earth and airborne drones.

 

Key Points

Space solar power beams orbital solar energy to Earth via microwaves or lasers, enabling continuous wireless electricity.

✅ Harvests sunlight in orbit and transmits via microwaves or lasers

✅ Provides 24-7 renewable power, independent of weather or night

✅ Enables wireless power for remote sites, grids, and drones

 

Earlier this year, a small group of spectators gathered in David Taylor Model Basin, the Navy’s cavernous indoor wave pool in Maryland, to watch something they couldn’t see. At each end of the facility there was a 13-foot pole with a small cube perched on top. A powerful infrared laser beam shot out of one of the cubes, striking an array of photovoltaic cells inside the opposite cube. To the naked eye, however, it looked like a whole lot of nothing. The only evidence that anything was happening came from a small coffee maker nearby, which was churning out “laser lattes” using only the power generated by the system as ambitions for cheap abundant electricity gain momentum worldwide.

The laser setup managed to transmit 400 watts of power—enough for several small household appliances—through hundreds of meters of air without moving any mass. The Naval Research Lab, which ran the project, hopes to use the system to send power to drones during flight. But NRL electronics engineer Paul Jaffe has his sights set on an even more ambitious problem: beaming solar power to Earth from space. For decades the idea had been reserved for The Future, but a series of technological breakthroughs and a massive new government research program suggest that faraway day may have finally arrived as interest in space-based solar broadens across industry and government.

Since the idea for space solar power first cropped up in Isaac Asimov’s science fiction in the early 1940s, scientists and engineers have floated dozens of proposals to bring the concept to life, including inflatable solar arrays and robotic self-assembly. But the basic idea is always the same: A giant satellite in orbit harvests energy from the sun and converts it to microwaves or lasers for transmission to Earth, where it is converted into electricity. The sun never sets in space, so a space solar power system could supply renewable power to anywhere on the planet, day or night, as recent tests show we can generate electricity from the night sky as well, rain or shine.

Like fusion energy, space-based solar power seemed doomed to become a technology that was always 30 years away. Technical problems kept cropping up, cost estimates remained stratospheric, and as solar cells became cheaper and more efficient, and storage improved with cheap batteries, the case for space-based solar seemed to be shrinking.

That didn’t stop government research agencies from trying. In 1975, after partnering with the Department of Energy on a series of space solar power feasibility studies, NASA beamed 30 kilowatts of power over a mile using a giant microwave dish. Beamed energy is a crucial aspect of space solar power, but this test remains the most powerful demonstration of the technology to date. “The fact that it’s been almost 45 years since NASA’s demonstration, and it remains the high-water mark, speaks for itself,” Jaffe says. “Space solar wasn’t a national imperative, and so a lot of this technology didn’t meaningfully progress.”

John Mankins, a former physicist at NASA and director of Solar Space Technologies, witnessed how government bureaucracy killed space solar power development firsthand. In the late 1990s, Mankins authored a report for NASA that concluded it was again time to take space solar power seriously and led a project to do design studies on a satellite system. Despite some promising results, the agency ended up abandoning it.

In 2005, Mankins left NASA to work as a consultant, but he couldn’t shake the idea of space solar power. He did some modest space solar power experiments himself and even got a grant from NASA’s Innovative Advanced Concepts program in 2011. The result was SPS-ALPHA, which Mankins called “the first practical solar power satellite.” The idea, says Mankins, was “to build a large solar-powered satellite out of thousands of small pieces.” His modular design brought the cost of hardware down significantly, at least in principle.

Jaffe, who was just starting to work on hardware for space solar power at the Naval Research Lab, got excited about Mankins’ concept. At the time he was developing a “sandwich module” consisting of a small solar panel on one side and a microwave transmitter on the other. His electronic sandwich demonstrated all the elements of an actual space solar power system and, perhaps most important, it was modular. It could work beautifully with something like Mankins' concept, he figured. All they were missing was the financial support to bring the idea from the laboratory into space.

Jaffe invited Mankins to join a small team of researchers entering a Defense Department competition, in which they were planning to pitch a space solar power concept based on SPS-ALPHA. In 2016, the team presented the idea to top Defense officials and ended up winning four out of the seven award categories. Both Jaffe and Mankins described it as a crucial moment for reviving the US government’s interest in space solar power.

They might be right. In October, the Air Force Research Lab announced a $100 million program to develop hardware for a solar power satellite. It’s an important first step toward the first demonstration of space solar power in orbit, and Mankins says it could help solve what he sees as space solar power’s biggest problem: public perception. The technology has always seemed like a pie-in-the-sky idea, and the cost of setting up a solar array on Earth is plummeting, as proposals like a tenfold U.S. solar expansion signal rapid growth; but space solar power has unique benefits, chief among them the availability of solar energy around the clock regardless of the weather or time of day.

It can also provide renewable energy to remote locations, such as forward operating bases for the military, which has deployed its first floating solar array to bolster resilience. And at a time when wildfires have forced the utility PG&E to kill power for thousands of California residents on multiple occasions, having a way to provide renewable energy through the clouds and smoke doesn’t seem like such a bad idea. (Ironically enough, PG&E entered a first-of-its-kind agreement to buy space solar power from a company called Solaren back in 2009; the system was supposed to start operating in 2016 but never came to fruition.)

“If space solar power does work, it is hard to overstate what the geopolitical implications would be,” Jaffe says. “With GPS, we sort of take it for granted that no matter where we are on this planet, we can get precise navigation information. If the same thing could be done for energy, especially as peer-to-peer energy sharing matures, it would be revolutionary.”

Indeed, there seems to be an emerging race to become the first to harness this technology. Earlier this year China announced its intention to become the first country to build a solar power station in space, and for more than a decade Japan has considered the development of a space solar power station to be a national priority. Now that the US military has joined in with a $100 million hardware development program, it may only be a matter of time before there’s a solar farm in the solar system.

 

Related News

View more

Sign Up for Electricity Forum’s Newsletter

Stay informed with our FREE Newsletter — get the latest news, breakthrough technologies, and expert insights, delivered straight to your inbox.

Electricity Today T&D Magazine Subscribe for FREE

Stay informed with the latest T&D policies and technologies.
  • Timely insights from industry experts
  • Practical solutions T&D engineers
  • Free access to every issue

Live Online & In-person Group Training

Advantages To Instructor-Led Training – Instructor-Led Course, Customized Training, Multiple Locations, Economical, CEU Credits, Course Discounts.

Request For Quotation

Whether you would prefer Live Online or In-Person instruction, our electrical training courses can be tailored to meet your company's specific requirements and delivered to your employees in one location or at various locations.