Former EPA chief says nuclear is needed

By Associated Press


NFPA 70e Training

Our customized live online or in‑person group training can be delivered to your staff at your location.

  • Live Online
  • 6 hours Instructor-led
  • Group Training Available
Regular Price:
$199
Coupon Price:
$149
Reserve Your Seat Today
Former Environmental Protection Agency administrator Christine Todd Whitman said that Congress should include nuclear power in any requirement for utilities to tap more clean energy sources.

Lawmakers are weighing a renewable electricity standard, which would require utilities to get a minimum percentage of their power from sources such as wind, solar and geothermal.

Whitman said in a telephone interview that she'd like to see it broadened to a "green" standard that includes nuclear power. She argued that renewable sources alone won't be able to meet the country's growing energy needs.

Whitman co-chairs a nuclear energy-financed group, the Clean and Safe Energy Coalition, and is paid by the nuclear industry.

Backers tout nuclear plants as a clean source of energy because they don't emit greenhouse gases blamed for global warming while generating electricity. But critics note that carbon is emitted during the mining and enrichment of uranium, and point to the government's lack of a long-term plan to store commercial radioactive waste.

"What we want is clean, green energy," Whitman said. "And you should let the market decide which form is going to work the best. If you say renewables, you can't include nuclear, because it does rely on uranium, and that's a finite resource."

"I get very leery when Congress picks the winners within any band of energy source," she added.

Still, Whitman said she was fine with a policy that required a minimum percentage of clean energy be used.

"There's a place for that kind of standard, because we do care about our air quality" and climate change, said Whitman, a moderate Republican who was governor of New Jersey and then served in the administration of President George W. Bush.

The idea of a broader clean energy mandate has been kicked around Capitol Hill, but it doesn't appear to have much of a chance in this session of Congress. Last month, the chairman of the Senate energy committee, New Mexico Democrat Jeff Bingaman, introduced legislation that would set the renewable electricity standard at 15 percent by 2021, and his spokesman stressed that the point of the mandate is to boost renewable forms of electricity.

"Sen. Bingaman has a solid record of support for nuclear power, which is an important part of our nation's energy mix," said spokesman Bill Wicker. "But he also knows that nuclear is a mature industry which already has benefited from a tremendous amount of federal support for more than half a century."

Bingaman hopes to get a vote on his bill in a lame duck session of Congress after the elections, but it won't be an easy sell. Opponents of a mandate in the Southeast argue that the region lacks renewable sources like abundant levels of wind.

Richard Caperton, a policy analyst who works on clean energy issues at the Center for American Progress, said that renewable sources have benefits that go beyond low carbon emissions, such as not generating waste. He said he'd prefer that the focus stay on renewables.

"That said, if people are serious about clean energy standards, there's potentially a way to involve nuclear power," Caperton said. One option, he said, would be a national standard with regional variations. Under that scenario, the Southeast could use nuclear toward its mandate — but renewable sources would count more toward the minimum requirement.

Rob Gramlich, a lobbyist for the American Wind Energy Association, said he opposed any tinkering with the mandate pending in the Senate.

"The renewable electricity standard is ready to be passed right now," he said in a statement. "It's not the time to throw it out and start all over."

Related News

Was there another reason for electricity shutdowns in California?

PG&E Wind Shutdown and Renewable Reliability examines PSPS strategy, wildfire risk, transmission line exposure, wind turbine cut-out speeds, grid stability, and California's energy mix amid historic high-wind events and supply constraints across service areas.

 

Key Points

An overview of PG&E's PSPS decisions, wildfire mitigation, and how wind cut-out limits influence grid reliability.

✅ Wind turbines reach cut-out near 55 mph, reducing generation.

✅ PSPS mitigates ignition from damaged transmission infrastructure.

✅ Baseload diversity improves resilience during high-wind events.

 

According to the official, widely reported story, Pacific Gas & Electric (PG&E) initiated power shutoffs across substantial portions of its electric transmission system in northern California as a precautionary measure.

Citing high wind speeds they described as “historic,” the utility claims that if it didn’t turn off the grid, wind-caused damage to its infrastructure could start more wildfires.

Perhaps that’s true. Perhaps. This tale presumes that the folks who designed and maintain PG&E’s transmission system are unaware of or ignored the need to design it to withstand severe weather events, and that the Federal Energy Regulatory Commission (FERC) and North American Electric Reliability Corp. (NERC) allowed the utility to do so.

Ignorance and incompetence happens, to be sure, but there’s much about this story that doesn’t smell right—and it’s disappointing that most journalists and elected officials are apparently accepting it without question.

Take, for example, this statement from a Fox News story about the Kincade Fires: “A PG&E meteorologist said it’s ‘likely that many trees will fall, branches will break,’ which could damage utility infrastructure and start a fire.”

Did you ever notice how utilities cut wide swaths of trees away when transmission lines pass through forests? There’s a reason for that: When trees fall and branches break, the grid can still function, and even as the electric rhythms of New York City shifted during COVID-19, operators planned for variability.

So, if badly designed and poorly maintained infrastructure isn’t the reason PG&E cut power to millions of Californians, what might have prompted them to do so? Could it be that PG&E’s heavy reliance on renewable energy means they don’t have the power to send when a “historic” weather event occurs, especially as policymakers weigh the postponed closure of three power plants elsewhere in California?

 

Wind Speed Limits

The two most popular forms of renewable energy come with operating limitations, which is why some energy leaders urge us to keep electricity options open when planning the grid. With solar power, the constraint is obvious: the availability of sunlight. One doesn’t generate solar power at night and energy generation drops off with increasing degrees of cloud cover during the day.

The main operating constraint of wind power is, of course, wind speed, and even in markets undergoing 'transformative change' in wind generation, operators adhere to these technical limits. At the low end of the scale, you need about a 6 or 7 miles-per-hour wind to get a turbine moving. This is called the “cut-in speed.” To generate maximum power, about a 30 mph wind is typically required. But, if the wind speed is too high, the wind turbine will shut down. This is called the “cut-out speed,” and it’s about 55 miles per hour for most modern wind turbines.

It may seem odd that wind turbines have a cut-out speed, but there’s a very good reason for it. Each wind turbine rotor is connected to an electric generator housed in the turbine nacelle. The connection is made through a gearbox that is sized to turn the generator at the precise speed required to produce 60 Hertz AC power.

The blades of the wind turbine are airfoils, just like the wings of an airplane. Adjusting the pitch (angle) of the blades allows the rotor to maintain constant speed, which, in turn, allows the generator to maintain the constant speed it needs to safely deliver power to the grid. However, there’s a limit to blade pitch adjustment. When the wind is blowing so hard that pitch adjustment is no longer possible, the turbine shuts down. That’s the cut-out speed.

Now consider how California’s power generation profile has changed. According to Energy Information Administration data, the state generated 74.3 percent of its electricity from traditional sources—fossil fuels and nuclear, amid debates over whether to classify nuclear as renewable—in 2001. Hydroelectric, geothermal, and biomass-generated power accounted for most of the remaining 25.7 percent, with wind and solar providing only 1.98 percent of the total.

By 2018, the state’s renewable portfolio had jumped to 43.8 percent of total generation, with clean power increasing and wind and solar now accounting for 17.9 percent of total generation. That’s a lot of power to depend on from inherently unreliable sources. Thus, it wouldn’t be at all surprising to learn that PG&E didn’t stop delivering power out of fear of starting fires, but because it knew it wouldn’t have power to deliver once high winds shut down all those wind turbines

 

Related News

View more

Climate change: Greenhouse gas concentrations again break records

Rising Greenhouse Gas Concentrations drive climate change, with CO2, methane, and nitrous oxide surging; WMO data show higher radiative forcing, elevated pre-industrial baselines, and persistent atmospheric concentrations despite Paris Agreement emissions pledges.

 

Key Points

Increasing atmospheric CO2, methane, and nitrous oxide levels that raise radiative forcing and drive warming.

✅ WMO data show CO2 at 407.8 ppm in 2018, above decade average

✅ Methane and nitrous oxide surged, elevating total radiative forcing

✅ Concentrations differ from emissions; sinks absorb about half

 

The World Meteorological Organization (WMO) says the increase in CO2 was just above the average rise recorded over the last decade.

Levels of other warming gases, such as methane and nitrous oxide, have also surged by above average amounts.

Since 1990 there's been an increase of 43% in the warming effect on the climate of long lived greenhouse gases.

The WMO report looks at concentrations of warming gases in the atmosphere rather than just emissions.

The difference between the two is that emissions refer to the amount of gases that go up into the atmosphere from the use of fossil fuels, such as burning coal for coal-fired electricity generation and from deforestation.

Concentrations are what's left in the air after a complex series of interactions between the atmosphere, the oceans, the forests and the land. About a quarter of all carbon emissions are absorbed by the seas, and a similar amount by land and trees, while technologies like carbon capture are being explored to remove CO2.

Using data from monitoring stations in the Arctic and all over the world, researchers say that in 2018 concentrations of CO2 reached 407.8 parts per million (ppm), up from 405.5ppm a year previously.

This increase was above the average for the last 10 years and is 147% of the "pre-industrial" level in 1750.

The WMO also records concentrations of other warming gases, including methane and nitrous oxide, and some countries have reported declines in certain potent gases, as noted in US greenhouse gas controls reports, though global levels remain elevated. About 40% of the methane emitted into the air comes from natural sources, such as wetlands, with 60% from human activities, including cattle farming, rice cultivation and landfill dumps.

Methane is now at 259% of the pre-industrial level and the increase seen over the past year was higher than both the previous annual rate and the average over the past 10 years.

Nitrous oxide is emitted from natural and human sources, including from the oceans and from fertiliser-use in farming. According to the WMO, it is now at 123% of the levels that existed in 1750.

Last year's increase in concentrations of the gas, which can also harm the ozone layer, was bigger than the previous 12 months and higher than the average of the past decade.

What concerns scientists is the overall warming impact of all these increasing concentrations. Known as total radiative forcing, this effect has increased by 43% since 1990, and is not showing any indication of stopping.

There is no sign of a slowdown, let alone a decline, in greenhouse gases concentration in the atmosphere despite all the commitments under the Paris agreement on climate change and the ongoing global energy transition efforts," said WMO Secretary-General Petteri Taalas.

"We need to translate the commitments into action and increase the level of ambition for the sake of the future welfare of mankind," he added.

"It is worth recalling that the last time the Earth experienced a comparable concentration of CO2 was three to five million years ago. Back then, the temperature was 2-3C warmer, sea level was 10-20m higher than now," said Mr Taalas.

The UN Environment Programme will report shortly on the gap between what actions countries are taking to cut carbon, for example where Australia's emissions rose 2% recently, and what needs to be done to keep under the temperature targets agreed in the Paris climate pact.

Preliminary findings from this study, published during the UN Secretary General's special climate summit last September, indicated that emissions continued to rise during 2018, although global emissions flatlined in 2019 according to the IEA.

Both reports will help inform delegates from almost 200 countries who will meet in Madrid next week for COP25, following COP24 in Katowice the previous year, the annual round of international climate talks.

 

Related News

View more

If B.C. wants to electrify all road vehicles by 2055, it will need to at least double its power output: study

B.C. EV Electrification 2055 projects grid capacity needs doubling to 37 GW, driven by electric vehicles, renewable energy expansion, wind and solar generation, limited natural gas, and policy mandates for zero-emission transportation.

 

Key Points

A projection that electrifying all B.C. road transport by 2055 would more than double grid demand to 37 GW.

✅ Site C adds 1.1 GW; rest from wind, solar, limited natural gas.

✅ Electricity price per kWh rises 9%, but fuel savings offset.

✅ Significant GHG cuts with 93% renewable grid under Clean Energy Act.

 

Researchers at the University of Victoria say that if B.C. were to shift to electric power for all road vehicles by 2055, the province would require more than double the electricity now being generated.

The findings are included in a study to be published in the November issue of the Applied Energy journal.

According to co-author and UVic professor Curran Crawford, the team at the university's Pacific Institute for Climate Solutions took B.C.'s 2015 electrical capacity of 15.6 gigawatts as a baseline, and added projected demands from population and economic growth, then added the increase that shifting to electric vehicles would require, while acknowledging power supply challenges that could arise.

They calculated the demand in 2055 would amount to 37 gigawatts, more than double 15.6 gigawatts used in 2015 as a baseline, and utilities warn of a potential EV charging bottleneck if demand ramps up faster than infrastructure.

"We wanted to understand what the electricity requirements are if you want to do that," he said. "It's possible — it would take some policy direction."

B.C. announces $4M in rebates for home and work EV charging stations across the province
The team took the planned Site C dam project into account, but that would only add 1.1 gigawatts of power. So assuming no other hydroelectric dams are planned, the remainder would likely have to come from wind and solar projects and some natural gas.

"Geothermal and biomass were also in the model," said Crawford, adding that they are more expensive electricity sources. "The model we were using, essentially, we're looking for the cheapest options."
Wind turbines on the Tantramar Marsh between Nova Scotia and New Brunswick tower over the Trans-Canada Highway. If British Columbia were to shift to 100 per cent electric-powered ground transportation by 2055, the province would have to significantly increase its wind and solar power generation. (Eric Woolliscroft/CBC)
The electricity bill, per kilowatt hour, would increase by nine per cent, according to the team's research, but Crawford said getting rid of the gasoline and diesel now used to fuel vehicles could amount to an overall cost saving, especially when combined with zero-emission vehicle incentives available to consumers.

The province introduced a law this year requiring that all new light-duty vehicles sold in B.C. be zero emission by 2040, while the federal 2035 EV mandate adds another policy signal, so the researchers figured 2055 was a reasonable date to imagine all vehicles on the road to be electric.

Crawford said hydrogen-powered vehicles weren't considered in the study, as the model used was already complicated enough, but hydrogen fuel would actually require more electricity for the electrolysis, when compared to energy stored in batteries.

Electric vehicles are approaching a tipping point as faster charging becomes more available — here's why
The study also found that shifting to all-electric ground transportation in B.C. would also mean a significant decrease in greenhouse gas emissions, assuming the Clean Energy Act remains in place, which mandates that 93 per cent of grid electricity must come from renewable resources, whereas nationally, about 18 per cent of electricity still comes from fossil fuels, according to 2019 data. 

"Doing the electrification makes some sense — If you're thinking of spending some money to reduce carbon emissions, this is a pretty cost effective way of doing that," said Crawford.

 

Related News

View more

Sustaining U.S. Nuclear Power And Decarbonization

Existing Nuclear Reactor Lifetime Extension sustains carbon-free electricity, supports deep decarbonization, and advances net zero climate goals by preserving the US nuclear fleet, stabilizing the grid, and complementing advanced reactors.

 

Key Points

Extending licenses keeps carbon-free nuclear online, stabilizes grid, and accelerates decarbonization toward net zero.

✅ Preserves 24/7 carbon-free baseload to meet climate targets

✅ Avoids emissions and replacement costs from premature retirements

✅ Complements advanced reactors; reduces capital and material needs

 

Nuclear power is the single largest source of carbon-free energy in the United States and currently provides nearly 20 percent of the nation’s electrical demand. As a result, many analyses have investigated the potential of future nuclear energy contributions in addressing climate change and investing in carbon-free electricity across the sector. However, few assess the value of existing nuclear power reactors.

Research led by Pacific Northwest National Laboratory (PNNL) Earth scientist Son H. Kim, with the Joint Global Change Research Institute (JGCRI), a partnership between PNNL and the University of Maryland, has added insight to the scarce literature and is the first to evaluate nuclear energy for meeting deep decarbonization goals amid rising credit risks for nuclear power identified by Moody's. Kim sought to answer the question: How much do our existing nuclear reactors contribute to the mission of meeting the country’s climate goals, both now and if their operating licenses were extended?

As the world races to discover solutions for reaching net zero as part of the global energy transition now underway, Kim’s report quantifies the economic value of bringing the existing nuclear fleet into the year 2100. It outlines its significant contributions to limiting global warming.

Plants slated to close by 2050 could be among the most important players in a challenge requiring all available carbon-free technology solutions—emerging and existing—alongside renewable electricity in many regions, the report finds. New nuclear technology also has a part to play, and its contributions could be boosted by driving down construction costs.  

“Even modest reductions in capital costs could bring big climate benefits,” said Kim. “Significant effort has been incorporated into the design of advanced reactors to reduce the use of all materials in general, such as concrete and steel because that directly translates into reduced costs and carbon emissions.”

Nuclear power reactors face an uncertain future, and some utilities face investor pressure to release climate reports as well.
The nuclear power fleet in the United States consists of 93 operating reactors across 28 states. Most of these plants were constructed and deployed between 1970-1990. Half of the fleet has outlived its original operating license lifetime of 40 years. While most reactors have had their licenses renewed for an additional 20 years, and some for another 20, the total number of reactors that will receive a lifetime extension to operate a full 80 years from deployment is uncertain.

Other countries also rely on nuclear energy. In France, for example, nuclear energy provides 70 percent of the country’s power supply. They and other countries must also consider extending the lifetime, retiring, or building new, modern reactors while navigating Canadian climate policy implications for electricity grids. However, the U.S. faces the potential retirement of many reactors in a short period—this could have a far stronger impact than the staggered closures other countries may experience.

“Our existing nuclear power plants are aging, and with their current 60-year lifetimes, nearly all of them will be gone by 2050. It’s ironic. We have a net zero goal to reach by 2050, yet our single largest source of carbon-free electricity is at risk of closure, as seen in New Zealand's electricity transition debates,“ said Kim.

 

Related News

View more

Are Net-Zero Energy Buildings Really Coming Soon to Mass?

Massachusetts Energy Code Updates align DOER regulations with BBRS standards, advancing Stretch Code and Specialized Code beyond the Base Energy Code to accelerate net-zero construction, electrification, and high-efficiency building performance across municipal opt-in communities.

 

Key Points

They are DOER-led changes to Base, Stretch, and Specialized Codes to drive net-zero, electrified, efficient buildings.

✅ Updates apply Base, Stretch, or opt-in Specialized Code.

✅ Targets net-zero by 2050 with electrification-first design.

✅ Municipalities choose code path via City Council or Town Meeting.

 

Massachusetts will soon see significant updates to the energy codes that govern the construction and alteration of buildings throughout the Commonwealth.

As required by the 2021 climate bill, the Massachusetts Department of Energy Resources (DOER) has recently finalized regulations updating the current Stretch Energy Code, previously promulgated by the state's Board of Building Regulations and Standards (BBRS), and establishing a new Specialized Code geared toward achieving net-zero building energy performance.

The final code has been submitted to the Joint Committee on Telecommunications, Utilities, and Energy for review as required under state law, amid ongoing Connecticut market overhaul discussions that could influence regional dynamics.

Under the new regulations, each municipality must apply one of the following:

Base Energy Code - The current Base Energy Code is being updated by the BBRS as part of its routine updates to the full set of building codes. This base code is the default if a municipality has not opted in to an alternative energy code.

Stretch Code - The updated Stretch Code creates stricter guidelines on energy-efficiency for almost all new constructions and alterations in municipalities that have adopted the previous Stretch Code, paralleling 100% carbon-free target in Minnesota and elsewhere to support building decarbonization. The updated Stretch Code will automatically become the applicable code in any municipality that previously opted-in to the Stretch Code.

Specialized Code - The newly created Specialized Code includes additional requirements above and beyond the Stretch Code, designed to get to ensure that new construction is consistent with a net-zero economy by 2050, similar to Canada's clean electricity regulations that set a 2050 decarbonization pathway. Municipalities must opt-in to adopt the Specialized Code by vote of City Council or Town Meeting.

The new codes are much too detailed to summarize in a blog post. You can read more here. Without going into those details here, it is worth noting a few significant policy implications of the new regulations:

With roughly 90% of Massachusetts municipalities having already adopted the prior version of the Stretch Code, the Commonwealth will effectively soon have a new base code that, even if it does not mandate zero-energy buildings, is nonetheless very aggressive in pushing new construction to be as energy-efficient as possible, as jurisdictions such as Ontario clean electricity regulations continue to reshape the power mix.

Although some concerns have been raised about the cost of compliance, particularly in a period of high inflation, and amid solar demand charge debates in Massachusetts, our understanding is that many developers have indicated that they can work with the new regulations without significant adverse impacts.

Of course, the success of the new codes depends on the success of the Commonwealth's efforts to transition quickly to a zero-carbon electrical grid, supported by initiatives like the state's energy storage solicitation to bolster reliability. If the cost of doing so is higher than expected, there could well be public resistance. If new transmission doesn't get built out sufficiently quickly or other problems occur, such that the power is not available to electrify all new construction, that would be a much more significant problem - for many reasons!

In short, the new regulations unquestionably set the Commonwealth on a course to electrify new construction and squeeze carbon emissions out of new buildings. However, as with the rest of our climate goals, there are a lot of moving pieces, including proposals for a clean electricity standard shaping the power sector that are going to have to come together to make the zero-carbon economy a reality.

 

Related News

View more

Longer, more frequent outages afflict the U.S. power grid as states fail to prepare for climate change

Power Grid Climate Resilience demands storm hardening, underground power lines, microgrids, batteries, and renewable energy as regulators and utilities confront climate change, sea level rise, and extreme weather to reduce outages and protect vulnerable communities.

 

Key Points

It is the grid capacity to resist and recover from climate hazards using buried lines, microgrids, and batteries.

✅ Underground lines reduce wind outages and wildfire ignition risk.

✅ Microgrids with solar and batteries sustain critical services.

✅ Regulators balance cost, resilience, equity, and reliability.

 

Every time a storm lashes the Carolina coast, the power lines on Tonye Gray’s street go down, cutting her lights and air conditioning. After Hurricane Florence in 2018, Gray went three days with no way to refrigerate medicine for her multiple sclerosis or pump the floodwater out of her basement.

What you need to know about the U.N. climate summit — and why it matters
“Florence was hell,” said Gray, 61, a marketing account manager and Wilmington native who finds herself increasingly frustrated by the city’s vulnerability.

“We’ve had storms long enough in Wilmington and this particular area that all power lines should have been underground by now. We know we’re going to get hit.”

Across the nation, severe weather fueled by climate change is pushing aging electrical systems past their limits, often with deadly results. Last year, amid increasing nationwide blackouts, the average American home endured more than eight hours without power, according to the U.S. Energy Information Administration — more than double the outage time five years ago.

This year alone, a wave of abnormally severe winter storms caused a disastrous power failure in Texas, leaving millions of homes in the dark, sometimes for days, and at least 200 dead. Power outages caused by Hurricane Ida contributed to at least 14 deaths in Louisiana, as some of the poorest parts of the state suffered through weeks of 90-degree heat without air conditioning.

As storms grow fiercer and more frequent, environmental groups are pushing states to completely reimagine the electrical grid, incorporating more grid-scale batteries, renewable energy sources and localized systems known as “microgrids,” which they say could reduce the incidence of wide-scale outages. Utility companies have proposed their own storm-proofing measures, including burying power lines underground.

But state regulators largely have rejected these ideas, citing pressure to keep energy rates affordable. Of $15.7 billion in grid improvements under consideration last year, regulators approved only $3.4 billion, according to a national survey by the NC Clean Energy Technology Center — about one-fifth, highlighting persistent vulnerabilities in the grid nationwide.

After a weather disaster, “everybody’s standing around saying, ‘Why didn’t you spend more to keep the lights on?’ ” Ted Thomas, chairman of the Arkansas Public Service Commission, said in an interview with The Washington Post. “But when you try to spend more when the system is working, it’s a tough sell.”

A major impediment is the failure by state regulators and the utility industry to consider the consequences of a more volatile climate — and to come up with better tools to prepare for it. For example, a Berkeley Lab study last year of outages caused by major weather events in six states found that neither state officials nor utility executives attempted to calculate the social and economic costs of longer and more frequent outages, such as food spoilage, business closures, supply chain disruptions and medical problems.

“There is no question that climatic changes are happening that directly affect the operation of the power grid,” said Justin Gundlach, a senior attorney at the Institute for Policy Integrity, a think tank at New York University Law School. “What you still haven’t seen … is a [state] commission saying: 'Isn’t climate the through line in all of this? Let’s examine it in an open-ended way. Let’s figure out where the information takes us and make some decisions.’ ”

In interviews, several state commissioners acknowledged that failure.

“Our electric grid was not built to handle the storms that are coming this next century,” said Tremaine L. Phillips, a commissioner on the Michigan Public Service Commission, which in August held an emergency meeting to discuss the problem of power outages. “We need to come up with a broader set of metrics in order to better understand the success of future improvements.”

Five disasters in four years
The need is especially urgent in North Carolina, where experts warn Atlantic grids and coastlines need a rethink as the state has declared a federal disaster from a hurricane or tropical storm five times in the past four years. Among them was Hurricane Florence, which brought torrential rain, catastrophic flooding and the state’s worst outage in over a decade in September 2018.

More than 1 million residents were left disconnected from refrigerators, air conditioners, ventilators and other essential machines, some for up to two weeks. Elderly residents dependent on oxygen were evacuated from nursing homes. Relief teams flew medical supplies to hospitals cut off by flooded roads. Desperate people facing closed stores and rotting food looted a Wilmington Family Dollar.

“I have PTSD from Hurricane Florence, not because of the actual storm but the aftermath,” said Evelyn Bryant, a community organizer who took part in the Wilmington response.

The storm reignited debate over a $13 billion proposal by Duke Energy, one of the largest power companies in the nation, to reinforce the state’s power grid. A few months earlier, the state had rejected Duke’s request for full repayment of those costs, determining that protecting the grid against weather is a normal part of doing business and not eligible for the type of reimbursement the company had sought.

After Florence, Duke offered a smaller, $2.5 billion plan, along with the argument that severe weather events are one of seven “megatrends” (including cyberthreats and population growth) that require greater investment, according to a PowerPoint presentation included in testimony to the state. The company owns the two largest utilities in North Carolina, Duke Energy Carolinas and Duke Energy Progress.

Vote Solar, a nonprofit climate advocacy group, objected to Duke’s plan, saying the utility had failed to study the risks of climate impacts. Duke’s flood maps, for example, had not been updated to reflect the latest projections for sea level rise, they said. In testimony, Vote Solar claimed Duke was using environmental trends to justify investments “it had already decided to pursue.”

The United States is one of the few countries where regulated utilities are usually guaranteed a rate of return on capital investments, even as studies show the U.S. experiences more blackouts than much of the developed world. That business model incentivizes spending regardless of how well it solves problems for customers and inspires skepticism. Ric O’Connell, executive director of GridLab, a nonprofit group that assists state and regional policymakers on electrical grid issues, said utilities in many states “are waving their hands and saying hurricanes” to justify spending that would do little to improve climate resilience.

In North Carolina, hurricanes convinced Republicans that climate change is real

Duke Energy spokesman Jeff Brooks acknowledged that the company had not conducted a climate risk study but pointed out that this type of analysis is still relatively new for the industry. He said Duke’s grid improvement plan “inherently was designed to think about future needs,” including reinforced substations with walls that rise several feet above the previous high watermark for flooding, and partly relied on federal flood maps to determine which stations are at most risk.

Brooks said Duke is not using weather events to justify routine projects, noting that the company had spent more than a year meeting with community stakeholders and using their feedback to make significant changes to its grid improvement plan.

This year, the North Carolina Utilities Commission finally approved a set of grid improvements that will cost customers $1.2 billion. But the commission reserved the right to deny Duke reimbursement of those costs if it cannot prove they are prudent and reasonable. The commission’s general counsel, Sam Watson, declined to discuss the decision, saying the commission can comment on specific cases only in public orders.

The utility is now burying power lines in “several neighborhoods across the state” that are most vulnerable to wide-scale outages, Brooks said. It is also fitting aboveground power lines with “self-healing” technology, a network of sensors that diverts electricity away from equipment failures to minimize the number of customers affected by an outage.

As part of a settlement with Vote Solar, Duke Energy last year agreed to work with state officials and local leaders to further evaluate the potential impacts of climate change, a process that Brooks said is expected to take two to three years.

High costs create hurdles
The debate in North Carolina is being echoed in states across the nation, where burying power lines has emerged as one of the most common proposals for insulating the grid from high winds, fires and flooding. But opponents have balked at the cost, which can run in the millions of dollars per mile.

In California, for example, Pacific Gas & Electric wants to bury 10,000 miles of power lines, both to make the grid more resilient and to reduce the risk of sparking wildfires. Its power equipment has contributed to multiple deadly wildfires in the past decade, including the 2018 Camp Fire that killed at least 85 people.

PG&E’s proposal has drawn scorn from critics, including San Jose Mayor Sam Liccardo, who say it would be too slow and expensive. But Patricia Poppe, the company’s CEO, told reporters that doing nothing would cost California even more in lost lives and property while struggling to keep the lights on during wildfires. The plan has yet to be submitted to the state, but Terrie Prosper, a spokeswoman for the California Public Utilities Commission, said the commission has supported underground lines as a wildfire mitigation strategy.

Another oft-floated solution is microgrids, small electrical systems that provide power to a single neighborhood, university or medical center. Most of the time, they are connected to a larger utility system. But in the event of an outage, microgrids can operate on their own, with the aid of solar energy stored in batteries.

In Florida, regulators recently approved a four-year microgrid pilot project, but the technology remains expensive and unproven. In Maryland, regulators in 2016 rejected a plan to spend about $16 million for two microgrids in Baltimore, in part because the local utility made no attempt to quantify “the tangible benefits to its customer base.”

Amid shut-off woes, a beacon of energy

In Texas, where officials have largely abandoned state regulation in favor of the free market, the results have been no more encouraging. Without requirements, as exist elsewhere, for building extra capacity for times of high demand or stress, the state was ill-equipped to handle an abnormal deep freeze in February that knocked out power to 4 million customers for days.

Since then, Berkshire Hathaway Energy and Starwood Energy Group each proposed spending $8 billion to build new power plants to provide backup capacity, with guaranteed returns on the investment of 9 percent, but the Texas legislature has not acted on either plan.

New York is one of the few states where regulators have assessed the risks of climate change and pushed utilities to invest in solutions. After 800,000 New Yorkers lost power for 10 days in 2012 in the wake of Hurricane Sandy, state regulators ordered utility giant Con Edison to evaluate the state’s vulnerability to weather events.

The resulting report, which estimated climate risks could cost the company as much as $5.2 billion by 2050, gave ConEd data to inform its investments in storm hardening measures, including new storm walls and submersible equipment in areas at risk of flooding.

Meanwhile, the New York Public Service Commission has aggressively enforced requirements that utility companies keep the lights on during big storms, fining utility providers nearly $190 million for violations including inadequate staffing during Tropical Storm Isaias in 2020.

“At the end of the day, we do not want New Yorkers to be at the mercy of outdated infrastructure,” said Rory M. Christian, who last month was appointed chair of the New York commission.

The price of inaction
In North Carolina, as Duke Energy slowly works to harden the grid, some are pursuing other means of fostering climate-resilient communities.

Beth Schrader, the recovery and resilience director for New Hanover County, which includes Wilmington, said some of the people who went the longest without power after Florence had no vehicles, no access to nearby grocery stores and no means of getting to relief centers set up around the city.

For example, Quanesha Mullins, a 37-year-old mother of three, went eight days without power in her housing project on Wilmington’s east side. Her family got by on food from the Red Cross and walked a mile to charge their phones at McDonald’s. With no air conditioning, they slept with the windows open in a neighborhood with a history of violent crime.

Schrader is working with researchers at the University of North Carolina in Charlotte to estimate the cost of helping people like Mullins. The researchers estimate that it would have cost about $572,000 to provide shelter, meals and emergency food stamp benefits to 100 families for two weeks, said Robert Cox, an engineering professor who researches power systems at UNC-Charlotte.

Such calculations could help spur local governments to do more to help vulnerable communities, for example by providing “resilience outposts” with backup power generators, heating or cooling rooms, Internet access and other resources, Schrader said. But they also are intended to show the costs of failing to shore up the grid.

“The regulators need to be moved along,” Cox said.

In the meantime, Tonye Gray finds herself worrying about what happens when the next storm hits. While Duke Energy says it is burying power lines in the most outage-prone areas, she has yet to see its yellow-vested crews turn up in her neighborhood.

“We feel,” she said, “that we’re at the end of the line.”

 

Related News

View more

Sign Up for Electricity Forum’s Newsletter

Stay informed with our FREE Newsletter — get the latest news, breakthrough technologies, and expert insights, delivered straight to your inbox.

Electricity Today T&D Magazine Subscribe for FREE

Stay informed with the latest T&D policies and technologies.
  • Timely insights from industry experts
  • Practical solutions T&D engineers
  • Free access to every issue

Download the 2025 Electrical Training Catalog

Explore 50+ live, expert-led electrical training courses –

  • Interactive
  • Flexible
  • CEU-cerified