Carbon capture CanadaÂ’s best hope to meet Kyoto targets

By Vancouver Sun


CSA Z462 Arc Flash Training - Electrical Safety Essentials

Our customized live online or in‑person group training can be delivered to your staff at your location.

  • Live Online
  • 6 hours Instructor-led
  • Group Training Available
Regular Price:
$249
Coupon Price:
$199
Reserve Your Seat Today
The basic principle - when you make a mess you clean it up - applies to countries as well as children. And that's the idea behind carbon carbon capture and storage.

A just-released federal-Alberta committee report says urgent action is required to ensure that Canada starts mopping up after itself on the carbon emissions front.

It notes that carbon capture, or sequestration, could become Canada's ticket to meeting its greenhouse gas reduction targets under the Kyoto Protocol. Specifically, by 2050, carbon capture could address about 40 per cent of Canada's projected emissions.

Realistically, this may be our best hope of meeting targets that will force emissions reductions of between 60 and 70 by mid-century. Canada has promised by 2020 to reduce emissions from current levels by 20 per cent.

Put those reduction promises against the fact that our emissions are up by more than 25 per cent since 1990, and the challenge facing Canadian politicians is apparent.

There's no getting away from the fact that this is a country rich in oil, natural gas and coal resources, all of which spew the stuff that's creating climate havoc around the world.

The oilsands north of Edmonton in particular are problematic in that they've turned Alberta into the pollution champion of Canada. That province alone accounts for a third of the country's greenhouse gas emissions.

The report, nearly a year in the making, asserts that creating carbon capture and storage technology is as vital a national infrastructure project for Canada as was the national railway. As worthy of funding from taxpayers as the Hibernia oil field off Newfoundland, as electricity transmission grids and natural gas and oil pipelines.

It proposes three to five carbon-capture projects by 2015, at a projected cost of $2 billion.

Carbon capture, endorsed by the Intergovernmental Panel on Climate Change, is already in use on a limited basis around the globe.

It works by way of trapping carbon dioxide at the point of pollution, compressing it into pipelines and shipping it to disposal sites where it's injected into underground caverns.

The 55-page report, titled Canada's Fossil Energy Future, asserts carbon capture is "a natural fit for Canada," endowed with underground stable sedimentary rock formations ideal for carbon dioxide storage.

It's "an opportunity for the country and its industrial sectors to become world leaders." Sounds good. But of course there's what the report terms "a financial gap."

It goes on to argue that, as a national building initiative, carbon capture technology is the responsibility of all Canadians.

That viewpoint is not surprising since the committee itself was jointly sponsored by the federal and Alberta governments.

But an excellent case can be made that Alberta should pick up the lion's share of the tab to create this tidbit of technology.

After all, Wild Rose Country is in danger of growing out of sync economically with other provinces, developing a fatcat reputation as it continues to be the prime beneficiary of Canada's oil industry as well as the largest contributor among provinces to the greenhouse gas emissions problem.

Alberta is the only jurisdiction in Canada to be debt free and running huge surpluses.

Other provinces, appropriately, don't expect Alberta to pass along any of its oil wealth beyond contributing its share of equalization money.

Carbon capture presents an opportunity for Alberta to create some goodwill for itself across the country by providing funding for an innovation that's badly needed in its own backyard.

Edmonton has been lagging on the goodwill front. Last week, Premier Ed Stelmach up and left a provincial meeting of premiers aimed at discussing greenhouse gas emissions.

At the meeting, four of his fellow premiers pledged to develop a cap-and-trade carbon market scheme to help Canada meet its greenhouse gas reduction targets under Kyoto.

Interestingly, environmental groups are turning thumbs down on national funding for carbon capture. They fear the new technology would provide an outlet that will do nothing to encourage energy conservation.

But there's another issue. As John Bennett, director of climateforchange.ca, has stated, the highly profitable industries doing the polluting should take their share of the responsibility.

"The concept of polluter pay is apparently too complicated for the oil industry."

Related News

Trump's Oil Policies Spark Shift in Wall Street's Energy Strategy

Wall Street Fossil Fuel Pivot signals banks reassessing ESG, net-zero, and decarbonization goals, reviving oil, gas, and coal financing while recalibrating clean energy exposure amid policy shifts, regulatory rollbacks, and investment risk realignment.

 

Key Points

A shift as major U.S. banks ease ESG limits to fund oil, gas, coal while rebalancing alongside renewables.

✅ Banks revisit lending to oil, gas, and coal after policy shifts.

✅ ESG and net-zero commitments face reassessment amid returns.

✅ Renewables compete for capital as risk models are updated.

 

The global energy finance sector, worth a staggering $1.4 trillion, is undergoing a significant transformation, largely due to former President Donald Trump's renewed support for the oil, gas, and coal industries. Wall Street, which had previously aligned itself with global climate initiatives and the energy transition and net-zero goals, is now reassessing its strategy and pivoting toward a more fossil-fuel-friendly stance.

This shift represents a major change from the earlier stance, where many of the largest U.S. banks and financial institutions took a firm stance on decarbonization push, including limiting their exposure to fossil-fuel projects. Just a few years ago, these institutions were vocal supporters of the global push for a sustainable future, with many committing to support clean energy solutions and abandon investments in high-carbon energy sources.

However, with the change in administration and the resurgence of support for traditional energy sectors under Trump’s policies, these same banks are now rethinking their strategies. Financial institutions are increasingly discussing the possibility of lifting long-standing restrictions that limited their investments in controversial fossil-fuel projects, including coal mining, where emissions drop as coal declines, and offshore drilling. The change reflects a broader realignment within the energy finance sector, with Wall Street reexamining its role in shaping the future of energy.

One of the most significant developments is the Biden administration’s policy reversal, which emphasized reducing the U.S. carbon footprint in favor of carbon-free electricity strategies. Under Trump, however, there has been a renewed focus on supporting the traditional energy sectors. His administration has pushed to reduce regulatory burdens on fossil-fuel companies, particularly oil and gas, while simultaneously reintroducing favorable tax incentives for the coal and gas industries. This is a stark contrast to the Biden administration's efforts to incentivize the transition toward renewable energy and zero-emissions goals.

Trump's policies have, in effect, sent a strong signal to financial markets that the fossil-fuel industry could see a resurgence. U.S. banks, which had previously distanced themselves from financing oil and gas ventures due to the pressure from environmental activists and ESG (Environmental, Social, and Governance) investors, as seen in investor pressure on Duke Energy, are now reconsidering their positions. Major players like JPMorgan Chase and Goldman Sachs are reportedly having internal discussions about revisiting financing for energy projects that involve high carbon emissions, including controversial oil extraction and gas drilling initiatives.

The implications of this shift are far-reaching. In the past, a growing number of institutional investors had embraced ESG principles, with the goal of supporting the transition to renewable energy sources. However, Trump’s pro-fossil fuel stance appears to be emboldening Wall Street’s biggest players to rethink their commitment to green investing. Some are now advocating for a “balanced approach” that would allow for continued investment in traditional energy sectors, while also acknowledging the growing importance of renewable energy investments, a trend echoed by European oil majors going electric in recent years.

This reversal has led to confusion among investors and analysts, who are now grappling with how to navigate a rapidly changing landscape. Wall Street's newfound support for the fossil-fuel industry comes amid a backdrop of global concerns about climate change. Many investors, who had previously embraced policies aimed at curbing the effects of global warming, are now finding it harder to reconcile their environmental commitments with the shift toward fossil-fuel-heavy portfolios. The reemergence of fossil-fuel-friendly policies is forcing institutional investors to rethink their long-term strategies.

The consequences of this policy shift are also being felt by renewable energy companies, which now face increased competition for investment dollars from traditional energy sectors. The shift towards oil and gas projects has made it more challenging for renewable energy companies to attract the same level of financial backing, even as demand for clean energy continues to rise and as doubling electricity investment becomes a key policy call. This could result in a deceleration of renewable energy projects, potentially delaying the progress needed to meet the world’s climate targets.

Despite this, some analysts remain optimistic that the long-term shift toward green energy is inevitable, even if fossil-fuel investments gain a temporary boost. As the world continues to grapple with the effects of climate change, and as technological advancements in clean energy continue to reduce costs, the transition to renewables is likely to persist, regardless of the political climate.

The shift in Wall Street’s approach to energy investments, spurred by Trump’s pro-fossil fuel policies, is reshaping the $1.4 trillion global energy finance market. While the pivot towards fossil fuels may offer short-term gains, the long-term trajectory for energy markets remains firmly in the direction of renewables. The next few years will be crucial in determining whether financial institutions can balance the demand for short-term profitability with their long-term environmental responsibilities.

 

Related News

View more

Cleaning up Canada's electricity is critical to meeting climate pledges

Canada Clean Electricity Standard targets a net-zero grid by 2035, using carbon pricing, CO2 caps, and carbon capture while expanding renewables and interprovincial trade to decarbonize power in Alberta, Saskatchewan, and Ontario.

 

Key Points

A federal plan to reach a net-zero grid by 2035 using CO2 caps, carbon pricing, carbon capture, renewables, and trade.

✅ CO2 caps and rising carbon prices through 2050

✅ Carbon capture required on gas plants in high-emitting provinces

✅ Renewables build-out and interprovincial trade to balance supply

 

A new tool has been proposed in the federal election campaign as a way of eradicating the carbon emissions from Canada’s patchwork electricity system. 

As the country’s need for power grows through the decarbonization of transportation, industry and space heating, the Liberal Party climate plan is proposing a clean energy standard to help Canada achieve a 100% net-zero-electricity system by 2035, aligning with Canada’s net-zero by 2050 target overall. 

The proposal echoes a report released August 19 by the David Suzuki Foundation and a group of environmental NGOs that also calls for a clean electricity standard, capping power-sector emissions, and tighter carbon-pricing regulations. The report, written by Simon Fraser University climate economist Mark Jaccard and data analyst Brad Griffin, asserts that these policies would effectively decarbonize Canada’s electricity system by 2035.

“Fuel switching from dirty fossil fuels to clean electricity is an essential part of any serious pathway to transition to a net-zero energy system by 2050,” writes Tom Green, climate policy advisor to the Suzuki Foundation, in a foreword to the report. The pathway to a net-zero grid is even more important as Canada switches from fossil fuels to electric vehicles, space heating and industrial processes, even as the Canadian Gas Association warns of high transition costs.

Under Jaccard and Griffin’s proposal, a clean electricity standard would be established to regulate CO2 emissions specifically from power plants across Canada. In addition, the plan includes an increase in the carbon price imposed on electricity system releases, combined with tighter regulation to ensure that 100% of the carbon price set by the federal government is charged to electricity producers. The authors propose that the current scheduled carbon price of $170 per tonne of CO2 in 2030 should rise to at least $300 per tonne by 2050.

In Alberta, Saskatchewan, Ontario, New Brunswick and Nova Scotia, the 2030 standard would mean that all fossil-fuel-powered electricity plants would require carbon capture in order to comply with the standard. The provinces would be given until 2035 to drop to zero grams CO2 per kilowatt hour, matching the 2030 standard for low-carbon provinces (Quebec, British Columbia, Manitoba, Newfoundland and Labrador and Prince Edward Island). 

Alberta and Saskatchewan targeted 
Canada has a relatively clean electricity system, as shown by nationwide progress in electricity, with about 80% of the country’s power generated from low- or zero-emission sources. So the biggest impacts of the proposal will be felt in the higher-carbon provinces of Alberta and Saskatchewan. Alberta has a plan to switch from coal-based electric power to natural gas generation by 2023. But Saskatchewan is still working on its plan. Under the Jaccard-Griffin proposal, these provinces would need to install carbon capture on their gas-fired plants by 2030 and carbon-negative technology (biomass with carbon capture, for instance) by 2035. Saskatchewan has been operating carbon capture and storage technology at its Boundary Dam power station since 2014, but large-scale rollout at power plants has not yet been achieved in Canada. 

With its heavy reliance on nuclear and hydro generation, Ontario’s electricity supply is already low carbon. Natural gas now accounts for about 7% of the province’s grid, but the clean electricity standard could pose a big challenge for the province as it ramps up natural-gas-generated power to replace electricity from its aging Pickering station, scheduled to go out of service in 2025, even as a fully renewable grid by 2030 remains a debated goal. Pickering currently supplies about 14% of Ontario’s power. 

Ontario doesn’t have large geological basins for underground CO2 storage, as Alberta and Saskatchewan do, so the report says Ontario will have to build up its solar and wind generation significantly as part of Canada’s renewable energy race, or find a solution to capture CO2 from its gas plants. The Ontario Clean Air Alliance has kicked off a campaign to encourage the Ontario government to phase out gas-fired generation by purchasing power from Quebec or installing new solar or wind power.

As the report points out, the federal government has Supreme Court–sanctioned authority to impose carbon regulations, such as a clean electricity standard, and carbon pricing on the provinces, with significant policy implications for electricity grids nationwide.

The federal government can also mandate a national approach to CO2 reduction regardless of fuel source, encouraging higher-carbon provinces to work with their lower-carbon neighbours. The Atlantic provinces would be encouraged to buy power from hydro-heavy Newfoundland, for example, while Ontario would be encouraged to buy power from Quebec, Saskatchewan from Manitoba, and Alberta from British Columbia.

The Canadian Electricity Association, the umbrella organization for Canada’s power sector, did not respond to a request for comment on the Jaccard-Griffin report or the Liberal net-zero grid proposal.

Just how much more clean power will Canada need? 
The proposal has also kicked off a debate, and an IEA report underscores rising demand, about exactly how much additional electricity Canada will need in coming decades.

In his 2015 report, Pathways to Deep Decarbonization in Canada, energy and climate analyst Chris Bataille estimated that to achieve Canada’s climate net-zero target by 2050 the country will need to double its electricity use by that year.

Jaccard and Griffin agree with this estimate, saying that Canada will need more than 1,200 terawatt hours of electricity per year in 2050, up from about 640 terawatt hours currently.

But energy and climate consultant Ralph Torrie (also director of research at Corporate Knights) disputes this analysis.

He says large-scale programs to make the economy more energy efficient could substantially reduce electricity demand. A major program to install heat pumps and replace inefficient electric heating in homes and businesses could save 50 terawatt hours of consumption on its own, according to a recent report from Torrie and colleague Brendan Haley. 

Put in context, 50 terawatt hours would require generation from 7,500 large wind turbines. Applied to electric vehicle charging, 50 terawatt hours could power 10 million electric vehicles.

While Torrie doesn’t dispute the need to bring the power system to net-zero, he also doesn’t believe the “arm-waving argument that the demand for electricity is necessarily going to double because of the electrification associated with decarbonization.” 

 

Related News

View more

Longer, more frequent outages afflict the U.S. power grid as states fail to prepare for climate change

Power Grid Climate Resilience demands storm hardening, underground power lines, microgrids, batteries, and renewable energy as regulators and utilities confront climate change, sea level rise, and extreme weather to reduce outages and protect vulnerable communities.

 

Key Points

It is the grid capacity to resist and recover from climate hazards using buried lines, microgrids, and batteries.

✅ Underground lines reduce wind outages and wildfire ignition risk.

✅ Microgrids with solar and batteries sustain critical services.

✅ Regulators balance cost, resilience, equity, and reliability.

 

Every time a storm lashes the Carolina coast, the power lines on Tonye Gray’s street go down, cutting her lights and air conditioning. After Hurricane Florence in 2018, Gray went three days with no way to refrigerate medicine for her multiple sclerosis or pump the floodwater out of her basement.

What you need to know about the U.N. climate summit — and why it matters
“Florence was hell,” said Gray, 61, a marketing account manager and Wilmington native who finds herself increasingly frustrated by the city’s vulnerability.

“We’ve had storms long enough in Wilmington and this particular area that all power lines should have been underground by now. We know we’re going to get hit.”

Across the nation, severe weather fueled by climate change is pushing aging electrical systems past their limits, often with deadly results. Last year, amid increasing nationwide blackouts, the average American home endured more than eight hours without power, according to the U.S. Energy Information Administration — more than double the outage time five years ago.

This year alone, a wave of abnormally severe winter storms caused a disastrous power failure in Texas, leaving millions of homes in the dark, sometimes for days, and at least 200 dead. Power outages caused by Hurricane Ida contributed to at least 14 deaths in Louisiana, as some of the poorest parts of the state suffered through weeks of 90-degree heat without air conditioning.

As storms grow fiercer and more frequent, environmental groups are pushing states to completely reimagine the electrical grid, incorporating more grid-scale batteries, renewable energy sources and localized systems known as “microgrids,” which they say could reduce the incidence of wide-scale outages. Utility companies have proposed their own storm-proofing measures, including burying power lines underground.

But state regulators largely have rejected these ideas, citing pressure to keep energy rates affordable. Of $15.7 billion in grid improvements under consideration last year, regulators approved only $3.4 billion, according to a national survey by the NC Clean Energy Technology Center — about one-fifth, highlighting persistent vulnerabilities in the grid nationwide.

After a weather disaster, “everybody’s standing around saying, ‘Why didn’t you spend more to keep the lights on?’ ” Ted Thomas, chairman of the Arkansas Public Service Commission, said in an interview with The Washington Post. “But when you try to spend more when the system is working, it’s a tough sell.”

A major impediment is the failure by state regulators and the utility industry to consider the consequences of a more volatile climate — and to come up with better tools to prepare for it. For example, a Berkeley Lab study last year of outages caused by major weather events in six states found that neither state officials nor utility executives attempted to calculate the social and economic costs of longer and more frequent outages, such as food spoilage, business closures, supply chain disruptions and medical problems.

“There is no question that climatic changes are happening that directly affect the operation of the power grid,” said Justin Gundlach, a senior attorney at the Institute for Policy Integrity, a think tank at New York University Law School. “What you still haven’t seen … is a [state] commission saying: 'Isn’t climate the through line in all of this? Let’s examine it in an open-ended way. Let’s figure out where the information takes us and make some decisions.’ ”

In interviews, several state commissioners acknowledged that failure.

“Our electric grid was not built to handle the storms that are coming this next century,” said Tremaine L. Phillips, a commissioner on the Michigan Public Service Commission, which in August held an emergency meeting to discuss the problem of power outages. “We need to come up with a broader set of metrics in order to better understand the success of future improvements.”

Five disasters in four years
The need is especially urgent in North Carolina, where experts warn Atlantic grids and coastlines need a rethink as the state has declared a federal disaster from a hurricane or tropical storm five times in the past four years. Among them was Hurricane Florence, which brought torrential rain, catastrophic flooding and the state’s worst outage in over a decade in September 2018.

More than 1 million residents were left disconnected from refrigerators, air conditioners, ventilators and other essential machines, some for up to two weeks. Elderly residents dependent on oxygen were evacuated from nursing homes. Relief teams flew medical supplies to hospitals cut off by flooded roads. Desperate people facing closed stores and rotting food looted a Wilmington Family Dollar.

“I have PTSD from Hurricane Florence, not because of the actual storm but the aftermath,” said Evelyn Bryant, a community organizer who took part in the Wilmington response.

The storm reignited debate over a $13 billion proposal by Duke Energy, one of the largest power companies in the nation, to reinforce the state’s power grid. A few months earlier, the state had rejected Duke’s request for full repayment of those costs, determining that protecting the grid against weather is a normal part of doing business and not eligible for the type of reimbursement the company had sought.

After Florence, Duke offered a smaller, $2.5 billion plan, along with the argument that severe weather events are one of seven “megatrends” (including cyberthreats and population growth) that require greater investment, according to a PowerPoint presentation included in testimony to the state. The company owns the two largest utilities in North Carolina, Duke Energy Carolinas and Duke Energy Progress.

Vote Solar, a nonprofit climate advocacy group, objected to Duke’s plan, saying the utility had failed to study the risks of climate impacts. Duke’s flood maps, for example, had not been updated to reflect the latest projections for sea level rise, they said. In testimony, Vote Solar claimed Duke was using environmental trends to justify investments “it had already decided to pursue.”

The United States is one of the few countries where regulated utilities are usually guaranteed a rate of return on capital investments, even as studies show the U.S. experiences more blackouts than much of the developed world. That business model incentivizes spending regardless of how well it solves problems for customers and inspires skepticism. Ric O’Connell, executive director of GridLab, a nonprofit group that assists state and regional policymakers on electrical grid issues, said utilities in many states “are waving their hands and saying hurricanes” to justify spending that would do little to improve climate resilience.

In North Carolina, hurricanes convinced Republicans that climate change is real

Duke Energy spokesman Jeff Brooks acknowledged that the company had not conducted a climate risk study but pointed out that this type of analysis is still relatively new for the industry. He said Duke’s grid improvement plan “inherently was designed to think about future needs,” including reinforced substations with walls that rise several feet above the previous high watermark for flooding, and partly relied on federal flood maps to determine which stations are at most risk.

Brooks said Duke is not using weather events to justify routine projects, noting that the company had spent more than a year meeting with community stakeholders and using their feedback to make significant changes to its grid improvement plan.

This year, the North Carolina Utilities Commission finally approved a set of grid improvements that will cost customers $1.2 billion. But the commission reserved the right to deny Duke reimbursement of those costs if it cannot prove they are prudent and reasonable. The commission’s general counsel, Sam Watson, declined to discuss the decision, saying the commission can comment on specific cases only in public orders.

The utility is now burying power lines in “several neighborhoods across the state” that are most vulnerable to wide-scale outages, Brooks said. It is also fitting aboveground power lines with “self-healing” technology, a network of sensors that diverts electricity away from equipment failures to minimize the number of customers affected by an outage.

As part of a settlement with Vote Solar, Duke Energy last year agreed to work with state officials and local leaders to further evaluate the potential impacts of climate change, a process that Brooks said is expected to take two to three years.

High costs create hurdles
The debate in North Carolina is being echoed in states across the nation, where burying power lines has emerged as one of the most common proposals for insulating the grid from high winds, fires and flooding. But opponents have balked at the cost, which can run in the millions of dollars per mile.

In California, for example, Pacific Gas & Electric wants to bury 10,000 miles of power lines, both to make the grid more resilient and to reduce the risk of sparking wildfires. Its power equipment has contributed to multiple deadly wildfires in the past decade, including the 2018 Camp Fire that killed at least 85 people.

PG&E’s proposal has drawn scorn from critics, including San Jose Mayor Sam Liccardo, who say it would be too slow and expensive. But Patricia Poppe, the company’s CEO, told reporters that doing nothing would cost California even more in lost lives and property while struggling to keep the lights on during wildfires. The plan has yet to be submitted to the state, but Terrie Prosper, a spokeswoman for the California Public Utilities Commission, said the commission has supported underground lines as a wildfire mitigation strategy.

Another oft-floated solution is microgrids, small electrical systems that provide power to a single neighborhood, university or medical center. Most of the time, they are connected to a larger utility system. But in the event of an outage, microgrids can operate on their own, with the aid of solar energy stored in batteries.

In Florida, regulators recently approved a four-year microgrid pilot project, but the technology remains expensive and unproven. In Maryland, regulators in 2016 rejected a plan to spend about $16 million for two microgrids in Baltimore, in part because the local utility made no attempt to quantify “the tangible benefits to its customer base.”

Amid shut-off woes, a beacon of energy

In Texas, where officials have largely abandoned state regulation in favor of the free market, the results have been no more encouraging. Without requirements, as exist elsewhere, for building extra capacity for times of high demand or stress, the state was ill-equipped to handle an abnormal deep freeze in February that knocked out power to 4 million customers for days.

Since then, Berkshire Hathaway Energy and Starwood Energy Group each proposed spending $8 billion to build new power plants to provide backup capacity, with guaranteed returns on the investment of 9 percent, but the Texas legislature has not acted on either plan.

New York is one of the few states where regulators have assessed the risks of climate change and pushed utilities to invest in solutions. After 800,000 New Yorkers lost power for 10 days in 2012 in the wake of Hurricane Sandy, state regulators ordered utility giant Con Edison to evaluate the state’s vulnerability to weather events.

The resulting report, which estimated climate risks could cost the company as much as $5.2 billion by 2050, gave ConEd data to inform its investments in storm hardening measures, including new storm walls and submersible equipment in areas at risk of flooding.

Meanwhile, the New York Public Service Commission has aggressively enforced requirements that utility companies keep the lights on during big storms, fining utility providers nearly $190 million for violations including inadequate staffing during Tropical Storm Isaias in 2020.

“At the end of the day, we do not want New Yorkers to be at the mercy of outdated infrastructure,” said Rory M. Christian, who last month was appointed chair of the New York commission.

The price of inaction
In North Carolina, as Duke Energy slowly works to harden the grid, some are pursuing other means of fostering climate-resilient communities.

Beth Schrader, the recovery and resilience director for New Hanover County, which includes Wilmington, said some of the people who went the longest without power after Florence had no vehicles, no access to nearby grocery stores and no means of getting to relief centers set up around the city.

For example, Quanesha Mullins, a 37-year-old mother of three, went eight days without power in her housing project on Wilmington’s east side. Her family got by on food from the Red Cross and walked a mile to charge their phones at McDonald’s. With no air conditioning, they slept with the windows open in a neighborhood with a history of violent crime.

Schrader is working with researchers at the University of North Carolina in Charlotte to estimate the cost of helping people like Mullins. The researchers estimate that it would have cost about $572,000 to provide shelter, meals and emergency food stamp benefits to 100 families for two weeks, said Robert Cox, an engineering professor who researches power systems at UNC-Charlotte.

Such calculations could help spur local governments to do more to help vulnerable communities, for example by providing “resilience outposts” with backup power generators, heating or cooling rooms, Internet access and other resources, Schrader said. But they also are intended to show the costs of failing to shore up the grid.

“The regulators need to be moved along,” Cox said.

In the meantime, Tonye Gray finds herself worrying about what happens when the next storm hits. While Duke Energy says it is burying power lines in the most outage-prone areas, she has yet to see its yellow-vested crews turn up in her neighborhood.

“We feel,” she said, “that we’re at the end of the line.”

 

Related News

View more

Nord Stream: Norway and Denmark tighten energy infrastructure security after gas pipeline 'attack'

Nord Stream Pipeline Sabotage triggers Baltic Sea gas leaks as Norway and Denmark tighten energy infrastructure security, offshore surveillance, and exclusion zones, after drone sightings near platforms and explosions reported by experts.

 

Key Points

An alleged attack causing Baltic gas leaks and heightened energy security measures in Norway and Denmark.

✅ Norway boosts offshore and onshore site security

✅ Denmark enforces 5 nm exclusion zone near leaks

✅ Drones spotted; police probe sabotage and safety breaches

 

Norway and Denmark will increase security and surveillance around their energy infrastructure sites after the alleged sabotage of Russia's Nord Stream gas pipeline in the Baltic Sea, as the EU pursues a plan to dump Russian energy to safeguard supplies. 

Major leaks struck two underwater natural gas pipelines running from Russia to Germany, which has moved to a 200 billion-euro energy shield amid surging prices, with experts reporting that explosions rattled the Baltic Sea beforehand.

Norway -- an oil-rich nation and Europe's biggest supplier of gas -- will strengthen security at its land and offshore installations, even as it weighs curbing electricity exports to avoid shortages, the country's energy minister said.

The Scandinavian country's Petroleum Safety Authority also urged vigilance on Monday after unidentified drones were seen flying near Norway's offshore oil and gas platforms.

"The PSA has received a number of warnings/notifications from operator companies on the Norwegian Continental Shelf concerning the observation of unidentified drones/aircraft close to offshore facilities" the agency said in a statement.

"Cases where drones have infringed the safety zone around facilities are now being investigated by the Norwegian police."

Meanwhile Denmark will increase security across its energy sector after the Nord Stream incident, as wider market strains, including Germany's struggling local utilities, ripple across Europe, a spokesperson for gas transmission operator Energinet told Upstream.

The Danish Maritime Agency has also imposed an exclusion zone for five nautical miles around the leaks, warning ships of a danger they could lose buoyancy, and stating there is a risk of the escaping gas igniting "above the water and in the air," even as Europe weighs emergency electricity measures to limit prices.

Denmark's defence minister said there was no cause for security concerns in the Baltic Sea region.

"Russia has a significant military presence in the Baltic Sea region and we expect them to continue their sabre-rattling," Morten Bodskov said in a statement.

Video taken by a Danish military plane on Tuesday afternoon showed the extent of one of gas pipeline leaks, with the surface of the Baltic bubbling up as gas escapes, highlighting Europe's energy crisis for global audiences:

Meanwhile police in Sweden have opened a criminal investigation into "gross sabotage" of the Nord Stream 1 and Nord Stream 2 pipelines, and Sweden's crisis management unit was activated to monitor the situation. The unit brings together representatives from different government agencies. 

Swedish Foreign Minister Ann Linde had a call with her Danish counterpart Jeppe Kofod on Tuesday evening, and the pair also spoke with Norwegian Foreign Minister Anniken Huitfeldt on Wednesday, as the bloc debates gas price cap strategies to address the crisis, with Kofod saying there should be a "clear and unambiguous EU statement about the explosions in the Baltic Sea." 

"Focus now on uncovering exactly what has happened - and why. Any sabotage against European energy infrastructure will be met with a robust and coordinated response," said Kofod. 

 

Related News

View more

Jolting the brain's circuits with electricity is moving from radical to almost mainstream therapy

Brain Stimulation is transforming neuromodulation, from TMS and DBS to closed loop devices, targeting neural circuits for addiction, depression, Parkinsons, epilepsy, and chronic pain, powered by advanced imaging, AI analytics, and the NIH BRAIN Initiative.

 

Key Points

Brain stimulation uses pulses to modulate neural circuits, easing symptoms in depression, Parkinsons, and epilepsy.

✅ Noninvasive TMS and invasive DBS modulate specific brain circuits

✅ Closed loop systems adapt stimulation via real time biomarker detection

✅ Emerging uses: addiction, depression, Parkinsons, epilepsy, chronic pain

 

In June 2015, biology professor Colleen Hanlon went to a conference on drug dependence. As she met other researchers and wandered around a glitzy Phoenix resort’s conference rooms to learn about the latest work on therapies for drug and alcohol use disorders, she realized that out of the 730 posters, there were only two on brain stimulation as a potential treatment for addiction — both from her own lab at Wake Forest School of Medicine.

Just four years later, she would lead 76 researchers on four continents in writing a consensus article about brain stimulation as an innovative tool for addiction. And in 2020, the Food and Drug Administration approved a transcranial magnetic stimulation device to help patients quit smoking, a milestone for substance use disorders.

Brain stimulation is booming. Hanlon can attend entire conferences devoted to the study of what electrical currents do—including how targeted stimulation can improve short-term memory in older adults—to the intricate networks of highways and backroads that make up the brain’s circuitry. This expanding field of research is slowly revealing truths of the brain: how it works, how it malfunctions, and how electrical impulses, precisely targeted and controlled, might be used to treat psychiatric and neurological disorders.

In the last half-dozen years, researchers have launched investigations into how different forms of neuromodulation affect addiction, depression, loss-of-control eating, tremor, chronic pain, obsessive compulsive disorder, Parkinson’s disease, epilepsy, and more. Early studies have shown subtle electrical jolts to certain brain regions could disrupt circuit abnormalities — the miscommunications — that are thought to underlie many brain diseases, and help ease symptoms that persist despite conventional treatments.

The National Institute of Health’s massive BRAIN Initiative put circuits front and center, distributing $2.4 billion to researchers since 2013 to devise and use new tools to observe interactions between brain cells and circuits. That, in turn, has kindled interest from the private sector. Among the advances that have enhanced our understanding of how distant parts of the brain talk with one another are new imaging technology and the use of machine learning, much as utilities use AI to adapt to shifting electricity demand, to interpret complex brain signals and analyze what happens when circuits go haywire.

Still, the field is in its infancy, and even therapies that have been approved for use in patients with, for example, Parkinson’s disease or epilepsy, help only a minority of patients, and in a world where electricity drives pandemic readiness expectations can outpace evidence. “If it was the Bible, it would be the first chapter of Genesis,” said Michael Okun, executive director of the Norman Fixel Institute for Neurological Diseases at University of Florida Health.

As brain stimulation evolves, researchers face daunting hurdles, and not just scientific ones. How will brain stimulation become accessible to all the patients who need it, given how expensive and invasive some treatments are? Proving to the FDA that brain stimulation works, and does so safely, is complicated and expensive. Even with a swell of scientific momentum and an influx of funding, the agency has so far cleared brain stimulation for only a handful of limited conditions. Persuading insurers to cover the treatments is another challenge altogether. And outside the lab, researchers are debating nascent issues, such as the ethics of mind control, the privacy of a person’s brain data—concerns that echo efforts to develop algorithms to prevent blackouts during rising ransomware threats—and how to best involve patients in the study of the human brain’s far-flung regions.

Neurologist Martha Morrell is optimistic about the future of brain stimulation. She remembers the shocked reactions of her colleagues in 2004 when she left full-time teaching at Stanford (she still has a faculty appointment as a clinical professor of neurology) to direct clinical trials at NeuroPace, then a young company making neurostimulator systems to potentially treat epilepsy patients.

Related: Once a last resort, this pain therapy is getting a new life amid the opioid crisis
“When I started working on this, everybody thought I was insane,” said Morrell. Nearly 20 years in, she sees a parallel between the story of jolting the brain’s circuitry and that of early implantable cardiac devices, such as pacemakers and defibrillators, which initially “were used as a last option, where all other medications have failed.” Now, “the field of cardiology is very comfortable incorporating electrical therapy, device therapy, into routine care. And I think that’s really where we’re going with neurology as well.”


Reaching a ‘slope of enlightenment’
Parkinson’s is, in some ways, an elder in the world of modern brain stimulation, and it shows the potential as well as the limitations of the technology. Surgeons have been implanting electrodes deep in the brains of Parkinson’s patients since the late 1990s, and in people with more advanced disease since the early 2000s.

In that time, it’s gone through the “hype cycle,” said Okun, the national medical adviser to the Parkinson’s Foundation since 2006. Feverish excitement and overinflated expectations have given way to reality, bringing scientists to a “slope of enlightenment,” he said. They have found deep brain stimulation to be very helpful for some patients with Parkinson’s, rendering them almost symptom-free by calming the shaking and tremors that medications couldn’t. But it doesn’t stop the progression of the disease, or resolve some of the problems patients with advanced Parkinson’s have walking, talking, and thinking.

In 2015, the same year Hanlon found only her lab’s research on brain stimulation at the addiction conference, Kevin O’Neill watched one finger on his left hand start doing something “funky.” One finger twitched, then two, then his left arm started tingling and a feeling appeared in his right leg, like it was about to shake but wouldn’t — a tremor.

“I was assuming it was anxiety,” O’Neill, 62, told STAT. He had struggled with anxiety before, and he had endured a stressful year: a separation, selling his home, starting a new job at a law firm in California’s Bay Area. But a year after his symptoms first began, O’Neill was diagnosed with Parkinson’s.

In the broader energy context, California has increasingly turned to battery storage to stabilize its strained grid.

Related: Psychiatric shock therapy, long controversial, may face fresh restrictions
Doctors prescribed him pills that promote the release of dopamine, to offset the death of brain cells that produce this messenger molecule in circuits that control movement. But he took them infrequently because he worried about insomnia as a side effect. Walking became difficult — “I had to kind of think my left leg into moving” — and the labor lawyer found it hard to give presentations and travel to clients’ offices.

A former actor with an outgoing personality, he developed social anxiety and didn’t tell his bosses about his diagnosis for three years, and wouldn’t have, if not for two workdays in summer 2018 when his tremors were severe and obvious.

O’Neill’s tremors are all but gone since he began deep brain stimulation last May, though his left arm shakes when he feels tense.

It was during that period that he learned about deep brain stimulation, at a support group for Parkinson’s patients. “I thought, ‘I will never let anybody fuss with my brain. I’m not going to be a candidate for that,’” he recalled. “It felt like mad scientist science fiction. Like, are you kidding me?”

But over time, the idea became less radical, as O’Neill spoke to DBS patients and doctors and did his own research, and as his symptoms worsened. He decided to go for it. Last May, doctors at the University of California, San Francisco surgically placed three metal leads into his brain, connected by thin cords to two implants in his chest, just near the clavicles. A month later, he went into the lab and researchers turned the device on.

“That was a revelation that day,” he said. “You immediately — literally, immediately — feel the efficacy of these things. … You go from fully symptomatic to non-symptomatic in seconds.”

When his nephew pulled up to the curb to pick him up, O’Neill started dancing, and his nephew teared up. The following day, O’Neill couldn’t wait to get out of bed and go out, even if it was just to pick up his car from the repair shop.

In the year since, O’Neill’s walking has gone from “awkward and painful” to much improved, and his tremors are all but gone. When he is extra frazzled, like while renovating and moving into his new house overlooking the hills of Marin County, he feels tense and his left arm shakes and he worries the DBS is “failing,” but generally he returns to a comfortable, tremor-free baseline.

O’Neill worried about the effects of DBS wearing off but, for now, he can think “in terms of decades, instead of years or months,” he recalled his neurologist telling him. “The fact that I can put away that worry was the big thing.”

He’s just one patient, though. The brain has regions that are mostly uniform across all people. The functions of those regions also tend to be the same. But researchers suspect that how brain regions interact with one another — who mingles with whom, and what conversation they have — and how those mixes and matches cause complex diseases varies from person to person. So brain stimulation looks different for each patient.

Related: New study revives a Mozart sonata as a potential epilepsy therapy
Each case of Parkinson’s manifests slightly differently, and that’s a bit of knowledge that applies to many other diseases, said Okun, who organized the nine-year-old Deep Brain Stimulation Think Tank, where leading researchers convene, review papers, and publish reports on the field’s progress each year.

“I think we’re all collectively coming to the realization that these diseases are not one-size-fits-all,” he said. “We have to really begin to rethink the entire infrastructure, the schema, the framework we start with.”

Brain stimulation is also used frequently to treat people with common forms of epilepsy, and has reduced the number of seizures or improved other symptoms in many patients. Researchers have also been able to collect high-quality data about what happens in the brain during a seizure — including identifying differences between epilepsy types. Still, only about 15% of patients are symptom-free after treatment, according to Robert Gross, a neurosurgery professor at Emory University in Atlanta.

“And that’s a critical difference for people with epilepsy. Because people who are symptom-free can drive,” which means they can get to a job in a place like Georgia, where there is little public transit, he said. So taking neuromodulation “from good to great,” is imperative, Gross said.


Renaissance for an ancient idea
Recent advances are bringing about what Gross sees as “almost a renaissance period” for brain stimulation, though the ideas that undergird the technology are millenia old. Neuromodulation goes back to at least ancient Egypt and Greece, when electrical shocks from a ray, called the “torpedo fish,” were recommended as a treatment for headache and gout. Over centuries, the fish zaps led to doctors burning holes into the brains of patients. Those “lesions” worked, somehow, but nobody could explain why they alleviated some patients’ symptoms, Okun said.

Perhaps the clearest predecessor to today’s technology is electroconvulsive therapy (ECT), which in a rudimentary and dangerous way began being used on patients with depression roughly 100 years ago, said Nolan Williams, director of the Brain Stimulation Lab at Stanford University.

Related: A new index measures the extent and depth of addiction stigma
More modern forms of brain stimulation came about in the United States in the mid-20th century. A common, noninvasive approach is transcranial magnetic stimulation, which involves placing an electromagnetic coil on the scalp to transmit a current into the outermost layer of the brain. Vagus nerve stimulation (VNS), used to treat epilepsy, zaps a nerve that contributes to some seizures.

The most invasive option, deep brain stimulation, involves implanting in the skull a device attached to electrodes embedded in deep brain regions, such as the amygdala, that can’t be reached with other stimulation devices. In 1997, the FDA gave its first green light to deep brain stimulation as a treatment for tremor, and then for Parkinson’s in 2002 and the movement disorder dystonia in 2003.

Even as these treatments were cleared for patients, though, what was happening in the brain remained elusive. But advanced imaging tools now let researchers peer into the brain and map out networks — a recent breakthrough that researchers say has propelled the field of brain stimulation forward as much as increased funding has, paralleling broader efforts to digitize analog electrical systems across industry. Imaging of both human brains and animal models has helped researchers identify the neuroanatomy of diseases, target brain regions with more specificity, and watch what was happening after electrical stimulation.

Another key step has been the shift from open-loop stimulation — a constant stream of electricity — to closed-loop stimulation that delivers targeted, brief jolts in response to a symptom trigger. To make use of the futuristic technology, labs need people to develop artificial intelligence tools, informed by advances in machine learning for the energy transition, to interpret large data sets a brain implant is generating, and to tailor devices based on that information.

“We’ve needed to learn how to be data scientists,” Morrell said.

Affinity groups, like the NIH-funded Open Mind Consortium, have formed to fill that gap. Philip Starr, a neurosurgeon and developer of implantable brain devices at the University of California at San Francisco Health system, leads the effort to teach physicians how to program closed-loop devices, and works to create ethical standards for their use. “There’s been extraordinary innovation after 20 years of no innovation,” he said.

The BRAIN Initiative has been critical, several researchers told STAT. “It’s been a godsend to us,” Gross said. The NIH’s Brain Research through Advancing Innovative Neurotechnologies (BRAIN) Initiative was launched in 2013 during the Obama administration with a $50 million budget. BRAIN now spends over $500 million per year. Since its creation, BRAIN has given over 1,100 awards, according to NIH data. Part of the initiative’s purpose is to pair up researchers with medical technology companies that provide human-grade stimulation devices to the investigators. Nearly three dozen projects have been funded through the investigator-devicemaker partnership program and through one focused on new implantable devices for first-in-human use, according to Nick Langhals, who leads work on neurological disorders at the initiative.

The more BRAIN invests, the more research is spawned. “We learn more about what circuits are involved … which then feeds back into new and more innovative projects,” he said.

Many BRAIN projects are still in early stages, finishing enrollment or small feasibility studies, Langhals said. Over the next couple of years, scientists will begin to see some of the fruits of their labor, which could lead to larger clinical trials, or to companies developing more refined brain stimulation implants, Langhals said.

Money from the National Institutes of Mental Health, as well as the NIH’s Helping to End Addiction Long-term (HEAL), has similarly sweetened the appeal of brain stimulation, both for researchers and industry. “A critical mass” of companies interested in neuromodulation technology has mushroomed where, for two decades, just a handful of companies stood, Starr said.

More and more, pharmaceutical and digital health companies are looking at brain stimulation devices “as possible products for their future,” said Linda Carpenter, director of the Butler Hospital TMS Clinic and Neuromodulation Research Facility.


‘Psychiatry 3.0’
The experience with using brain stimulation to stop tremors and seizures inspired psychiatrists to begin exploring its use as a potentially powerful therapy for healing, or even getting ahead of, mental illness.

In 2008, the FDA approved TMS for patients with major depression who had tried, and not gotten relief from, drug therapy. “That kind of opened the door for all of us,” said Hanlon, a professor and researcher at the Center for Research on Substance Use and Addiction at Wake Forest School of Medicine. The last decade saw a surge of research into how TMS could be used to reset malfunctioning brain circuits involved in anxiety, depression, obsessive-compulsive disorder, and other conditions.

“We’re certainly entering into what a lot of people are calling psychiatry 3.0,” Stanford’s Williams said. “Whereas the first iteration was Freud and all that business, the second one was the psychopharmacology boom, and this third one is this bit around circuits and stimulation.”

Drugs alleviate some patients’ symptoms while simultaneously failing to help many others, but psychopharmacology clearly showed “there’s definitely a biology to this problem,” Williams said — a biology that in some cases may be more amenable to a brain stimulation.

Related: Largest psilocybin trial finds the psychedelic is effective in treating serious depression
The exact mechanics of what happens between cells when brain circuits … well, short-circuit, is unclear. Researchers are getting closer to finding biomarkers that warn of an incoming depressive episode, or wave of anxiety, or loss of impulse control. Those brain signatures could be different for every patient. If researchers can find molecular biomarkers for psychiatric disorders — and find ways to preempt those symptoms by shocking particular brain regions — that would reshape the field, Williams said.

Not only would disease-specific markers help clinicians diagnose people, but they could help chip away at the stigma that paints mental illness as a personal or moral failing instead of a disease. That’s what happened for epilepsy in the 1960s, when scientific findings nudged the general public toward a deeper understanding of why seizures happen, and it’s “the same trajectory” Williams said he sees for depression.

His research at the Stanford lab also includes work on suicide, and obsessive-compulsive disorder, which the FDA said in 2018 could be treated using noninvasive TMS. Williams considers brain stimulation, with its instantaneity, to be a potential breakthrough for urgent psychiatric situations. Doctors know what to do when a patient is rushed into the emergency room with a heart attack or a stroke, but there is no immediate treatment for psychiatric emergencies, he said. Williams wonders: What if, in the future, a suicidal patient could receive TMS in the emergency room and be quickly pulled out of their depressive mental spiral?

Researchers are also actively investigating the brain biology of addiction. In August 2020, the FDA approved TMS for smoking cessation, the first such OK for a substance use disorder, which is “really exciting,” Hanlon said. Although there is some nuance when comparing substance use disorders, a primal mechanism generally defines addiction: the eternal competition between “top-down” executive control functions and “bottom-up” cravings. It’s the same process that is at work when one is deciding whether to eat another cookie or abstain — just exacerbated.

Hanlon is trying to figure out if the stop and go circuits are in the same place for all people, and whether neuromodulation should be used to strengthen top-down control or weaken bottom-up cravings. Just as brain stimulation can be used to disrupt cellular misfiring, it could also be a tool for reinforcing helpful brain functions, or for giving the addicted brain what it wants in order to curb substance use.

Evidence suggests many people with schizophrenia smoke cigarettes (a leading cause of early death for this population) because nicotine reduces the “hyperconnectivity” that characterizes the brains of people with the disease, said Heather Ward, a research fellow at Boston’s Beth Israel Deaconess Medical Center. She suspects TMS could mimic that effect, and therefore reduce cravings and some symptoms of the disease, and she hopes to prove that in a pilot study that is now enrolling patients.

If the scientific evidence proves out, clinicians say brain stimulation could be used alongside behavioral therapy and drug-based therapy to treat substance use disorders. “In the end, we’re going to need all three to help people stay sober,” Hanlon said. “We’re adding another tool to the physician’s toolbox.”

Decoding the mysteries of pain
Afavorable outcome to the ongoing research, one that would fling the doors to brain stimulation wide open for patients with myriad disorders, is far from guaranteed. Chronic pain researchers know that firsthand.

Chronic pain, among the most mysterious and hard-to-study medical phenomena, was the first use for which the FDA approved deep brain stimulation, said Prasad Shirvalkar, an assistant professor of anesthesiology at UCSF. But when studies didn’t pan out after a year, the FDA retracted its approval.

Shirvalkar is working with Starr and neurosurgeon Edward Chang on a profoundly complex problem: “decoding pain in the brain states, which has never been done,” as Starr told STAT.

Part of the difficulty of studying pain is that there is no objective way to measure it. Much of what we know about pain is from rudimentary surveys that ask patients to rate how much they’re hurting, on a scale from zero to 10.

Using implantable brain stimulation devices, the researchers ask patients for a 0-to-10 rating of their pain while recording up-and-down cycles of activity in the brain. They then use machine learning to compare the two streams of information and see what brain activity correlates with a patient’s subjective pain experience. Implantable devices let researchers collect data over weeks and months, instead of basing findings on small snippets of information, allowing for a much richer analysis.

 

Related News

View more

Can the Electricity Industry Seize Its Resilience Moment?

Hurricane Grid Resilience examines how utilities manage outages with renewables, microgrids, and robust transmission and distribution systems, balancing solar, wind, and batteries to restore service, harden infrastructure, and improve storm response and recovery.

 

Key Points

Hurricane grid resilience is a utility approach to withstand storms, reduce outages, and speed safe power restoration.

✅ Focus on T&D hardening, vegetation management, remote switching

✅ Balance generation mix; integrate solar, wind, batteries, microgrids

✅ Plan 12-hour shifts; automate forecasting and outage restoration

 

When operators of Duke Energy's control room in Raleigh, North Carolina wait for a hurricane, the mood is often calm in the hours leading up to the storm.

“Things are usually fairly quiet before the activity starts,” said Mark Goettsch, the systems operations manager at Duke. “We’re anxiously awaiting the first operation and the first event. Once that begins, you get into storm mode.”

Then begins a “frenzied pace” that can last for days — like when Hurricane Florence parked over Duke’s service territory in September.

When an event like Florence hits, all eyes are on transmission and distribution. Where it’s available, Duke uses remote switching to reconnect customers quickly. As outages mount, the utility forecasts and balances its generation with electricity demand.

The control center’s four to six operators work 12-hour shifts, while nearby staff members field thousands of calls and alarms on the system. After it’s over, “we still hold our breath a little bit to make sure we’ve operated everything correctly,” said Goettsch. Damage assessment and rebuilding can only begin once a storm passes.

That cycle is becoming increasingly common in utility service areas like Duke's.

A slate of natural disasters that reads like a roll call — Willa, Michael, Harvey, Irma, Maria, Florence and Thomas — has forced a serious conversation about resiliency. And though Goettsch has heard a lot about resiliency as a “hot topic” at industry events and meetings, those conversations are only now entering Duke’s control room.

Resilience discussions come and go in the energy industry. Storms like Hurricane Sandy and Matthew can spur a nationwide focus on resiliency, but change is largely concentrated in local areas that experienced the disaster. After a few news cycles, the topic fades into the background.

However, experts agree that resilience is becoming much more important to year-round utility planning and operations as utilities pursue decarbonization goals across their fleets. It's not a fad.

“If you look at the whole ecosystem of utilities and vendors, there’s a sense that there needs to be a more resilient grid,” said Miki Deric, Accenture’s managing director of utilities, transmission and distribution for North America. “Even if they don’t necessarily agree on everything, they are all working with the same objective.”

Can renewables meet the challenge?

After Hurricane Florence, The Intercept reported on coal ash basins washed out by the storm’s overwhelming waters. In advance of that storm, Duke shut down one nuclear plant to protect it from high winds. The Washington Post also recently reported on a slowly leaking oil spill, which could surpass Deepwater Horizon in size, caused by Hurricane Ivan in 2004.

Clean energy boosters have seized on those vulnerabilities.They say solar and wind, which don’t rely on access to fuel and can often generate power immediately after a storm, provide resilience that other electricity sources do not.

“Clearly, logistics becomes a big issue on fossil plants, much more than renewable,” said Bruce Levy, CEO and president at BMR Energy, which owns and operates clean energy projects in the Caribbean and Latin America. “The ancillaries around it — the fuel delivery, fuel storage, water in, water out — are all as susceptible to damage as a renewable plant.”

Duke, however, dismissed the notion that one generation type could beat out another in a serious storm.

“I don’t think any generation source is immune,” said Duke spokesperson Randy Wheeless. “We’ve always been a big supporter of a balanced energy mix, reflecting why the grid isn't 100% renewable in practice today. That’s going to include nuclear and natural gas and solar and renewables as well. We do that because not every day is a good day for each generation source.”

In regard to performance, Wade Schauer, director of Americas Power & Renewables Research at Wood Mackenzie, said the situation is “complex.” According to him, output of solar and wind during a storm depends heavily on the event and its location.

While comprehensive data on generation performance is sparse, Schauer said coal and gas generators could experience outages at 25 percent while stormy weather might cut 95 percent of output from renewables, underscoring clean energy's dirty secret about variability under stress. Ahead of last year’s “bomb cyclone” in New England, WoodMac data shows that wind dropped to less than 1 percent of the supply mix.

“When it comes to resiliency, ‘average performance’ doesn't cut it,” said Schauer.

In the future, he said high winds could impact all U.S. offshore wind farms, since projects are slated for a small geographic area in the Northeast. He also pointed to anecdotal instances of solar arrays in New England taken out by feet of snow. During Florence, North Carolina’s wind farms escaped the highest winds and continued producing electricity throughout. Cloud cover, on the other hand, pushed solar production below average levels.

After Florence passed, Duke reported that most of its solar came online quickly, although four of its utility-owned facilities remained offline for weeks afterward. Only one was because of damage; the other three remained offline due to substation interconnection issues.

“Solar performed pretty well,” said Wheeless. “But did it come out unscathed? No.”

According to installer reports, solar systems fared relatively well in recent storms, even as the Covid-19 impact on renewables constrained projects worldwide. But the industry has also highlighted potential improvements. Following Hurricanes Maria and Irma, the Federal Emergency Management Agency published guidelines for installing and maintaining storm-resistant solar arrays. The document recommended steps such as annual checks for bolt tightness and using microinverters rather than string inverters.

Rocky Mountain Institute (RMI) also assembled a guide for retrofitting and constructing new installations. It described attributes of solar systems that survived storms, like lateral racking supports, and those that failed, like undersized and under-torqued bolts.

“The hurricanes, as much as no one liked them, [were] a real learning experience for folks in our industry,” said BMR’s Levy. “We saw what worked, and what didn’t.”          

Facing the "800-pound gorilla" on the grid

Advocates believe wind, solar, batteries and microgrids offer the most promise because they often rely less on transmitting electricity long distances and could support peer-to-peer energy models within communities.

Most extreme weather outages arise from transmission and distribution problems, not generation issues. Schauer at WoodMac called storm damage to T&D the “800-pound gorilla.”

“I'd be surprised if a single customer power outage was due to generators being offline, especially since loads where so low due to mild temperatures and people leaving the area ahead of the storm,” he said of Hurricane Florence. “Instead, it was wind [and] tree damage to power lines and blown transformers.”

 

Related News

View more

Sign Up for Electricity Forum’s Newsletter

Stay informed with our FREE Newsletter — get the latest news, breakthrough technologies, and expert insights, delivered straight to your inbox.

Electricity Today T&D Magazine Subscribe for FREE

Stay informed with the latest T&D policies and technologies.
  • Timely insights from industry experts
  • Practical solutions T&D engineers
  • Free access to every issue

Live Online & In-person Group Training

Advantages To Instructor-Led Training – Instructor-Led Course, Customized Training, Multiple Locations, Economical, CEU Credits, Course Discounts.

Request For Quotation

Whether you would prefer Live Online or In-Person instruction, our electrical training courses can be tailored to meet your company's specific requirements and delivered to your employees in one location or at various locations.