Ohio Edison to repower coal plant with biomass

By Environmental Expert


NFPA 70b Training - Electrical Maintenance

Our customized live online or in‑person group training can be delivered to your staff at your location.

  • Live Online
  • 12 hours Instructor-led
  • Group Training Available
Regular Price:
$599
Coupon Price:
$499
Reserve Your Seat Today
Ohio Edison Company, a subsidiary of FirstEnergy Corp., has agreed in a consent decree to repower one of its coal-fired power plants using primarily renewable biomass fuels, the Justice Department and U.S. Environmental Protection Agency announced.

In the agreement, filed in federal court in Columbus, Ohio and joined by the states of New York, New Jersey and Connecticut, Ohio Edison will repower the R.E. Burger Units 4 and 5 near Shadyside, Ohio with biomass fuel. The consent decree modifies a 2005 consent decree requiring Ohio Edison to reduce emissions of sulfur dioxide (SO2) and nitrogen oxide (NOx) at several of its coal-fired plants.

The modified consent decree will substantially reduce emissions of SO2 and NOx from Burger's current levels and also reduce carbon dioxide (CO2) emissions from current levels by more than 1.3 million tons a year. Burger will be the largest coal-fired electric utility plant in the country to repower with renewable biomass fuels and the first such plant at which greenhouse gas emissions will be reduced under a Clean Air Act consent decree.

The original 2005 consent decree resolved a lawsuit filed in 1999 under the New Source Review provisions of the Clean Air Act regarding Ohio Edison's W. H. Sammis plant and required that the company reduce SO2 emissions not only at Sammis but also at several of its smaller plants, including Burger. That agreement gave Ohio Edison three options to reduce Burger's SO2 emissions: shut down the plant, install a scrubber, or repower with natural gas.

Under the modified agreement, Ohio Edison will repower Burger beginning in 2012 with mostly biomass fuels, co-firing with not more than 20 percent low sulfur coal, including natural wood from waste tree trimmings and dedicated sustainable nurseries, agricultural crops, grasses and vegetation waste or products.

Following a year of initial operation and optimization, the Burger plant will be subject to enforceable emissions rates for SO2, NOx and particulate matter (PM). Reductions from current levels of SO2 emissions are expected to be as much as 14,000 tons a year; for NOx, as much as 1300 tons a year; and for PM, as much as 700 tons a year.

As a result of this agreement, conversion to biomass fuel combustion is expected to approach “carbon neutrality,” meaning that CO2 emissions released by burning biomass fuel will be offset by the amount of CO2 absorbed from the atmosphere by the wood and vegetation grown to produce the fuel. After offset, Burger is expected to emit approximately 400,000 tons of CO2 emissions a year, based on 20 percent coal co-firing, versus more than 1.7 million tons from coal-fired combustion prior to repowering with biomass fuel.

The adverse effects on the environment of CO2 emissions, particularly from coal-fired power plants, are well-documented. Last April, EPA issued the “Proposed Endangerment and Cause or Contribute Findings for Greenhouse Gases under the Clean Air Act,” which identified the dangers of the current and projected concentrations of the six key greenhouse gases, the most significant being carbon dioxide. In addition, sulfur dioxides, nitrogen oxides and particulate matter cause severe respiratory problems and contribute to childhood asthma. They are also significant contributors to acid rain, smog and haze, which impair visibility in national parks.

“This is a great result for the health and the environment of the nation,” said John C. Cruden, Acting Assistant Attorney General for the Justice Department's Environment and Natural Resources Division. “We are pleased that Ohio Edison has chosen to significantly reduce greenhouse gases and other pollutants from the Burger plant and hope that Ohio Edison will become the standard-bearer for other companies considering conversion to renewable biomass fuels under the auspices of the EPA and state environmental agencies.”

“Today's settlement improves air quality for the local community and reduces greenhouse gas emissions by requiring the use of a renewable, carbon-neutral fuel to generate electricity,” said Cynthia Giles, Assistant Administrator for EPA's Office of Enforcement and Compliance Assurance. “EPA will seek similar commitments from companies to replace coal-fired electric generation with cleaner, renewable energy in future Clean Air Act settlements.”

The consent decree, lodged in the U.S. District Court for the Southern District of Ohio, is subject to a 30-day public comment period and approval by the federal court. A copy of the consent decree is available on the Department of Justice Web site.

Related News

For Hydro-Québec, selling to the United States means reinventing itself

Hydro-Quebec hydropower exports deliver low-carbon electricity to New England, sparking debate on greenhouse gas accounting, grid attributes, and REC-style certificates as Quebec modernizes monitoring to verify emissions, integrate renewables, and meet ambitious climate targets.

 

Key Points

Low-carbon electricity to New England, with improved emissions tracking and verifiable grid attributes.

✅ Deep, narrow reservoirs cut lifecycle GHGs in cold boreal waters

✅ Attribute certificates trace source, type, and carbon intensity

✅ Contracts require facility-level tagging for compliance

 

For 40 years, through the most vicious interprovincial battles, even as proposals for bridging the Alberta-B.C. gap aimed to improve grid resilience, Canadians could agree on one way Quebec is undeniably superior to the rest of the country.

It’s hydropower, and specifically the mammoth dam system in Northern Quebec that has been paying dividends since it was first built in the 70s. “Quebec continues to boast North America’s lowest electricity prices,” was last year’s business-as-usual update in one trade publication, even as Newfoundland's rate strategy seeks relief for consumers.

With climate crisis looming, that long-ago decision earns even more envy and reflects Canada's electricity progress across the grid today. Not only do they pay less, but Quebeckers also emit the least carbon per capita of any province.

It may surprise most Canadians, then, to hear how most of New England has reacted to the idea of being able to buy permanently into Quebec’s power grid.

​​​​​​Hydro-Québec’s efforts to strike major export deals have been rebuffed in the U.S., by environmentalists more than anyone. They question everything about Quebec hydropower, including asking “is it really low-carbon?”

These doubts may sound nonsensical to regular Quebeckers. But airing them has, in fact, pushed Hydro-Québec to learn more about itself and adopt new technology.

We know far more about hydropower than we knew 40 years ago, including whether it’s really zero-emission (it’s not), how to make it as close to zero-emission as possible, and how to account for it as precisely as new clean energies like solar and wind, underscoring how cleaning up Canada's electricity is vital to meeting climate pledges.

The export deals haven’t gone through yet, but they’ve already helped drag Hydro-Québec—roughly the fourth-biggest hydropower system on the planet—into the climate era.

Fighting to export
One of the first signs of trouble for Quebec hydro was in New Hampshire, almost 10 years ago. People there began pasting protest signs on their barns and buildings. One citizens’ group accused Hydro of planning a “monstrous extension cord” across the state.

Similar accusations have since come from Maine, Massachusetts and New York.

The criticism isn’t coming from state governments, which mostly want a more permanent relationship with Hydro-Québec. They already rely on Quebec power, but in a piecemeal way, topping up their own power grid when needed (with the exception of Vermont, which has a small permanent contract for Quebec hydropower).

Last year, Quebec provided about 15 percent of New England’s total power, plus another substantial amount to New York, which is officially not considered to be part of New England, and has its own energy market separate from the New England grid.

Now, northeastern states need an energy lynch pin, rather than a top-up, with existing power plants nearing the end of their lifespans. In Massachusetts, for example, one major nuclear plant shut down this year and another will be retired in 2021. State authorities want a hydro-based energy plan that would send $10 billion to Hydro-Québec over 20 years.

New England has some of North America’s most ambitious climate goals, with every state in the region pledging to cut emissions by at least 80 percent over the next 30 years.

What’s the downside? Ask the citizens’ groups and nonprofits that have written countless op-eds, organized petitions and staged protests. They argue that hydropower isn’t as clean as cutting-edge clean energy such as solar and wind power, and that Hydro-Québec isn’t trying hard enough to integrate itself into the most innovative carbon-counting energy system. Right as these other energy sources finally become viable, they say, it’s a step backwards to commit to hydro.

As Hydro-Québec will point out, many of these critics are legitimate nonprofits, but others may have questionable connections. The Portland Press Herald in Maine reported in September 2018 that a supposedly grassroot citizens’ group called “Stand Up For Maine” was actually funded by the New England Power Generators Association, which is based in Boston and represents such power plant owners as Calpine Corp., Vistra Energy and NextEra Energy.

But in the end, that may not matter. Arguably the biggest motivator to strike these deals comes not from New England’s needs, but from within Quebec. The province has spent more than $10 billion in the last 15 years to expand its dam and reservoir system, and in order to stay financially healthy, it needs to double its revenue in the next 10 years—a plan that relies largely on exports.

With so much at stake, it has spent the last decade trying to prove it can be an energy of the future.

“Learning as you go”
American critics, justified or not, have been forcing advances at Hydro for a long time.

When the famously huge northern Quebec hydro dams were built at James Bay—construction began in the early 1970s—the logic was purely economic. The term “climate change” didn’t exist. The province didn’t even have an environment department.

The only reason Quebec scientists started trying to measure carbon emissions from hydro reservoirs was “basically because of the U.S.,” said Alain Tremblay, a senior environmental advisor at Hydro Quebec.


Alain Tremblay, senior environmental advisor at Hydro-Québec. Photograph courtesy of Hydro-Québec
In the early 1990s, Hydro began to export power to the U.S., and “because we were a good company in terms of cost and efficiency, some Americans didn't like that,” he said—mainly competitors, though he couldn’t say specifically who. “They said our reservoirs were emitting a lot of greenhouse gases.”

The detractors had no research to back up that claim, but Hydro-Québec had none to refute it, either, said Tremblay. “At that time we didn’t have any information, but from back-of-the envelope calculations, it was impossible to have the emissions the Americans were expecting we have.”

So research began, first to design methods to take the measurements, and then to carry them out. Hydro began a five-year project with a Quebec university.

It took about 10 years to develop a solid methodology, Tremblay said, with “a lot of error and learning-as-you-go.” There have been major strides since then.

“Twenty years ago we were taking a sample of water, bringing it back to the lab and analyzing that with what we call a gas chromatograph,” said Tremblay. “Now, we have an automated system that can measure directly in the water,” reading concentrations of CO2 and methane every three hours and sending its data to a processing centre.

The tools Hydro-Québec uses are built in California. Researchers around the world now follow the same standard methods.

At this point, it’s common knowledge that hydropower does emit greenhouse gases. Experts know these emissions are much higher than previously thought.

Workers on the Eastmain-1 project environmental monitoring program. Photography courtesy of Alain Tremblay.
​But Hydro-Québec now has the evidence, also, to rebut the original accusations from the early 1990s and many similar ones today.

“All our research from Université Laval [found] that it’s about a thousand years before trees decompose in cold Canadian waters,” said Tremblay.

Hydro reservoirs emit greenhouse gases because vegetation and sometimes other biological materials, like soil runoff, decay under the surface.

But that decay depends partly on the warmth of the water. In tropical regions, including the southern U.S., hydro dams can have very high emissions. But in boreal zones like northern Quebec (or Manitoba, Labrador and most other Canadian locations with massive hydro dams), the cold, well-oxygenated water vastly slows the process.

Hydro emissions have “a huge range,” said Laura Scherer, an industrial ecology professor at Leiden University in the Netherlands who led a study of almost 1,500 hydro dams around the world.

“It can be as low as other renewable energy sources, but it can also be as high as fossil fuel energy,” in rare cases, she said.

While her study found that climate was important, the single biggest factor was “sizing and design” of each dam, and specifically its shape, she said. Ideally, hydro dams should be deep and narrow to minimize surface area, perhaps using a natural valley.

Hydro-Québec’s first generation of dams, the ones around James Bay, were built the opposite way—they’re wide and shallow, infamously flooding giant tracts of land.


Alain Tremblay, senior environmental advisor at Hydro-Québec testing emission levels. Photography courtesy of Alain Tremblay
Newly built ones take that new information into account, said Tremblay. Its most recent project is the Romaine River complex, which will eventually include four reservoirs near Quebec’s northeastern border with Labrador. Construction began in 2016.

The site was picked partly for its topography, said Tremblay.

“It’s a valley-type reservoir, so large volume, small surface area, and because of that there’s a pretty limited amount of vegetation that’s going to be flooded,” he said.

There’s a dramatic emissions difference with the project built just before that, commissioned in 2006. Called Eastmain, it’s built near James Bay.

“The preliminary results indicate with the same amount of energy generated [by Romaine] as with Eastmain, you’re going to have about 10 times less emissions,” said Tremblay.

Tracing energy to its source
These signs of progress likely won’t satisfy the critics, who have publicly argued back and forth with Hydro about exactly how emissions should be tallied up.

But Hydro-Québec also faces a different kind of growing gap when it comes to accounting publicly for its product. In the New England energy market, a sophisticated system “tags” all the energy in order to delineate exactly how much comes from which source—nuclear, wind, solar, and others—and allows buyers to single out clean power, or at least the bragging rights to say they bought only clean power.

Really, of course, it’s all the same mix of energy—you can’t pick what you consume. But creating certificates prevents energy producers from, in worst-case scenarios, being able to launder regular power through their clean-power facilities. Wind farms, for example, can’t oversell what their own turbines have produced.

What started out as a fraud prevention tool has “evolved to make it possible to also track carbon emissions,” said Deborah Donovan, Massachusetts director at the Acadia Center, a climate-focused nonprofit.

But Hydro-Québec isn’t doing enough to integrate itself into this system, she says.

It’s “the tool that all of our regulators in New England rely on when we are confirming to ourselves that we’ve met our clean energy and our carbon goals. And…New York has a tool just like that,” said Donovan. “There isn’t a tracking system in Canada that’s comparable, though provinces like Nova Scotia are tapping the Western Climate Initiative for technical support.”

Hydro Quebec Chénier-Vignan transmission line crossing the Outaouais river. Photography courtesy of Hydro-Québec
Developing this system is more a question of Canadian climate policy than technology.

Energy companies have long had the same basic tracking device—a meter, said Tanya Bodell, a consultant and expert in New England’s energy market. But in New England, on top of measuring “every time there’s a physical flow of electricity” from a given source, said Bodell, a meter “generates an attribute or a GIS certificate,” which certifies exactly where it’s from. The certificate can show the owner, the location, type of power and its average emissions.

Since 2006, Hydro-Québec has had the ability to attach the same certificates to its exports, and it sometimes does.

“It could be wind farm generation, even large hydro these days—we can do it,” said Louis Guilbault, who works in regulatory affairs at Hydro-Québec. For Quebec-produced wind energy, for example, “I can trade those to whoever’s willing to buy it,” he said.

But, despite having the ability, he also has the choice not to attach a detailed code—which Hydro doesn’t do for most of its hydropower—and to have it counted instead under the generic term of “system mix.”

Once that hydropower hits the New England market, the administrators there have their own way of packaging it. The market perhaps “tries to determine emissions, GHG content,” Guilbault said. “They have their own rules; they do their own calculations.”

This is the crux of what bothers people like Donovan and Bodell. Hydro-Québec is fully meeting its contractual obligations, since it’s not required to attach a code to every export. But the critics wish it would, whether by future obligation or on its own volition.

Quebec wants it both ways, Donovan argued; it wants the benefits of selling low-emission energy without joining the New England system of checks and balances.

“We could just buy undifferentiated power and be done with it, but we want carbon-free power,” Donovan said. “We’re buying it because of its carbon content—that’s the reason.”

Still, the requirements are slowly increasing. Under Hydro-Québec’s future contract with Massachusetts (which still has several regulatory steps to go through before it’s approved) it’s asked to sell the power’s attributes, not just the power itself. That means that, at least on paper, Massachusetts wants to be able to trace the energy back to a single location in Quebec.

“It’s part of the contract we just signed with them,” said Guilbault. “We’re going to deliver those attributes. I’m going to select a specific hydro facility, put the number in...and transfer that to the buyers.”

Hydro-Québec says it’s voluntarily increasing its accounting in other ways. “Even though this is not strictly required,” said spokeswoman Lynn St. Laurent, Hydro is tracking its entire output with a continent-wide registry, the North American Renewables Registry.

That registry is separate from New England’s, so as far as Bodell is concerned, the measure doesn’t really help. But she and others also expect the entire tracking system to grow and mature, perhaps integrating into one. If it had been created today, in fact, rather than in the 1990s, maybe it would use blockchain technology rather than a varied set of administrators, she said.

Counting emissions through tracking still has a long way to go, as well, said Donovan, and it will increasingly matter in Canada's race to net-zero as standards tighten. For example, natural gas is assigned an emissions number that’s meant to reflect the emissions when it’s consumed. But “we do not take into account what the upstream carbon emissions are through the pipeline leakage, methane releases during fracking, any of that,” she said.

Now that the search for exactitude has begun, Hydro-Québec won’t be exempt, whether or not Quebeckers share that curiosity. “We don’t know what Hydro-Québec is doing on the other side of the border,” said Donovan.

 

Related News

View more

Michigan Public Service Commission grants Consumers Energy request for more wind generation

Consumers Energy Wind Expansion gains MPSC approval in Michigan, adding up to 525 MW of wind power, including Gratiot Farms, while solar capacity requests face delays over cost projections under the renewable portfolio standard targets.

 

Key Points

A regulatory-approved plan enabling Consumers Energy to add 525 MW of wind while solar additions await cost review.

✅ MPSC approves up to 525 MW in new wind projects

✅ Gratiot Farms purchase allowed before May 1

✅ Solar request delayed over high cost projections

 

Consumers Energy Co.’s efforts to expand its renewable offerings gained some traction this week when the Michigan Public Service Commission (MPSC) approved a request for additional wind generation capacity.

Consumers had argued that both more wind and solar facilities are needed to meet the state’s renewable portfolio standard, which was expanded in 2016 to encompass 12.5 percent of the retail power of each Michigan electric provider. Those figures will continue to rise under the law through 2021 when the figure reaches 15 percent, alongside ongoing electricity market reforms discussions. However, Consumers’ request for additional solar facilities was delayed at this time due to what the Commission labeled unrealistically high-cost projections.

Consumers will be able to add as much as 525 megawatts of new wind projects amid a shifting wind market, including two proposed 175-megawatt wind projects slated to begin operation this year and next. Consumers has also been allowed to purchase the Gratiot Farms Wind Project before May 1.

The MPSC said a final determination would be made on Consumers’ solar requests during a decision in April. Consumers had sought an additional 100 megawatts of solar facilities, hoping to get them online sometime in 2024 and 2025.

 

Related News

View more

Canadian Electricity Grids Increasingly Exposed to Harsh Weather

North American Grid Reliability faces extreme weather, climate change, demand spikes, and renewable variability; utilities, AESO, and NERC stress resilience, dispatchable capacity, interconnections, and grid alerts to prevent blackouts during heatwaves and cold snaps.

 

Key Points

North American grid reliability is the ability to meet demand during extreme weather while maintaining stability.

✅ Extreme heat and cold drive record demand and resource strain.

✅ Balance dispatchable and intermittent generation for resilience.

✅ Expand interconnections, capacity, and demand response to avert outages.

 

The recent alerts in Alberta's electricity grid during extreme cold have highlighted a broader North American issue, where power systems are more susceptible to being overwhelmed by extreme weather impacts on reliability.

Electricity Canada's chief executive emphasized that no part of the grid is safe from the escalating intensity and frequency of weather extremes linked to climate change across the sector.

“In recent years, during these extreme weather events, we’ve observed record highs in electricity demand,” he stated.

“It’s a nationwide phenomenon. For instance, last summer in Ontario and last winter in Quebec, we experienced unprecedented demand levels. This pattern of extremes is becoming more pronounced across the country.”

The U.S. has also experienced strain on its electricity grids due to extreme weather, with more blackouts than peers documented in studies. Texas faced power outages in 2021 due to winter storms, and California has had to issue several emergency grid alerts during heat waves.

In Canada, Albertans received a government emergency alert two weeks ago, urging an immediate reduction in electricity use to prevent potential rotating blackouts as temperatures neared -40°C. No blackouts occurred, with a notable decrease in electricity use following the alert, according to the Alberta Electric System Operator (AESO).

AESO's data indicates an increase in grid alerts in Alberta for both heatwaves and cold spells, reflecting dangerous vulnerabilities noted nationwide. The period between 2017 and 2020 saw only four alerts, in contrast to 17 since 2021.

Alberta's electricity grid reliability has sparked political debate, including proposals for a western Canadian grid to improve reliability, particularly with the transition from coal-fired plants to increased reliance on intermittent wind and solar power. Despite this debate, the AESO noted that the crisis eased when wind and solar generation resumed, despite challenges with two idled gas plants.

Bradley pointed out that Alberta's grid issues are not isolated. Every Canadian region is experiencing growing electricity demand, partly due to the surge in electric vehicles and clean energy technologies. No province has a complete solution yet.

“Ontario has had to request reduced consumption during heatwaves,” he noted. “Similar concerns about energy mix are present in British Columbia or Manitoba, especially now with drought affecting their hydro-dependent systems.”

The North American Electric Reliability Corporation (NERC) released a report in November warning of elevated risks across North America this winter for insufficient energy supplies, particularly under extreme conditions like prolonged cold snaps.

While the U.S. is generally more susceptible to winter grid disruptions, and summer blackout warnings remain a concern, the report also highlights risks in parts of Canada. Saskatchewan faces a “high” risk due to increased demand, power plant retirements, and maintenance, whereas Quebec and the Maritimes are at “elevated risk.”

Mark Olson, NERC’s manager of reliability assessments, mentioned that Alberta wasn't initially considered at risk, illustrating the challenges in predicting electricity demand amid intensifying extreme weather.

Rob Thornton, president and CEO of the International District Energy Association, acknowledged public concerns about grid alerts but reassured that the risk of a catastrophic grid failure remains very low.

“The North American grid is exceptionally reliable. It’s a remarkably efficient system,” he said.

However, Thornton emphasized the importance of policies for a resilient and reliable electricity system through 2050 and beyond. This involves balancing dispatchable and intermittent electricity sources, investing in extra capacity, enhancing macrogrids and inter-jurisdictional connections, and more.

“These grid alerts raise awareness, if not anxiety, about our energy future,” Thornton concluded.

 

Related News

View more

BNEF Report: Wind and Solar Will Provide 50% of Electricity in 2050

BNEF 2019 New Energy Outlook projects surging renewable energy demand, aggressive decarbonization, wind and solar cost declines, battery storage growth, coal phase-out, and power market reform to meet Paris Agreement targets through 2050.

 

Key Points

Bloomberg's NEO 2019 forecasts power demand, renewables growth, and decarbonization pathways through 2050.

✅ Predicts wind/solar to ~50% of global electricity by 2050

✅ Foresees coal decline; Asia transitions slower than Europe

✅ Calls for power market reform and battery integration

 

In a report that examines the ways in which renewable energy demand is expected to increase, Bloomberg New Energy Finance (BNEF) finds that “aggressive decarbonization” will be required beyond 2030 to meet the temperature goals of the Paris Agreement on climate change.

Focusing on electricity, BNEF’s 2019 New Energy Outlook (NEO) predicts a 62% increase in global power demand, leading to global generating capacity tripling between now and 2050, when wind and solar are expected to make up almost 50% of world electricity, as wind and solar gains indicate, due to decreasing costs.

The report concludes that coal will collapse everywhere except Asia, and, by 2032, there will be more wind and solar electricity than coal-fired electricity. It forecasts that coal’s role in the global power mix will decrease from 37% today, as renewables surpass 30% globally, to 12% by 2050 with the virtual elimination of oil as a power-generating source.

Highlighting regional differences, the report finds that:

Western European economies are already on a strong decarbonization path due to carbon pricing and strong policy support, with offshore wind costs dropping bolstering progress;

by 2040, renewables will comprise 90% of the electricity mix in Europe, with wind and solar accounting for 80%;

the US, with low-priced natural gas, and China, with its coal-fired plants, will transition more slowly even as 30% from wind and solar becomes feasible; and

China’s power sector emissions will peak in 2026 and then fall by more than half over the next 20 years, as solar PV growth accelerates, with wind and solar increasing from 8% to 48% of total electricity generation by 2050.

Power markets must be reformed to ensure wind, solar and batteries are properly remunerated for their contributions to the grid.

The 2019 report finds that wind and solar now represent the cheapest option for adding new power-generating capacity in much of the world, amid record-setting momentum, which is expected to attract USD 13.3 trillion in new investment. While solar, wind, batteries and other renewables are expected to attract USD 10 trillion in investment by 2050, the report warns that curbing emissions will require other technologies as well.

Speaking about the report, Matthias Kimmel, NEO 2019 lead analyst, said solar photovoltaic modules, wind turbines and lithium-ion batteries are set to continue on aggressive cost reduction curves of 28%, 14% and 18%, respectively, for every doubling in global installed capacity. He explained that by 2030, energy generated or stored and dispatched by these technologies will undercut electricity generated by existing coal and gas plants.

To achieve this level of transition and decarbonization, the report stresses, power markets must be reformed to ensure wind, solar and batteries are “properly remunerated for their contributions to the grid.”

Additionally, the 2019 NEO includes a number of updates such as:

  • new scenarios on global warming of 2°C above preindustrial levels, electrified heat and road transport, and an updated coal phase-out scenario;
  • new sections on coal and gas power technology, the future grid, energy access, and costs related to decarbonization technology such as carbon capture and storage (CCS), biogas, hydrogen fuel cells, nuclear and solar thermal;
  • sub-national results for China;
  • the addition of commercial electric vehicles;
  • an expanded air-conditioning analysis; and
  • modeling of Brazil, Mexico, Chile, Turkey and Southeast Asia in greater detail.

Every year, the NEO compares the costs of competing energy technologies, informing projections like US renewables at one-fourth in the near term. The 2019 report brought together 65 market and technology experts from 12 countries to provide their views on how the market might evolve.

 

Related News

View more

Jolting the brain's circuits with electricity is moving from radical to almost mainstream therapy

Brain Stimulation is transforming neuromodulation, from TMS and DBS to closed loop devices, targeting neural circuits for addiction, depression, Parkinsons, epilepsy, and chronic pain, powered by advanced imaging, AI analytics, and the NIH BRAIN Initiative.

 

Key Points

Brain stimulation uses pulses to modulate neural circuits, easing symptoms in depression, Parkinsons, and epilepsy.

✅ Noninvasive TMS and invasive DBS modulate specific brain circuits

✅ Closed loop systems adapt stimulation via real time biomarker detection

✅ Emerging uses: addiction, depression, Parkinsons, epilepsy, chronic pain

 

In June 2015, biology professor Colleen Hanlon went to a conference on drug dependence. As she met other researchers and wandered around a glitzy Phoenix resort’s conference rooms to learn about the latest work on therapies for drug and alcohol use disorders, she realized that out of the 730 posters, there were only two on brain stimulation as a potential treatment for addiction — both from her own lab at Wake Forest School of Medicine.

Just four years later, she would lead 76 researchers on four continents in writing a consensus article about brain stimulation as an innovative tool for addiction. And in 2020, the Food and Drug Administration approved a transcranial magnetic stimulation device to help patients quit smoking, a milestone for substance use disorders.

Brain stimulation is booming. Hanlon can attend entire conferences devoted to the study of what electrical currents do—including how targeted stimulation can improve short-term memory in older adults—to the intricate networks of highways and backroads that make up the brain’s circuitry. This expanding field of research is slowly revealing truths of the brain: how it works, how it malfunctions, and how electrical impulses, precisely targeted and controlled, might be used to treat psychiatric and neurological disorders.

In the last half-dozen years, researchers have launched investigations into how different forms of neuromodulation affect addiction, depression, loss-of-control eating, tremor, chronic pain, obsessive compulsive disorder, Parkinson’s disease, epilepsy, and more. Early studies have shown subtle electrical jolts to certain brain regions could disrupt circuit abnormalities — the miscommunications — that are thought to underlie many brain diseases, and help ease symptoms that persist despite conventional treatments.

The National Institute of Health’s massive BRAIN Initiative put circuits front and center, distributing $2.4 billion to researchers since 2013 to devise and use new tools to observe interactions between brain cells and circuits. That, in turn, has kindled interest from the private sector. Among the advances that have enhanced our understanding of how distant parts of the brain talk with one another are new imaging technology and the use of machine learning, much as utilities use AI to adapt to shifting electricity demand, to interpret complex brain signals and analyze what happens when circuits go haywire.

Still, the field is in its infancy, and even therapies that have been approved for use in patients with, for example, Parkinson’s disease or epilepsy, help only a minority of patients, and in a world where electricity drives pandemic readiness expectations can outpace evidence. “If it was the Bible, it would be the first chapter of Genesis,” said Michael Okun, executive director of the Norman Fixel Institute for Neurological Diseases at University of Florida Health.

As brain stimulation evolves, researchers face daunting hurdles, and not just scientific ones. How will brain stimulation become accessible to all the patients who need it, given how expensive and invasive some treatments are? Proving to the FDA that brain stimulation works, and does so safely, is complicated and expensive. Even with a swell of scientific momentum and an influx of funding, the agency has so far cleared brain stimulation for only a handful of limited conditions. Persuading insurers to cover the treatments is another challenge altogether. And outside the lab, researchers are debating nascent issues, such as the ethics of mind control, the privacy of a person’s brain data—concerns that echo efforts to develop algorithms to prevent blackouts during rising ransomware threats—and how to best involve patients in the study of the human brain’s far-flung regions.

Neurologist Martha Morrell is optimistic about the future of brain stimulation. She remembers the shocked reactions of her colleagues in 2004 when she left full-time teaching at Stanford (she still has a faculty appointment as a clinical professor of neurology) to direct clinical trials at NeuroPace, then a young company making neurostimulator systems to potentially treat epilepsy patients.

Related: Once a last resort, this pain therapy is getting a new life amid the opioid crisis
“When I started working on this, everybody thought I was insane,” said Morrell. Nearly 20 years in, she sees a parallel between the story of jolting the brain’s circuitry and that of early implantable cardiac devices, such as pacemakers and defibrillators, which initially “were used as a last option, where all other medications have failed.” Now, “the field of cardiology is very comfortable incorporating electrical therapy, device therapy, into routine care. And I think that’s really where we’re going with neurology as well.”


Reaching a ‘slope of enlightenment’
Parkinson’s is, in some ways, an elder in the world of modern brain stimulation, and it shows the potential as well as the limitations of the technology. Surgeons have been implanting electrodes deep in the brains of Parkinson’s patients since the late 1990s, and in people with more advanced disease since the early 2000s.

In that time, it’s gone through the “hype cycle,” said Okun, the national medical adviser to the Parkinson’s Foundation since 2006. Feverish excitement and overinflated expectations have given way to reality, bringing scientists to a “slope of enlightenment,” he said. They have found deep brain stimulation to be very helpful for some patients with Parkinson’s, rendering them almost symptom-free by calming the shaking and tremors that medications couldn’t. But it doesn’t stop the progression of the disease, or resolve some of the problems patients with advanced Parkinson’s have walking, talking, and thinking.

In 2015, the same year Hanlon found only her lab’s research on brain stimulation at the addiction conference, Kevin O’Neill watched one finger on his left hand start doing something “funky.” One finger twitched, then two, then his left arm started tingling and a feeling appeared in his right leg, like it was about to shake but wouldn’t — a tremor.

“I was assuming it was anxiety,” O’Neill, 62, told STAT. He had struggled with anxiety before, and he had endured a stressful year: a separation, selling his home, starting a new job at a law firm in California’s Bay Area. But a year after his symptoms first began, O’Neill was diagnosed with Parkinson’s.

In the broader energy context, California has increasingly turned to battery storage to stabilize its strained grid.

Related: Psychiatric shock therapy, long controversial, may face fresh restrictions
Doctors prescribed him pills that promote the release of dopamine, to offset the death of brain cells that produce this messenger molecule in circuits that control movement. But he took them infrequently because he worried about insomnia as a side effect. Walking became difficult — “I had to kind of think my left leg into moving” — and the labor lawyer found it hard to give presentations and travel to clients’ offices.

A former actor with an outgoing personality, he developed social anxiety and didn’t tell his bosses about his diagnosis for three years, and wouldn’t have, if not for two workdays in summer 2018 when his tremors were severe and obvious.

O’Neill’s tremors are all but gone since he began deep brain stimulation last May, though his left arm shakes when he feels tense.

It was during that period that he learned about deep brain stimulation, at a support group for Parkinson’s patients. “I thought, ‘I will never let anybody fuss with my brain. I’m not going to be a candidate for that,’” he recalled. “It felt like mad scientist science fiction. Like, are you kidding me?”

But over time, the idea became less radical, as O’Neill spoke to DBS patients and doctors and did his own research, and as his symptoms worsened. He decided to go for it. Last May, doctors at the University of California, San Francisco surgically placed three metal leads into his brain, connected by thin cords to two implants in his chest, just near the clavicles. A month later, he went into the lab and researchers turned the device on.

“That was a revelation that day,” he said. “You immediately — literally, immediately — feel the efficacy of these things. … You go from fully symptomatic to non-symptomatic in seconds.”

When his nephew pulled up to the curb to pick him up, O’Neill started dancing, and his nephew teared up. The following day, O’Neill couldn’t wait to get out of bed and go out, even if it was just to pick up his car from the repair shop.

In the year since, O’Neill’s walking has gone from “awkward and painful” to much improved, and his tremors are all but gone. When he is extra frazzled, like while renovating and moving into his new house overlooking the hills of Marin County, he feels tense and his left arm shakes and he worries the DBS is “failing,” but generally he returns to a comfortable, tremor-free baseline.

O’Neill worried about the effects of DBS wearing off but, for now, he can think “in terms of decades, instead of years or months,” he recalled his neurologist telling him. “The fact that I can put away that worry was the big thing.”

He’s just one patient, though. The brain has regions that are mostly uniform across all people. The functions of those regions also tend to be the same. But researchers suspect that how brain regions interact with one another — who mingles with whom, and what conversation they have — and how those mixes and matches cause complex diseases varies from person to person. So brain stimulation looks different for each patient.

Related: New study revives a Mozart sonata as a potential epilepsy therapy
Each case of Parkinson’s manifests slightly differently, and that’s a bit of knowledge that applies to many other diseases, said Okun, who organized the nine-year-old Deep Brain Stimulation Think Tank, where leading researchers convene, review papers, and publish reports on the field’s progress each year.

“I think we’re all collectively coming to the realization that these diseases are not one-size-fits-all,” he said. “We have to really begin to rethink the entire infrastructure, the schema, the framework we start with.”

Brain stimulation is also used frequently to treat people with common forms of epilepsy, and has reduced the number of seizures or improved other symptoms in many patients. Researchers have also been able to collect high-quality data about what happens in the brain during a seizure — including identifying differences between epilepsy types. Still, only about 15% of patients are symptom-free after treatment, according to Robert Gross, a neurosurgery professor at Emory University in Atlanta.

“And that’s a critical difference for people with epilepsy. Because people who are symptom-free can drive,” which means they can get to a job in a place like Georgia, where there is little public transit, he said. So taking neuromodulation “from good to great,” is imperative, Gross said.


Renaissance for an ancient idea
Recent advances are bringing about what Gross sees as “almost a renaissance period” for brain stimulation, though the ideas that undergird the technology are millenia old. Neuromodulation goes back to at least ancient Egypt and Greece, when electrical shocks from a ray, called the “torpedo fish,” were recommended as a treatment for headache and gout. Over centuries, the fish zaps led to doctors burning holes into the brains of patients. Those “lesions” worked, somehow, but nobody could explain why they alleviated some patients’ symptoms, Okun said.

Perhaps the clearest predecessor to today’s technology is electroconvulsive therapy (ECT), which in a rudimentary and dangerous way began being used on patients with depression roughly 100 years ago, said Nolan Williams, director of the Brain Stimulation Lab at Stanford University.

Related: A new index measures the extent and depth of addiction stigma
More modern forms of brain stimulation came about in the United States in the mid-20th century. A common, noninvasive approach is transcranial magnetic stimulation, which involves placing an electromagnetic coil on the scalp to transmit a current into the outermost layer of the brain. Vagus nerve stimulation (VNS), used to treat epilepsy, zaps a nerve that contributes to some seizures.

The most invasive option, deep brain stimulation, involves implanting in the skull a device attached to electrodes embedded in deep brain regions, such as the amygdala, that can’t be reached with other stimulation devices. In 1997, the FDA gave its first green light to deep brain stimulation as a treatment for tremor, and then for Parkinson’s in 2002 and the movement disorder dystonia in 2003.

Even as these treatments were cleared for patients, though, what was happening in the brain remained elusive. But advanced imaging tools now let researchers peer into the brain and map out networks — a recent breakthrough that researchers say has propelled the field of brain stimulation forward as much as increased funding has, paralleling broader efforts to digitize analog electrical systems across industry. Imaging of both human brains and animal models has helped researchers identify the neuroanatomy of diseases, target brain regions with more specificity, and watch what was happening after electrical stimulation.

Another key step has been the shift from open-loop stimulation — a constant stream of electricity — to closed-loop stimulation that delivers targeted, brief jolts in response to a symptom trigger. To make use of the futuristic technology, labs need people to develop artificial intelligence tools, informed by advances in machine learning for the energy transition, to interpret large data sets a brain implant is generating, and to tailor devices based on that information.

“We’ve needed to learn how to be data scientists,” Morrell said.

Affinity groups, like the NIH-funded Open Mind Consortium, have formed to fill that gap. Philip Starr, a neurosurgeon and developer of implantable brain devices at the University of California at San Francisco Health system, leads the effort to teach physicians how to program closed-loop devices, and works to create ethical standards for their use. “There’s been extraordinary innovation after 20 years of no innovation,” he said.

The BRAIN Initiative has been critical, several researchers told STAT. “It’s been a godsend to us,” Gross said. The NIH’s Brain Research through Advancing Innovative Neurotechnologies (BRAIN) Initiative was launched in 2013 during the Obama administration with a $50 million budget. BRAIN now spends over $500 million per year. Since its creation, BRAIN has given over 1,100 awards, according to NIH data. Part of the initiative’s purpose is to pair up researchers with medical technology companies that provide human-grade stimulation devices to the investigators. Nearly three dozen projects have been funded through the investigator-devicemaker partnership program and through one focused on new implantable devices for first-in-human use, according to Nick Langhals, who leads work on neurological disorders at the initiative.

The more BRAIN invests, the more research is spawned. “We learn more about what circuits are involved … which then feeds back into new and more innovative projects,” he said.

Many BRAIN projects are still in early stages, finishing enrollment or small feasibility studies, Langhals said. Over the next couple of years, scientists will begin to see some of the fruits of their labor, which could lead to larger clinical trials, or to companies developing more refined brain stimulation implants, Langhals said.

Money from the National Institutes of Mental Health, as well as the NIH’s Helping to End Addiction Long-term (HEAL), has similarly sweetened the appeal of brain stimulation, both for researchers and industry. “A critical mass” of companies interested in neuromodulation technology has mushroomed where, for two decades, just a handful of companies stood, Starr said.

More and more, pharmaceutical and digital health companies are looking at brain stimulation devices “as possible products for their future,” said Linda Carpenter, director of the Butler Hospital TMS Clinic and Neuromodulation Research Facility.


‘Psychiatry 3.0’
The experience with using brain stimulation to stop tremors and seizures inspired psychiatrists to begin exploring its use as a potentially powerful therapy for healing, or even getting ahead of, mental illness.

In 2008, the FDA approved TMS for patients with major depression who had tried, and not gotten relief from, drug therapy. “That kind of opened the door for all of us,” said Hanlon, a professor and researcher at the Center for Research on Substance Use and Addiction at Wake Forest School of Medicine. The last decade saw a surge of research into how TMS could be used to reset malfunctioning brain circuits involved in anxiety, depression, obsessive-compulsive disorder, and other conditions.

“We’re certainly entering into what a lot of people are calling psychiatry 3.0,” Stanford’s Williams said. “Whereas the first iteration was Freud and all that business, the second one was the psychopharmacology boom, and this third one is this bit around circuits and stimulation.”

Drugs alleviate some patients’ symptoms while simultaneously failing to help many others, but psychopharmacology clearly showed “there’s definitely a biology to this problem,” Williams said — a biology that in some cases may be more amenable to a brain stimulation.

Related: Largest psilocybin trial finds the psychedelic is effective in treating serious depression
The exact mechanics of what happens between cells when brain circuits … well, short-circuit, is unclear. Researchers are getting closer to finding biomarkers that warn of an incoming depressive episode, or wave of anxiety, or loss of impulse control. Those brain signatures could be different for every patient. If researchers can find molecular biomarkers for psychiatric disorders — and find ways to preempt those symptoms by shocking particular brain regions — that would reshape the field, Williams said.

Not only would disease-specific markers help clinicians diagnose people, but they could help chip away at the stigma that paints mental illness as a personal or moral failing instead of a disease. That’s what happened for epilepsy in the 1960s, when scientific findings nudged the general public toward a deeper understanding of why seizures happen, and it’s “the same trajectory” Williams said he sees for depression.

His research at the Stanford lab also includes work on suicide, and obsessive-compulsive disorder, which the FDA said in 2018 could be treated using noninvasive TMS. Williams considers brain stimulation, with its instantaneity, to be a potential breakthrough for urgent psychiatric situations. Doctors know what to do when a patient is rushed into the emergency room with a heart attack or a stroke, but there is no immediate treatment for psychiatric emergencies, he said. Williams wonders: What if, in the future, a suicidal patient could receive TMS in the emergency room and be quickly pulled out of their depressive mental spiral?

Researchers are also actively investigating the brain biology of addiction. In August 2020, the FDA approved TMS for smoking cessation, the first such OK for a substance use disorder, which is “really exciting,” Hanlon said. Although there is some nuance when comparing substance use disorders, a primal mechanism generally defines addiction: the eternal competition between “top-down” executive control functions and “bottom-up” cravings. It’s the same process that is at work when one is deciding whether to eat another cookie or abstain — just exacerbated.

Hanlon is trying to figure out if the stop and go circuits are in the same place for all people, and whether neuromodulation should be used to strengthen top-down control or weaken bottom-up cravings. Just as brain stimulation can be used to disrupt cellular misfiring, it could also be a tool for reinforcing helpful brain functions, or for giving the addicted brain what it wants in order to curb substance use.

Evidence suggests many people with schizophrenia smoke cigarettes (a leading cause of early death for this population) because nicotine reduces the “hyperconnectivity” that characterizes the brains of people with the disease, said Heather Ward, a research fellow at Boston’s Beth Israel Deaconess Medical Center. She suspects TMS could mimic that effect, and therefore reduce cravings and some symptoms of the disease, and she hopes to prove that in a pilot study that is now enrolling patients.

If the scientific evidence proves out, clinicians say brain stimulation could be used alongside behavioral therapy and drug-based therapy to treat substance use disorders. “In the end, we’re going to need all three to help people stay sober,” Hanlon said. “We’re adding another tool to the physician’s toolbox.”

Decoding the mysteries of pain
Afavorable outcome to the ongoing research, one that would fling the doors to brain stimulation wide open for patients with myriad disorders, is far from guaranteed. Chronic pain researchers know that firsthand.

Chronic pain, among the most mysterious and hard-to-study medical phenomena, was the first use for which the FDA approved deep brain stimulation, said Prasad Shirvalkar, an assistant professor of anesthesiology at UCSF. But when studies didn’t pan out after a year, the FDA retracted its approval.

Shirvalkar is working with Starr and neurosurgeon Edward Chang on a profoundly complex problem: “decoding pain in the brain states, which has never been done,” as Starr told STAT.

Part of the difficulty of studying pain is that there is no objective way to measure it. Much of what we know about pain is from rudimentary surveys that ask patients to rate how much they’re hurting, on a scale from zero to 10.

Using implantable brain stimulation devices, the researchers ask patients for a 0-to-10 rating of their pain while recording up-and-down cycles of activity in the brain. They then use machine learning to compare the two streams of information and see what brain activity correlates with a patient’s subjective pain experience. Implantable devices let researchers collect data over weeks and months, instead of basing findings on small snippets of information, allowing for a much richer analysis.

 

Related News

View more

Why Canada should invest in "macrogrids" for greener, more reliable electricity

Canadian electricity transmission enables grid resilience, long-distance power trade, and decarbonization by integrating renewables, hydroelectric storage, and HVDC links, providing backup during extreme weather and lowering costs to reach net-zero, clean energy targets.

 

Key Points

An interprovincial high-voltage grid that shares clean power to deliver reliable, low-cost decarbonization.

✅ Enables resilience by sharing power across weather zones

✅ Integrates renewables with hydro storage via HVDC links

✅ Lowers decarbonization costs through interprovincial trade

 

As the recent disaster in Texas showed, climate change requires electricity utilities to prepare for extreme events. This “global weirding” is leaving Canadian electricity grids increasingly exposed to harsh weather that leads to more intense storms, higher wind speeds, heatwaves and droughts that can threaten the performance of electricity systems.

The electricity sector must adapt to this changing climate while also playing a central role in mitigating climate change. Greenhouse gas emissions can be reduced a number of ways, but the electricity sector is expected to play a central role in decarbonization, including powering a net-zero grid by 2050 across Canada. Zero-emissions electricity can be used to electrify transportation, heating and industry and help achieve emissions reduction in these sectors.

Enhancing long-distance transmission is viewed as a cost-effective way to enable a clean and reliable power grid, and to lower the cost of meeting our climate targets. Now is the time to strengthen transmission links in Canada, with concepts like a western Canadian electricity grid gaining traction.


Insurance for climate extremes
An early lesson from the Texas power outages is that extreme conditions can lead to failures across all forms of power supply. The state lost the capacity to generate electricity from natural gas, coal, nuclear and wind simultaneously. But it also lacked cross-border transmission to other electricity systems that could have bolstered supply.

Join thousands of Canadians who subscribe to free evidence-based news.
Long-distance transmission offers the opportunity to escape the correlative clutch of extreme weather, by accessing energy and spare capacity in areas not beset by the same weather patterns. For example, while Texas was in its deep freeze, relatively balmy conditions in California meant there was a surplus of electricity generation capability in that region — but no means to get it to Texas. Building new transmission lines and connections across broader regions, including projects like a hydropower line to New York that expand access, can act as an insurance policy, providing a back-up for regions hit by the crippling effects of climate change.

A transmission tower crumpled under the weight of ice.
The 1998 Quebec ice storm left 3.5 million Quebecers and a million Ontarians, as well as thousands in in New Brunswick, without power. CP Photo/Robert Galbraith
Transmission is also vulnerable to climate disruptions, such as crippling ice storms that leave wires temporarily inoperable. This may mean using stronger poles when building transmission, or burying major high-voltage transmission links, or deploying superconducting cables to reduce losses.

In any event, more transmission links between regions can improve resilience by co-ordinating supply across larger regions. Well-connected grids that are larger than the areas disrupted by weather systems can be more resilient to climate extremes.


Lowering the cost of clean power
Adding more transmission can also play a role in mitigating climate change. Numerous studies have found that building a larger transmission grid allows for greater shares of renewables onto the grid, ultimately lowering the overall cost of electricity.

In a recent study, two of us looked at the role transmission could play in lowering greenhouse gas emissions in Canada’s electricity sector. We found the cost of reducing greenhouse gas emissions is lower when new or enhanced transmission links can be built between provinces.

Average cost increase to electricity in Canada at different levels of decarbonization, with new transmission (black) and without new transmission (red). New transmission lowers the cost of reducing greenhouse gas emissions. (Authors), Author provided
Much of the value of transmission in these scenarios comes from linking high-quality wind and solar resources with flexible zero-emission generation that can produce electricity on demand. In Canada, our system is dominated by hydroelectricity, but most of this hydro capacity is located in five provinces: British Columbia, Manitoba, Ontario, Québec and Newfoundland and Labrador.

In the west, Alberta and Saskatchewan are great locations for building low-cost wind and solar farms. Enhanced interprovincial transmission would allow Alberta and Saskatchewan to build more variable wind and solar, with the assurance that they could receive backup power from B.C. and Manitoba when the wind isn’t blowing and the sun isn’t shining.

When wind and solar are plentiful, the flow of low cost energy can reverse to allow B.C. and Manitoba the opportunity to better manage their hydro reservoir levels. Provinces can only benefit from trading with each other if we have the infrastructure to make that trade possible.

A recent working paper examined the role that new transmission links could play in decarbonizing the B.C. and Alberta electricity systems. We again found that enabling greater electricity trade between B.C. and Alberta can reduce the cost of deep cuts to greenhouse gas emissions by billions of dollars a year. Although we focused on the value of the Site C project, in the context of B.C.'s clean energy shift, the analysis showed that new transmission would offer benefits of much greater value than a single hydroelectric project.

The value of enabling new transmission links between Alberta and B.C. as greenhouse gas emissions reductions are pursued. (Authors), Author provided
Getting transmission built
With the benefits that enhanced electricity transmission links can provide, one might think new projects would be a slam dunk. But there are barriers to getting projects built.

First, electricity grids in Canada are managed at the provincial level, most often by Crown corporations. Decisions by the Crowns are influenced not simply by economics, but also by political considerations. If a transmission project enables greater imports of electricity to Saskatchewan from Manitoba, it raises a flag about lost economic development opportunity within Saskatchewan. Successful transmission agreements need to ensure a two-way flow of benefits.

Second, transmission can be expensive. On this front, the Canadian government could open up the purse strings to fund new transmission links between provinces. It has already shown a willingness to do so.

Lastly, transmission lines are long linear projects, not unlike pipelines. Siting transmission lines can be contentious, even when they are delivering zero-emissions electricity. Using infrastructure corridors, such as existing railway right of ways or the proposed Canadian Northern Corridor, could help better facilitate co-operation between regions and reduce the risks of siting transmission lines.

If Canada can address these barriers to transmission, we should find ourselves in an advantageous position, where we are more resilient to climate extremes and have achieved a lower-cost, zero-emissions electricity grid.

 

Related News

View more

Sign Up for Electricity Forum’s Newsletter

Stay informed with our FREE Newsletter — get the latest news, breakthrough technologies, and expert insights, delivered straight to your inbox.

Electricity Today T&D Magazine Subscribe for FREE

Stay informed with the latest T&D policies and technologies.
  • Timely insights from industry experts
  • Practical solutions T&D engineers
  • Free access to every issue

Live Online & In-person Group Training

Advantages To Instructor-Led Training – Instructor-Led Course, Customized Training, Multiple Locations, Economical, CEU Credits, Course Discounts.

Request For Quotation

Whether you would prefer Live Online or In-Person instruction, our electrical training courses can be tailored to meet your company's specific requirements and delivered to your employees in one location or at various locations.