Areva, New Brunswick reveal nuclear plans

By CBC News


Electrical Testing & Commissioning of Power Systems

Our customized live online or in‑person group training can be delivered to your staff at your location.

  • Live Online
  • 12 hours Instructor-led
  • Group Training Available
Regular Price:
$599
Coupon Price:
$499
Reserve Your Seat Today
New Brunswick has signed a letter of intent to look at building a second reactor at NB Power's Point Lepreau nuclear plant.

Premier Shawn Graham, Energy Minister Jack Keir and officials from French nuclear giant Areva made the announcement in Saint John.

"The New Brunswick government recognizes the integral role the energy sector has in growing our economy," said Graham in a media release.

"Although this announcement is just a first step, a project of this magnitude would create 8,500 direct and indirect jobs for New Brunswickers in all regions of our province."

The potential light water reactor may also come with a proposed clean energy park that may also include offshore wind, biomass and solar power generation.

Graham emphasized during the news conference that the proposal will carry no financial risk for the provincial government.

Areva must find the financing if the project is to proceed, according to the premier.

Jacques Besnainou, the chief executive officer of Areva, said he expects the province to help build a business case for the reactor.

But Areva also said that it would be entirely responsible for managing the design, construction and financing.

Besnainou said he believes there is demand both in Atlantic Canada and the northeastern United States for the power that would come from the reactor in 2020.

The letter of intent sets out that NB Power and the provincial government will work "intensively" with the French company to hammer out a detailed agreement by the end of 2010.

NB Power is going to be in charge of operating the unit and for any technical and regulatory support during the licensing phase.

NB Power operates the Point Lepreau station, which is Atlantic Canada's only nuclear reactor. When the station was licensed in the 1970s it was permitted to hold more than one reactor.

Among the terms of the letter of intent is a clause that states it does not form a legal partnership, joint venture, or any legal entity between the two parties.

The French company has 50,000 employees worldwide and 5,000 workers in North America. The company has operated in Canada for 40 years.

This is the second time that Areva has tried to get a foothold in New Brunswick's nuclear industry. Areva lost its previous attempt to Atomic Energy of Canada Ltd. and its consortium of partners in 2007.

The New Brunswick government committed in its 2006 election platform to study the feasibility of a second reactor. AECL and its partners produced a feasibility study showing the reactor project would work and had started lining up potential investors.

The AECL proposal was abandoned recently as the federal Crown nuclear agency was running into significant delays with the $1.4-billion refurbishment project at Point Lepreau.

New Democratic Party Leader Roger Duguay dismissed the announcement as political grandstanding and said calling it "clean energy" was a complete misnomer.

"Those companies like to talk about nuclear, but they don't like to talk about what they're going to do after with the waste," he said.

Duguay also said the track record of Areva should be taken into account, citing a Finnish project that has doubled in cost and has been delayed by more than two years.

"The people of Finland are waiting for four years and already the cost for the construction of that plant is doubled and I don't think we should repeat that experience here in New Brunswick," he said.

The Progressive Conservatives have also blasted the discussions as political opportunism, noting the provincial election is scheduled for Sept. 27. Before losing the 2006 election, the Tories had also committed to studying the feasibility of a second reactor.

Related News

Jolting the brain's circuits with electricity is moving from radical to almost mainstream therapy

Brain Stimulation is transforming neuromodulation, from TMS and DBS to closed loop devices, targeting neural circuits for addiction, depression, Parkinsons, epilepsy, and chronic pain, powered by advanced imaging, AI analytics, and the NIH BRAIN Initiative.

 

Key Points

Brain stimulation uses pulses to modulate neural circuits, easing symptoms in depression, Parkinsons, and epilepsy.

✅ Noninvasive TMS and invasive DBS modulate specific brain circuits

✅ Closed loop systems adapt stimulation via real time biomarker detection

✅ Emerging uses: addiction, depression, Parkinsons, epilepsy, chronic pain

 

In June 2015, biology professor Colleen Hanlon went to a conference on drug dependence. As she met other researchers and wandered around a glitzy Phoenix resort’s conference rooms to learn about the latest work on therapies for drug and alcohol use disorders, she realized that out of the 730 posters, there were only two on brain stimulation as a potential treatment for addiction — both from her own lab at Wake Forest School of Medicine.

Just four years later, she would lead 76 researchers on four continents in writing a consensus article about brain stimulation as an innovative tool for addiction. And in 2020, the Food and Drug Administration approved a transcranial magnetic stimulation device to help patients quit smoking, a milestone for substance use disorders.

Brain stimulation is booming. Hanlon can attend entire conferences devoted to the study of what electrical currents do—including how targeted stimulation can improve short-term memory in older adults—to the intricate networks of highways and backroads that make up the brain’s circuitry. This expanding field of research is slowly revealing truths of the brain: how it works, how it malfunctions, and how electrical impulses, precisely targeted and controlled, might be used to treat psychiatric and neurological disorders.

In the last half-dozen years, researchers have launched investigations into how different forms of neuromodulation affect addiction, depression, loss-of-control eating, tremor, chronic pain, obsessive compulsive disorder, Parkinson’s disease, epilepsy, and more. Early studies have shown subtle electrical jolts to certain brain regions could disrupt circuit abnormalities — the miscommunications — that are thought to underlie many brain diseases, and help ease symptoms that persist despite conventional treatments.

The National Institute of Health’s massive BRAIN Initiative put circuits front and center, distributing $2.4 billion to researchers since 2013 to devise and use new tools to observe interactions between brain cells and circuits. That, in turn, has kindled interest from the private sector. Among the advances that have enhanced our understanding of how distant parts of the brain talk with one another are new imaging technology and the use of machine learning, much as utilities use AI to adapt to shifting electricity demand, to interpret complex brain signals and analyze what happens when circuits go haywire.

Still, the field is in its infancy, and even therapies that have been approved for use in patients with, for example, Parkinson’s disease or epilepsy, help only a minority of patients, and in a world where electricity drives pandemic readiness expectations can outpace evidence. “If it was the Bible, it would be the first chapter of Genesis,” said Michael Okun, executive director of the Norman Fixel Institute for Neurological Diseases at University of Florida Health.

As brain stimulation evolves, researchers face daunting hurdles, and not just scientific ones. How will brain stimulation become accessible to all the patients who need it, given how expensive and invasive some treatments are? Proving to the FDA that brain stimulation works, and does so safely, is complicated and expensive. Even with a swell of scientific momentum and an influx of funding, the agency has so far cleared brain stimulation for only a handful of limited conditions. Persuading insurers to cover the treatments is another challenge altogether. And outside the lab, researchers are debating nascent issues, such as the ethics of mind control, the privacy of a person’s brain data—concerns that echo efforts to develop algorithms to prevent blackouts during rising ransomware threats—and how to best involve patients in the study of the human brain’s far-flung regions.

Neurologist Martha Morrell is optimistic about the future of brain stimulation. She remembers the shocked reactions of her colleagues in 2004 when she left full-time teaching at Stanford (she still has a faculty appointment as a clinical professor of neurology) to direct clinical trials at NeuroPace, then a young company making neurostimulator systems to potentially treat epilepsy patients.

Related: Once a last resort, this pain therapy is getting a new life amid the opioid crisis
“When I started working on this, everybody thought I was insane,” said Morrell. Nearly 20 years in, she sees a parallel between the story of jolting the brain’s circuitry and that of early implantable cardiac devices, such as pacemakers and defibrillators, which initially “were used as a last option, where all other medications have failed.” Now, “the field of cardiology is very comfortable incorporating electrical therapy, device therapy, into routine care. And I think that’s really where we’re going with neurology as well.”


Reaching a ‘slope of enlightenment’
Parkinson’s is, in some ways, an elder in the world of modern brain stimulation, and it shows the potential as well as the limitations of the technology. Surgeons have been implanting electrodes deep in the brains of Parkinson’s patients since the late 1990s, and in people with more advanced disease since the early 2000s.

In that time, it’s gone through the “hype cycle,” said Okun, the national medical adviser to the Parkinson’s Foundation since 2006. Feverish excitement and overinflated expectations have given way to reality, bringing scientists to a “slope of enlightenment,” he said. They have found deep brain stimulation to be very helpful for some patients with Parkinson’s, rendering them almost symptom-free by calming the shaking and tremors that medications couldn’t. But it doesn’t stop the progression of the disease, or resolve some of the problems patients with advanced Parkinson’s have walking, talking, and thinking.

In 2015, the same year Hanlon found only her lab’s research on brain stimulation at the addiction conference, Kevin O’Neill watched one finger on his left hand start doing something “funky.” One finger twitched, then two, then his left arm started tingling and a feeling appeared in his right leg, like it was about to shake but wouldn’t — a tremor.

“I was assuming it was anxiety,” O’Neill, 62, told STAT. He had struggled with anxiety before, and he had endured a stressful year: a separation, selling his home, starting a new job at a law firm in California’s Bay Area. But a year after his symptoms first began, O’Neill was diagnosed with Parkinson’s.

In the broader energy context, California has increasingly turned to battery storage to stabilize its strained grid.

Related: Psychiatric shock therapy, long controversial, may face fresh restrictions
Doctors prescribed him pills that promote the release of dopamine, to offset the death of brain cells that produce this messenger molecule in circuits that control movement. But he took them infrequently because he worried about insomnia as a side effect. Walking became difficult — “I had to kind of think my left leg into moving” — and the labor lawyer found it hard to give presentations and travel to clients’ offices.

A former actor with an outgoing personality, he developed social anxiety and didn’t tell his bosses about his diagnosis for three years, and wouldn’t have, if not for two workdays in summer 2018 when his tremors were severe and obvious.

O’Neill’s tremors are all but gone since he began deep brain stimulation last May, though his left arm shakes when he feels tense.

It was during that period that he learned about deep brain stimulation, at a support group for Parkinson’s patients. “I thought, ‘I will never let anybody fuss with my brain. I’m not going to be a candidate for that,’” he recalled. “It felt like mad scientist science fiction. Like, are you kidding me?”

But over time, the idea became less radical, as O’Neill spoke to DBS patients and doctors and did his own research, and as his symptoms worsened. He decided to go for it. Last May, doctors at the University of California, San Francisco surgically placed three metal leads into his brain, connected by thin cords to two implants in his chest, just near the clavicles. A month later, he went into the lab and researchers turned the device on.

“That was a revelation that day,” he said. “You immediately — literally, immediately — feel the efficacy of these things. … You go from fully symptomatic to non-symptomatic in seconds.”

When his nephew pulled up to the curb to pick him up, O’Neill started dancing, and his nephew teared up. The following day, O’Neill couldn’t wait to get out of bed and go out, even if it was just to pick up his car from the repair shop.

In the year since, O’Neill’s walking has gone from “awkward and painful” to much improved, and his tremors are all but gone. When he is extra frazzled, like while renovating and moving into his new house overlooking the hills of Marin County, he feels tense and his left arm shakes and he worries the DBS is “failing,” but generally he returns to a comfortable, tremor-free baseline.

O’Neill worried about the effects of DBS wearing off but, for now, he can think “in terms of decades, instead of years or months,” he recalled his neurologist telling him. “The fact that I can put away that worry was the big thing.”

He’s just one patient, though. The brain has regions that are mostly uniform across all people. The functions of those regions also tend to be the same. But researchers suspect that how brain regions interact with one another — who mingles with whom, and what conversation they have — and how those mixes and matches cause complex diseases varies from person to person. So brain stimulation looks different for each patient.

Related: New study revives a Mozart sonata as a potential epilepsy therapy
Each case of Parkinson’s manifests slightly differently, and that’s a bit of knowledge that applies to many other diseases, said Okun, who organized the nine-year-old Deep Brain Stimulation Think Tank, where leading researchers convene, review papers, and publish reports on the field’s progress each year.

“I think we’re all collectively coming to the realization that these diseases are not one-size-fits-all,” he said. “We have to really begin to rethink the entire infrastructure, the schema, the framework we start with.”

Brain stimulation is also used frequently to treat people with common forms of epilepsy, and has reduced the number of seizures or improved other symptoms in many patients. Researchers have also been able to collect high-quality data about what happens in the brain during a seizure — including identifying differences between epilepsy types. Still, only about 15% of patients are symptom-free after treatment, according to Robert Gross, a neurosurgery professor at Emory University in Atlanta.

“And that’s a critical difference for people with epilepsy. Because people who are symptom-free can drive,” which means they can get to a job in a place like Georgia, where there is little public transit, he said. So taking neuromodulation “from good to great,” is imperative, Gross said.


Renaissance for an ancient idea
Recent advances are bringing about what Gross sees as “almost a renaissance period” for brain stimulation, though the ideas that undergird the technology are millenia old. Neuromodulation goes back to at least ancient Egypt and Greece, when electrical shocks from a ray, called the “torpedo fish,” were recommended as a treatment for headache and gout. Over centuries, the fish zaps led to doctors burning holes into the brains of patients. Those “lesions” worked, somehow, but nobody could explain why they alleviated some patients’ symptoms, Okun said.

Perhaps the clearest predecessor to today’s technology is electroconvulsive therapy (ECT), which in a rudimentary and dangerous way began being used on patients with depression roughly 100 years ago, said Nolan Williams, director of the Brain Stimulation Lab at Stanford University.

Related: A new index measures the extent and depth of addiction stigma
More modern forms of brain stimulation came about in the United States in the mid-20th century. A common, noninvasive approach is transcranial magnetic stimulation, which involves placing an electromagnetic coil on the scalp to transmit a current into the outermost layer of the brain. Vagus nerve stimulation (VNS), used to treat epilepsy, zaps a nerve that contributes to some seizures.

The most invasive option, deep brain stimulation, involves implanting in the skull a device attached to electrodes embedded in deep brain regions, such as the amygdala, that can’t be reached with other stimulation devices. In 1997, the FDA gave its first green light to deep brain stimulation as a treatment for tremor, and then for Parkinson’s in 2002 and the movement disorder dystonia in 2003.

Even as these treatments were cleared for patients, though, what was happening in the brain remained elusive. But advanced imaging tools now let researchers peer into the brain and map out networks — a recent breakthrough that researchers say has propelled the field of brain stimulation forward as much as increased funding has, paralleling broader efforts to digitize analog electrical systems across industry. Imaging of both human brains and animal models has helped researchers identify the neuroanatomy of diseases, target brain regions with more specificity, and watch what was happening after electrical stimulation.

Another key step has been the shift from open-loop stimulation — a constant stream of electricity — to closed-loop stimulation that delivers targeted, brief jolts in response to a symptom trigger. To make use of the futuristic technology, labs need people to develop artificial intelligence tools, informed by advances in machine learning for the energy transition, to interpret large data sets a brain implant is generating, and to tailor devices based on that information.

“We’ve needed to learn how to be data scientists,” Morrell said.

Affinity groups, like the NIH-funded Open Mind Consortium, have formed to fill that gap. Philip Starr, a neurosurgeon and developer of implantable brain devices at the University of California at San Francisco Health system, leads the effort to teach physicians how to program closed-loop devices, and works to create ethical standards for their use. “There’s been extraordinary innovation after 20 years of no innovation,” he said.

The BRAIN Initiative has been critical, several researchers told STAT. “It’s been a godsend to us,” Gross said. The NIH’s Brain Research through Advancing Innovative Neurotechnologies (BRAIN) Initiative was launched in 2013 during the Obama administration with a $50 million budget. BRAIN now spends over $500 million per year. Since its creation, BRAIN has given over 1,100 awards, according to NIH data. Part of the initiative’s purpose is to pair up researchers with medical technology companies that provide human-grade stimulation devices to the investigators. Nearly three dozen projects have been funded through the investigator-devicemaker partnership program and through one focused on new implantable devices for first-in-human use, according to Nick Langhals, who leads work on neurological disorders at the initiative.

The more BRAIN invests, the more research is spawned. “We learn more about what circuits are involved … which then feeds back into new and more innovative projects,” he said.

Many BRAIN projects are still in early stages, finishing enrollment or small feasibility studies, Langhals said. Over the next couple of years, scientists will begin to see some of the fruits of their labor, which could lead to larger clinical trials, or to companies developing more refined brain stimulation implants, Langhals said.

Money from the National Institutes of Mental Health, as well as the NIH’s Helping to End Addiction Long-term (HEAL), has similarly sweetened the appeal of brain stimulation, both for researchers and industry. “A critical mass” of companies interested in neuromodulation technology has mushroomed where, for two decades, just a handful of companies stood, Starr said.

More and more, pharmaceutical and digital health companies are looking at brain stimulation devices “as possible products for their future,” said Linda Carpenter, director of the Butler Hospital TMS Clinic and Neuromodulation Research Facility.


‘Psychiatry 3.0’
The experience with using brain stimulation to stop tremors and seizures inspired psychiatrists to begin exploring its use as a potentially powerful therapy for healing, or even getting ahead of, mental illness.

In 2008, the FDA approved TMS for patients with major depression who had tried, and not gotten relief from, drug therapy. “That kind of opened the door for all of us,” said Hanlon, a professor and researcher at the Center for Research on Substance Use and Addiction at Wake Forest School of Medicine. The last decade saw a surge of research into how TMS could be used to reset malfunctioning brain circuits involved in anxiety, depression, obsessive-compulsive disorder, and other conditions.

“We’re certainly entering into what a lot of people are calling psychiatry 3.0,” Stanford’s Williams said. “Whereas the first iteration was Freud and all that business, the second one was the psychopharmacology boom, and this third one is this bit around circuits and stimulation.”

Drugs alleviate some patients’ symptoms while simultaneously failing to help many others, but psychopharmacology clearly showed “there’s definitely a biology to this problem,” Williams said — a biology that in some cases may be more amenable to a brain stimulation.

Related: Largest psilocybin trial finds the psychedelic is effective in treating serious depression
The exact mechanics of what happens between cells when brain circuits … well, short-circuit, is unclear. Researchers are getting closer to finding biomarkers that warn of an incoming depressive episode, or wave of anxiety, or loss of impulse control. Those brain signatures could be different for every patient. If researchers can find molecular biomarkers for psychiatric disorders — and find ways to preempt those symptoms by shocking particular brain regions — that would reshape the field, Williams said.

Not only would disease-specific markers help clinicians diagnose people, but they could help chip away at the stigma that paints mental illness as a personal or moral failing instead of a disease. That’s what happened for epilepsy in the 1960s, when scientific findings nudged the general public toward a deeper understanding of why seizures happen, and it’s “the same trajectory” Williams said he sees for depression.

His research at the Stanford lab also includes work on suicide, and obsessive-compulsive disorder, which the FDA said in 2018 could be treated using noninvasive TMS. Williams considers brain stimulation, with its instantaneity, to be a potential breakthrough for urgent psychiatric situations. Doctors know what to do when a patient is rushed into the emergency room with a heart attack or a stroke, but there is no immediate treatment for psychiatric emergencies, he said. Williams wonders: What if, in the future, a suicidal patient could receive TMS in the emergency room and be quickly pulled out of their depressive mental spiral?

Researchers are also actively investigating the brain biology of addiction. In August 2020, the FDA approved TMS for smoking cessation, the first such OK for a substance use disorder, which is “really exciting,” Hanlon said. Although there is some nuance when comparing substance use disorders, a primal mechanism generally defines addiction: the eternal competition between “top-down” executive control functions and “bottom-up” cravings. It’s the same process that is at work when one is deciding whether to eat another cookie or abstain — just exacerbated.

Hanlon is trying to figure out if the stop and go circuits are in the same place for all people, and whether neuromodulation should be used to strengthen top-down control or weaken bottom-up cravings. Just as brain stimulation can be used to disrupt cellular misfiring, it could also be a tool for reinforcing helpful brain functions, or for giving the addicted brain what it wants in order to curb substance use.

Evidence suggests many people with schizophrenia smoke cigarettes (a leading cause of early death for this population) because nicotine reduces the “hyperconnectivity” that characterizes the brains of people with the disease, said Heather Ward, a research fellow at Boston’s Beth Israel Deaconess Medical Center. She suspects TMS could mimic that effect, and therefore reduce cravings and some symptoms of the disease, and she hopes to prove that in a pilot study that is now enrolling patients.

If the scientific evidence proves out, clinicians say brain stimulation could be used alongside behavioral therapy and drug-based therapy to treat substance use disorders. “In the end, we’re going to need all three to help people stay sober,” Hanlon said. “We’re adding another tool to the physician’s toolbox.”

Decoding the mysteries of pain
Afavorable outcome to the ongoing research, one that would fling the doors to brain stimulation wide open for patients with myriad disorders, is far from guaranteed. Chronic pain researchers know that firsthand.

Chronic pain, among the most mysterious and hard-to-study medical phenomena, was the first use for which the FDA approved deep brain stimulation, said Prasad Shirvalkar, an assistant professor of anesthesiology at UCSF. But when studies didn’t pan out after a year, the FDA retracted its approval.

Shirvalkar is working with Starr and neurosurgeon Edward Chang on a profoundly complex problem: “decoding pain in the brain states, which has never been done,” as Starr told STAT.

Part of the difficulty of studying pain is that there is no objective way to measure it. Much of what we know about pain is from rudimentary surveys that ask patients to rate how much they’re hurting, on a scale from zero to 10.

Using implantable brain stimulation devices, the researchers ask patients for a 0-to-10 rating of their pain while recording up-and-down cycles of activity in the brain. They then use machine learning to compare the two streams of information and see what brain activity correlates with a patient’s subjective pain experience. Implantable devices let researchers collect data over weeks and months, instead of basing findings on small snippets of information, allowing for a much richer analysis.

 

Related News

View more

For Hydro-Québec, selling to the United States means reinventing itself

Hydro-Quebec hydropower exports deliver low-carbon electricity to New England, sparking debate on greenhouse gas accounting, grid attributes, and REC-style certificates as Quebec modernizes monitoring to verify emissions, integrate renewables, and meet ambitious climate targets.

 

Key Points

Low-carbon electricity to New England, with improved emissions tracking and verifiable grid attributes.

✅ Deep, narrow reservoirs cut lifecycle GHGs in cold boreal waters

✅ Attribute certificates trace source, type, and carbon intensity

✅ Contracts require facility-level tagging for compliance

 

For 40 years, through the most vicious interprovincial battles, even as proposals for bridging the Alberta-B.C. gap aimed to improve grid resilience, Canadians could agree on one way Quebec is undeniably superior to the rest of the country.

It’s hydropower, and specifically the mammoth dam system in Northern Quebec that has been paying dividends since it was first built in the 70s. “Quebec continues to boast North America’s lowest electricity prices,” was last year’s business-as-usual update in one trade publication, even as Newfoundland's rate strategy seeks relief for consumers.

With climate crisis looming, that long-ago decision earns even more envy and reflects Canada's electricity progress across the grid today. Not only do they pay less, but Quebeckers also emit the least carbon per capita of any province.

It may surprise most Canadians, then, to hear how most of New England has reacted to the idea of being able to buy permanently into Quebec’s power grid.

​​​​​​Hydro-Québec’s efforts to strike major export deals have been rebuffed in the U.S., by environmentalists more than anyone. They question everything about Quebec hydropower, including asking “is it really low-carbon?”

These doubts may sound nonsensical to regular Quebeckers. But airing them has, in fact, pushed Hydro-Québec to learn more about itself and adopt new technology.

We know far more about hydropower than we knew 40 years ago, including whether it’s really zero-emission (it’s not), how to make it as close to zero-emission as possible, and how to account for it as precisely as new clean energies like solar and wind, underscoring how cleaning up Canada's electricity is vital to meeting climate pledges.

The export deals haven’t gone through yet, but they’ve already helped drag Hydro-Québec—roughly the fourth-biggest hydropower system on the planet—into the climate era.

Fighting to export
One of the first signs of trouble for Quebec hydro was in New Hampshire, almost 10 years ago. People there began pasting protest signs on their barns and buildings. One citizens’ group accused Hydro of planning a “monstrous extension cord” across the state.

Similar accusations have since come from Maine, Massachusetts and New York.

The criticism isn’t coming from state governments, which mostly want a more permanent relationship with Hydro-Québec. They already rely on Quebec power, but in a piecemeal way, topping up their own power grid when needed (with the exception of Vermont, which has a small permanent contract for Quebec hydropower).

Last year, Quebec provided about 15 percent of New England’s total power, plus another substantial amount to New York, which is officially not considered to be part of New England, and has its own energy market separate from the New England grid.

Now, northeastern states need an energy lynch pin, rather than a top-up, with existing power plants nearing the end of their lifespans. In Massachusetts, for example, one major nuclear plant shut down this year and another will be retired in 2021. State authorities want a hydro-based energy plan that would send $10 billion to Hydro-Québec over 20 years.

New England has some of North America’s most ambitious climate goals, with every state in the region pledging to cut emissions by at least 80 percent over the next 30 years.

What’s the downside? Ask the citizens’ groups and nonprofits that have written countless op-eds, organized petitions and staged protests. They argue that hydropower isn’t as clean as cutting-edge clean energy such as solar and wind power, and that Hydro-Québec isn’t trying hard enough to integrate itself into the most innovative carbon-counting energy system. Right as these other energy sources finally become viable, they say, it’s a step backwards to commit to hydro.

As Hydro-Québec will point out, many of these critics are legitimate nonprofits, but others may have questionable connections. The Portland Press Herald in Maine reported in September 2018 that a supposedly grassroot citizens’ group called “Stand Up For Maine” was actually funded by the New England Power Generators Association, which is based in Boston and represents such power plant owners as Calpine Corp., Vistra Energy and NextEra Energy.

But in the end, that may not matter. Arguably the biggest motivator to strike these deals comes not from New England’s needs, but from within Quebec. The province has spent more than $10 billion in the last 15 years to expand its dam and reservoir system, and in order to stay financially healthy, it needs to double its revenue in the next 10 years—a plan that relies largely on exports.

With so much at stake, it has spent the last decade trying to prove it can be an energy of the future.

“Learning as you go”
American critics, justified or not, have been forcing advances at Hydro for a long time.

When the famously huge northern Quebec hydro dams were built at James Bay—construction began in the early 1970s—the logic was purely economic. The term “climate change” didn’t exist. The province didn’t even have an environment department.

The only reason Quebec scientists started trying to measure carbon emissions from hydro reservoirs was “basically because of the U.S.,” said Alain Tremblay, a senior environmental advisor at Hydro Quebec.


Alain Tremblay, senior environmental advisor at Hydro-Québec. Photograph courtesy of Hydro-Québec
In the early 1990s, Hydro began to export power to the U.S., and “because we were a good company in terms of cost and efficiency, some Americans didn't like that,” he said—mainly competitors, though he couldn’t say specifically who. “They said our reservoirs were emitting a lot of greenhouse gases.”

The detractors had no research to back up that claim, but Hydro-Québec had none to refute it, either, said Tremblay. “At that time we didn’t have any information, but from back-of-the envelope calculations, it was impossible to have the emissions the Americans were expecting we have.”

So research began, first to design methods to take the measurements, and then to carry them out. Hydro began a five-year project with a Quebec university.

It took about 10 years to develop a solid methodology, Tremblay said, with “a lot of error and learning-as-you-go.” There have been major strides since then.

“Twenty years ago we were taking a sample of water, bringing it back to the lab and analyzing that with what we call a gas chromatograph,” said Tremblay. “Now, we have an automated system that can measure directly in the water,” reading concentrations of CO2 and methane every three hours and sending its data to a processing centre.

The tools Hydro-Québec uses are built in California. Researchers around the world now follow the same standard methods.

At this point, it’s common knowledge that hydropower does emit greenhouse gases. Experts know these emissions are much higher than previously thought.

Workers on the Eastmain-1 project environmental monitoring program. Photography courtesy of Alain Tremblay.
​But Hydro-Québec now has the evidence, also, to rebut the original accusations from the early 1990s and many similar ones today.

“All our research from Université Laval [found] that it’s about a thousand years before trees decompose in cold Canadian waters,” said Tremblay.

Hydro reservoirs emit greenhouse gases because vegetation and sometimes other biological materials, like soil runoff, decay under the surface.

But that decay depends partly on the warmth of the water. In tropical regions, including the southern U.S., hydro dams can have very high emissions. But in boreal zones like northern Quebec (or Manitoba, Labrador and most other Canadian locations with massive hydro dams), the cold, well-oxygenated water vastly slows the process.

Hydro emissions have “a huge range,” said Laura Scherer, an industrial ecology professor at Leiden University in the Netherlands who led a study of almost 1,500 hydro dams around the world.

“It can be as low as other renewable energy sources, but it can also be as high as fossil fuel energy,” in rare cases, she said.

While her study found that climate was important, the single biggest factor was “sizing and design” of each dam, and specifically its shape, she said. Ideally, hydro dams should be deep and narrow to minimize surface area, perhaps using a natural valley.

Hydro-Québec’s first generation of dams, the ones around James Bay, were built the opposite way—they’re wide and shallow, infamously flooding giant tracts of land.


Alain Tremblay, senior environmental advisor at Hydro-Québec testing emission levels. Photography courtesy of Alain Tremblay
Newly built ones take that new information into account, said Tremblay. Its most recent project is the Romaine River complex, which will eventually include four reservoirs near Quebec’s northeastern border with Labrador. Construction began in 2016.

The site was picked partly for its topography, said Tremblay.

“It’s a valley-type reservoir, so large volume, small surface area, and because of that there’s a pretty limited amount of vegetation that’s going to be flooded,” he said.

There’s a dramatic emissions difference with the project built just before that, commissioned in 2006. Called Eastmain, it’s built near James Bay.

“The preliminary results indicate with the same amount of energy generated [by Romaine] as with Eastmain, you’re going to have about 10 times less emissions,” said Tremblay.

Tracing energy to its source
These signs of progress likely won’t satisfy the critics, who have publicly argued back and forth with Hydro about exactly how emissions should be tallied up.

But Hydro-Québec also faces a different kind of growing gap when it comes to accounting publicly for its product. In the New England energy market, a sophisticated system “tags” all the energy in order to delineate exactly how much comes from which source—nuclear, wind, solar, and others—and allows buyers to single out clean power, or at least the bragging rights to say they bought only clean power.

Really, of course, it’s all the same mix of energy—you can’t pick what you consume. But creating certificates prevents energy producers from, in worst-case scenarios, being able to launder regular power through their clean-power facilities. Wind farms, for example, can’t oversell what their own turbines have produced.

What started out as a fraud prevention tool has “evolved to make it possible to also track carbon emissions,” said Deborah Donovan, Massachusetts director at the Acadia Center, a climate-focused nonprofit.

But Hydro-Québec isn’t doing enough to integrate itself into this system, she says.

It’s “the tool that all of our regulators in New England rely on when we are confirming to ourselves that we’ve met our clean energy and our carbon goals. And…New York has a tool just like that,” said Donovan. “There isn’t a tracking system in Canada that’s comparable, though provinces like Nova Scotia are tapping the Western Climate Initiative for technical support.”

Hydro Quebec Chénier-Vignan transmission line crossing the Outaouais river. Photography courtesy of Hydro-Québec
Developing this system is more a question of Canadian climate policy than technology.

Energy companies have long had the same basic tracking device—a meter, said Tanya Bodell, a consultant and expert in New England’s energy market. But in New England, on top of measuring “every time there’s a physical flow of electricity” from a given source, said Bodell, a meter “generates an attribute or a GIS certificate,” which certifies exactly where it’s from. The certificate can show the owner, the location, type of power and its average emissions.

Since 2006, Hydro-Québec has had the ability to attach the same certificates to its exports, and it sometimes does.

“It could be wind farm generation, even large hydro these days—we can do it,” said Louis Guilbault, who works in regulatory affairs at Hydro-Québec. For Quebec-produced wind energy, for example, “I can trade those to whoever’s willing to buy it,” he said.

But, despite having the ability, he also has the choice not to attach a detailed code—which Hydro doesn’t do for most of its hydropower—and to have it counted instead under the generic term of “system mix.”

Once that hydropower hits the New England market, the administrators there have their own way of packaging it. The market perhaps “tries to determine emissions, GHG content,” Guilbault said. “They have their own rules; they do their own calculations.”

This is the crux of what bothers people like Donovan and Bodell. Hydro-Québec is fully meeting its contractual obligations, since it’s not required to attach a code to every export. But the critics wish it would, whether by future obligation or on its own volition.

Quebec wants it both ways, Donovan argued; it wants the benefits of selling low-emission energy without joining the New England system of checks and balances.

“We could just buy undifferentiated power and be done with it, but we want carbon-free power,” Donovan said. “We’re buying it because of its carbon content—that’s the reason.”

Still, the requirements are slowly increasing. Under Hydro-Québec’s future contract with Massachusetts (which still has several regulatory steps to go through before it’s approved) it’s asked to sell the power’s attributes, not just the power itself. That means that, at least on paper, Massachusetts wants to be able to trace the energy back to a single location in Quebec.

“It’s part of the contract we just signed with them,” said Guilbault. “We’re going to deliver those attributes. I’m going to select a specific hydro facility, put the number in...and transfer that to the buyers.”

Hydro-Québec says it’s voluntarily increasing its accounting in other ways. “Even though this is not strictly required,” said spokeswoman Lynn St. Laurent, Hydro is tracking its entire output with a continent-wide registry, the North American Renewables Registry.

That registry is separate from New England’s, so as far as Bodell is concerned, the measure doesn’t really help. But she and others also expect the entire tracking system to grow and mature, perhaps integrating into one. If it had been created today, in fact, rather than in the 1990s, maybe it would use blockchain technology rather than a varied set of administrators, she said.

Counting emissions through tracking still has a long way to go, as well, said Donovan, and it will increasingly matter in Canada's race to net-zero as standards tighten. For example, natural gas is assigned an emissions number that’s meant to reflect the emissions when it’s consumed. But “we do not take into account what the upstream carbon emissions are through the pipeline leakage, methane releases during fracking, any of that,” she said.

Now that the search for exactitude has begun, Hydro-Québec won’t be exempt, whether or not Quebeckers share that curiosity. “We don’t know what Hydro-Québec is doing on the other side of the border,” said Donovan.

 

Related News

View more

Energy crisis: EU outlines possible gas price cap strategies

EU Gas Price Cap Strategies aim to curb inflation during an energy crisis by capping wholesale gas and electricity generation costs, balancing supply and demand, mitigating subsidies, and safeguarding supply security amid Russia-Ukraine shocks.

 

Key Points

Temporary EU measures to cap gas and power prices, curb inflation, manage demand, and protect supply security.

✅ Flexible temporary price limits to secure gas supplies

✅ Framework cap on gas for electricity generation with demand checks

✅ Risk: subsidies, higher demand, and market distortions

 

The European Commission has outlined possible strategies to cap gas prices as the bloc faces a looming energy crisis this winter. 

Member states are divided over the emergency measures designed to pull down soaring inflation amid Russia's war in Ukraine. 

One proposal is a temporary "flexible" limit on gas prices to ensure that Europe can continue to secure enough gas, EU energy commissioner Kadri Simson said on Tuesday. 

Another option could be an EU-wide "framework" for a price cap on gas used to generate electricity, which would be combined with measures to ensure gas demand does not rise as a result, she said.

EU leaders are meeting on Friday to debate gas price cap strategies amid warnings that Europe's energy nightmare could worsen this winter.

Last week, France, Italy, Poland and 12 other EU countries urged the Commission to propose a broader price cap targeting all wholesale gas trade. 

But Germany -- Europe's biggest gas buyer -- and the Netherlands are among those opposing electricity market reforms within the bloc.

Russia has slashed gas deliveries to Europe since its February invasion of Ukraine, with Moscow blaming the cuts on Western sanctions imposed in response to the invasion, as the EU advances a plan to dump Russian energy across the bloc.

Since then, the EU has agreed on emergency laws to fill gas storage and windfall profit levies to raise money to help consumers with bills. 

Price cap critics
One energy analyst told Euronews that an energy price cap was an "unchartered territory" for the European Union. 

The EU's energy sector is largely liberalised and operates under the fundamental rules of supply and demand, making rolling back electricity prices complex in practice.

"My impression is that member states are looking at prices and quantities in isolation and that's difficult because of economics," said Elisabetta Cornago, a senior energy researcher at the Centre for European Reform.

"It's hard to picture such a level of market intervention This is uncharted territory."

The energy price cap would "quickly start costing billions" because it would force governments to continually subsidise the difference between the real market price and the artificially capped price, another expert said. 

"If you are successful and prices are low and you still get gas, consumers will increase their demand: low price means high demand. Especially now that winter is coming," said Bram Claeys, a senior advisor at the Regulatory Assistance Project. 

 

Related News

View more

Energy Department Announces 20 New Competitors for the American-Made Solar Prize

American-Made Solar Prize Round 3 accelerates DOE-backed solar innovation, empowering entrepreneurs and domestic manufacturing with photovoltaics and grid integration support via National Laboratories, incubators, and investors to validate products, secure funding, and deploy backup power.

 

Key Points

A DOE challenge fast-tracking solar innovation to market readiness, boosting US manufacturing and grid integration.

✅ $50,000 awards to 20 teams for prototype validation

✅ Access to National Labs, incubators, investors, and mentors

✅ Focus on PV advances and grid integration solutions

 

The U.S. Department of Energy (DOE) announced the 20 competitors who have been invited to advance to the next phase of the American-Made Solar Prize Round 3, a competition designed to incentivize the nation’s entrepreneurs to strengthen American leadership in solar energy innovation and domestic manufacturing, a key front in the clean energy race today.

The American-Made Solar Prize is designed to help more American entrepreneurs thrive in the competitive global energy market. Each round of the prize brings new technologies to pre-commercial readiness in less than a year, ensuring new ideas enter the marketplace. As part of the competition, teams will have access to a network of DOE National Laboratories, technology incubators and accelerators, and related DOE efforts like next-generation building upgrades, venture capital firms, angel investors, and industry. This American-Made Network will help these competitors raise private funding, validate early-stage products, or test technologies in the field.

Each team will receive a $50,000 cash prize and become eligible to compete in the next phase of the competition. Through a rigorous evaluation process, teams were chosen based on the novelty of their ideas and how their solutions address a critical need of the solar industry. The teams were selected from 120 submissions and represent 11 states. These projects will tackle challenges related to new solar applications, like farming, as well as show how solar can be used to provide backup power when the grid goes down, aided by increasingly affordable batteries now reaching scale. Nine teams will advance solar photovoltaic technologies, and 11 will address challenges related to how solar integrates with the grid. The projects are as follows:

Photovoltaics:

  • Durable Antireflective and Self-Cleaning Glass (Pittsburgh, PA)
  • Pursuit Solar - More Power, Less Hassle (Denver, NC)
  • PV WaRD (San Diego, CA)
  • Remotely Deployed Solar Arrays (Charlottesville, VA)
  • Robotics Changing the Landscape for Solar Farms (San Antonio, TX)
  • TrackerSled (Chicago, IL)
  • Transparent Polymer Barrier Films for PV (Bristol, PA)
  • Solar for Snow (Duluth, MN)
  • SolarWall Power Tower (Buffalo, NY)


Systems Integration:

  • Affordable Local Solar Storage via Utility Virtual Power Plants (Parker, TX)
  • Allbrand Solar Monitor (Detroit, MI)
  • Beyond Monitoring – Next Gen Software and Hardware (Atlanta, GA)
  • Democratizing Solar with Artificial Intelligence Energy Management (Houston, TX)
  • Embedded, Multi-Function Maximum Power Point Tracker for Smart Modules (Las Vegas, NV)
  • Evergrid: Keep Solar Flowing When the Grid Is Down (Livermore, CA)
  • Inverter Health Scan (San Jose, CA)
  • JuiceBox: Integrated Solar Electricity for Americans Transitioning out of Homelessness and Recovering from Natural Disasters (Claremont, CA)
  • Low-Cost Parallel-Connected DC Power Optimizer (Blacksburg, VA)
  • Powerfly: A Plug-and-Play Solar Monitoring Device (Berkeley, CA)
  • Simple-Assembly Storage Kit (San Antonio, TX)

Read the descriptions of the projects to see how they contribute to efforts to improve solar and wind power worldwide.

Over the next six months, these teams will fast-track their efforts to identify, develop, and test disruptive solutions amid record solar and storage growth projected nationwide. During a national demonstration day at Solar Power International in September 2020, a panel of judges will select two final winners who will receive a $500,000 prize. Learn more at the American-Made Solar Prize webpage.

The American-Made Challenges incentivize the nation's entrepreneurs to strengthen American leadership in energy innovation and domestic manufacturing. These new challenges seek to lower the barriers U.S.-based innovators face in reaching manufacturing scale by accelerating the cycles of learning from years to weeks while helping to create partnerships that connect entrepreneurs to the private sector and the network of DOE’s National Laboratories across the nation, alongside recent wind energy awards that complement solar innovation.

Go here to learn how this work aligns with a tenfold solar expansion being discussed nationally.

https://www.energy.gov/eere/solar/solar-energy-technologies-office

 

Related News

View more

California lawmakers plan to overturn income-based utility charges

California income-based utility charges face bipartisan pushback as the PUC weighs fixed fees for PG&E, SDG&E, and Southern California Edison, reshaping rate design, electricity affordability, energy equity, and privacy amid proposed per-kWh reductions.

 

Key Points

PUC-approved fixed fees tied to household income for PG&E, SDG&E, and SCE, offset by lower per-kWh rates.

✅ Proposed fixed fees: $51 SCE, $73.31 SDG&E, $50.92 PG&E

✅ Critics warn admin, privacy, legal risks and higher bills for savers

✅ Backers say lower-income pay less; kWh rates cut ~33% in PG&E area

 

Efforts are being made across California's political landscape to derail a legislative initiative that introduced income-based utility charges for customers of Southern California Edison and other major utilities.

Legislators from both the Democratic and Republican parties have proposed bills aimed at nullifying the 2022 legislation that established a sliding scale for utility charges based on customer income, a decision made in a late-hour session and subsequently endorsed by Governor Gavin Newsom.

The plan, pending final approval from the state Public Utilities Commission (PUC) — all of whose current members were appointed by Governor Newsom — would enable utilities like Southern California Edison, San Diego Gas & Electric, and PG&E to apply new income-based charges as early as this July.

Among the state legislators pushing back against the income-based charge scheme are Democrats Jacqui Irwin and Marc Berman, along with Republicans Janet Nguyen, Kelly Seyarto, Rosilicie Ochoa Bogh, Scott Wilk, Brian Dahle, Shannon Grove, and Roger Niello.

A cadre of specialists, including economist Ahmad Faruqui who has advised all three utilities implicated in the fee proposal, have outlined several concerns regarding the PUC's pending decision.

Faruqui and his colleagues argue that the proposed charges are excessively high in comparison to national standards, reflecting soaring electricity prices across the state, potentially leading to administrative challenges, legal disputes, and negative unintended outcomes, such as penalizing energy-conservative consumers.

Advocates for the income-based fee model, including The Utility Reform Network (TURN) and the National Resources Defense Council, argue it would result in higher charges for wealthier consumers and reduced fees for those with lower incomes. They also believe that the utilities plan to decrease per kilowatt-hour rates as part of a broader rate structure review to balance out the new fees.

However, even supporters like TURN and the Natural Resources Defense Council acknowledge that the income-based fee model is not a comprehensive solution to making soaring electricity bills more affordable.

If implemented, California would have the highest income-based utility fees in the country, with averages far surpassing the national average of $11.15, as reported by EQ Research:

  • Southern California Edison would charge $51.
  • San Diego Gas & Electric would levy $73.31.
  • PG&E would set fees at $50.92.

The proposal has raised concerns among state legislators about the additional financial burden on Californians already struggling with high electricity costs.

Critics highlight several practical challenges, including the PUC's task of assessing customers' income levels, a process fraught with privacy concerns, potential errors, and constitutional questions regarding access to tax information.

Economists have pointed out further complications, such as the difficulty in accurately assessing incomes for out-of-state property owners and the variability of customers' incomes over time.

The proposed income-based charges would differ by income bracket within the PG&E service area, for example, with lower-income households facing lower fixed charges and higher-income households facing higher charges, alongside a proposed 33% reduction in electricity rates to help mitigate the fixed charge impact.

Yet, the economists warn that most customers, particularly low-usage customers, could end up paying more, essentially rewarding higher consumption and penalizing efficiency.

This legislative approach, they caution, could inadvertently increase costs for moderate users across all income brackets, a sign of major changes to electric bills that could emerge, challenging the very goals it aims to achieve by promoting energy inefficiency.

 

Related News

View more

Competition in Electricity Has Been Good for Consumers and Good for the Environment

Electricity Market Competition drives lower wholesale prices, stable retail rates, better grid reliability, and faster emissions cuts as deregulation and renewables adoption pressure utilities, improve efficiency, and enhance consumer choice in power markets.

 

Key Points

Electricity market competition opens supply to rivals, lowering prices, improving reliability, and reducing emissions.

✅ Wholesale prices fell faster in competitive markets

✅ Retail rates rose less than in monopoly states

✅ Fewer outages, shorter durations, improved reliability

 

By Bernard L. Weinstein

Electricity used to be boring.  Public utilities that provided power to homes and businesses were regulated monopolies and, by law, guaranteed a fixed rate-of-return on their generation, transmission, and distribution assets. Prices per kilowatt-hour were set by utility commissions after lengthy testimony from power companies, wanting higher rates, and consumer groups, wanting lower rates.

About 25 years ago, the electricity landscape started to change as economists and others argued that competition could lead to lower prices and stronger grid reliability. Opponents of competition argued that consumers weren’t knowledgeable enough about power markets to make intelligent choices in a competitive pricing environment. Nonetheless, today 20 states have total or partial competition for electricity, allowing independent power generators to compete in wholesale markets and retail electric providers (REPs) to compete for end-use customers, a dynamic echoed by the Alberta electricity market across North America. (Transmission, in all states, remains a regulated natural monopoly).

A recent study by the non-partisan Pacific Research Institute (PRI) provides compelling evidence that competition in power markets has been a boon for consumers. Using data from the U.S. Energy Information Administration (EIA), PRI’s researchers found that wholesale electricity prices in competitive markets have been generally declining or flat, prompting discussions of free electricity business models, over the last five years. For example, compared to 2015, wholesale power prices in New England have dropped more than 44 percent, those in most Mid-Atlantic States have fallen nearly 42 percent, and in New York City they’ve declined by nearly 45 percent. Wholesale power costs have also declined in monopoly states, but at a considerably slower rate.

As for end-users, states that have competitive retail electricity markets have seen smaller price increases, as consumers can shop for electricity in Texas more cheaply than in monopoly states. Again, using EIA data, PRI found that in 14 competitive jurisdictions, retail prices essentially remained flat between 2008 and 2020. By contrast, retail prices jumped an average of 21 percent in monopoly states.  The ten states with the largest retail price increases were all monopoly-based frameworks. A 2017 report from the Retail Energy Supply Association found customers in states that still have monopoly utilities saw their average energy prices increase nearly 19 percent from 2008 to 2017 while prices fell 7 percent in competitive markets over the same period.

The PRI study also observed that competition has improved grid reliability, the recent power disruptions in California and Texas, alongside disruptions in coal and nuclear sectors across the U.S., notwithstanding. Looking at two common measures of grid resiliency, PRI’s analysis found that power interruptions were 10.4 percent lower in competitive states while the duration of outages was 6.5 percent lower.

Citing data from the EIA between 2008 and 2018, PRI reports that greenhouse gas emissions in competitive states declined on average 12.1 percent compared to 7.3 percent in monopoly states. This result is not surprising, and debates over whether Israeli power supply competition can bring cheaper electricity mirror these dynamics.  In a competitive wholesale market, independent power producers have an incentive to seek out lower-cost options, including subsidized renewables like wind and solar. By contrast, generators in monopoly markets have no such incentive as they can pass on higher costs to end-users. Perhaps the most telling case is in the monopoly state of Georgia where the cost to build nuclear Plant Vogtle has doubled from its original estimate of $14 billion 12 years ago. Overruns are estimated to cost Georgia ratepayers an average of $854, and there is no definite date for this facility to come on line. This type of mismanagement doesn’t occur in competitive markets.

Unfortunately, some critics are attempting to halt the momentum for electricity competition and have pointed to last winter’s “deep freeze” in Texas that left several million customers without power for up to a week. But this example is misplaced. Power outages in February were the result of unprecedented and severe weather conditions affecting electricity generation and fuel supply, and numerous proposals to improve Texas grid reliability have focused on weatherization and fuel resilience; the state simply did not have enough access to natural gas and wind generation to meet demand. Competitive power markets were not a factor.

The benefits of wholesale and retail competition in power markets are incontrovertible. Evidence shows that households and businesses in competitive states are paying less for electricity while grid reliability has improved. The facts also suggest that wholesale and retail competition can lead to faster reductions in greenhouse gas emissions. In short, competition in power markets is good for consumers and good for the environment.

Bernard L. Weinstein is emeritus professor of applied economics at the University of North Texas, former associate director of the Maguire Energy Institute at Southern Methodist University, and a fellow of Goodenough College, London. He wrote this for InsideSources.com.

 

Related News

View more

Sign Up for Electricity Forum’s Newsletter

Stay informed with our FREE Newsletter — get the latest news, breakthrough technologies, and expert insights, delivered straight to your inbox.

Electricity Today T&D Magazine Subscribe for FREE

Stay informed with the latest T&D policies and technologies.
  • Timely insights from industry experts
  • Practical solutions T&D engineers
  • Free access to every issue

Download the 2025 Electrical Training Catalog

Explore 50+ live, expert-led electrical training courses –

  • Interactive
  • Flexible
  • CEU-cerified