27,000 solar panels yield enough energy for 500 homes

By Knight Ridder Tribune


CSA Z463 Electrical Maintenance

Our customized live online or in‑person group training can be delivered to your staff at your location.

  • Live Online
  • 6 hours Instructor-led
  • Group Training Available
Regular Price:
$249
Coupon Price:
$199
Reserve Your Seat Today
The Army recruited the sun for active duty.

Basic training is on a former landfill site at Fort Carson. There, on a 12-acre site, are 27,000 solar panels doing more than just soaking up rays. The solar array can generate 3,200-megawatt hours annually, enough to power about 500 homes.

A red ribbon was cut to celebrate the Army going green, with the solar panels providing the juice for the acoustics.

"This project is the largest solar project on an Army base," said Erik Rothenberg, spokesman for 3 Phases Energy Services, part of the private/public collaboration. "It is the sixth-largest solar project in the United States, the 70th-largest in the world."

Fort Carson did not foot the project's $13 million bill.

The post leased out the land and locked in a flat utility rate to buy the power. Major players in the project include the Army, investment bankers, solar energy developers and power companies, such as Colorado Springs Utilities. The government kicked in tax credits.

"It's like a Rubik's Cube puzzle of projects," Rothenberg said. Gov. Bill Ritter gave the project a state renewable energy award.

"This is about a partnership," Ritter told the gathering of supporters who outnumbered the folding chairs. "That's the way we have to think about how we move the agenda forward as it relates to renewable energy." It's about security, he said -- economic, environmental and energy security.

The looming aboveground grid of black panels looks stunningly simple. The solar array is row after row of dark shiny panels, tilted like bleachers, on a dirt field where decades of construction debris is buried.

"The sun shines, then it produces electricity," Fort Carson utility programs manager Vince Guthrie said.

And when it's cloudy? "The power outlet is less." In terms of power on post, "It is approximate of 2.3 percent of Fort Carson's total load," Guthrie said. It is expected to save a halfmillion dollars over the next 20 years, he said, and will reduce greenhouse gases and reliance on other resources.

"It's not just about climate change, it's about culture change," Guthrie said. "It starts with a project like this."

Related News

Ontario Energy minister downplays dispute between auditor, electricity regulator

Ontario IESO Accounting Dispute highlights tensions over public sector accounting standards, auditor general oversight, electricity market transparency, KPMG advice, rate-regulated accounting, and an alleged $1.3B deficit understatement affecting Hydro bills and provincial finances.

 

Key Points

A PSAS clash between Ontario's auditor general and the IESO, alleging a $1.3B deficit impact and transparency failures.

✅ Auditor alleges deficit understated by $1.3B

✅ Dispute over PSAS vs US-style accounting

✅ KPMG support, transparency and co-operation questioned

 

The bad blood between the Ontario government and auditor general bubbled to the surface once again Monday, with the Liberal energy minister downplaying a dispute between the auditor and the Crown corporation that manages the province's electricity market, even as the government pursued legislation to lower electricity rates in the province.

Glenn Thibeault said concerns raised by auditor general Bonnie Lysyk during testimony before a legislative committee last week aren't new and the practices being used by the Independent Electricity System Operator are commonly endorsed by major auditing firms.

"(Lysyk) doesn't like the rate-regulated accounting. We've always said we've relied on the other experts within the field as well, plus the provincial controller," Thibeault said.

#google#

"We believe that we are following public sector accounting standards."

Thibeault said that Ontario Power Generation, Hydro One and many other provinces and U.S. states use the same accounting practices.

"We go with what we're being told by those who are in the field, like KPMG, like E&Y," he said.

But a statement from Lysyk's office Monday disputed Thibeault's assessment.

"The minister said the practices being used by the IESO are common in other jurisdictions," the statement said.

"In fact, the situation with the IESO is different because none of the six other jurisdictions with entities similar to the IESOuse Canadian Public Sector Accounting Standards. Five of them are in the United States and use U.S. accounting standards."

Lysyk said last week that the IESO is using "bogus" accounting practices and her office launched a special audit of the agency late last year after the agency changed their accounting to be more in line with U.S. accounting, following reports of a phantom demand problem that cost customers millions.

Lysyk said the accounting changes made by the IESO impact the province's deficit, understating it by $1.3 billion as of the end of 2017, adding that IESO "stalled" her office when it asked for information and was not co-operative during the audit.

Lysyk's full audit of the IESO is expected to be released in the coming weeks and is among several accounting disputes her office has been engaged in with the Liberal government over the past few years.

Last fall, she accused the government of purposely obscuring the true financial impact of its 25% hydro rate cut by keeping billions in debt used to finance that plan off the province's books. Lysyk had said she would audit the IESO because of its role in the hydro plan's complex accounting scheme.

"Management of the IESO and the board would not co-operate with us, in the sense that they continually say they're co-operating, but they stalled on giving us information," she said last week.

Terry Young, a vice-president with the IESO, said the agency has fully co-operated with the auditor general. The IESO opened up its office to seven staff members from the auditor's office while they did their work.

"We recognize the work that she's doing and to that end we've tried to fully co-operate," he said. "We've given her all of the information that we can."

Young said the change in accounting standards is about ensuring greater transparency in transactions in the energy marketplace.

"It's consistent with many other independent electricity system operators are doing," he said.

Lysyk also criticized IESO's accounting firm, KPMG, for agreeing with the IESO on the accounting standards. She was critical of the firm billing taxpayers for nearly $600,000 work with the IESO in 2017, compared to their normal yearly audit fee of $86,500.

KPMG spokeswoman Lisa Papas said the accounting issues that IESO addressed during 2017 were complex, contributing to the higher fees.

The accounting practices the auditor is questioning are a "difference of professional judgement," she said.

"The standards for public sector organizations such as IESO are principles-based standards and, accordingly, require the exercise of considerable professional judgement," she said in a statement.

"In many cases, there is more than one acceptable approach that is compliant with the applicable standards."

Progressive Conservative energy critic Todd Smith said the government isn't being transparent with the auditor general or taxpayers, aligning with calls for cleaning up Ontario's hydro mess in the sector.

"Obviously, they have some kind of dispute but the auditor's office is saying that the numbers that the government is putting out there are bogus.

Those are her words," he said. "We've always said that we believe the auditor general's are the true numbers for the
province of Ontario."

NDP energy critic Peter Tabuns said the Liberal government has decided to "play with accounting rules" to make its books look better ahead of the spring election, despite warnings that electricity prices could soar if costs are pushed into the future.

 

Related News

View more

Jolting the brain's circuits with electricity is moving from radical to almost mainstream therapy

Brain Stimulation is transforming neuromodulation, from TMS and DBS to closed loop devices, targeting neural circuits for addiction, depression, Parkinsons, epilepsy, and chronic pain, powered by advanced imaging, AI analytics, and the NIH BRAIN Initiative.

 

Key Points

Brain stimulation uses pulses to modulate neural circuits, easing symptoms in depression, Parkinsons, and epilepsy.

✅ Noninvasive TMS and invasive DBS modulate specific brain circuits

✅ Closed loop systems adapt stimulation via real time biomarker detection

✅ Emerging uses: addiction, depression, Parkinsons, epilepsy, chronic pain

 

In June 2015, biology professor Colleen Hanlon went to a conference on drug dependence. As she met other researchers and wandered around a glitzy Phoenix resort’s conference rooms to learn about the latest work on therapies for drug and alcohol use disorders, she realized that out of the 730 posters, there were only two on brain stimulation as a potential treatment for addiction — both from her own lab at Wake Forest School of Medicine.

Just four years later, she would lead 76 researchers on four continents in writing a consensus article about brain stimulation as an innovative tool for addiction. And in 2020, the Food and Drug Administration approved a transcranial magnetic stimulation device to help patients quit smoking, a milestone for substance use disorders.

Brain stimulation is booming. Hanlon can attend entire conferences devoted to the study of what electrical currents do—including how targeted stimulation can improve short-term memory in older adults—to the intricate networks of highways and backroads that make up the brain’s circuitry. This expanding field of research is slowly revealing truths of the brain: how it works, how it malfunctions, and how electrical impulses, precisely targeted and controlled, might be used to treat psychiatric and neurological disorders.

In the last half-dozen years, researchers have launched investigations into how different forms of neuromodulation affect addiction, depression, loss-of-control eating, tremor, chronic pain, obsessive compulsive disorder, Parkinson’s disease, epilepsy, and more. Early studies have shown subtle electrical jolts to certain brain regions could disrupt circuit abnormalities — the miscommunications — that are thought to underlie many brain diseases, and help ease symptoms that persist despite conventional treatments.

The National Institute of Health’s massive BRAIN Initiative put circuits front and center, distributing $2.4 billion to researchers since 2013 to devise and use new tools to observe interactions between brain cells and circuits. That, in turn, has kindled interest from the private sector. Among the advances that have enhanced our understanding of how distant parts of the brain talk with one another are new imaging technology and the use of machine learning, much as utilities use AI to adapt to shifting electricity demand, to interpret complex brain signals and analyze what happens when circuits go haywire.

Still, the field is in its infancy, and even therapies that have been approved for use in patients with, for example, Parkinson’s disease or epilepsy, help only a minority of patients, and in a world where electricity drives pandemic readiness expectations can outpace evidence. “If it was the Bible, it would be the first chapter of Genesis,” said Michael Okun, executive director of the Norman Fixel Institute for Neurological Diseases at University of Florida Health.

As brain stimulation evolves, researchers face daunting hurdles, and not just scientific ones. How will brain stimulation become accessible to all the patients who need it, given how expensive and invasive some treatments are? Proving to the FDA that brain stimulation works, and does so safely, is complicated and expensive. Even with a swell of scientific momentum and an influx of funding, the agency has so far cleared brain stimulation for only a handful of limited conditions. Persuading insurers to cover the treatments is another challenge altogether. And outside the lab, researchers are debating nascent issues, such as the ethics of mind control, the privacy of a person’s brain data—concerns that echo efforts to develop algorithms to prevent blackouts during rising ransomware threats—and how to best involve patients in the study of the human brain’s far-flung regions.

Neurologist Martha Morrell is optimistic about the future of brain stimulation. She remembers the shocked reactions of her colleagues in 2004 when she left full-time teaching at Stanford (she still has a faculty appointment as a clinical professor of neurology) to direct clinical trials at NeuroPace, then a young company making neurostimulator systems to potentially treat epilepsy patients.

Related: Once a last resort, this pain therapy is getting a new life amid the opioid crisis
“When I started working on this, everybody thought I was insane,” said Morrell. Nearly 20 years in, she sees a parallel between the story of jolting the brain’s circuitry and that of early implantable cardiac devices, such as pacemakers and defibrillators, which initially “were used as a last option, where all other medications have failed.” Now, “the field of cardiology is very comfortable incorporating electrical therapy, device therapy, into routine care. And I think that’s really where we’re going with neurology as well.”


Reaching a ‘slope of enlightenment’
Parkinson’s is, in some ways, an elder in the world of modern brain stimulation, and it shows the potential as well as the limitations of the technology. Surgeons have been implanting electrodes deep in the brains of Parkinson’s patients since the late 1990s, and in people with more advanced disease since the early 2000s.

In that time, it’s gone through the “hype cycle,” said Okun, the national medical adviser to the Parkinson’s Foundation since 2006. Feverish excitement and overinflated expectations have given way to reality, bringing scientists to a “slope of enlightenment,” he said. They have found deep brain stimulation to be very helpful for some patients with Parkinson’s, rendering them almost symptom-free by calming the shaking and tremors that medications couldn’t. But it doesn’t stop the progression of the disease, or resolve some of the problems patients with advanced Parkinson’s have walking, talking, and thinking.

In 2015, the same year Hanlon found only her lab’s research on brain stimulation at the addiction conference, Kevin O’Neill watched one finger on his left hand start doing something “funky.” One finger twitched, then two, then his left arm started tingling and a feeling appeared in his right leg, like it was about to shake but wouldn’t — a tremor.

“I was assuming it was anxiety,” O’Neill, 62, told STAT. He had struggled with anxiety before, and he had endured a stressful year: a separation, selling his home, starting a new job at a law firm in California’s Bay Area. But a year after his symptoms first began, O’Neill was diagnosed with Parkinson’s.

In the broader energy context, California has increasingly turned to battery storage to stabilize its strained grid.

Related: Psychiatric shock therapy, long controversial, may face fresh restrictions
Doctors prescribed him pills that promote the release of dopamine, to offset the death of brain cells that produce this messenger molecule in circuits that control movement. But he took them infrequently because he worried about insomnia as a side effect. Walking became difficult — “I had to kind of think my left leg into moving” — and the labor lawyer found it hard to give presentations and travel to clients’ offices.

A former actor with an outgoing personality, he developed social anxiety and didn’t tell his bosses about his diagnosis for three years, and wouldn’t have, if not for two workdays in summer 2018 when his tremors were severe and obvious.

O’Neill’s tremors are all but gone since he began deep brain stimulation last May, though his left arm shakes when he feels tense.

It was during that period that he learned about deep brain stimulation, at a support group for Parkinson’s patients. “I thought, ‘I will never let anybody fuss with my brain. I’m not going to be a candidate for that,’” he recalled. “It felt like mad scientist science fiction. Like, are you kidding me?”

But over time, the idea became less radical, as O’Neill spoke to DBS patients and doctors and did his own research, and as his symptoms worsened. He decided to go for it. Last May, doctors at the University of California, San Francisco surgically placed three metal leads into his brain, connected by thin cords to two implants in his chest, just near the clavicles. A month later, he went into the lab and researchers turned the device on.

“That was a revelation that day,” he said. “You immediately — literally, immediately — feel the efficacy of these things. … You go from fully symptomatic to non-symptomatic in seconds.”

When his nephew pulled up to the curb to pick him up, O’Neill started dancing, and his nephew teared up. The following day, O’Neill couldn’t wait to get out of bed and go out, even if it was just to pick up his car from the repair shop.

In the year since, O’Neill’s walking has gone from “awkward and painful” to much improved, and his tremors are all but gone. When he is extra frazzled, like while renovating and moving into his new house overlooking the hills of Marin County, he feels tense and his left arm shakes and he worries the DBS is “failing,” but generally he returns to a comfortable, tremor-free baseline.

O’Neill worried about the effects of DBS wearing off but, for now, he can think “in terms of decades, instead of years or months,” he recalled his neurologist telling him. “The fact that I can put away that worry was the big thing.”

He’s just one patient, though. The brain has regions that are mostly uniform across all people. The functions of those regions also tend to be the same. But researchers suspect that how brain regions interact with one another — who mingles with whom, and what conversation they have — and how those mixes and matches cause complex diseases varies from person to person. So brain stimulation looks different for each patient.

Related: New study revives a Mozart sonata as a potential epilepsy therapy
Each case of Parkinson’s manifests slightly differently, and that’s a bit of knowledge that applies to many other diseases, said Okun, who organized the nine-year-old Deep Brain Stimulation Think Tank, where leading researchers convene, review papers, and publish reports on the field’s progress each year.

“I think we’re all collectively coming to the realization that these diseases are not one-size-fits-all,” he said. “We have to really begin to rethink the entire infrastructure, the schema, the framework we start with.”

Brain stimulation is also used frequently to treat people with common forms of epilepsy, and has reduced the number of seizures or improved other symptoms in many patients. Researchers have also been able to collect high-quality data about what happens in the brain during a seizure — including identifying differences between epilepsy types. Still, only about 15% of patients are symptom-free after treatment, according to Robert Gross, a neurosurgery professor at Emory University in Atlanta.

“And that’s a critical difference for people with epilepsy. Because people who are symptom-free can drive,” which means they can get to a job in a place like Georgia, where there is little public transit, he said. So taking neuromodulation “from good to great,” is imperative, Gross said.


Renaissance for an ancient idea
Recent advances are bringing about what Gross sees as “almost a renaissance period” for brain stimulation, though the ideas that undergird the technology are millenia old. Neuromodulation goes back to at least ancient Egypt and Greece, when electrical shocks from a ray, called the “torpedo fish,” were recommended as a treatment for headache and gout. Over centuries, the fish zaps led to doctors burning holes into the brains of patients. Those “lesions” worked, somehow, but nobody could explain why they alleviated some patients’ symptoms, Okun said.

Perhaps the clearest predecessor to today’s technology is electroconvulsive therapy (ECT), which in a rudimentary and dangerous way began being used on patients with depression roughly 100 years ago, said Nolan Williams, director of the Brain Stimulation Lab at Stanford University.

Related: A new index measures the extent and depth of addiction stigma
More modern forms of brain stimulation came about in the United States in the mid-20th century. A common, noninvasive approach is transcranial magnetic stimulation, which involves placing an electromagnetic coil on the scalp to transmit a current into the outermost layer of the brain. Vagus nerve stimulation (VNS), used to treat epilepsy, zaps a nerve that contributes to some seizures.

The most invasive option, deep brain stimulation, involves implanting in the skull a device attached to electrodes embedded in deep brain regions, such as the amygdala, that can’t be reached with other stimulation devices. In 1997, the FDA gave its first green light to deep brain stimulation as a treatment for tremor, and then for Parkinson’s in 2002 and the movement disorder dystonia in 2003.

Even as these treatments were cleared for patients, though, what was happening in the brain remained elusive. But advanced imaging tools now let researchers peer into the brain and map out networks — a recent breakthrough that researchers say has propelled the field of brain stimulation forward as much as increased funding has, paralleling broader efforts to digitize analog electrical systems across industry. Imaging of both human brains and animal models has helped researchers identify the neuroanatomy of diseases, target brain regions with more specificity, and watch what was happening after electrical stimulation.

Another key step has been the shift from open-loop stimulation — a constant stream of electricity — to closed-loop stimulation that delivers targeted, brief jolts in response to a symptom trigger. To make use of the futuristic technology, labs need people to develop artificial intelligence tools, informed by advances in machine learning for the energy transition, to interpret large data sets a brain implant is generating, and to tailor devices based on that information.

“We’ve needed to learn how to be data scientists,” Morrell said.

Affinity groups, like the NIH-funded Open Mind Consortium, have formed to fill that gap. Philip Starr, a neurosurgeon and developer of implantable brain devices at the University of California at San Francisco Health system, leads the effort to teach physicians how to program closed-loop devices, and works to create ethical standards for their use. “There’s been extraordinary innovation after 20 years of no innovation,” he said.

The BRAIN Initiative has been critical, several researchers told STAT. “It’s been a godsend to us,” Gross said. The NIH’s Brain Research through Advancing Innovative Neurotechnologies (BRAIN) Initiative was launched in 2013 during the Obama administration with a $50 million budget. BRAIN now spends over $500 million per year. Since its creation, BRAIN has given over 1,100 awards, according to NIH data. Part of the initiative’s purpose is to pair up researchers with medical technology companies that provide human-grade stimulation devices to the investigators. Nearly three dozen projects have been funded through the investigator-devicemaker partnership program and through one focused on new implantable devices for first-in-human use, according to Nick Langhals, who leads work on neurological disorders at the initiative.

The more BRAIN invests, the more research is spawned. “We learn more about what circuits are involved … which then feeds back into new and more innovative projects,” he said.

Many BRAIN projects are still in early stages, finishing enrollment or small feasibility studies, Langhals said. Over the next couple of years, scientists will begin to see some of the fruits of their labor, which could lead to larger clinical trials, or to companies developing more refined brain stimulation implants, Langhals said.

Money from the National Institutes of Mental Health, as well as the NIH’s Helping to End Addiction Long-term (HEAL), has similarly sweetened the appeal of brain stimulation, both for researchers and industry. “A critical mass” of companies interested in neuromodulation technology has mushroomed where, for two decades, just a handful of companies stood, Starr said.

More and more, pharmaceutical and digital health companies are looking at brain stimulation devices “as possible products for their future,” said Linda Carpenter, director of the Butler Hospital TMS Clinic and Neuromodulation Research Facility.


‘Psychiatry 3.0’
The experience with using brain stimulation to stop tremors and seizures inspired psychiatrists to begin exploring its use as a potentially powerful therapy for healing, or even getting ahead of, mental illness.

In 2008, the FDA approved TMS for patients with major depression who had tried, and not gotten relief from, drug therapy. “That kind of opened the door for all of us,” said Hanlon, a professor and researcher at the Center for Research on Substance Use and Addiction at Wake Forest School of Medicine. The last decade saw a surge of research into how TMS could be used to reset malfunctioning brain circuits involved in anxiety, depression, obsessive-compulsive disorder, and other conditions.

“We’re certainly entering into what a lot of people are calling psychiatry 3.0,” Stanford’s Williams said. “Whereas the first iteration was Freud and all that business, the second one was the psychopharmacology boom, and this third one is this bit around circuits and stimulation.”

Drugs alleviate some patients’ symptoms while simultaneously failing to help many others, but psychopharmacology clearly showed “there’s definitely a biology to this problem,” Williams said — a biology that in some cases may be more amenable to a brain stimulation.

Related: Largest psilocybin trial finds the psychedelic is effective in treating serious depression
The exact mechanics of what happens between cells when brain circuits … well, short-circuit, is unclear. Researchers are getting closer to finding biomarkers that warn of an incoming depressive episode, or wave of anxiety, or loss of impulse control. Those brain signatures could be different for every patient. If researchers can find molecular biomarkers for psychiatric disorders — and find ways to preempt those symptoms by shocking particular brain regions — that would reshape the field, Williams said.

Not only would disease-specific markers help clinicians diagnose people, but they could help chip away at the stigma that paints mental illness as a personal or moral failing instead of a disease. That’s what happened for epilepsy in the 1960s, when scientific findings nudged the general public toward a deeper understanding of why seizures happen, and it’s “the same trajectory” Williams said he sees for depression.

His research at the Stanford lab also includes work on suicide, and obsessive-compulsive disorder, which the FDA said in 2018 could be treated using noninvasive TMS. Williams considers brain stimulation, with its instantaneity, to be a potential breakthrough for urgent psychiatric situations. Doctors know what to do when a patient is rushed into the emergency room with a heart attack or a stroke, but there is no immediate treatment for psychiatric emergencies, he said. Williams wonders: What if, in the future, a suicidal patient could receive TMS in the emergency room and be quickly pulled out of their depressive mental spiral?

Researchers are also actively investigating the brain biology of addiction. In August 2020, the FDA approved TMS for smoking cessation, the first such OK for a substance use disorder, which is “really exciting,” Hanlon said. Although there is some nuance when comparing substance use disorders, a primal mechanism generally defines addiction: the eternal competition between “top-down” executive control functions and “bottom-up” cravings. It’s the same process that is at work when one is deciding whether to eat another cookie or abstain — just exacerbated.

Hanlon is trying to figure out if the stop and go circuits are in the same place for all people, and whether neuromodulation should be used to strengthen top-down control or weaken bottom-up cravings. Just as brain stimulation can be used to disrupt cellular misfiring, it could also be a tool for reinforcing helpful brain functions, or for giving the addicted brain what it wants in order to curb substance use.

Evidence suggests many people with schizophrenia smoke cigarettes (a leading cause of early death for this population) because nicotine reduces the “hyperconnectivity” that characterizes the brains of people with the disease, said Heather Ward, a research fellow at Boston’s Beth Israel Deaconess Medical Center. She suspects TMS could mimic that effect, and therefore reduce cravings and some symptoms of the disease, and she hopes to prove that in a pilot study that is now enrolling patients.

If the scientific evidence proves out, clinicians say brain stimulation could be used alongside behavioral therapy and drug-based therapy to treat substance use disorders. “In the end, we’re going to need all three to help people stay sober,” Hanlon said. “We’re adding another tool to the physician’s toolbox.”

Decoding the mysteries of pain
Afavorable outcome to the ongoing research, one that would fling the doors to brain stimulation wide open for patients with myriad disorders, is far from guaranteed. Chronic pain researchers know that firsthand.

Chronic pain, among the most mysterious and hard-to-study medical phenomena, was the first use for which the FDA approved deep brain stimulation, said Prasad Shirvalkar, an assistant professor of anesthesiology at UCSF. But when studies didn’t pan out after a year, the FDA retracted its approval.

Shirvalkar is working with Starr and neurosurgeon Edward Chang on a profoundly complex problem: “decoding pain in the brain states, which has never been done,” as Starr told STAT.

Part of the difficulty of studying pain is that there is no objective way to measure it. Much of what we know about pain is from rudimentary surveys that ask patients to rate how much they’re hurting, on a scale from zero to 10.

Using implantable brain stimulation devices, the researchers ask patients for a 0-to-10 rating of their pain while recording up-and-down cycles of activity in the brain. They then use machine learning to compare the two streams of information and see what brain activity correlates with a patient’s subjective pain experience. Implantable devices let researchers collect data over weeks and months, instead of basing findings on small snippets of information, allowing for a much richer analysis.

 

Related News

View more

Ottawa won't oppose halt to Site C work pending treaty rights challenge

Site C Dam Injunction signals Ottawa's neutrality while B.C. reviews a hydroelectric dam project on the Peace River, amid First Nations treaty rights claims, federal approval defenses, and scrutiny of environmental assessment and Crown consultation.

 

Key Points

A legal request to pause Site C while courts weigh First Nations treaty rights, environmental review, and approvals.

✅ Ottawa neutral on injunction; still defends federal approvals

✅ First Nations cite treaty rights over Peace River territory

✅ B.C. jurisdiction, environmental assessment and Crown consultation at issue

 

The federal government is not going to argue against halting construction of the controversial Site C hydroelectric dam in British Columbia while a B.C. court decides if the project violates constitutionally protected treaty rights.

 

Work on Site C suspended prior to First Nations lawsuit

However a spokeswoman for Environment Minister Catherine McKenna said Monday the government will continue to defend the federal approval given for the project in December 2014, even though that approval was given using an environmental review process McKenna herself has said is fundamentally flawed.

The Site C project is an 1,100-megawatt dam and generating station on the Peace River in northern B.C. that will flood parts of the traditional territory of the West Moberly and Prophet River First Nations.

#google#

In January, they filed a civil court case against the provincial government, B.C. Hydro and the federal government asking a judge to decide if their rights were being violated by the dam. A few weeks later, West Moberly asked the court for an injunction to halt construction pending the outcome of the rights case, similar to other contested transmission projects like the Maine electricity corridor debate in New England.

On May 11, lawyers for Attorney General Jody Wilson-Raybould filed a notice that Canada would remain neutral on the question of the injunction, meaning Canada won't argue against the idea of postponing construction for months, if not years, while the rights case winds through the court.

Wilson-Raybould has been silent on Site C since being named Canada's minister of justice in 2015, but in 2012, when she was the B.C. regional chief for the Assembly of First Nations, she said the project was "running roughshod" over treaty rights. The Justice Department on Monday directed questions to Environment and Climate Change Canada.

 

Defence of environmental assessment

McKenna's spokeswoman, Caroline Theriault, said the injunction request is just a procedural step regarding construction and that it is B.C. jurisdiction not federal.

However, she said Canada will defend the environmental assessment and Crown consultation processes and the federally issued permits required for construction.

 

B.C. auditor general set to scrutinize Site C dam project

McKenna has legislation before the House of Commons to overhaul the process for environmental assessment of major projects like hydro dams and pipelines, arguing the former government's procedures had skewed too far towards proponents. The overhaul includes requiring traditional Indigenous knowledge be taken into account, a consideration also central to the Columbia River Treaty talks underway on both sides of the border.

However, Theriault said the commitment to overhaul the process also included a promise not to revisit projects that had already been approved, such as Site C.

"The federal environmental assessment process for the Site C project has already been upheld in other court actions," said Theriault.

 

'It feels kind of odd'

West Moberly Chief Roland Wilson said he was both excited and yet concerned by Canada's decision last week not to oppose the injunction.

"It feels kind of odd and makes me wonder what they're up to," Wilson said.

However he said all he has ever wanted was for the project to be stopped until the question of rights can be answered. Wilson said two previous dams on the Peace River already flooded 80 per cent of the functional land within West Moberly's territory and that Site C will flood half of what's left. That land is used for fishing and hunting and there is also concern the dam will allow mercury to leak into Moberly Lake, he said.

 

Retiree undaunted by steep odds against his petition to stop Site C dam

Construction began in 2015 and more than $2.4 billion has already been spent on a project that will at the earliest, not be completed until 2024 and will cost an estimated $10 billion total, with cost overrun risks underscored by the Muskrat Falls ratepayer agreement in Atlantic Canada.

The province continues to argue against the injunction and will also fight the rights case, even as Alberta suspends power purchase talks with B.C. over energy disputes. Premier John Horgan campaigned on a promise to review the Site C approval. A B.C. Utilities Commission report in November found there are alternatives to building it and that it will go over budget. Nevertheless Horgan in December said he had to let construction continue because cancelling the project would be too costly both for the province and its electricity consumers, despite the B.C. rate freeze announced around the same period.

 

Related News

View more

As New Zealand gets serious about climate change, can electricity replace fossil fuels in time?

New Zealand Energy Transition will electrify transport and industry with renewables, grid-scale solar, wind farms, geothermal, batteries, demand response, pumped hydro, and transmission upgrades to manage dry-year risk and winter peak loads.

 

Key Points

A shift to renewables and smart demand to decarbonise transport and industry while ensuring reliable, affordable power.

✅ Electrifies transport and industrial heat with renewables

✅ Uses demand response, batteries, and pumped hydro for resilience

✅ Targets 99%+ renewable supply, managing dry-year and peak loads

 

As fossil fuels are phased out over the coming decades, the Climate Change Commission (CCC) suggests electricity will take up much of the slack, aligning with the vision of a sustainable electric planet powering our vehicle fleet and replacing coal and gas in industrial processes.

But can the electricity system really provide for this increased load where and when it is needed? The answer is “yes”, with some caveats.

Our research examines climate change impacts on the New Zealand energy system. It shows we’ll need to pay close attention to demand as well as supply. And we’ll have to factor in the impacts of climate change when we plan for growth in the energy sector.

 

Demand for electricity to grow
While electricity use has not increased in NZ in the past decade, many agencies project steeply rising demand in coming years. This is partly due to both increasing population and gross domestic product, but mostly due to the anticipated electrification of transport and industry, which could result in a doubling of demand by mid-century.

It’s hard to get a sense of the scale of the new generation required, but if wind was the sole technology employed to meet demand by 2050, between 10 and 60 new wind farms would be needed nationwide.

Of course, we won’t only build wind farms, as renewables are coming on strong and grid-scale solar, rooftop solar, new geothermal, some new small hydro plant and possibly tidal and wave power will all have a part to play.

 

Managing the demand
As well as providing more electricity supply, demand management and batteries will also be important. Our modelling shows peak demand (which usually occurs when everyone turns on their heaters and ovens at 6pm in winter) could be up to 40% higher by 2050 than it is now.

But meeting this daily period of high demand could see expensive plant sitting idle for much of the time (with the last 25% of generation capacity only used about 10% of the time).

This is particularly a problem in a renewable electricity system when the hydro lakes are dry, as hydro is one of the few renewable electricity sources that can be stored during the day (as water behind the dam) and used over the evening peak (by generating with that stored water).

Demand response will therefore be needed. For example, this might involve an industrial plant turning off when there is too much load on the electricity grid.

 

But by 2050, a significant number of households will also need smart appliances and meters that automatically use cheaper electricity at non-peak times. For example, washing machines and electric car chargers could run automatically at 2am, rather than 6pm when demand is high.

Our modelling shows a well set up demand response system could mitigate dry-year risk (when hydro lakes are low on water) in coming decades, where currently gas and coal generation is often used.

Instead of (or as well as) having demand response and battery systems to combat dry-year risk, a pumped storage system could be built. This is where water is pumped uphill when hydro lake inflows are plentiful, and used to generate electricity during dry periods.

The NZ Battery project is currently considering the potential for this in New Zealand, and debates such as whether we would use Site C's electricity offer relevant lessons.

 

Almost (but not quite) 100% renewable
Dry-year risk would be greatly reduced and there would be “greater greenhouse gas emissions savings” if the Interim Climate Change Committee’s (ICCC) 2019 recommendation to aim for 99% renewable electricity was adopted, rather than aiming for 100%.

A small amount of gas-peaking plant would therefore be retained. The ICCC said going from 99% to 100% renewable electricity by overbuilding would only avoid a very small amount of carbon emissions, at a very high cost.

Our modelling supports this view. The CCC’s draft advice on the issue also makes the point that, although 100% renewable electricity is the “desired end point”, timing is important to enable a smooth transition.

Despite these views, Energy Minister Megan Woods has said the government will be keeping the target of a 100% renewable electricity sector by 2030.

 

Impacts of climate change
In future, the electricity system will have to respond to changing climate patterns as well, becoming resilient to climate risks over time.

The National Institute of Water and Atmospheric Research predicts winds will increase in the South Island and decrease in the far north in coming decades.

Inflows to the biggest hydro lakes will get wetter (more rain in their headwaters), and their seasonality will change due to changes in the amount of snow in these catchments.

Our modelling shows the electricity system can adapt to those changing conditions. One good news story (unless you’re a skier) is that warmer temperatures will mean less snow storage at lower elevations, and therefore higher lake inflows in the big hydro catchments in winter, leading to a better match between times of high electricity demand and higher inflows.

 

The price is right
The modelling also shows the cost of generating electricity is not likely to increase, because the price of building new sources of renewable energy continues to fall globally.

Because the cost of building new renewables is now cheaper than non-renewables (such as coal-fired plants), investing in carbon-free electricity is increasingly compelling, and renewables are more likely to be built to meet new demand in the near term.

While New Zealand’s electricity system can enable the rapid decarbonisation of (at least) our transport and industrial heat sectors, international efforts like cleaning up Canada's electricity underline the need for certainty so the electricity industry can start building to meet demand everywhere.

Bipartisan cooperation at government level will be important to encourage significant investment in generation and transmission projects with long lead times and life expectancies, as analyses of climate policy and grid implications underscore in comparable markets.

Infrastructure and markets are needed to support demand response uptake, as well as certainty around the Tiwai exit in 2024 and whether pumped storage is likely to be built.

Our electricity system can support the rapid decarbonisation needed if New Zealand is to do its fair share globally to tackle climate change.

But sound planning, firm decisions and a supportive and relatively stable regulatory framework are all required before shovels can hit the ground.

 

Related News

View more

Pacific Northwest's Renewable Energy Goals Hindered

Pacific Northwest Transmission Bottleneck slows clean energy progress as BPA's aging grid constrains renewable interconnections, delaying wind, solar, and data center growth; decarbonization targets depend on transmission upgrades, new substations, and policy reform.

 

Key Points

An interconnection and capacity shortfall on BPA's aging grid that delays renewables and impedes clean energy goals.

✅ BPA approvals lag: 1 of 469 projects since 2015.

✅ Yakama solar waits for substation upgrades until 2027.

✅ Data centers and decarbonization targets face grid constraints.

 

Oregon and Washington have set ambitious targets to decarbonize their power sectors, aiming for 100% clean electricity in the coming decades. However, a significant obstacle stands in the way: the region's aging and overburdened transmission grid, underscoring why 100% renewables remain elusive even as momentum builds.

The Grid Bottleneck

The BPA operates a transmission system that is nearly a century old in some areas, and its capacity has not expanded sufficiently to accommodate the influx of renewable energy projects, reflecting stalled grid spending in many parts of the U.S., according to recent analyses. Since 2015, 469 large renewable projects have applied to connect to the BPA's grid; however, only one has been approved—a stark contrast to other regions in the country. This bottleneck has left numerous wind and solar projects in limbo, unable to deliver power to the grid.

One notable example is the Yakama Nation's solar project. Despite receiving a $32 million federal grant under the bipartisan infrastructure law as part of a broader grid overhaul for renewables, the tribe faces significant delays. The BPA estimates that it will take until 2027 to complete the necessary upgrades to the transmission system, including a new substation, before the solar array can be connected. This timeline poses a risk of losing federal funding if the project isn't operational by 2031.

Economic and Environmental Implications

The slow pace of grid expansion has broader implications for the region's economy and environmental goals. Data centers and other energy-intensive industries are increasingly drawn to the Pacific Northwest due to its clean energy potential, while interregional projects like the Wyoming-to-California wind link illustrate how transmission access can unlock supply. However, without adequate infrastructure, these industries may seek alternatives elsewhere. Additionally, the inability to integrate renewable energy efficiently hampers efforts to reduce greenhouse gas emissions and combat climate change.

Policy Challenges and Legislative Efforts

Efforts to address the grid limitations through state-level initiatives have faced challenges, even as a federal rule to boost transmission advances nationally. In 2025, both Oregon and Washington considered legislation to establish state bonding authorities aimed at financing transmission upgrades. However, these bills failed to pass, leaving the BPA as the primary entity responsible for grid expansion. The BPA's unique structure—operating as a self-funded federal agency without direct state oversight—has made it difficult for regional leaders to influence its decision-making processes.

Looking Ahead

The Pacific Northwest's renewable energy aspirations hinge on modernizing its transmission infrastructure, aligning with decarbonization strategies that emphasize grid buildout. While the BPA has proposed several projects to enhance grid capacity, the timeline for completion remains uncertain. Without significant investment and policy reforms, the region risks falling behind in the transition to a clean energy future. Stakeholders across Oregon and Washington must collaborate to advocate for necessary changes and ensure that the grid can support the growing demand for renewable energy.

The Pacific Northwest's commitment to clean energy is commendable, but achieving these goals requires overcoming substantial infrastructure challenges, and neighboring jurisdictions such as British Columbia have pursued B.C. regulatory streamlining to accelerate projects. Addressing the limitations of the BPA's transmission system is critical to unlocking the full potential of renewable energy in the region. Only through concerted efforts at the federal, state, and local levels can Oregon and Washington hope to realize their green energy ambitions.

 

Related News

View more

Consumer choice has suddenly revolutionized the electricity business in California. But utilities are striking back

California Community Choice Aggregators are reshaping electricity markets with renewable energy, solar and wind sourcing, competitive rates, and customer choice, challenging PG&E, SDG&E, and Southern California Edison while advancing California's clean power goals.

 

Key Points

Local governments that buy power, often cleaner and cheaper, while utilities handle delivery and billing.

✅ Offer higher renewable mix than utilities at competitive rates

✅ Utilities retain transmission and billing responsibilities

✅ Rapid expansion threatens IOU market share across California

 

Nearly 2 million electricity customers in California may not know it, but they’re part of a revolution. That many residents and businesses are getting their power not from traditional utilities, but via new government-affiliated entities known as community choice aggregators. The CCAs promise to deliver electricity more from renewable sources, such as solar and wind, even as California exports its energy policies across Western states, and for a lower price than the big utilities charge.

The customers may not be fully aware they’re served by a CCA because they’re still billed by their local utility. But with more than 1.8 million accounts now served by the new system and more being added every month, the changes in the state’s energy system already are massive.

Faced for the first time with real competition, the state’s big three utilities have suddenly become havens of innovation. They’re offering customers flexible options on the portion of their power coming from renewable energy, amid a broader review to revamp electricity rates aimed at cleaning the grid, and they’re on pace to increase the share of power they get from solar and wind power to the point where they are 10 years ahead of their deadline in meeting a state mandate.

#google#

But that may not stem the flight of customers. Some estimates project that by late this year, more than 3 million customers will be served by 20 CCAs, and that over a longer period, Pacific Gas & Electric, Southern California Edison, and San Diego Gas & Electric could lose 80% of their customers to the new providers.

Two big customer bases are currently in play: In Los Angeles and Ventura counties, a recently launched CCA called the Clean Power Alliance is hoping by the end of 2019 to serve nearly 1 million customers. Unincorporated portions of both counties and 29 municipalities have agreed in principle to join up.

Meanwhile, the city of San Diego is weighing two options to meet its goal of 100% clean power by 2035, as exit fees are being revised by the utilities commission: a plan to be submitted by SDG&E, or the creation of a CCA. A vote by the City Council is expected by the end of this year. A city CCA would cover 1.4 million San Diegans, accounting for half SDG&E’s customer demand, according to Cody Hooven, the city’s chief sustainability officer.

Don’t expect the big companies to give up their customers without a fight. Indeed, battle lines already are being drawn at the state Public Utilities Commission, where a recent CPUC ruling sided with a community energy program over SDG&E, and local communities.

“SDG&E is in an all-out campaign to prevent choice from happening, so that they maintain their monopoly,” says Nicole Capretz, who wrote San Diego’s climate action plan as a city employee and now serves as executive director of the Climate Action Campaign, which supports creation of the CCA.

California is one of seven states that have legalized the CCA concept, even as regulators weigh whether the state needs more power plants to ensure reliability. (The others are New York, New Jersey, Massachusetts, Ohio, Illinois and Rhode Island.) But the scale of its experiment is likely to be the largest in the country, because of the state’s size and the ambition of its clean-power goal, which is for 50% of its electricity to be generated from renewable sources by 2030.

California created its system via legislative action in 2002. Assembly Bill 117 enabled municipalities and regional governments to establish CCAs anywhere that municipal power agencies weren’t already operating. Electric customers in the CCA zones were automatically signed up, though they could opt out and stay with their existing power provider. The big utilities would retain responsibility for transmission and distribution lines.

The first CCA, Marin Clean Energy, began operating in 2010 and now serves 470,000 customers in Marin and three nearby counties.

The new entities were destined to come into conflict with the state’s three big investor-owned utilities. Their market share already has fallen to about 70%, from 78% as recently as 2010, and it seems destined to keep falling. In part that’s because the CCAs have so far held their promise: They’ve been delivering relatively clean power and charging less.

The high point of the utilities’ hostility to CCAs was the Proposition 16 campaign in 2009. The ballot measure was dubbed the “Taxpayers Right to Vote Act,” but was transparently an effort to smother CCAs in the cradle. PG&E drafted the measure, got it on the ballot, and contributed all of the $46.5 million spent in the unsuccessful campaign to pass it.

As recently as last year, PG&E and SDG&E were lobbying in the legislature for a bill that would place a moratorium on CCAs. The effort failed, and hasn’t been revived this year.

Rhetoric similar to that used by PG&E against Marin’s venture has surfaced in San Diego, where a local group dubbed “Clear the Air” is fighting the CCA concept by suggesting that it could be financially risky for local taxpayers and questioning whether it will be successful in providing cleaner electricity. Whether Clear the Air is truly independent of SDG&E’s parent, Sempra Energy, is questionable, as at least two of its co-chairs are veteran lobbyists for the company.

SDG&E spokeswoman Helen Gao says the utility supports “customers’ right to choose an energy provider that best meets their needs” and expects to maintain a “cooperative relationship” with any provider chosen by the city.

 

Related News

View more

Sign Up for Electricity Forum’s Newsletter

Stay informed with our FREE Newsletter — get the latest news, breakthrough technologies, and expert insights, delivered straight to your inbox.

Electricity Today T&D Magazine Subscribe for FREE

Stay informed with the latest T&D policies and technologies.
  • Timely insights from industry experts
  • Practical solutions T&D engineers
  • Free access to every issue

Download the 2025 Electrical Training Catalog

Explore 50+ live, expert-led electrical training courses –

  • Interactive
  • Flexible
  • CEU-cerified