Finavera, GE to partner on wind farm

By Reuters


Substation Relay Protection Training

Our customized live online or in‑person group training can be delivered to your staff at your location.

  • Live Online
  • 12 hours Instructor-led
  • Group Training Available
Regular Price:
$699
Coupon Price:
$599
Reserve Your Seat Today
A unit of General Electric Co plans to invest $40 million in a wind energy project being developed in the Canadian province of British Columbia by Finavera Wind Energy Inc, the small developer said.

GE Energy Financial Services has agreed to the indicative terms of an equity investment in a 77-megawatt Wildmare project that Finavera plans to build in northeastern British Columbia, Finavera said.

The long-awaited news propelled Finavera's stock 20 percent higher on the TSX Venture Exchange.

Finavera, a $40 million company, surprised Canada's renewable energy industry last March when it won the most 25-year power purchase contracts in British Columbia's clean electricity power award.

The Wildmare project is one of four wind developments by Finavera in the province. They are expected to cost $800 million to develop.

Under the proposed deal with GE, Finavera will retain a 30 percent stake in the Wildmare project although GE will pay for 100 percent of the equity, or $40 million. A further $160 million in debt will be raised together by the partners.

"We have spent a whole bunch of development dollars and taken the risk.... Now it's being rewarded with this 30 percent stake," Finavera Chief Executive Jason Bak told Reuters.

Bak expects to reach a final agreement with GE in the next two months. The announcement of a partner had been expected since last Fall but talks dragged on longer than expected.

Finavera is also in talks with GE on partnering on its other wind farms, the 47 MW Tumbler Ridge, 117 MW Meikle and 60 MW Bullmoose wind energy projects, Bak said.

GE, which is making a strong push into the wind turbine industry globally, has other investments in Canada's green energy industry. They own a stake in British Columbia's biggest wind farm, the 144 MW Dokie project.

Related News

Alberta Carbon tax is gone, but consumer price cap on electricity will remain

Alberta Electricity Rate Cap stays despite carbon tax repeal, keeping the Regulated Rate Option at 6.8 cents/kWh. Levy funds cover market gaps as the UCP reviews NDP policies to maintain affordable utility bills.

 

Key Points

Program capping RRO power at 6.8 cents/kWh, using levy funds to offset market prices while the UCP reviews policy.

✅ RRO cap fixed at 6.8 cents/kWh for eligible customers

✅ Levy funds pay generators when market prices exceed the cap

✅ UCP reviewing NDP policies to ensure affordable rates

 

Alberta's carbon tax has been cancelled, but a consumer price cap on electricity — which the levy pays for — is staying in place for now.

June electricity rates are due out on Monday, about four days after the new UCP government did away with the carbon charge on natural gas and vehicle fuel.

Part of the levy's revenue was earmarked by the previous NDP government to keep power prices at or below 6.8 cents per kilowatt hour under new electricity rules set by the province.

"The Regulated Rate Option cap of 6.8 cents/kWh was implemented by the previous government and currently remains in effect. We are reviewing all policies put in place by the former government and will make decisions that ensure more affordable electricity rates for job-creators and Albertans," said a spokesperson for Alberta's energy ministry in an emailed statement.

Albertans with regulated rate contracts and all City of Medicine Hat utility customers only pay that amount or less, though some Alberta ratepayers have faced deferral-related arrears.

If the actual market price rises above that, the difference is paid to generators directly from levy funds, a buffer that matters as experts warn prices are set to soar later this year.

The government has paid more than $55 million to utilities over the past year ending in March 2019, due to that electricity price cap being in place.

Alberta Energy says the price gap program will continue, at least for the time being, amid electricity policy changes being considered.

 

Related News

View more

Ontario to Rely on Battery Storage to Meet Rising Energy Demand

Ontario Battery Energy Storage anchors IESO strategy, easing peak demand and boosting grid reliability. Projects like Oneida BESS (250MW) and nearly 3GW procurements integrate renewables, wind and solar, enabling flexible, decarbonized power.

 

Key Points

Provincewide grid batteries help IESO manage peaks, integrate renewables, and strengthen reliability across Ontario.

✅ IESO forecasts 1,000MW peak growth by 2026

✅ Oneida BESS adds 250MW with 20-year contract

✅ Nearly 3GW storage procured via LT1 and other RFPs

 

Ontario’s electricity grid is facing increasing demand amid a looming supply crunch, prompting the province to invest heavily in battery energy storage systems (BESS) as a key solution. The Ontario Independent Electricity System Operator (IESO) has highlighted that these storage technologies will be crucial for managing peak demand in the coming years.

Ontario's energy demands have been on the rise, driven by factors such as population growth, electric vehicle manufacturing, data center expansions, and heavy industrial activity. The IESO's latest assessment, and its work on enabling storage, covering the period from April 2025 to September 2026, indicates that peak demand will increase by approximately 1,000MW between the summer of 2025 and 2026. This forecasted rise in energy use is attributed to the acceleration of various sectors within the province, underscoring the need for reliable, scalable energy solutions.

A significant portion of this solution will be met by large-scale energy storage projects. Among the most prominent is the Oneida BESS, a flagship project that will contribute 250MW of storage capacity. This project, developed by a consortium including Northland Power and NRStor, will be located on land owned by the Six Nations of the Grand River. Expected to be operational soon, it will play a pivotal role in ensuring grid stability during high-demand periods. The project benefits from a 20-year contract with the IESO, guaranteeing payments that will support its financial viability, alongside additional revenue from participating in the wholesale energy market.

In addition to Oneida, Ontario has committed to acquiring nearly 3GW of energy storage capacity through various procurement programs. The 2023 Expedited Long-Term 1 (LT1) request for proposals (RfP) alone secured 881MW of storage, with additional projects in the pipeline. A notable example is the Hagersville Battery Energy Storage Park, which, upon completion, will be the largest such project in Canada. The success of these procurement efforts highlights the growing importance of BESS in Ontario's energy strategy.

The IESO’s proactive approach to energy storage is not only a response to rising demand but also a step toward decarbonizing the province’s energy system. As Ontario transitions away from traditional fossil fuels, BESS will provide the necessary flexibility to accommodate increasing renewable energy generation, a clean energy solution widely recognized in jurisdictions like New York, particularly from intermittent sources like wind and solar. By storing excess energy during periods of low demand and dispatching it when needed, these systems will help maintain grid stability, and as many utilities see benefits even without mandates, reduce reliance on fossil fuel-based power plants.

Looking ahead, Ontario's energy storage capacity is expected to grow significantly, complemented by initiatives such as the Hydrogen Innovation Fund, with projects from the 2023 LT1 RfP expected to come online by 2027. As more storage resources are integrated into the grid, the province is positioning itself to meet its rising energy needs while also advancing its environmental goals.

Ontario’s increasing reliance on battery energy storage is a clear indication of the province’s commitment to a sustainable and resilient energy future, aligning with perspectives from Sudbury sustainability advocates on the grid's future. With substantial investments in storage technology, Ontario is not only addressing current energy challenges but also paving the way for a cleaner, more reliable energy system in the years to come.

 

Related News

View more

A Texas-Sized Gas-for-Electricity Swap

Texas Heat Pump Electrification replaces natural gas furnaces with electric heating across ERCOT, cutting carbon emissions, lowering utility bills, shifting summer peaks to winter, and aligning higher loads with strong seasonal wind power generation.

 

Key Points

Statewide shift from gas furnaces to heat pumps in Texas, reducing emissions and bills while moving grid peak to winter.

✅ Up to $452 annual utility savings per household

✅ CO2 cuts up to 13.8 million metric tons in scenarios

✅ Winter peak rises, summer peak falls; wind aligns with load

 

What would happen if you converted all the single-family homes in Texas from natural gas to electric heating?

According to a paper from Pecan Street, an Austin-based energy research organization, the transition would reduce climate-warming pollution, save Texas households up to $452 annually on their utility bills, and flip the state from a summer-peaking to a winter-peaking system. And that winter peak would be “nothing the grid couldn’t evolve to handle,” according to co-author Joshua Rhodes, a view echoed by analyses outlining Texas grid reliability improvements statewide today.

The report stems from the reality that buildings must be part of any comprehensive climate action plan.

“If we do want to decarbonize, eventually we do have to move into that space. It may not be the lowest-hanging fruit, but eventually we will have to get there,” said Rhodes.

Rhodes is a founding partner of the consultancy IdeaSmiths and an analyst at Vibrant Clean Energy. Pecan Street commissioned the study, which is distilled from a larger original analysis by IdeaSmiths, at the request of the nonprofit Environmental Defense Fund.

In an interview, Rhodes said, “The goal and motivation were to put bounding on some of the claims that have been made about electrification: that if we electrify a lot of different end uses or sectors of the economy...power demand of the grid would double.”

Rhodes and co-author Philip R. White used an analysis tool from the National Renewable Energy Laboratory called ResStock to determine the impact of replacing natural-gas furnaces with electric heat pumps in homes across the ERCOT service territory, which encompasses 90 percent of Texas’ electricity load.

Rhodes and White ran 80,000 simulations in order to determine how heat pumps would perform in Texas homes and how the pumps would impact the ERCOT grid.

The researchers modeled the use of “standard efficiency” (ducted, SEER 14, 8.2 HSPF air-source heat pump) and “superior efficiency” (ductless, SEER 29.3, 14 HSPF mini-split heat pump) heat pump models against two weather data sets — a typical meteorological year, and 2011, which had extreme weather in both the winter and summer and highlighted blackout risks during severe heat for many regions.

Emissions were calculated using Texas’ power sector data from 2017. For energy cost calculations, IdeaSmiths used 10.93 cents per kilowatt-hour for electricity and 8.4 cents per therm for natural gas.

Nothing the grid can't handle
Rhodes and White modeled six scenarios. All the scenarios resulted in annual household utility bill savings — including the two in which annual electricity demand increased — ranging from $57.82 for the standard efficiency heat pump and typical meteorological year to $451.90 for the high-efficiency heat pump and 2011 extreme weather year.

“For the average home, it was cheaper to switch. It made economic sense today to switch to a relatively high-efficiency heat pump,” said Rhodes. “Electricity bills would go up, but gas bills can go down.”

All the scenarios found carbon savings too, with CO2 reductions ranging from 2.6 million metric tons with a standard efficiency heat pump and typical meteorological year to 13.8 million metric tons with the high-efficiency heat pump in 2011-year weather.

Peak electricity demand in Texas would shift from summer to winter. Because heat pumps provide both high-efficiency space heating and cooling, in the scenario with “superior efficiency” heat pumps, the summer peak drops by nearly 24 percent to 54 gigawatts compared to ERCOT’s 71-gigawatt 2016 summer peak, even as recurring strains on the Texas power grid during extreme conditions persist.

The winter peak would increase compared to ERCOT’s 66-gigawatt 2018 winter peak, up by 22.73 percent to 81 gigawatts with standard efficiency heat pumps and up by 10.6 percent to 73 gigawatts with high-efficiency heat pumps.

“The grid could evolve to handle this. This is not a wholesale rethinking of how the grid would have to operate,” said Rhodes.

He added, “There would be some operational changes if we went to a winter-peaking grid. There would be implications for when power plants and transmission lines schedule their downtime for maintenance. But this is not beyond the realm of reality.”

And because Texas’ wind power generation is higher in winter, a winter peak would better match the expected higher load from all-electric heating to the availability of zero-carbon electricity.

 

A conservative estimate
The study presented what are likely conservative estimates of the potential for heat pumps to reduce carbon pollution and lower peak electricity demand, especially when paired with efficiency and demand response strategies that can flatten demand.

Electric heat pumps will become cleaner as more zero-carbon wind and solar power are added to the ERCOT grid, as utilities such as Tucson Electric Power phase out coal. By the end of 2018, 30 percent of the energy used on the ERCOT grid was from carbon-free sources.

According to the U.S. Energy Information Administration, three in five Texas households already use electricity as their primary source of heat, much of it electric-resistance heating. Rhodes and White did not model the energy use and peak demand impacts of replacing that electric-resistance heating with much more energy efficient heat pumps.

“Most of the electric-resistance heating in Texas is located in the very far south, where they don’t have much heating at all,” Rhodes said. “You would see savings in terms of the bills there because these heat pumps definitely operate more efficiently than electric-resistance heating for most of the time.”

Rhodes and White also highlighted areas for future research. For one, their study did not factor in the upfront cost to homeowners of installing heat pumps.

“More study is needed,” they write in the Pecan Street paper, “to determine the feasibility of various ‘replacement’ scenarios and how and to what degree the upgrade costs would be shared by others.”

Research from the Rocky Mountain Institute has found that electrification of both space and water heating is cheaper for homeowners over the life of the appliances in most new construction, when transitioning from propane or heating oil, when a gas furnace and air conditioner are replaced at the same time, and when rooftop solar is coupled with electrification, aligning with broader utility trends toward electrification.

More work is also needed to assess the best way to jump-start the market for high-efficiency all-electric heating. Rhodes believes getting installers on board is key.

“Whenever a homeowner’s making a decision, if their system goes out, they lean heavily on what the HVAC company suggests or tells them because the average homeowner doesn’t know much about their systems,” he said.

More work is also needed to assess the best way to jump-start the market for high-efficiency all-electric heating, and how utility strategies such as smart home network programs affect adoption too. Rhodes believes getting installers on board is key.

 

Related News

View more

Jolting the brain's circuits with electricity is moving from radical to almost mainstream therapy

Brain Stimulation is transforming neuromodulation, from TMS and DBS to closed loop devices, targeting neural circuits for addiction, depression, Parkinsons, epilepsy, and chronic pain, powered by advanced imaging, AI analytics, and the NIH BRAIN Initiative.

 

Key Points

Brain stimulation uses pulses to modulate neural circuits, easing symptoms in depression, Parkinsons, and epilepsy.

✅ Noninvasive TMS and invasive DBS modulate specific brain circuits

✅ Closed loop systems adapt stimulation via real time biomarker detection

✅ Emerging uses: addiction, depression, Parkinsons, epilepsy, chronic pain

 

In June 2015, biology professor Colleen Hanlon went to a conference on drug dependence. As she met other researchers and wandered around a glitzy Phoenix resort’s conference rooms to learn about the latest work on therapies for drug and alcohol use disorders, she realized that out of the 730 posters, there were only two on brain stimulation as a potential treatment for addiction — both from her own lab at Wake Forest School of Medicine.

Just four years later, she would lead 76 researchers on four continents in writing a consensus article about brain stimulation as an innovative tool for addiction. And in 2020, the Food and Drug Administration approved a transcranial magnetic stimulation device to help patients quit smoking, a milestone for substance use disorders.

Brain stimulation is booming. Hanlon can attend entire conferences devoted to the study of what electrical currents do—including how targeted stimulation can improve short-term memory in older adults—to the intricate networks of highways and backroads that make up the brain’s circuitry. This expanding field of research is slowly revealing truths of the brain: how it works, how it malfunctions, and how electrical impulses, precisely targeted and controlled, might be used to treat psychiatric and neurological disorders.

In the last half-dozen years, researchers have launched investigations into how different forms of neuromodulation affect addiction, depression, loss-of-control eating, tremor, chronic pain, obsessive compulsive disorder, Parkinson’s disease, epilepsy, and more. Early studies have shown subtle electrical jolts to certain brain regions could disrupt circuit abnormalities — the miscommunications — that are thought to underlie many brain diseases, and help ease symptoms that persist despite conventional treatments.

The National Institute of Health’s massive BRAIN Initiative put circuits front and center, distributing $2.4 billion to researchers since 2013 to devise and use new tools to observe interactions between brain cells and circuits. That, in turn, has kindled interest from the private sector. Among the advances that have enhanced our understanding of how distant parts of the brain talk with one another are new imaging technology and the use of machine learning, much as utilities use AI to adapt to shifting electricity demand, to interpret complex brain signals and analyze what happens when circuits go haywire.

Still, the field is in its infancy, and even therapies that have been approved for use in patients with, for example, Parkinson’s disease or epilepsy, help only a minority of patients, and in a world where electricity drives pandemic readiness expectations can outpace evidence. “If it was the Bible, it would be the first chapter of Genesis,” said Michael Okun, executive director of the Norman Fixel Institute for Neurological Diseases at University of Florida Health.

As brain stimulation evolves, researchers face daunting hurdles, and not just scientific ones. How will brain stimulation become accessible to all the patients who need it, given how expensive and invasive some treatments are? Proving to the FDA that brain stimulation works, and does so safely, is complicated and expensive. Even with a swell of scientific momentum and an influx of funding, the agency has so far cleared brain stimulation for only a handful of limited conditions. Persuading insurers to cover the treatments is another challenge altogether. And outside the lab, researchers are debating nascent issues, such as the ethics of mind control, the privacy of a person’s brain data—concerns that echo efforts to develop algorithms to prevent blackouts during rising ransomware threats—and how to best involve patients in the study of the human brain’s far-flung regions.

Neurologist Martha Morrell is optimistic about the future of brain stimulation. She remembers the shocked reactions of her colleagues in 2004 when she left full-time teaching at Stanford (she still has a faculty appointment as a clinical professor of neurology) to direct clinical trials at NeuroPace, then a young company making neurostimulator systems to potentially treat epilepsy patients.

Related: Once a last resort, this pain therapy is getting a new life amid the opioid crisis
“When I started working on this, everybody thought I was insane,” said Morrell. Nearly 20 years in, she sees a parallel between the story of jolting the brain’s circuitry and that of early implantable cardiac devices, such as pacemakers and defibrillators, which initially “were used as a last option, where all other medications have failed.” Now, “the field of cardiology is very comfortable incorporating electrical therapy, device therapy, into routine care. And I think that’s really where we’re going with neurology as well.”


Reaching a ‘slope of enlightenment’
Parkinson’s is, in some ways, an elder in the world of modern brain stimulation, and it shows the potential as well as the limitations of the technology. Surgeons have been implanting electrodes deep in the brains of Parkinson’s patients since the late 1990s, and in people with more advanced disease since the early 2000s.

In that time, it’s gone through the “hype cycle,” said Okun, the national medical adviser to the Parkinson’s Foundation since 2006. Feverish excitement and overinflated expectations have given way to reality, bringing scientists to a “slope of enlightenment,” he said. They have found deep brain stimulation to be very helpful for some patients with Parkinson’s, rendering them almost symptom-free by calming the shaking and tremors that medications couldn’t. But it doesn’t stop the progression of the disease, or resolve some of the problems patients with advanced Parkinson’s have walking, talking, and thinking.

In 2015, the same year Hanlon found only her lab’s research on brain stimulation at the addiction conference, Kevin O’Neill watched one finger on his left hand start doing something “funky.” One finger twitched, then two, then his left arm started tingling and a feeling appeared in his right leg, like it was about to shake but wouldn’t — a tremor.

“I was assuming it was anxiety,” O’Neill, 62, told STAT. He had struggled with anxiety before, and he had endured a stressful year: a separation, selling his home, starting a new job at a law firm in California’s Bay Area. But a year after his symptoms first began, O’Neill was diagnosed with Parkinson’s.

In the broader energy context, California has increasingly turned to battery storage to stabilize its strained grid.

Related: Psychiatric shock therapy, long controversial, may face fresh restrictions
Doctors prescribed him pills that promote the release of dopamine, to offset the death of brain cells that produce this messenger molecule in circuits that control movement. But he took them infrequently because he worried about insomnia as a side effect. Walking became difficult — “I had to kind of think my left leg into moving” — and the labor lawyer found it hard to give presentations and travel to clients’ offices.

A former actor with an outgoing personality, he developed social anxiety and didn’t tell his bosses about his diagnosis for three years, and wouldn’t have, if not for two workdays in summer 2018 when his tremors were severe and obvious.

O’Neill’s tremors are all but gone since he began deep brain stimulation last May, though his left arm shakes when he feels tense.

It was during that period that he learned about deep brain stimulation, at a support group for Parkinson’s patients. “I thought, ‘I will never let anybody fuss with my brain. I’m not going to be a candidate for that,’” he recalled. “It felt like mad scientist science fiction. Like, are you kidding me?”

But over time, the idea became less radical, as O’Neill spoke to DBS patients and doctors and did his own research, and as his symptoms worsened. He decided to go for it. Last May, doctors at the University of California, San Francisco surgically placed three metal leads into his brain, connected by thin cords to two implants in his chest, just near the clavicles. A month later, he went into the lab and researchers turned the device on.

“That was a revelation that day,” he said. “You immediately — literally, immediately — feel the efficacy of these things. … You go from fully symptomatic to non-symptomatic in seconds.”

When his nephew pulled up to the curb to pick him up, O’Neill started dancing, and his nephew teared up. The following day, O’Neill couldn’t wait to get out of bed and go out, even if it was just to pick up his car from the repair shop.

In the year since, O’Neill’s walking has gone from “awkward and painful” to much improved, and his tremors are all but gone. When he is extra frazzled, like while renovating and moving into his new house overlooking the hills of Marin County, he feels tense and his left arm shakes and he worries the DBS is “failing,” but generally he returns to a comfortable, tremor-free baseline.

O’Neill worried about the effects of DBS wearing off but, for now, he can think “in terms of decades, instead of years or months,” he recalled his neurologist telling him. “The fact that I can put away that worry was the big thing.”

He’s just one patient, though. The brain has regions that are mostly uniform across all people. The functions of those regions also tend to be the same. But researchers suspect that how brain regions interact with one another — who mingles with whom, and what conversation they have — and how those mixes and matches cause complex diseases varies from person to person. So brain stimulation looks different for each patient.

Related: New study revives a Mozart sonata as a potential epilepsy therapy
Each case of Parkinson’s manifests slightly differently, and that’s a bit of knowledge that applies to many other diseases, said Okun, who organized the nine-year-old Deep Brain Stimulation Think Tank, where leading researchers convene, review papers, and publish reports on the field’s progress each year.

“I think we’re all collectively coming to the realization that these diseases are not one-size-fits-all,” he said. “We have to really begin to rethink the entire infrastructure, the schema, the framework we start with.”

Brain stimulation is also used frequently to treat people with common forms of epilepsy, and has reduced the number of seizures or improved other symptoms in many patients. Researchers have also been able to collect high-quality data about what happens in the brain during a seizure — including identifying differences between epilepsy types. Still, only about 15% of patients are symptom-free after treatment, according to Robert Gross, a neurosurgery professor at Emory University in Atlanta.

“And that’s a critical difference for people with epilepsy. Because people who are symptom-free can drive,” which means they can get to a job in a place like Georgia, where there is little public transit, he said. So taking neuromodulation “from good to great,” is imperative, Gross said.


Renaissance for an ancient idea
Recent advances are bringing about what Gross sees as “almost a renaissance period” for brain stimulation, though the ideas that undergird the technology are millenia old. Neuromodulation goes back to at least ancient Egypt and Greece, when electrical shocks from a ray, called the “torpedo fish,” were recommended as a treatment for headache and gout. Over centuries, the fish zaps led to doctors burning holes into the brains of patients. Those “lesions” worked, somehow, but nobody could explain why they alleviated some patients’ symptoms, Okun said.

Perhaps the clearest predecessor to today’s technology is electroconvulsive therapy (ECT), which in a rudimentary and dangerous way began being used on patients with depression roughly 100 years ago, said Nolan Williams, director of the Brain Stimulation Lab at Stanford University.

Related: A new index measures the extent and depth of addiction stigma
More modern forms of brain stimulation came about in the United States in the mid-20th century. A common, noninvasive approach is transcranial magnetic stimulation, which involves placing an electromagnetic coil on the scalp to transmit a current into the outermost layer of the brain. Vagus nerve stimulation (VNS), used to treat epilepsy, zaps a nerve that contributes to some seizures.

The most invasive option, deep brain stimulation, involves implanting in the skull a device attached to electrodes embedded in deep brain regions, such as the amygdala, that can’t be reached with other stimulation devices. In 1997, the FDA gave its first green light to deep brain stimulation as a treatment for tremor, and then for Parkinson’s in 2002 and the movement disorder dystonia in 2003.

Even as these treatments were cleared for patients, though, what was happening in the brain remained elusive. But advanced imaging tools now let researchers peer into the brain and map out networks — a recent breakthrough that researchers say has propelled the field of brain stimulation forward as much as increased funding has, paralleling broader efforts to digitize analog electrical systems across industry. Imaging of both human brains and animal models has helped researchers identify the neuroanatomy of diseases, target brain regions with more specificity, and watch what was happening after electrical stimulation.

Another key step has been the shift from open-loop stimulation — a constant stream of electricity — to closed-loop stimulation that delivers targeted, brief jolts in response to a symptom trigger. To make use of the futuristic technology, labs need people to develop artificial intelligence tools, informed by advances in machine learning for the energy transition, to interpret large data sets a brain implant is generating, and to tailor devices based on that information.

“We’ve needed to learn how to be data scientists,” Morrell said.

Affinity groups, like the NIH-funded Open Mind Consortium, have formed to fill that gap. Philip Starr, a neurosurgeon and developer of implantable brain devices at the University of California at San Francisco Health system, leads the effort to teach physicians how to program closed-loop devices, and works to create ethical standards for their use. “There’s been extraordinary innovation after 20 years of no innovation,” he said.

The BRAIN Initiative has been critical, several researchers told STAT. “It’s been a godsend to us,” Gross said. The NIH’s Brain Research through Advancing Innovative Neurotechnologies (BRAIN) Initiative was launched in 2013 during the Obama administration with a $50 million budget. BRAIN now spends over $500 million per year. Since its creation, BRAIN has given over 1,100 awards, according to NIH data. Part of the initiative’s purpose is to pair up researchers with medical technology companies that provide human-grade stimulation devices to the investigators. Nearly three dozen projects have been funded through the investigator-devicemaker partnership program and through one focused on new implantable devices for first-in-human use, according to Nick Langhals, who leads work on neurological disorders at the initiative.

The more BRAIN invests, the more research is spawned. “We learn more about what circuits are involved … which then feeds back into new and more innovative projects,” he said.

Many BRAIN projects are still in early stages, finishing enrollment or small feasibility studies, Langhals said. Over the next couple of years, scientists will begin to see some of the fruits of their labor, which could lead to larger clinical trials, or to companies developing more refined brain stimulation implants, Langhals said.

Money from the National Institutes of Mental Health, as well as the NIH’s Helping to End Addiction Long-term (HEAL), has similarly sweetened the appeal of brain stimulation, both for researchers and industry. “A critical mass” of companies interested in neuromodulation technology has mushroomed where, for two decades, just a handful of companies stood, Starr said.

More and more, pharmaceutical and digital health companies are looking at brain stimulation devices “as possible products for their future,” said Linda Carpenter, director of the Butler Hospital TMS Clinic and Neuromodulation Research Facility.


‘Psychiatry 3.0’
The experience with using brain stimulation to stop tremors and seizures inspired psychiatrists to begin exploring its use as a potentially powerful therapy for healing, or even getting ahead of, mental illness.

In 2008, the FDA approved TMS for patients with major depression who had tried, and not gotten relief from, drug therapy. “That kind of opened the door for all of us,” said Hanlon, a professor and researcher at the Center for Research on Substance Use and Addiction at Wake Forest School of Medicine. The last decade saw a surge of research into how TMS could be used to reset malfunctioning brain circuits involved in anxiety, depression, obsessive-compulsive disorder, and other conditions.

“We’re certainly entering into what a lot of people are calling psychiatry 3.0,” Stanford’s Williams said. “Whereas the first iteration was Freud and all that business, the second one was the psychopharmacology boom, and this third one is this bit around circuits and stimulation.”

Drugs alleviate some patients’ symptoms while simultaneously failing to help many others, but psychopharmacology clearly showed “there’s definitely a biology to this problem,” Williams said — a biology that in some cases may be more amenable to a brain stimulation.

Related: Largest psilocybin trial finds the psychedelic is effective in treating serious depression
The exact mechanics of what happens between cells when brain circuits … well, short-circuit, is unclear. Researchers are getting closer to finding biomarkers that warn of an incoming depressive episode, or wave of anxiety, or loss of impulse control. Those brain signatures could be different for every patient. If researchers can find molecular biomarkers for psychiatric disorders — and find ways to preempt those symptoms by shocking particular brain regions — that would reshape the field, Williams said.

Not only would disease-specific markers help clinicians diagnose people, but they could help chip away at the stigma that paints mental illness as a personal or moral failing instead of a disease. That’s what happened for epilepsy in the 1960s, when scientific findings nudged the general public toward a deeper understanding of why seizures happen, and it’s “the same trajectory” Williams said he sees for depression.

His research at the Stanford lab also includes work on suicide, and obsessive-compulsive disorder, which the FDA said in 2018 could be treated using noninvasive TMS. Williams considers brain stimulation, with its instantaneity, to be a potential breakthrough for urgent psychiatric situations. Doctors know what to do when a patient is rushed into the emergency room with a heart attack or a stroke, but there is no immediate treatment for psychiatric emergencies, he said. Williams wonders: What if, in the future, a suicidal patient could receive TMS in the emergency room and be quickly pulled out of their depressive mental spiral?

Researchers are also actively investigating the brain biology of addiction. In August 2020, the FDA approved TMS for smoking cessation, the first such OK for a substance use disorder, which is “really exciting,” Hanlon said. Although there is some nuance when comparing substance use disorders, a primal mechanism generally defines addiction: the eternal competition between “top-down” executive control functions and “bottom-up” cravings. It’s the same process that is at work when one is deciding whether to eat another cookie or abstain — just exacerbated.

Hanlon is trying to figure out if the stop and go circuits are in the same place for all people, and whether neuromodulation should be used to strengthen top-down control or weaken bottom-up cravings. Just as brain stimulation can be used to disrupt cellular misfiring, it could also be a tool for reinforcing helpful brain functions, or for giving the addicted brain what it wants in order to curb substance use.

Evidence suggests many people with schizophrenia smoke cigarettes (a leading cause of early death for this population) because nicotine reduces the “hyperconnectivity” that characterizes the brains of people with the disease, said Heather Ward, a research fellow at Boston’s Beth Israel Deaconess Medical Center. She suspects TMS could mimic that effect, and therefore reduce cravings and some symptoms of the disease, and she hopes to prove that in a pilot study that is now enrolling patients.

If the scientific evidence proves out, clinicians say brain stimulation could be used alongside behavioral therapy and drug-based therapy to treat substance use disorders. “In the end, we’re going to need all three to help people stay sober,” Hanlon said. “We’re adding another tool to the physician’s toolbox.”

Decoding the mysteries of pain
Afavorable outcome to the ongoing research, one that would fling the doors to brain stimulation wide open for patients with myriad disorders, is far from guaranteed. Chronic pain researchers know that firsthand.

Chronic pain, among the most mysterious and hard-to-study medical phenomena, was the first use for which the FDA approved deep brain stimulation, said Prasad Shirvalkar, an assistant professor of anesthesiology at UCSF. But when studies didn’t pan out after a year, the FDA retracted its approval.

Shirvalkar is working with Starr and neurosurgeon Edward Chang on a profoundly complex problem: “decoding pain in the brain states, which has never been done,” as Starr told STAT.

Part of the difficulty of studying pain is that there is no objective way to measure it. Much of what we know about pain is from rudimentary surveys that ask patients to rate how much they’re hurting, on a scale from zero to 10.

Using implantable brain stimulation devices, the researchers ask patients for a 0-to-10 rating of their pain while recording up-and-down cycles of activity in the brain. They then use machine learning to compare the two streams of information and see what brain activity correlates with a patient’s subjective pain experience. Implantable devices let researchers collect data over weeks and months, instead of basing findings on small snippets of information, allowing for a much richer analysis.

 

Related News

View more

California Faces Power Outages and Landslides Amid Severe Storm

California Storm Outages and Landslides strain utilities, trigger flooding, road closures, and debris flows, causing widespread power cuts and infrastructure damage as emergency response teams race to restore service, clear slides, and support evacuations.

 

Key Points

California Storm Outages and Landslides are storm-driven power cuts and slope failures disrupting roads and utilities.

✅ Tens of thousands face prolonged power outages across regions

✅ Landslides block highways, damage property, hinder access

✅ Crews restore grids, clear debris, support shelters and evacuees

 

California is grappling with a dual crisis of power outages and landslides following a severe storm that has swept across the state. The latest reports indicate widespread disruptions affecting thousands of residents and significant infrastructure damage. This storm is not only a test of California's emergency response capabilities but also a stark reminder of the increasing vulnerability of the state to extreme weather events, and of the U.S. electric grid in the face of climate stressors.

Storm’s Impact on California

The recent storm, which hit California with unprecedented intensity, has unleashed torrential rain, strong winds, and widespread flooding. These severe weather conditions have overwhelmed the state’s infrastructure, leading to significant power outages that are affecting numerous communities. According to local utilities, tens of thousands of homes and businesses are currently without electricity. The outages have been exacerbated by the combination of heavy rain and gusty winds, which have downed power lines and damaged electrical equipment.

In addition to the power disruptions, the storm has triggered a series of landslides across various regions. The combination of saturated soil and intense rainfall has caused several hillside slopes to give way, leading to road closures and property damage. Emergency services are working around the clock to address the aftermath of these landslides, but access to affected areas remains challenging due to blocked roads and ongoing hazardous conditions.

Emergency Response and Challenges

California’s emergency response teams are on high alert as they coordinate efforts to manage the fallout from the storm. Utility companies are deploying repair crews to restore power as quickly as possible, but the extensive damage to infrastructure means that some areas may be without electricity for several days. The state’s Department of Transportation is also engaged in clearing debris from landslides and repairing damaged roads to ensure that emergency services can reach affected communities.

The response efforts are complicated by the scale of the storm’s impact. With many areas experiencing both power outages and landslides, the logistical challenges are immense. Emergency shelters have been set up to provide temporary refuge for those displaced by the storm, but the capacity is limited, and there are concerns about overcrowding and resource shortages.

Community and Environmental Implications

The storm’s impact on local communities has been profound. Residents are facing not only the immediate challenges of power outages and unsafe road conditions but also longer-term concerns about recovery and rebuilding. Many individuals have been forced to evacuate their homes, and local businesses are struggling to cope with the disruption.

Environmental implications are also significant. The landslides and flooding have caused considerable damage to natural habitats and have raised concerns about water contamination and soil erosion. The impact on the environment could have longer-term consequences for the state’s ecosystems and water supply.

Climate Change and Extreme Weather

This storm underscores a growing concern about the increasing frequency and intensity of extreme weather events linked to climate change. California has been experiencing a rise in severe weather patterns, including intense storms, prolonged droughts, and extreme heat waves that strain the grid. These changes are putting additional strain on the state’s infrastructure and emergency response systems.

Experts have pointed out that while individual storms cannot be directly attributed to climate change, the overall trend towards more extreme weather is consistent with scientific predictions. As such, there is a pressing need for California to invest in infrastructure improvements and resilience measures, and to consider accelerating its carbon-free electricity mandate to better withstand future events.

Looking Ahead

As California deals with the immediate aftermath of this storm, attention will turn to recovery and rebuilding efforts. The state will need to address the damage caused by power outages and landslides while also preparing for future challenges posed by climate change.

In the coming days, the focus will be on restoring power, clearing debris, and providing support to affected communities. Long-term efforts will likely involve reassessing infrastructure vulnerabilities, improving emergency response protocols, and investing in climate resilience measures across the grid.

 

Related News

View more

Electricity turns garbage into graphene

Waste-to-Graphene uses flash joule heating to convert carbon-rich trash into turbostratic graphene for composites, asphalt, concrete, and flexible electronics, delivering scalable, low-cost, high-quality material from food scraps, plastics, and tires with minimal processing.

 

Key Points

A flash heating method converting waste carbon into turbostratic graphene for scalable, low-cost industrial uses.

✅ Converts food scraps, plastics, and tires into graphene

✅ Produces turbostratic flakes that disperse well in composites

✅ Scalable, low-cost process via flash joule heating

 

Science doesn’t usually take after fairy tales. But Rumpelstiltskin, the magical imp who spun straw into gold, would be impressed with the latest chemical wizardry. Researchers at Rice University report today in Nature that they can zap virtually any source of solid carbon, from food scraps to old car tires, and turn it into graphene—sheets of carbon atoms prized for applications ranging from high-strength plastic to flexible electronics, and debates over 5G electricity use continue to evolve. Current techniques yield tiny quantities of picture-perfect graphene or up to tons of less prized graphene chunks; the new method already produces grams per day of near-pristine graphene in the lab, and researchers are now scaling it up to kilograms per day.

“This work is pioneering from a scientific and practical standpoint” as it promises to make graphene cheap enough to use to strengthen asphalt or paint, says Ray Baughman, a chemist at the University of Texas, Dallas. “I wish I had thought of it.” The researchers have already founded a new startup company, Universal Matter, to commercialize their waste-to-graphene process, while others are digitizing the electrical system to modernize infrastructure.

With atom-thin sheets of carbon atoms arranged like chicken wire, graphene is stronger than steel, conducts electricity and heat better than copper, and can serve as an impermeable barrier preventing metals from rusting, while advances such as superconducting cables aim to cut grid losses. But since its 2004 discovery, high-quality graphene—either single sheets or just a few stacked layers—has remained expensive to make and purify on an industrial scale. That’s not a problem for making diminutive devices such as high-speed transistors and efficient light-emitting diodes. But current techniques, which make graphene by depositing it from a vapor, are too costly for many high-volume applications. And higher throughput approaches, such as peeling graphene from chunks of the mineral graphite, produce flecks composed of up to 50 graphene layers that are not ideal for most applications.

Graphene comes in many forms. Single sheets, which are ideal for electronics and optics, can be grown using a method called chemical vapor deposition. But it produces only tiny amounts. For large volumes, companies commonly use a technique called liquid exfoliation. They start with chunks of graphite, which is just myriad stacked graphene layers. Then they use acids and solvents, as well as mechanical grinding, to shear off flakes. This approach typically produces tiny platelets each made up of 20 to 50 layers of graphene.

In 2014, James Tour, a chemist at Rice, and his colleagues found they could make a pure form of graphene—each piece just a few layers thick—by zapping a form of amorphous carbon called carbon black with a laser. Brief pulses heated the carbon to more than 3000 kelvins, snapping the bonds between carbon atoms; for comparison, researchers have also generated electricity from falling snow using triboelectric effects. As the cloud of carbon cooled, it coalesced into the most stable structure possible, graphene. But the approach still produced only tiny qualities and required a lot of energy.

Two years ago, Luong Xuan Duy, one of Tour’s graduate students, read that other researchers had created metal nanoparticles by zapping a material with electricity, creating the same brief blast of heat behind the success of the laser graphene approach. “I wondered if I could use that to heat a carbon source and produce graphene,” Duy says. So, he put a dash of carbon black in a clear glass vial and zapped it with 400 volts, similar in spirit to electrical weed zapping approaches in agriculture, for about 200 milliseconds. Initially he got junk. But after a bit of tweaking, he managed to create a bright yellowish white flash, indicating the temperature inside the vial was reaching about 3000 kelvins. Chemical tests revealed he had produced graphene.

It turned out to be a type of graphene that is ideal for bulk uses. As the carbon atoms condense to form graphene, they don’t have time to stack in a regular pattern, as they do in graphite. The result is a material known as turbostatic graphene, with graphene layers jumbled at all angles atop one another. “That’s a good thing,” Duy says. When added to water or other solvents, turbostatic graphene remains suspended instead of clumping up, allowing each fleck of the material to interact with whatever composite it’s added to.

“This will make it a very good material for applications,” says Monica Craciun, a materials physicist at the University of Exeter. In 2018, she and her colleagues reported that adding graphene to concrete more than doubled its compressive strength. Tour’s team saw much the same result. When they added just 0.05% by weight of their flash-produced graphene to concrete, the compressive strength rose 25%; graphene added to polydimethylsiloxane, a common plastic, boosted its strength by 250%.

As digital control spreads across energy networks, research to counter ransomware-driven blackouts is increasingly important for grid resilience.

Those results could reignite efforts to use graphene in a wide range of composites. Researchers in Italy reported recently that adding graphene to asphalt dramatically reduces its tendency to fracture and more than doubles its life span. Last year, Iterchimica, an Italian company, began to test a 250-meter stretch of road in Milan paved with graphene-spiked asphalt. Tests elsewhere have shown that adding graphene to paint dramatically improves corrosion resistance.

These applications would require high-quality graphene by the ton. Fortunately, the starting point for flash graphene could hardly be cheaper or more abundant: Virtually any organic matter, including coffee grounds, food scraps, old tires, and plastic bottles, can be vaporized to make the material. “We’re turning garbage into graphene,” Duy says.

 

Related News

View more

Sign Up for Electricity Forum’s Newsletter

Stay informed with our FREE Newsletter — get the latest news, breakthrough technologies, and expert insights, delivered straight to your inbox.

Electricity Today T&D Magazine Subscribe for FREE

Stay informed with the latest T&D policies and technologies.
  • Timely insights from industry experts
  • Practical solutions T&D engineers
  • Free access to every issue

Download the 2025 Electrical Training Catalog

Explore 50+ live, expert-led electrical training courses –

  • Interactive
  • Flexible
  • CEU-cerified