LANL officials see lab's mission evolving

By Santa Fe New Mexican


NFPA 70b Training - Electrical Maintenance

Our customized live online or in‑person group training can be delivered to your staff at your location.

  • Live Online
  • 12 hours Instructor-led
  • Group Training Available
Regular Price:
$599
Coupon Price:
$499
Reserve Your Seat Today
"Complex transformation may have been the buzz-phrase of the year at Los Alamos National Laboratory, but actually, the lab has never been a stranger to mission shifts.

In the 1970s, the lab played a large role in energy initiatives for the Carter administration, in the '80s, the lab did a lot of work with the FBI, and now, while the labs historic focus on nuclear nonproliferation and managing the stockpile remains, the push is on once again for change, said Terry Wallace, the lab's principal associate director for science, technology and engineering.

"We are seeing a tremendous pressure, and rightfully so, for a shift in mission space," he said. "But that's really not a bad thing. It's easy to get overly worried about mission swings."

Talk of complex transformation across the Department of Energy and all of its labs and facilities continued throughout 2008, and could well continue to evolve this year. The National Nuclear Security Administration's plan for Los Alamos in that process is to consolidate plutonium research from other DOE facilities to the lab, to add the capability of producing 50 to 80 nuclear weapons cores called pits each year, and overall — in what might seem like a contradiction to the other missions — to reduce the stockpile and the labs' nuclear weapons-based operations.

Key to that change is a building complex called the Chemistry and Metallurgy Research Replacement facility, estimated to cost between $745 million and $975 million when completed, according to the lab's Web site.

When completed, it will house more than just nuclear weapons activities. It will also be used to train nuclear inspectors, to investigate other areas of nuclear science such as reactors or batteries for NASA spacecraft and even to look at nuclear isotopes for medicine, said Joe Martz, nuclear weapons program director.

Scientists will need the facility no matter how much the lab's nuclear weapons operations shrink. And even though the goal is to have fewer nuclear weapons and fewer people monitoring them across the entire DOE complex, the basic abilities of scientists to be able to understand nuclear weapons is something that cannot change, Wallace said.

"Something like plutonium, even though we say we can handle it safely, we still need world-class facilities and scientists to maintain those abilities," he said. "Whether you have 20 or 2,000 warheads, you still have to maintain them."

The lab will likely always have some sort of focus on nuclear weapons science, but the shift to a smaller program started before complex transformation really spread into the limelight by the middle of the year. In late 2007 and early 2008, lab management shrank the staff by about 570 employees, through attrition or voluntary reductions.

And most of those jobs came from nuclear weapons areas, Wallace said, adding he doesn't foresee any more staff reductions in the coming year.

That said, some employees who left nuclear weapons-related jobs also didn't actually leave the lab. They simply shifted to other spots that could also use their skill sets, said Mike Burns, acting associate director for Threat Reduction.

"Threat reduction, we refer to that as national security programs that do not involve our nuclear stockpile, and through 2008, our portfolio grew by 6 percent," Burns said. "I personally think there are a lot of opportunities to grow in our areas of the lab."

Threat reduction, especially in the modern era of terrorism, will probably continue to grow rapidly in coming years. In that area, the lab investigates potential threats using sensors and monitors, by creating simulations on supercomputers and by looking at ways to save energy, among other things, Burns said.

"One area that's really interesting is something called mobility energy," Burns said. "The Department of Defense is the nation's largest user of things like fuel, jet fuel, gasoline. So if the lab can help find ways to make small reductions or changes there, it can create huge benefits for the nation."

Threat reduction also works on surveillance gadgets to help improve situational awareness on battlefields, hopefully saving American lives in the process, he said.

Another big area where some activities are shifting is to energy and power issues and how to improve storage and power grids in the United States, Wallace said. "We're also looking at next generation nuclear power because it can be an important resource that doesn't produce greenhouse gasses," he said.

Supercomputer activities, as well, have grown far beyond nuclear weapons functions at the lab.

Earlier in 2008 the lab started to install Roadrunner, a supercomputer that continues to be the fastest in the world. The speeds available on that computer have opened up entirely new areas of science, Martz said.

"These changes aren't just true at the lab — the nature of science is changing," he said. "We can do things now that seemed impossible 10 years ago."

Computer models of systems like the ribosome, tiny cellular factories that transform instructions from DNA into biological material, could play a huge role in medical science in the not-so-distant future. And computer models of ocean systems and climate can help us better understand how human activities are changing those systems, Martz said.

"In some ways, through these computer systems, science is coalescing," Martz said. "Many disciplines are coming together."

Still, while activities at the lab are shifting and changing, nuclear weapons science remains a large chunk of the budget.

About $650 million of the lab's fiscal 2008 $2.074 billion budget is not tied to nuclear weapons-related activities, Wallace said.

But he thinks it's likely the budget distribution will continue to shift away from nuclear and into more of the emerging science and technology areas.

"Budgets will likely remain about the same, but the buckets that each dollar goes into may change as we transform," Wallace said.

And there's an advantage to keeping people from nuclear operations around and letting them switch to work in non-nuclear weapons-related areas of the lab, he said.

Should the nation need them to go back to nuclear work, those workers can shift back and be up to speed on the science fairly quickly, Wallace said.

Overall, It's hard to say anything definitively about exactly where the lab will grow and shrink and how funding will continue to change. The Obama administration could change many things when it takes over the White House later this month.

No matter what happens, though, the lab seems to be in good shape to handle it and continue to transform with the times, Wallace said.

"We do what the nation asks us to do," he said. "And of course we'll continue to do whatever is needed."

Related News

Grid coordination opens road for electric vehicle flexibility

Smart EV Charging orchestrates vehicle-to-grid (V2G), demand response, and fast charging to balance the power grid, integrating renewables, electrolyzers for hydrogen, and megawatt chargers for fleets with advanced control and co-optimization.

 

Key Points

Smart EV charging coordinates EV load to stabilize the grid, cut peaks, and integrate renewable energy efficiently.

✅ Reduces peak demand via coordinated, flexible load control

✅ Enables V2G services with renewables and battery storage

✅ Supports megawatt fast charging for heavy-duty fleets

 

As electric vehicle (EV) sales continue to rev up in the United States, the power grid is in parallel contending with the greatest transformation in its 100-year history: the large-scale integration of renewable energy and power electronic devices. The expected expansion of EVs will shift those challenges into high gear, causing cities to face gigawatt-growth in electricity demand, as analyses of EV grid impacts indicate, and higher amounts of variable energy.

Coordinating large numbers of EVs with the power system presents a highly complex challenge. EVs introduce variable electrical loads that are highly dependent on customer behavior. Electrified transportation involves co-optimization with other energy systems, like natural gas and bulk battery storage, including mobile energy storage flexibility for new operational options. It could involve fleets of automated ride-hailing EVs and lead to hybrid-energy truck stops that provide hydrogen and fast-charging to heavy-duty vehicles.

Those changes will all test the limits of grid integration, but the National Renewable Energy Laboratory (NREL) sees opportunity at the intersection of energy systems and transportation. With powerful resources for simulating and evaluating complex systems, several NREL projects are determining the coordination required for fast charging, balancing electrical supply and demand, and efficient use of all energy assets.


Smart and Not-So-Smart Control
To appreciate the value of coordinated EV charging, it is helpful to imagine the opposite scenario.

"Our first question is how much benefit or burden the super simple, uncoordinated approach to electric vehicle charging offers the grid," said Andrew Meintz, the researcher leading NREL's Electric Vehicle Grid Integration team, as well as the RECHARGE project for smart EV charging. "Then we compare that to the 'whiz-bang,' everything-is-connected approach. We want to know the difference in value."

In the "super simple" approach, Meintz explained that battery-powered electric vehicles grow in market share, exemplified by mass-market EVs, without any evolution in vehicle charging coordination. Picture every employee at your workplace driving home at 5 p.m. and charging their vehicle. That is the grid's equivalent of going 0 to 100 mph, and if it does not wreck the system, it is at least very expensive. According to NREL's Electrification Futures Study, a comprehensive analysis of the impacts of widespread electrification across all U.S. economic sectors, in 2050 EVs could contribute to a 33% increase in energy use during peak electrical demand, underscoring state grid challenges that make these intervals costly when energy reserves are procured. In duck curve parlance, EVs will further strain the duck's neck.

The Optimization and Control Lab's Electric Vehicle Grid Integration bays allow researchers to determine how advanced high power chargers can be added safely and effectively to the grid, with the potential to explore how to combine buildings and EV charging. Credit: Dennis Schroeder, NREL
Meintz's "whiz-bang" approach instead imagines EV control strategies that are deliberate and serve to smooth, rather than intensify, the upcoming demand for electricity. It means managing both when and where vehicles charge to create flexible load on the grid.

At NREL, smart strategies to dispatch vehicles for optimal charging are being developed for both the grid edge, where consumers and energy users connect to the grid, as in RECHARGEPDF, and the entire distribution system, as in the GEMINI-XFC projectPDF. Both projects, funded by the U.S. Department of Energy's (DOE's) Vehicle Technologies Office, lean on advanced capabilities at NREL's Energy Systems Integration Facility to simulate future energy systems.

At the grid edge, EVs can be co-optimized with distributed energy resources—small-scale generation or storage technologies—the subject of a partnership with Eaton that brought industry perspectives to bear on coordinated management of EV fleets.

At the larger-system level, the GEMINI-XFC project has extended EV optimization scenarios to the city scale—the San Francisco Bay Area, to be specific.

"GEMINI-XFC involves the highest-ever-fidelity modeling of transportation and the grid," said NREL Research Manager of Grid-Connected Energy Systems Bryan Palmintier.

"We're combining future transportation scenarios with a large metro area co-simulationPDF—millions of simulated customers and a realistic distribution system model—to find the best approaches to vehicles helping the grid."

GEMINI-XFC and RECHARGE can foresee future electrification scenarios and then insert controls that reduce grid congestion or offset peak demand, for example. Charging EVs involves a sort of shell game, where loads are continually moved among charging stations to accommodate grid demand.

But for heavy-duty vehicles, the load is harder to hide. Electrified truck fleets will hit the road soon, creating power needs for electric truck fleets that translate to megawatts of localized demand. No amount of rerouting can avoid the requirements of charging heavy-duty vehicles or other instances of extreme fast-charging (XFC). To address this challenge, NREL is working with industry and other national laboratories to study and demonstrate the technological buildout necessary to achieve 1+ MW charging stationsPDF that are capable of fast charging at very high energy levels for medium- and heavy-duty vehicles.

To reach such a scale, NREL is also considering new power conversion hardware based on advanced materials like wide-bandgap semiconductors, as well as new controllers and algorithms that are uniquely suited for fleets of charge-hungry vehicles. The challenge to integrate 1+ MW charging is also pushing NREL research to higher power: Upcoming capabilities will look at many-megawatt systems that tie in the support of other energy sectors.


Renewable In-Roads for Hydrogen

At NREL, the drive toward larger charging demands is being met with larger research capabilities. The announcement of ARIES opens the door to energy systems integration research at a scale 10-times greater than current capabilities: 20 MW, up from 2 MW. Critically, it presents an opportunity to understand how mobility with high energy demands can be co-optimized with other utility-scale assets to benefit grid stability.

"If you've got a grid humming along with a steady load, then a truck requires 500 kW or more of power, it could create a large disruption for the grid," said Keith Wipke, the laboratory program manager for fuel cells and hydrogen technologies at NREL.

Such a high power demand could be partially served by battery storage systems. Or it could be hidden entirely with hydrogen production. Wipke's program, with support from the DOE's Hydrogen and Fuel Cell Technologies Office, has been performing studies into how electrolyzers—devices that use electricity to break water into hydrogen and oxygen—could offset the grid impacts of XFC. These efforts are also closely aligned with DOE's H2@Scale vision for affordable and effective hydrogen use across multiple sectors, including heavy-duty transportation, power generation, and metals manufacturing, among others.

"We're simulating electrolyzers that can match the charging load of heavy-duty battery electric vehicles. When fast charging begins, the electrolyzers are ramped down. When fast charging ends, the electrolyzers are ramped back up," Wipke said. "If done smoothly, the utility doesn't even know it's happening."

NREL Researchers Rishabh Jain, Kazunori Nagasawa, and Jen Kurtz are working on how grid integration of electrolyzers—devices that use electricity to break water into hydrogen and oxygen—could offset the grid impacts of extreme fast-charging. Credit: National Renewable Energy Laboratory
As electrolyzers harness the cheap electrons from off-demand periods, a significant amount of hydrogen can be produced on site. That creates a natural energy pathway from discount electricity into a fuel. It is no wonder, then, that several well-known transportation and fuel companies have recently initiated a multimillion-dollar partnership with NREL to advance heavy-duty hydrogen vehicle technologies.

"The logistics of expanding electric charging infrastructure from 50 kW for a single demonstration battery electric truck to 5,000 kW for a fleet of 100 could present challenges," Wipke said. "Hydrogen scales very nicely; you're basically bringing hydrogen to a fueling station or producing it on site, but either way the hydrogen fueling events are decoupled in time from hydrogen production, providing benefits to the grid."

The long driving range and fast refuel times—including a DOE target of achieving 10-minutes refuel for a truck—have already made hydrogen the standout solution for applications in warehouse forklifts. Further, NREL is finding that distributed electrolyzers can simultaneously produce hydrogen and improve voltage conditions, which can add much-needed stability to a grid that is accommodating more energy from variable resources.

Those examples that co-optimize mobility with the grid, using diverse technologies, are encouraging NREL and its partners to pursue a new scale of systems integration. Several forward-thinking projects are reimagining urban mobility as a mix of energy solutions that integrate the relative strengths of transportation technologies, which complement each other to fill important gaps in grid reliability.


The Future of Urban Mobility
What will electrified transportation look like at high penetrations? A few NREL projects offer some perspective. Among the most experimental, NREL is helping the city of Denver develop a smart community, integrated with electrified mobility and featuring automated charging and vehicle dispatch.

On another path to advanced mobility, Los Angeles has embarked on a plan to modernize its electricity system infrastructure, reflecting California EV grid stability goals—aiming for a 100% renewable energy supply by 2045, along with aggressive electrification targets for buildings and vehicles. Through the Los Angeles 100% Renewable Energy Study, the city is currently working with NREL to assess the full-scale impacts of the transition in a detailed analysis that integrates diverse capabilities across the laboratory.

The transition would include the Port of Long Beach, the busiest container port in the United States.

At the port, NREL is applying the same sort of scenario forecasting and controls evaluation as other projects, in order to find the optimal mix of technologies that can be integrated for both grid stability and a reliable quality of service: a mix of hydrogen fuel-cell and battery EVs, battery storage systems, on-site renewable generation, and extreme coordination among everything.

"Hydrogen at ports makes sense for the same reason as trucks: Marine applications have big power and energy demands," Wipke said. "But it's really the synergies between diverse technologies—the existing infrastructure for EVs and the flexibility of bulk battery systems—that will truly make the transition to high renewable energy possible."

Like the Port of Long Beach, transportation hubs across the nation are adapting to a complex environment of new mobility solutions. Airports and public transit stations involve the movement of passengers, goods, and services at a volume exceeding anywhere else. With the transition to digitally connected electric mobility changing how airports plan for the future, NREL projects such as Athena are using the power of high-performance computing to demonstrate how these hubs can maximize the value of passenger and freight mobility per unit of energy, time, and/or cost.

The growth in complexity for transportation hubs has just begun, however. Looking ahead, fleets of ride-sharing EVs, automated vehicles, and automated ride-sharing EV fleets could present the largest effort to manage mobility yet.


A Self-Driving Power Grid
To understand the full impact of future mobility-service providers, NREL developed the HIVE (Highly Integrated Vehicle Ecosystem) simulation framework. HIVE combines factors related to serving mobility needs and grid operations—such as a customer's willingness to carpool or delay travel, and potentially time-variable costs of recharging—and simulates the outcome in an integrated environment.

"Our question is, how do you optimize the management of a fleet whose primary purpose is to provide rides and improve that fleet's dispatch and charging?" said Eric Wood, an NREL vehicle systems engineer.

HIVE was developed as part of NREL's Autonomous Energy Systems research to optimize the control of automated vehicle fleets. That is, optimized routing and dispatch of automated electric vehicles.

The project imagines how price signals could influence dispatch algorithms. Consider one customer booking a commute through a ride-hailing app. Out of the fleet of vehicles nearby—variously charged and continually changing locations—which one should pick up the customer?

Now consider the movements of thousands of passengers in a city and thousands of vehicles providing transportation services. Among the number of agents, the moment-to-moment change in energy supply and demand, and the broad diversity in vendor technologies, "we're playing with a lot of parameters," Wood said.

But cutting through all the complexity, and in the midst of massive simulations, the end goal for vehicle-to-grid integration is consistent:

"The motivation for our work is that there are forecasts for significant load on the grid from the electrification of transportation," Wood said. "We want to ensure that this load is safely and effectively integrated, while meeting the expectations and needs of passengers."

The Port of Long Beach uses a mix of hydrogen fuel-cell and battery EVs, battery storage systems, on-site renewable generation, and extreme coordination among everything. Credit: National Renewable Energy Laboratory
True Replacement without Caveats

Electric vehicles are not necessarily helpful to the grid, but they can be. As EVs become established in the transportation sector, NREL is studying how to even out any bumps that electrified mobility could cause on the grid and advance any benefits to commuters or industry.

"It all comes down to load flexibility," Meintz said. "We're trying to decide how to optimally dispatch vehicle charging to meet quality-of-service considerations, while also minimizing charging costs."

 

Related News

View more

Vehicle-to-grid could be ‘capacity on wheels’ for electricity networks

Vehicle-to-Grid (V2G) enables EV batteries to provide grid balancing, flexibility, and demand response, integrating renewables with bidirectional charging, reducing peaker plant reliance, and unlocking distributed energy storage from millions of connected electric vehicles.

 

Key Points

Vehicle-to-Grid (V2G) lets EVs export power via bidirectional charging to balance grids and support renewables.

✅ Turns parked EVs into distributed energy storage assets

✅ Delivers balancing services and demand response to the grid

✅ Cuts peaker plant use and supports renewable integration

 

“There are already many Gigawatt-hours of batteries on wheels”, which could be used to provide balance and flexibility to electrical grids, if the “ultimate potential” of vehicle-to-grid (V2G) technology could be harnessed.

That’s according to a panel of experts and stakeholders convened by our sister site Current±, which covers the business models and technologies inherent to the low carbon transition to decentralised and clean energy. Focusing mainly on the UK grid but opening up the conversation to other territories and the technologies themselves, representatives including distribution network operator (DNO) Northern Powergrid’s policy and markets director and Nissan Europe’s director of energy services debated the challenges, benefits and that aforementioned ultimate potential.

Decarbonisation of energy systems and of transport go hand-in-hand amid grid challenges from rising EV uptake, with vehicle fuel currently responsible for more emissions than electricity used for energy elsewhere, as Ian Cameron, head of innovation at DNO UK Power Networks says in the Q&A article.

“Furthermore, V2G technology will further help decarbonisation by replacing polluting power plants that back up the electrical grid,” Marc Trahand from EV software company Nuvve Corporation added, pointing to California grid stability initiatives as a leading example.

While the panel states that there will still be a place for standalone utility-scale energy storage systems, various speakers highlighted that there are over 20GWh of so-called ‘batteries on wheels’ in the US, capable of powering buildings as needed, and up to 10 million EVs forecast for Britain’s roads by 2030.

“…it therefore doesn’t make sense to keep building expensive standalone battery farms when you have all this capacity on wheels that just needs to be plugged into bidirectional chargers,” Trahand said.

 

Related News

View more

Cal ISO Warns Rolling Blackouts Possible, Calls For Conservation As Power Grid Strains

Cal ISO Flex Alert urges Southern California energy conservation as a Stage 2 emergency strains the power grid, with potential rolling blackouts during peak hours from 3 to 10 p.m., if demand exceeds supply.

 

Key Points

A statewide call to conserve power during high demand, issued by the grid operator to prevent rolling blackouts.

✅ Stage 2 emergency signals severe grid strain

✅ Peak Flex Alert hours: 3 to 10 p.m. statewide

✅ Set thermostats to 78 and avoid major appliances

 

Residents and businesses across Southern California were urged to conserve power Tuesday afternoon amid ongoing electricity inequities across the state as the manager of the state’s power grid warned rolling blackouts could be imminent for some power customers.

The California Independent System Operator (Cal ISO), which manages the state power grid, declared a Stage 2 emergency as of 2:30 p.m., indicating severe strain on the electrical system, similar to a recent grid alert in Alberta that relied on reserves.

ADVERTISING

Rolling blackouts for some customers could occur in a Stage 3 emergency, distinct from the intentional shut-offs some utilities use to reduce wildfire risk.

Cal ISO issued a statewide Flex Alert in effect from 3 to 10 p.m. Tuesday and Wednesday, with conservation considered especially critical during those hours, a concern heightened by pandemic-era grid operations this year.

Officials told reporters rolling blackouts might be avoided Tuesday evening if residents repeat the level of conservation seen Monday.
“If we can get the same sort of response we got yesterday, we can minimize this, or perhaps avoid it altogether,” Cal-ISO President/CEO Steve Berberich said, noting that some operators have even planned staff lockdowns during COVID-19 to maintain reliability.

Cal-ISO controls roughly 80% of the state’s power grid through Southern California Edison, Pacific Gas and Electric Co., with the utility recently restoring power after shut-offs in affected communities, and San Diego Gas & Electric.

Residents are urged to set thermostats at 78 in the afternoon and evening hours and avoiding the use of air conditioning and major appliances during the Flex Alert hours, as utilities like PG&E prepare for winter storms to improve resilience.

 

Related News

View more

Jolting the brain's circuits with electricity is moving from radical to almost mainstream therapy

Brain Stimulation is transforming neuromodulation, from TMS and DBS to closed loop devices, targeting neural circuits for addiction, depression, Parkinsons, epilepsy, and chronic pain, powered by advanced imaging, AI analytics, and the NIH BRAIN Initiative.

 

Key Points

Brain stimulation uses pulses to modulate neural circuits, easing symptoms in depression, Parkinsons, and epilepsy.

✅ Noninvasive TMS and invasive DBS modulate specific brain circuits

✅ Closed loop systems adapt stimulation via real time biomarker detection

✅ Emerging uses: addiction, depression, Parkinsons, epilepsy, chronic pain

 

In June 2015, biology professor Colleen Hanlon went to a conference on drug dependence. As she met other researchers and wandered around a glitzy Phoenix resort’s conference rooms to learn about the latest work on therapies for drug and alcohol use disorders, she realized that out of the 730 posters, there were only two on brain stimulation as a potential treatment for addiction — both from her own lab at Wake Forest School of Medicine.

Just four years later, she would lead 76 researchers on four continents in writing a consensus article about brain stimulation as an innovative tool for addiction. And in 2020, the Food and Drug Administration approved a transcranial magnetic stimulation device to help patients quit smoking, a milestone for substance use disorders.

Brain stimulation is booming. Hanlon can attend entire conferences devoted to the study of what electrical currents do—including how targeted stimulation can improve short-term memory in older adults—to the intricate networks of highways and backroads that make up the brain’s circuitry. This expanding field of research is slowly revealing truths of the brain: how it works, how it malfunctions, and how electrical impulses, precisely targeted and controlled, might be used to treat psychiatric and neurological disorders.

In the last half-dozen years, researchers have launched investigations into how different forms of neuromodulation affect addiction, depression, loss-of-control eating, tremor, chronic pain, obsessive compulsive disorder, Parkinson’s disease, epilepsy, and more. Early studies have shown subtle electrical jolts to certain brain regions could disrupt circuit abnormalities — the miscommunications — that are thought to underlie many brain diseases, and help ease symptoms that persist despite conventional treatments.

The National Institute of Health’s massive BRAIN Initiative put circuits front and center, distributing $2.4 billion to researchers since 2013 to devise and use new tools to observe interactions between brain cells and circuits. That, in turn, has kindled interest from the private sector. Among the advances that have enhanced our understanding of how distant parts of the brain talk with one another are new imaging technology and the use of machine learning, much as utilities use AI to adapt to shifting electricity demand, to interpret complex brain signals and analyze what happens when circuits go haywire.

Still, the field is in its infancy, and even therapies that have been approved for use in patients with, for example, Parkinson’s disease or epilepsy, help only a minority of patients, and in a world where electricity drives pandemic readiness expectations can outpace evidence. “If it was the Bible, it would be the first chapter of Genesis,” said Michael Okun, executive director of the Norman Fixel Institute for Neurological Diseases at University of Florida Health.

As brain stimulation evolves, researchers face daunting hurdles, and not just scientific ones. How will brain stimulation become accessible to all the patients who need it, given how expensive and invasive some treatments are? Proving to the FDA that brain stimulation works, and does so safely, is complicated and expensive. Even with a swell of scientific momentum and an influx of funding, the agency has so far cleared brain stimulation for only a handful of limited conditions. Persuading insurers to cover the treatments is another challenge altogether. And outside the lab, researchers are debating nascent issues, such as the ethics of mind control, the privacy of a person’s brain data—concerns that echo efforts to develop algorithms to prevent blackouts during rising ransomware threats—and how to best involve patients in the study of the human brain’s far-flung regions.

Neurologist Martha Morrell is optimistic about the future of brain stimulation. She remembers the shocked reactions of her colleagues in 2004 when she left full-time teaching at Stanford (she still has a faculty appointment as a clinical professor of neurology) to direct clinical trials at NeuroPace, then a young company making neurostimulator systems to potentially treat epilepsy patients.

Related: Once a last resort, this pain therapy is getting a new life amid the opioid crisis
“When I started working on this, everybody thought I was insane,” said Morrell. Nearly 20 years in, she sees a parallel between the story of jolting the brain’s circuitry and that of early implantable cardiac devices, such as pacemakers and defibrillators, which initially “were used as a last option, where all other medications have failed.” Now, “the field of cardiology is very comfortable incorporating electrical therapy, device therapy, into routine care. And I think that’s really where we’re going with neurology as well.”


Reaching a ‘slope of enlightenment’
Parkinson’s is, in some ways, an elder in the world of modern brain stimulation, and it shows the potential as well as the limitations of the technology. Surgeons have been implanting electrodes deep in the brains of Parkinson’s patients since the late 1990s, and in people with more advanced disease since the early 2000s.

In that time, it’s gone through the “hype cycle,” said Okun, the national medical adviser to the Parkinson’s Foundation since 2006. Feverish excitement and overinflated expectations have given way to reality, bringing scientists to a “slope of enlightenment,” he said. They have found deep brain stimulation to be very helpful for some patients with Parkinson’s, rendering them almost symptom-free by calming the shaking and tremors that medications couldn’t. But it doesn’t stop the progression of the disease, or resolve some of the problems patients with advanced Parkinson’s have walking, talking, and thinking.

In 2015, the same year Hanlon found only her lab’s research on brain stimulation at the addiction conference, Kevin O’Neill watched one finger on his left hand start doing something “funky.” One finger twitched, then two, then his left arm started tingling and a feeling appeared in his right leg, like it was about to shake but wouldn’t — a tremor.

“I was assuming it was anxiety,” O’Neill, 62, told STAT. He had struggled with anxiety before, and he had endured a stressful year: a separation, selling his home, starting a new job at a law firm in California’s Bay Area. But a year after his symptoms first began, O’Neill was diagnosed with Parkinson’s.

In the broader energy context, California has increasingly turned to battery storage to stabilize its strained grid.

Related: Psychiatric shock therapy, long controversial, may face fresh restrictions
Doctors prescribed him pills that promote the release of dopamine, to offset the death of brain cells that produce this messenger molecule in circuits that control movement. But he took them infrequently because he worried about insomnia as a side effect. Walking became difficult — “I had to kind of think my left leg into moving” — and the labor lawyer found it hard to give presentations and travel to clients’ offices.

A former actor with an outgoing personality, he developed social anxiety and didn’t tell his bosses about his diagnosis for three years, and wouldn’t have, if not for two workdays in summer 2018 when his tremors were severe and obvious.

O’Neill’s tremors are all but gone since he began deep brain stimulation last May, though his left arm shakes when he feels tense.

It was during that period that he learned about deep brain stimulation, at a support group for Parkinson’s patients. “I thought, ‘I will never let anybody fuss with my brain. I’m not going to be a candidate for that,’” he recalled. “It felt like mad scientist science fiction. Like, are you kidding me?”

But over time, the idea became less radical, as O’Neill spoke to DBS patients and doctors and did his own research, and as his symptoms worsened. He decided to go for it. Last May, doctors at the University of California, San Francisco surgically placed three metal leads into his brain, connected by thin cords to two implants in his chest, just near the clavicles. A month later, he went into the lab and researchers turned the device on.

“That was a revelation that day,” he said. “You immediately — literally, immediately — feel the efficacy of these things. … You go from fully symptomatic to non-symptomatic in seconds.”

When his nephew pulled up to the curb to pick him up, O’Neill started dancing, and his nephew teared up. The following day, O’Neill couldn’t wait to get out of bed and go out, even if it was just to pick up his car from the repair shop.

In the year since, O’Neill’s walking has gone from “awkward and painful” to much improved, and his tremors are all but gone. When he is extra frazzled, like while renovating and moving into his new house overlooking the hills of Marin County, he feels tense and his left arm shakes and he worries the DBS is “failing,” but generally he returns to a comfortable, tremor-free baseline.

O’Neill worried about the effects of DBS wearing off but, for now, he can think “in terms of decades, instead of years or months,” he recalled his neurologist telling him. “The fact that I can put away that worry was the big thing.”

He’s just one patient, though. The brain has regions that are mostly uniform across all people. The functions of those regions also tend to be the same. But researchers suspect that how brain regions interact with one another — who mingles with whom, and what conversation they have — and how those mixes and matches cause complex diseases varies from person to person. So brain stimulation looks different for each patient.

Related: New study revives a Mozart sonata as a potential epilepsy therapy
Each case of Parkinson’s manifests slightly differently, and that’s a bit of knowledge that applies to many other diseases, said Okun, who organized the nine-year-old Deep Brain Stimulation Think Tank, where leading researchers convene, review papers, and publish reports on the field’s progress each year.

“I think we’re all collectively coming to the realization that these diseases are not one-size-fits-all,” he said. “We have to really begin to rethink the entire infrastructure, the schema, the framework we start with.”

Brain stimulation is also used frequently to treat people with common forms of epilepsy, and has reduced the number of seizures or improved other symptoms in many patients. Researchers have also been able to collect high-quality data about what happens in the brain during a seizure — including identifying differences between epilepsy types. Still, only about 15% of patients are symptom-free after treatment, according to Robert Gross, a neurosurgery professor at Emory University in Atlanta.

“And that’s a critical difference for people with epilepsy. Because people who are symptom-free can drive,” which means they can get to a job in a place like Georgia, where there is little public transit, he said. So taking neuromodulation “from good to great,” is imperative, Gross said.


Renaissance for an ancient idea
Recent advances are bringing about what Gross sees as “almost a renaissance period” for brain stimulation, though the ideas that undergird the technology are millenia old. Neuromodulation goes back to at least ancient Egypt and Greece, when electrical shocks from a ray, called the “torpedo fish,” were recommended as a treatment for headache and gout. Over centuries, the fish zaps led to doctors burning holes into the brains of patients. Those “lesions” worked, somehow, but nobody could explain why they alleviated some patients’ symptoms, Okun said.

Perhaps the clearest predecessor to today’s technology is electroconvulsive therapy (ECT), which in a rudimentary and dangerous way began being used on patients with depression roughly 100 years ago, said Nolan Williams, director of the Brain Stimulation Lab at Stanford University.

Related: A new index measures the extent and depth of addiction stigma
More modern forms of brain stimulation came about in the United States in the mid-20th century. A common, noninvasive approach is transcranial magnetic stimulation, which involves placing an electromagnetic coil on the scalp to transmit a current into the outermost layer of the brain. Vagus nerve stimulation (VNS), used to treat epilepsy, zaps a nerve that contributes to some seizures.

The most invasive option, deep brain stimulation, involves implanting in the skull a device attached to electrodes embedded in deep brain regions, such as the amygdala, that can’t be reached with other stimulation devices. In 1997, the FDA gave its first green light to deep brain stimulation as a treatment for tremor, and then for Parkinson’s in 2002 and the movement disorder dystonia in 2003.

Even as these treatments were cleared for patients, though, what was happening in the brain remained elusive. But advanced imaging tools now let researchers peer into the brain and map out networks — a recent breakthrough that researchers say has propelled the field of brain stimulation forward as much as increased funding has, paralleling broader efforts to digitize analog electrical systems across industry. Imaging of both human brains and animal models has helped researchers identify the neuroanatomy of diseases, target brain regions with more specificity, and watch what was happening after electrical stimulation.

Another key step has been the shift from open-loop stimulation — a constant stream of electricity — to closed-loop stimulation that delivers targeted, brief jolts in response to a symptom trigger. To make use of the futuristic technology, labs need people to develop artificial intelligence tools, informed by advances in machine learning for the energy transition, to interpret large data sets a brain implant is generating, and to tailor devices based on that information.

“We’ve needed to learn how to be data scientists,” Morrell said.

Affinity groups, like the NIH-funded Open Mind Consortium, have formed to fill that gap. Philip Starr, a neurosurgeon and developer of implantable brain devices at the University of California at San Francisco Health system, leads the effort to teach physicians how to program closed-loop devices, and works to create ethical standards for their use. “There’s been extraordinary innovation after 20 years of no innovation,” he said.

The BRAIN Initiative has been critical, several researchers told STAT. “It’s been a godsend to us,” Gross said. The NIH’s Brain Research through Advancing Innovative Neurotechnologies (BRAIN) Initiative was launched in 2013 during the Obama administration with a $50 million budget. BRAIN now spends over $500 million per year. Since its creation, BRAIN has given over 1,100 awards, according to NIH data. Part of the initiative’s purpose is to pair up researchers with medical technology companies that provide human-grade stimulation devices to the investigators. Nearly three dozen projects have been funded through the investigator-devicemaker partnership program and through one focused on new implantable devices for first-in-human use, according to Nick Langhals, who leads work on neurological disorders at the initiative.

The more BRAIN invests, the more research is spawned. “We learn more about what circuits are involved … which then feeds back into new and more innovative projects,” he said.

Many BRAIN projects are still in early stages, finishing enrollment or small feasibility studies, Langhals said. Over the next couple of years, scientists will begin to see some of the fruits of their labor, which could lead to larger clinical trials, or to companies developing more refined brain stimulation implants, Langhals said.

Money from the National Institutes of Mental Health, as well as the NIH’s Helping to End Addiction Long-term (HEAL), has similarly sweetened the appeal of brain stimulation, both for researchers and industry. “A critical mass” of companies interested in neuromodulation technology has mushroomed where, for two decades, just a handful of companies stood, Starr said.

More and more, pharmaceutical and digital health companies are looking at brain stimulation devices “as possible products for their future,” said Linda Carpenter, director of the Butler Hospital TMS Clinic and Neuromodulation Research Facility.


‘Psychiatry 3.0’
The experience with using brain stimulation to stop tremors and seizures inspired psychiatrists to begin exploring its use as a potentially powerful therapy for healing, or even getting ahead of, mental illness.

In 2008, the FDA approved TMS for patients with major depression who had tried, and not gotten relief from, drug therapy. “That kind of opened the door for all of us,” said Hanlon, a professor and researcher at the Center for Research on Substance Use and Addiction at Wake Forest School of Medicine. The last decade saw a surge of research into how TMS could be used to reset malfunctioning brain circuits involved in anxiety, depression, obsessive-compulsive disorder, and other conditions.

“We’re certainly entering into what a lot of people are calling psychiatry 3.0,” Stanford’s Williams said. “Whereas the first iteration was Freud and all that business, the second one was the psychopharmacology boom, and this third one is this bit around circuits and stimulation.”

Drugs alleviate some patients’ symptoms while simultaneously failing to help many others, but psychopharmacology clearly showed “there’s definitely a biology to this problem,” Williams said — a biology that in some cases may be more amenable to a brain stimulation.

Related: Largest psilocybin trial finds the psychedelic is effective in treating serious depression
The exact mechanics of what happens between cells when brain circuits … well, short-circuit, is unclear. Researchers are getting closer to finding biomarkers that warn of an incoming depressive episode, or wave of anxiety, or loss of impulse control. Those brain signatures could be different for every patient. If researchers can find molecular biomarkers for psychiatric disorders — and find ways to preempt those symptoms by shocking particular brain regions — that would reshape the field, Williams said.

Not only would disease-specific markers help clinicians diagnose people, but they could help chip away at the stigma that paints mental illness as a personal or moral failing instead of a disease. That’s what happened for epilepsy in the 1960s, when scientific findings nudged the general public toward a deeper understanding of why seizures happen, and it’s “the same trajectory” Williams said he sees for depression.

His research at the Stanford lab also includes work on suicide, and obsessive-compulsive disorder, which the FDA said in 2018 could be treated using noninvasive TMS. Williams considers brain stimulation, with its instantaneity, to be a potential breakthrough for urgent psychiatric situations. Doctors know what to do when a patient is rushed into the emergency room with a heart attack or a stroke, but there is no immediate treatment for psychiatric emergencies, he said. Williams wonders: What if, in the future, a suicidal patient could receive TMS in the emergency room and be quickly pulled out of their depressive mental spiral?

Researchers are also actively investigating the brain biology of addiction. In August 2020, the FDA approved TMS for smoking cessation, the first such OK for a substance use disorder, which is “really exciting,” Hanlon said. Although there is some nuance when comparing substance use disorders, a primal mechanism generally defines addiction: the eternal competition between “top-down” executive control functions and “bottom-up” cravings. It’s the same process that is at work when one is deciding whether to eat another cookie or abstain — just exacerbated.

Hanlon is trying to figure out if the stop and go circuits are in the same place for all people, and whether neuromodulation should be used to strengthen top-down control or weaken bottom-up cravings. Just as brain stimulation can be used to disrupt cellular misfiring, it could also be a tool for reinforcing helpful brain functions, or for giving the addicted brain what it wants in order to curb substance use.

Evidence suggests many people with schizophrenia smoke cigarettes (a leading cause of early death for this population) because nicotine reduces the “hyperconnectivity” that characterizes the brains of people with the disease, said Heather Ward, a research fellow at Boston’s Beth Israel Deaconess Medical Center. She suspects TMS could mimic that effect, and therefore reduce cravings and some symptoms of the disease, and she hopes to prove that in a pilot study that is now enrolling patients.

If the scientific evidence proves out, clinicians say brain stimulation could be used alongside behavioral therapy and drug-based therapy to treat substance use disorders. “In the end, we’re going to need all three to help people stay sober,” Hanlon said. “We’re adding another tool to the physician’s toolbox.”

Decoding the mysteries of pain
Afavorable outcome to the ongoing research, one that would fling the doors to brain stimulation wide open for patients with myriad disorders, is far from guaranteed. Chronic pain researchers know that firsthand.

Chronic pain, among the most mysterious and hard-to-study medical phenomena, was the first use for which the FDA approved deep brain stimulation, said Prasad Shirvalkar, an assistant professor of anesthesiology at UCSF. But when studies didn’t pan out after a year, the FDA retracted its approval.

Shirvalkar is working with Starr and neurosurgeon Edward Chang on a profoundly complex problem: “decoding pain in the brain states, which has never been done,” as Starr told STAT.

Part of the difficulty of studying pain is that there is no objective way to measure it. Much of what we know about pain is from rudimentary surveys that ask patients to rate how much they’re hurting, on a scale from zero to 10.

Using implantable brain stimulation devices, the researchers ask patients for a 0-to-10 rating of their pain while recording up-and-down cycles of activity in the brain. They then use machine learning to compare the two streams of information and see what brain activity correlates with a patient’s subjective pain experience. Implantable devices let researchers collect data over weeks and months, instead of basing findings on small snippets of information, allowing for a much richer analysis.

 

Related News

View more

Feds "changing goalposts" with 2035 net-zero electricity grid target: Sask. premier

Canada Clean Electricity Regulations outline a 2035 net-zero grid target, driving decarbonization via wind, solar, hydro, SMRs, carbon capture, and efficiency, balancing reliability, affordability, and federal-provincial collaboration while phasing out coal and limiting fossil-fuel generation.

 

Key Points

Federal rules to cap CO2 from power plants and deliver a reliable, affordable net-zero grid by 2035.

✅ Applies to fossil-fired units; standards effective by Jan 1, 2035.

✅ Promotes wind, solar, hydro, SMRs, carbon capture, and efficiency.

✅ Balances reliability, affordability, and emissions cuts; ongoing consultation.

 

Saskatchewan’s premier said the federal government is “changing goalposts” with its proposed target for a net-zero electricity grid.

“We were looking at a net-zero plan in Saskatchewan and across Canada by the year 2050. That’s now been bumped to 2035. Well there are provinces that quite frankly aren’t going to achieve those types of targets by 2035,” Premier Scott Moe said Wednesday.

Ottawa proposed the Clean Electricity Regulations – formerly the Clean Electricity Standard – as part of its target for Canada to transition to net-zero emissions by 2050.

The regulations would help the country progress towards an updated proposed goal of a net-zero electricity grid by 2035.

“They’re un-consulted, notional targets that are put forward by the federal government without working with industries, provinces or anyone that’s generating electricity,” Moe said.

The Government of Canada was seeking feedback from stakeholders on the plan’s regulatory framework document earlier this year, up until August 2022.

“The clean electricity standard is something that’s still being consulted on and we certainly heard the views of Saskatchewan – not just Saskatchewan, many other provinces – and I think that’s something that’s being reflected on,” Jonathan Wilkinson, Canada’s minister of natural resources, said during an event near Regina Wednesday.

“We also recognize that the federal government has a role to play in helping provinces to make the kinds of changes that would need to be made in order to actually achieve a clean grid,” Wilkinson added.

The information received during the consultation will help inform the development of the proposed regulations, which are expected to be released before the end of the year, according to the federal government.


NET-ZERO ELECTRICITY GRID
The federal government said its Clean Electricity Regulations (CER) is part of a suite of measures, as the country moves towards a broad “decarbonization” of the economy, with Alberta's clean electricity path illustrating provincial approaches as well.

Net-zero emissions would mean Canada’s economy would either emit no greenhouse gas emissions or offset its emissions.

The plan encourages energy efficiency, abatement and non-emitting generation technologies such as carbon capture and storage and electricity generation options such as solar, wind, geothermal, small modular nuclear reactors (SMRs) and hydro, among others.

The government suggests consumer costs could be lowered by using some of these energy efficiency techniques, alongside demand management and a shift to lower-cost wind and solar power, echoing initiatives like the SaskPower 10% rebate aimed at affordability.

The CER focuses on three principles, each tied to affordability debates like the SaskPower rate hike in Saskatchewan:

 Maximize greenhouse gas reductions to achieve the 2035 target
 Ensure a reliable electrical grid to support Canadians and the economy
 Maintain electrical affordability

“Achieving a net-zero electricity supply is key to reaching Canada’s climate targets in two ways,” the government said in its proposed regulations.

“First, it will reduce [greenhouse gas] emissions from the production of electricity. Second, using clean electricity instead of fossil fuels in vehicles, heating and industry will reduce emissions from those sectors too.

The regulations would regulate carbon dioxide emissions from electricity generating units that combust any amount of fossil fuel, have a capacity above a small megawatt threshold and sell electricity onto a regulated electricity system.

New rules would also be implemented for the development of new electricity generation units firing fossil fuels in or after 2025 and existing units. All units would be subject to emission standards by Jan. 1, 2035, at the latest.

The federal government launched consultations on the proposed regulations in March 2022.

Canada also has a 2030 emissions reduction plan that works towards meeting its Paris Agreement target to reduce emissions by 40-45 per cent from 2005 levels by 2030. This plan includes regulations to phase out coal-fired electricity by 2030.


COLLABORATION
The province recently introduced the Saskatchewan First Act, in an attempt to confirm its own jurisdiction and sovereignty when it comes to natural resources.

The act would amend Saskatchewan’s constitution to exert exclusive legislative jurisdiction under the Constitution of Canada.

The province is seeking jurisdiction over the exploration of non-renewable resources, the development, conservation and management of non-renewable natural and forestry resources, and the operation of sites and facilities for the generation and production of electrical energy.

While the federal government and Saskatchewan have come head-to-head publicly over several policy concerns in the past year, both sides remain open to collaborating on issues surrounding natural resources.

“We do have provincial jurisdiction in the development of these natural resources. We’d like to work collaboratively with the federal government on developing some of the most sustainable potash, uranium, agri-food products in the world,” Moe said.

Minister Wilkinson noted that while both the federal and provincial governments aim to respect each other’s jurisdiction, there is often some overlap, particularly in the case of environmental and economic policies, with Alberta's electricity sector changes underscoring those tensions as well.

“My view is we should endeavour to try to figure out ways that we can work together, and to ensure that we’re actually making progress for Saskatchewanians and for Canadians,” Wilkinson said.

“I think that Canadians expect us to try to figure out ways to work together, and where there are some disputes that can’t get resolved, ultimately the Supreme Court will decide on the issue of jurisdiction as they did in the case on the price on pollution.”

Moe said Saskatchewan is always open to working with the federal government, but not at the expense of its “provincial, constitutional autonomy.”

 

Related News

View more

Electricity in Spain is 682.65% more expensive than the same day in 2020

Spain Electricity Prices surge to record highs as the wholesale market hits €339.84/MWh, driven by gas costs and CO2 permits, impacting PVPC regulated tariffs, free-market contracts, and household energy bills, OMIE data show.

 

Key Points

Rates in Spain's wholesale market that shape PVPC tariffs and free-market bills, moving with gas prices and CO2 costs.

✅ Record €339.84/MWh; peak 20:00-21:00; low 04:00-05:00 (OMIE).

✅ PVPC users and free-market contracts face higher bills.

✅ Drivers: high gas prices and rising CO2 emission rights.

 

Electricity in Spain's wholesale market will rise in price once more as European electricity prices continue to surge. Once again, it will set a historical record in Spain, reaching €339.84/MWh. With this figure, it is already the fifth time that the threshold of €300 has been exceeded.

This new high is a 6.32 per cent increase on today’s average price of €319.63/MWh, which is also a historic record, while Germany's power prices nearly doubled over the past year. Monday’s energy price will make it 682.65 per cent higher than the corresponding date in 2020, when the average was €43.42.

According to data published by the Iberian Energy Market Operator (OMIE), Monday’s maximum will be between the hours of 8pm and 9pm, reaching €375/MWh, a pattern echoed by markets where Electric Ireland price hikes reflect wholesale volatility. The cheapest will be from 4am to 5am, at €267.99.

The prices of the ‘pool’ have a direct effect on the regulated tariff  – PVPC – to which almost 11 million consumers in the country are connected, and serve as a reference for the other 17 million who have contracted their supply in the free market, where rolling back prices is proving difficult across Europe.

These spiraling prices in recent months, which have fueled EU energy inflation, are being blamed on high gas prices in the markets, and carbon dioxide (CO2) emission rights, both of which reached record highs this year.

According to an analysis by Facua-Consumidores en Acción, if the same rates were maintained for the rest of the month, the last invoice of the year would reach €134.45 for the average user. That would be 94.1 per cent above the €69.28 for December 2020, while U.S. residential electricity bills rose about 5% in 2022 after inflation adjustments.

The average user’s bill so far this year has increased by 15.1 per cent compared to 2018, as US electricity prices posted their largest jump in 41 years. Thus, compared to the €77.18 of three years ago, the average monthly bill now reaches €90.87 euros. However, the Government continues to insist that this year households will end up paying the same as in 2018.

As Ruben Sanchez, the general secretary of Facua commented, “The electricity bill for December would have to be negative for President Sanchez, and Minister Ribera, to fulfill their promise that this year consumers will pay the same as in 2018 once the CPI has been discounted”.

 

Related News

View more

Sign Up for Electricity Forum’s Newsletter

Stay informed with our FREE Newsletter — get the latest news, breakthrough technologies, and expert insights, delivered straight to your inbox.

Electricity Today T&D Magazine Subscribe for FREE

Stay informed with the latest T&D policies and technologies.
  • Timely insights from industry experts
  • Practical solutions T&D engineers
  • Free access to every issue

Live Online & In-person Group Training

Advantages To Instructor-Led Training – Instructor-Led Course, Customized Training, Multiple Locations, Economical, CEU Credits, Course Discounts.

Request For Quotation

Whether you would prefer Live Online or In-Person instruction, our electrical training courses can be tailored to meet your company's specific requirements and delivered to your employees in one location or at various locations.