A Texas-Sized Gas-for-Electricity Swap


texas electric heating

NFPA 70b Training - Electrical Maintenance

Our customized live online or in‑person group training can be delivered to your staff at your location.

  • Live Online
  • 12 hours Instructor-led
  • Group Training Available
Regular Price:
$599
Coupon Price:
$499
Reserve Your Seat Today

Texas Heat Pump Electrification replaces natural gas furnaces with electric heating across ERCOT, cutting carbon emissions, lowering utility bills, shifting summer peaks to winter, and aligning higher loads with strong seasonal wind power generation.

 

Key Points

Statewide shift from gas furnaces to heat pumps in Texas, reducing emissions and bills while moving grid peak to winter.

✅ Up to $452 annual utility savings per household

✅ CO2 cuts up to 13.8 million metric tons in scenarios

✅ Winter peak rises, summer peak falls; wind aligns with load

 

What would happen if you converted all the single-family homes in Texas from natural gas to electric heating?

According to a paper from Pecan Street, an Austin-based energy research organization, the transition would reduce climate-warming pollution, save Texas households up to $452 annually on their utility bills, and flip the state from a summer-peaking to a winter-peaking system. And that winter peak would be “nothing the grid couldn’t evolve to handle,” according to co-author Joshua Rhodes, a view echoed by analyses outlining Texas grid reliability improvements statewide today.

The report stems from the reality that buildings must be part of any comprehensive climate action plan.

“If we do want to decarbonize, eventually we do have to move into that space. It may not be the lowest-hanging fruit, but eventually we will have to get there,” said Rhodes.

Rhodes is a founding partner of the consultancy IdeaSmiths and an analyst at Vibrant Clean Energy. Pecan Street commissioned the study, which is distilled from a larger original analysis by IdeaSmiths, at the request of the nonprofit Environmental Defense Fund.

In an interview, Rhodes said, “The goal and motivation were to put bounding on some of the claims that have been made about electrification: that if we electrify a lot of different end uses or sectors of the economy...power demand of the grid would double.”

Rhodes and co-author Philip R. White used an analysis tool from the National Renewable Energy Laboratory called ResStock to determine the impact of replacing natural-gas furnaces with electric heat pumps in homes across the ERCOT service territory, which encompasses 90 percent of Texas’ electricity load.

Rhodes and White ran 80,000 simulations in order to determine how heat pumps would perform in Texas homes and how the pumps would impact the ERCOT grid.

The researchers modeled the use of “standard efficiency” (ducted, SEER 14, 8.2 HSPF air-source heat pump) and “superior efficiency” (ductless, SEER 29.3, 14 HSPF mini-split heat pump) heat pump models against two weather data sets — a typical meteorological year, and 2011, which had extreme weather in both the winter and summer and highlighted blackout risks during severe heat for many regions.

Emissions were calculated using Texas’ power sector data from 2017. For energy cost calculations, IdeaSmiths used 10.93 cents per kilowatt-hour for electricity and 8.4 cents per therm for natural gas.

Nothing the grid can't handle
Rhodes and White modeled six scenarios. All the scenarios resulted in annual household utility bill savings — including the two in which annual electricity demand increased — ranging from $57.82 for the standard efficiency heat pump and typical meteorological year to $451.90 for the high-efficiency heat pump and 2011 extreme weather year.

“For the average home, it was cheaper to switch. It made economic sense today to switch to a relatively high-efficiency heat pump,” said Rhodes. “Electricity bills would go up, but gas bills can go down.”

All the scenarios found carbon savings too, with CO2 reductions ranging from 2.6 million metric tons with a standard efficiency heat pump and typical meteorological year to 13.8 million metric tons with the high-efficiency heat pump in 2011-year weather.

Peak electricity demand in Texas would shift from summer to winter. Because heat pumps provide both high-efficiency space heating and cooling, in the scenario with “superior efficiency” heat pumps, the summer peak drops by nearly 24 percent to 54 gigawatts compared to ERCOT’s 71-gigawatt 2016 summer peak, even as recurring strains on the Texas power grid during extreme conditions persist.

The winter peak would increase compared to ERCOT’s 66-gigawatt 2018 winter peak, up by 22.73 percent to 81 gigawatts with standard efficiency heat pumps and up by 10.6 percent to 73 gigawatts with high-efficiency heat pumps.

“The grid could evolve to handle this. This is not a wholesale rethinking of how the grid would have to operate,” said Rhodes.

He added, “There would be some operational changes if we went to a winter-peaking grid. There would be implications for when power plants and transmission lines schedule their downtime for maintenance. But this is not beyond the realm of reality.”

And because Texas’ wind power generation is higher in winter, a winter peak would better match the expected higher load from all-electric heating to the availability of zero-carbon electricity.

 

A conservative estimate
The study presented what are likely conservative estimates of the potential for heat pumps to reduce carbon pollution and lower peak electricity demand, especially when paired with efficiency and demand response strategies that can flatten demand.

Electric heat pumps will become cleaner as more zero-carbon wind and solar power are added to the ERCOT grid, as utilities such as Tucson Electric Power phase out coal. By the end of 2018, 30 percent of the energy used on the ERCOT grid was from carbon-free sources.

According to the U.S. Energy Information Administration, three in five Texas households already use electricity as their primary source of heat, much of it electric-resistance heating. Rhodes and White did not model the energy use and peak demand impacts of replacing that electric-resistance heating with much more energy efficient heat pumps.

“Most of the electric-resistance heating in Texas is located in the very far south, where they don’t have much heating at all,” Rhodes said. “You would see savings in terms of the bills there because these heat pumps definitely operate more efficiently than electric-resistance heating for most of the time.”

Rhodes and White also highlighted areas for future research. For one, their study did not factor in the upfront cost to homeowners of installing heat pumps.

“More study is needed,” they write in the Pecan Street paper, “to determine the feasibility of various ‘replacement’ scenarios and how and to what degree the upgrade costs would be shared by others.”

Research from the Rocky Mountain Institute has found that electrification of both space and water heating is cheaper for homeowners over the life of the appliances in most new construction, when transitioning from propane or heating oil, when a gas furnace and air conditioner are replaced at the same time, and when rooftop solar is coupled with electrification, aligning with broader utility trends toward electrification.

More work is also needed to assess the best way to jump-start the market for high-efficiency all-electric heating. Rhodes believes getting installers on board is key.

“Whenever a homeowner’s making a decision, if their system goes out, they lean heavily on what the HVAC company suggests or tells them because the average homeowner doesn’t know much about their systems,” he said.

More work is also needed to assess the best way to jump-start the market for high-efficiency all-electric heating, and how utility strategies such as smart home network programs affect adoption too. Rhodes believes getting installers on board is key.

 

Related News

Related News

Gov. Greg Abbott touts Texas power grid's readiness heading into fall, election season

ERCOT Texas Fall Grid Forecast outlines ample power supply, planned maintenance outages, and grid reliability, citing PUC oversight and Gov. Abbott's remarks, with seasonal assessment noting mild demand yet climate risks and conservation alerts.

 

Key Points

ERCOT's seasonal outlook for Texas on fall power supply, outages, and reliability expectations under PUC oversight.

✅ Projects sufficient supply in October and November

✅ Many plants scheduled offline for maintenance

✅ Notes PUC oversight and Abbott's confidence

 

Gov. Greg Abbott said Tuesday that the Texas power grid is prepared for the fall months and referenced a new seasonal forecast by the state’s grid operator, which typically does not draw much attention to its fall and spring grid assessments because of the more mild temperatures during those seasons.

Tuesday’s new forecast by the Electric Reliability Council of Texas showed that there should be plenty of power supply to meet demand in October and November. It also showed that many Texas power plants are scheduled to be offline this fall for maintenance work. Texas power plants usually plan to go down in the fall and spring for repairs to improve reliability ahead of the more extreme temperatures in winter and summer, when Texans crank up their heat and air conditioning and raise demand for power.

ERCOT for at least a decade announced its seasonal forecasts, but did not do so on Tuesday. The grid operator stopped announcing the reports after the 2021 winter storm event. A spokesperson for the grid operator, which posted the report to its website midday without notifying the public or power industry stakeholders, said there were no plans to discuss the latest forecast and referred questions about it to the Public Utility Commission, which oversees ERCOT. Abbott appoints the board of the PUC.

Abbott on Tuesday expressed his confidence about the grid in a news release, which included photos of the governor sitting at a table with incoming ERCOT CEO Pablo Vegas, outgoing interim CEO Brad Jones and Public Utility Commission Chair Peter Lake.

“The State of Texas continues to monitor the reliability of our electric grid, and I thank ERCOT and PUC for their hard work to implement bipartisan reforms we passed last year and for their proactive leadership to ensure our grid is stronger than ever before,” Abbott said in the release.

Abbott has not previously shared or called attention to ERCOT’s forecasts as he did on Tuesday.

Up for reelection this fall, Abbott has faced continued criticism, including from the Sierra Club over his handling of the 2021 deadly power grid disaster, when extended freezing temperatures shut down natural gas facilities and power plants, which rely on each other to keep electricity flowing. The resulting blackouts left millions of Texans without power for days in the cold, and hundreds of people died.

ERCOT’s forecasts for fall and spring are typically the least worrisome seasonal forecasts, energy experts said, because temperatures are usually milder in between summer and winter, even as ERCOT has issued an RFP to procure winter capacity to address shortages, so demand for power usually does not skyrocket like it does during extreme temperatures.

But they’ve warned that climate change could potentially lead to more extreme temperatures during times when Texas hasn’t experienced such weather in the past. For example, in early May six power plants unexpectedly broke down when a spring heat wave drove power demand up and highlighted broader heat-related blackout risks across the grid. ERCOT asked Texans to conserve electricity at home at the time.

Abbott released the seasonal report at a time when he has asserted unprecedented control over ERCOT. Although he had no formal role in ERCOT’s search for a new permanent CEO, he put a stranglehold on the process, The Texas Tribune previously reported. Since the winter storm, Abbott’s office has also dictated what information about the power grid ERCOT has released to the public.

 

Related News

View more

More Managers Charged For Price Fixing At Ukraine Power Producer

DTEK Rotterdam+ price-fixing case scrutinizes alleged collusion over coal-based electricity tariffs in Ukraine, with NABU probing NERC regulators, market manipulation, consumer overpayment, and wholesale pricing tied to imported coal benchmarks.

 

Key Points

NABU probes alleged DTEK-NERC collusion to inflate coal power tariffs via Rotterdam+; all suspects deny wrongdoing.

✅ NABU alleges tariff manipulation tied to coal import benchmarks.

✅ Four DTEK execs and four NERC officials reportedly charged.

✅ Probe centers on 2016-2017 overpayments; defendants contest.

 

Two more executives of DTEK, Ukraine’s largest private power and coal producer and recently in energy talks with Octopus Energy, have been charged in a criminal case on August 14 involving an alleged conspiracy to fix electricity prices with the state energy regulator, Interfax reported.

They are Ivan Helyukh, the CEO of subsidiary DTEK Grid, which operates as Ukraine modernizes its network alongside global moves toward a smart electricity grid, and Borys Lisoviy, a top manager of power generation company Skhidenergo, according to Kyiv-based Concorde Capital investment bank.

Ukraine’s Anti-Corruption Bureau (NABU) alleges that now four DTEK managers “pressured” and colluded with four regulators at the National Energy and Utilities Regulatory Commission to manipulate tariffs on electricity generated from coal that forced consumers to overpay, reflecting debates about unjustified profits in the UK, $747 million in 2016-2017.

 

DTEK allegedly benefited $560 million in the scheme.

All eight suspects are charged with “abuse of office” and deny wrongdoing, similar to findings in a B.C. Hydro regulator report published in Canada.

There is “no legitimate basis for suspicions set out in the investigation,” DTEK said in an August 8 statement.

Suspect Dmytro Vovk, the former head of NERC, dismissed the investigation as a “wild goose chase” on Facebook.

In separate statements over the past week, DTEK said the managers who are charged have prematurely returned from vacation to “fully cooperate” with authorities in order to “help establish the truth.”

A Kyiv court on August 14 set bail at $400,000 for one DTEK manager who wasn’t named, as enforcement actions like the NT Power penalty highlight regulatory consequences.

The so-called Rotterdam+ pricing formula that NABU has been investigating since March 2017, similar to federal scrutiny of TVA rates, was in place from April 2016 until July of this year.

It based the wholesale price of electricity by Ukrainian thermal power plants on coal prices set in the Rotterdam port plus delivery costs to Ukraine.

NABU alleges that at certain times it has not seen documented proof that the purchased coal originated in Rotterdam, insisting that there was no justification for the price hikes, echoing issues around paying for electricity in India in some markets.

Ukraine started facing thermal-coal shortages after fighting between government forces and Russia-backed separatists in the eastern part of the country erupted in April 2014. A vast majority of the anthracite-coal mines on which many Ukrainian plants rely are located on territory controlled by the separatists.

Overnight, Ukraine went from being a net exporter of coal to a net importer and started purchasing coal from as far away as South Africa and Australia.

 

Related News

View more

Inside Copenhagen’s race to be the first carbon-neutral city

Hedonistic Sustainability turns Copenhagen's ARC waste-to-energy plant into a public playground, blending ski slope, climbing wall, and trails with carbon-neutral heating, renewables, circular economy design, and green growth for climate action and liveability.

 

Key Points

A design approach fusing public recreation with clean-energy infrastructure to drive carbon-neutral, livable urban growth.

✅ Waste-to-energy plant doubles as recreation hub

✅ Supports carbon-neutral heating and renewables

✅ Stakeholder-driven, scalable urban climate model

 

“We call it hedonistic sustainability,” says Jacob Simonsen of the decision to put an artificial ski slope on the roof of the £485m Amager Resource Centre (Arc), Copenhagen’s cutting-edge new waste-to-energy power plant that feeds the city’s district heating network as well. “It’s not just good for the environment, it’s good for life.”

Skiing is just one of the activities that Simonsen, Arc’s chief executive, and Bjarke Ingels, its lead architect, hope will enhance the latest jewel in Copenhagen’s sustainability crown. The incinerator building also incorporates hiking and running trails, a street fitness gym and the world’s highest outdoor climbing wall, an 85-metre “natural mountain” complete with overhangs that rises the full height of the main structure.

In Copenhagen, green transformation goes hand-in-hand with job creation, a growing economy and a better quality of life

Frank Jensen, lord mayor

It’s all part of Copenhagen’s plan to be net carbon-neutral by 2025. Even now, after a summer that saw wildfires ravagethe Arctic Circle and ice sheets in Greenland suffer near-record levels of melt, the goal seems ambitious. In 2009, when the project was formulated, it was positively revolutionary.

“A green, smart, carbon-neutral city,” declared the cover of the climate action plan, aligning with a broader electric planet vision, before detailing the scale of the challenge: 100 new wind turbines; a 20% reduction in both heat and commercial electricity consumption; 75% of all journeys to be by bike, on foot, or by public transport; the biogas-ification of all organic waste; 60,000 sq metres of new solar panels; and 100% of the city’s heating requirements to be met by renewables.

Radical and far-reaching, the scheme dared to rethink the very infrastructure underpinning the city. There’s still not a climate project anywhere else in the world that comes close, even as leaders elsewhere champion a fully renewable grid by 2030.

And, so far, it’s working. CO2 emissions have been reduced by 42% since 2005, and while challenges around mobility and energy consumption remain (new technologies such as better batteries and carbon capture are being implemented, and global calls for clean electricity investment grow), the city says it is on track to achieve its ultimate goal.

More significant still is that Copenhagen has achieved this while continuing to grow in traditional economic terms. Even as some commentators insist that nothing short of a total rethink of free-market economics and corporate structures is required to stave off global catastrophe, the Danish capital’s carbon transformation has happened alongside a 25% growth in its economy over two decades. Copenhagen’s experience will be a model for other world cities as the global energy transition unfolds.

The sentiment that lies behind Arc’s conception as a multi-use public good – “hedonistic sustainability” – is echoed by Bo Asmus Kjeldgaard, former mayor of Copenhagen for the environment and the man originally tasked, back in 2010, with making the plan a reality.

“We combined life quality with sustainability and called it ‘liveability’,” says Kjeldgaard, now CEO of his own climate adaptation company, Greenovation. “We succeeded in building a good narrative around this, one that everybody could believe in.”

The idea was first floated in the late 1990s, when the newly elected Kjeldgaard had a vision of Copenhagen as the environmental capital of Europe. His enthusiasm ran into political intransigence, however, and despite some success, a lack of budget meant most of his work became “just another branding exercise – it was greenwashing”.

We’re such a rich country – change should be easy for us

Claus Nielsen, furniture maker and designer

But after stints as mayor of family and the labour market, and children and young people, he ended up back at environment in 2010 with renewed determination and, crucially, a broader mandate from the city council. “I said: ‘This time, we have to do it right,’” he recalls, “so we made detailed, concrete plans for every area, set the carbon target, and demanded the money and the manpower to make it a reality.”

He brought on board more than 200 stakeholders, from businesses to academia to citizen representatives, and helped them develop 22 specific business plans and 65 separate projects. So far the plan appears on track: there has been a 15% reduction in heat consumption, 66% of all trips in the city are now by bike, on foot or public transport, and 51% of heat and power comes from renewable electricity sources.

The onus placed on ordinary Copenhageners to walk and cycle more, pay higher taxes (especially on cars) and put up with the inconvenience of infrastructure construction has generally been met with understanding and good grace. And while some people remain critical of the fact that Copenhagen airport is not factored into the CO2 calculations – it lies beyond the city’s boundaries – and grumble about precise definitions and formulae, dissent has been rare.

This relative lack of nimbyism and carping about change can, says Frank Jensen, the city’s lord mayor, be traced to longstanding political traditions.

“Caring for the environment and taking responsibility for society in general has been an integral part of the upbringing of many Danes,” he says. “Moreover, there is a general awareness that climate change now calls for immediate, ambitious and collective action.” A 2018 survey by Concito, a thinktank, found that such action was a top priority for voters.

Jensen is keen to stress the cooperative nature of the plan and says “our visions have to be grounded in the everyday lives of people to be politically feasible”. Indeed, involving so many stakeholders, and allowing them to actively help shape both the ends and the means, has been key to the plan’s success so far and the continued goodwill it enjoys. “It’s so important to note that we [the authorities] cannot do this alone,” says Jørgen Abildgaard, Copenhagen’s executive climate programme director.

Many businesses around the world have typically been reluctant to embrace sustainability when a dip in profits or inconvenience might be the result, but not in Copenhagen. Martin Manthorpe, director of strategy, business development and public affairs at NCC, one of Scandinavia’s largest construction and industrial groups, was brought in early on by Abildgaard to represent industry on the municipality’s climate panel, and to facilitate discussions with the wider business community. He thinks there are several reasons why.

“The Danes have a trading mindset, meaning ‘What will I have to sell tomorrow?’ is just as important as ‘What am I producing today?’” he says. “Also, many big Danish companies are still ultimately family-owned, so the culture leans more towards long-term thinking.”

It is, he says, natural for business to be concerned with issues around sustainability and be willing to endure short-term pain: “To do responsible, long-term business, you need to see yourself as part of the larger puzzle that is called ‘society’.”

Furthermore, in Denmark climate change denial is given extremely short shrift. “We believe in the science,” says Anders Haugaard, a local entrepreneur. “Why wouldn’t you? We’re told sustainability brings only benefits and we’ve got no reason to be suspicious.”

“No one would dare argue against the environment,” says his friend Claus Nielsen, a furniture maker and designer. “We’re such a rich country – change should be easy for us.” Nielsen talks about how enlightened his kids are – “my 11-year-old daughter is now a flexitarian ” – and says that nowadays he mainly buys organic; Haugaard doesn’t see a problem with getting rid of petrol cars (the whole country is aiming to be fossil fuel-free by 2050 as the EU electricity use by 2050 is expected to double).

Above all, there’s a belief that sustainability need not make the city poorer: that innovation and “green growth” can be lucrative in and of themselves. “In Copenhagen, green transformation goes hand-in-hand with job creation, a growing economy and a better quality of life,” says Jensen. “We have also shown that it’s possible to combine this transition with economic growth and market opportunities for businesses, and I think that other countries can learn from our example.”

Besides, as Jensen notes, there is little alternative, and even less time: “National states have failed to take enough responsibility, but cities have the power and will to create concrete solutions. We need to start accelerating their implementation – we need to act now.”

 

Related News

View more

Jolting the brain's circuits with electricity is moving from radical to almost mainstream therapy

Brain Stimulation is transforming neuromodulation, from TMS and DBS to closed loop devices, targeting neural circuits for addiction, depression, Parkinsons, epilepsy, and chronic pain, powered by advanced imaging, AI analytics, and the NIH BRAIN Initiative.

 

Key Points

Brain stimulation uses pulses to modulate neural circuits, easing symptoms in depression, Parkinsons, and epilepsy.

✅ Noninvasive TMS and invasive DBS modulate specific brain circuits

✅ Closed loop systems adapt stimulation via real time biomarker detection

✅ Emerging uses: addiction, depression, Parkinsons, epilepsy, chronic pain

 

In June 2015, biology professor Colleen Hanlon went to a conference on drug dependence. As she met other researchers and wandered around a glitzy Phoenix resort’s conference rooms to learn about the latest work on therapies for drug and alcohol use disorders, she realized that out of the 730 posters, there were only two on brain stimulation as a potential treatment for addiction — both from her own lab at Wake Forest School of Medicine.

Just four years later, she would lead 76 researchers on four continents in writing a consensus article about brain stimulation as an innovative tool for addiction. And in 2020, the Food and Drug Administration approved a transcranial magnetic stimulation device to help patients quit smoking, a milestone for substance use disorders.

Brain stimulation is booming. Hanlon can attend entire conferences devoted to the study of what electrical currents do—including how targeted stimulation can improve short-term memory in older adults—to the intricate networks of highways and backroads that make up the brain’s circuitry. This expanding field of research is slowly revealing truths of the brain: how it works, how it malfunctions, and how electrical impulses, precisely targeted and controlled, might be used to treat psychiatric and neurological disorders.

In the last half-dozen years, researchers have launched investigations into how different forms of neuromodulation affect addiction, depression, loss-of-control eating, tremor, chronic pain, obsessive compulsive disorder, Parkinson’s disease, epilepsy, and more. Early studies have shown subtle electrical jolts to certain brain regions could disrupt circuit abnormalities — the miscommunications — that are thought to underlie many brain diseases, and help ease symptoms that persist despite conventional treatments.

The National Institute of Health’s massive BRAIN Initiative put circuits front and center, distributing $2.4 billion to researchers since 2013 to devise and use new tools to observe interactions between brain cells and circuits. That, in turn, has kindled interest from the private sector. Among the advances that have enhanced our understanding of how distant parts of the brain talk with one another are new imaging technology and the use of machine learning, much as utilities use AI to adapt to shifting electricity demand, to interpret complex brain signals and analyze what happens when circuits go haywire.

Still, the field is in its infancy, and even therapies that have been approved for use in patients with, for example, Parkinson’s disease or epilepsy, help only a minority of patients, and in a world where electricity drives pandemic readiness expectations can outpace evidence. “If it was the Bible, it would be the first chapter of Genesis,” said Michael Okun, executive director of the Norman Fixel Institute for Neurological Diseases at University of Florida Health.

As brain stimulation evolves, researchers face daunting hurdles, and not just scientific ones. How will brain stimulation become accessible to all the patients who need it, given how expensive and invasive some treatments are? Proving to the FDA that brain stimulation works, and does so safely, is complicated and expensive. Even with a swell of scientific momentum and an influx of funding, the agency has so far cleared brain stimulation for only a handful of limited conditions. Persuading insurers to cover the treatments is another challenge altogether. And outside the lab, researchers are debating nascent issues, such as the ethics of mind control, the privacy of a person’s brain data—concerns that echo efforts to develop algorithms to prevent blackouts during rising ransomware threats—and how to best involve patients in the study of the human brain’s far-flung regions.

Neurologist Martha Morrell is optimistic about the future of brain stimulation. She remembers the shocked reactions of her colleagues in 2004 when she left full-time teaching at Stanford (she still has a faculty appointment as a clinical professor of neurology) to direct clinical trials at NeuroPace, then a young company making neurostimulator systems to potentially treat epilepsy patients.

Related: Once a last resort, this pain therapy is getting a new life amid the opioid crisis
“When I started working on this, everybody thought I was insane,” said Morrell. Nearly 20 years in, she sees a parallel between the story of jolting the brain’s circuitry and that of early implantable cardiac devices, such as pacemakers and defibrillators, which initially “were used as a last option, where all other medications have failed.” Now, “the field of cardiology is very comfortable incorporating electrical therapy, device therapy, into routine care. And I think that’s really where we’re going with neurology as well.”


Reaching a ‘slope of enlightenment’
Parkinson’s is, in some ways, an elder in the world of modern brain stimulation, and it shows the potential as well as the limitations of the technology. Surgeons have been implanting electrodes deep in the brains of Parkinson’s patients since the late 1990s, and in people with more advanced disease since the early 2000s.

In that time, it’s gone through the “hype cycle,” said Okun, the national medical adviser to the Parkinson’s Foundation since 2006. Feverish excitement and overinflated expectations have given way to reality, bringing scientists to a “slope of enlightenment,” he said. They have found deep brain stimulation to be very helpful for some patients with Parkinson’s, rendering them almost symptom-free by calming the shaking and tremors that medications couldn’t. But it doesn’t stop the progression of the disease, or resolve some of the problems patients with advanced Parkinson’s have walking, talking, and thinking.

In 2015, the same year Hanlon found only her lab’s research on brain stimulation at the addiction conference, Kevin O’Neill watched one finger on his left hand start doing something “funky.” One finger twitched, then two, then his left arm started tingling and a feeling appeared in his right leg, like it was about to shake but wouldn’t — a tremor.

“I was assuming it was anxiety,” O’Neill, 62, told STAT. He had struggled with anxiety before, and he had endured a stressful year: a separation, selling his home, starting a new job at a law firm in California’s Bay Area. But a year after his symptoms first began, O’Neill was diagnosed with Parkinson’s.

In the broader energy context, California has increasingly turned to battery storage to stabilize its strained grid.

Related: Psychiatric shock therapy, long controversial, may face fresh restrictions
Doctors prescribed him pills that promote the release of dopamine, to offset the death of brain cells that produce this messenger molecule in circuits that control movement. But he took them infrequently because he worried about insomnia as a side effect. Walking became difficult — “I had to kind of think my left leg into moving” — and the labor lawyer found it hard to give presentations and travel to clients’ offices.

A former actor with an outgoing personality, he developed social anxiety and didn’t tell his bosses about his diagnosis for three years, and wouldn’t have, if not for two workdays in summer 2018 when his tremors were severe and obvious.

O’Neill’s tremors are all but gone since he began deep brain stimulation last May, though his left arm shakes when he feels tense.

It was during that period that he learned about deep brain stimulation, at a support group for Parkinson’s patients. “I thought, ‘I will never let anybody fuss with my brain. I’m not going to be a candidate for that,’” he recalled. “It felt like mad scientist science fiction. Like, are you kidding me?”

But over time, the idea became less radical, as O’Neill spoke to DBS patients and doctors and did his own research, and as his symptoms worsened. He decided to go for it. Last May, doctors at the University of California, San Francisco surgically placed three metal leads into his brain, connected by thin cords to two implants in his chest, just near the clavicles. A month later, he went into the lab and researchers turned the device on.

“That was a revelation that day,” he said. “You immediately — literally, immediately — feel the efficacy of these things. … You go from fully symptomatic to non-symptomatic in seconds.”

When his nephew pulled up to the curb to pick him up, O’Neill started dancing, and his nephew teared up. The following day, O’Neill couldn’t wait to get out of bed and go out, even if it was just to pick up his car from the repair shop.

In the year since, O’Neill’s walking has gone from “awkward and painful” to much improved, and his tremors are all but gone. When he is extra frazzled, like while renovating and moving into his new house overlooking the hills of Marin County, he feels tense and his left arm shakes and he worries the DBS is “failing,” but generally he returns to a comfortable, tremor-free baseline.

O’Neill worried about the effects of DBS wearing off but, for now, he can think “in terms of decades, instead of years or months,” he recalled his neurologist telling him. “The fact that I can put away that worry was the big thing.”

He’s just one patient, though. The brain has regions that are mostly uniform across all people. The functions of those regions also tend to be the same. But researchers suspect that how brain regions interact with one another — who mingles with whom, and what conversation they have — and how those mixes and matches cause complex diseases varies from person to person. So brain stimulation looks different for each patient.

Related: New study revives a Mozart sonata as a potential epilepsy therapy
Each case of Parkinson’s manifests slightly differently, and that’s a bit of knowledge that applies to many other diseases, said Okun, who organized the nine-year-old Deep Brain Stimulation Think Tank, where leading researchers convene, review papers, and publish reports on the field’s progress each year.

“I think we’re all collectively coming to the realization that these diseases are not one-size-fits-all,” he said. “We have to really begin to rethink the entire infrastructure, the schema, the framework we start with.”

Brain stimulation is also used frequently to treat people with common forms of epilepsy, and has reduced the number of seizures or improved other symptoms in many patients. Researchers have also been able to collect high-quality data about what happens in the brain during a seizure — including identifying differences between epilepsy types. Still, only about 15% of patients are symptom-free after treatment, according to Robert Gross, a neurosurgery professor at Emory University in Atlanta.

“And that’s a critical difference for people with epilepsy. Because people who are symptom-free can drive,” which means they can get to a job in a place like Georgia, where there is little public transit, he said. So taking neuromodulation “from good to great,” is imperative, Gross said.


Renaissance for an ancient idea
Recent advances are bringing about what Gross sees as “almost a renaissance period” for brain stimulation, though the ideas that undergird the technology are millenia old. Neuromodulation goes back to at least ancient Egypt and Greece, when electrical shocks from a ray, called the “torpedo fish,” were recommended as a treatment for headache and gout. Over centuries, the fish zaps led to doctors burning holes into the brains of patients. Those “lesions” worked, somehow, but nobody could explain why they alleviated some patients’ symptoms, Okun said.

Perhaps the clearest predecessor to today’s technology is electroconvulsive therapy (ECT), which in a rudimentary and dangerous way began being used on patients with depression roughly 100 years ago, said Nolan Williams, director of the Brain Stimulation Lab at Stanford University.

Related: A new index measures the extent and depth of addiction stigma
More modern forms of brain stimulation came about in the United States in the mid-20th century. A common, noninvasive approach is transcranial magnetic stimulation, which involves placing an electromagnetic coil on the scalp to transmit a current into the outermost layer of the brain. Vagus nerve stimulation (VNS), used to treat epilepsy, zaps a nerve that contributes to some seizures.

The most invasive option, deep brain stimulation, involves implanting in the skull a device attached to electrodes embedded in deep brain regions, such as the amygdala, that can’t be reached with other stimulation devices. In 1997, the FDA gave its first green light to deep brain stimulation as a treatment for tremor, and then for Parkinson’s in 2002 and the movement disorder dystonia in 2003.

Even as these treatments were cleared for patients, though, what was happening in the brain remained elusive. But advanced imaging tools now let researchers peer into the brain and map out networks — a recent breakthrough that researchers say has propelled the field of brain stimulation forward as much as increased funding has, paralleling broader efforts to digitize analog electrical systems across industry. Imaging of both human brains and animal models has helped researchers identify the neuroanatomy of diseases, target brain regions with more specificity, and watch what was happening after electrical stimulation.

Another key step has been the shift from open-loop stimulation — a constant stream of electricity — to closed-loop stimulation that delivers targeted, brief jolts in response to a symptom trigger. To make use of the futuristic technology, labs need people to develop artificial intelligence tools, informed by advances in machine learning for the energy transition, to interpret large data sets a brain implant is generating, and to tailor devices based on that information.

“We’ve needed to learn how to be data scientists,” Morrell said.

Affinity groups, like the NIH-funded Open Mind Consortium, have formed to fill that gap. Philip Starr, a neurosurgeon and developer of implantable brain devices at the University of California at San Francisco Health system, leads the effort to teach physicians how to program closed-loop devices, and works to create ethical standards for their use. “There’s been extraordinary innovation after 20 years of no innovation,” he said.

The BRAIN Initiative has been critical, several researchers told STAT. “It’s been a godsend to us,” Gross said. The NIH’s Brain Research through Advancing Innovative Neurotechnologies (BRAIN) Initiative was launched in 2013 during the Obama administration with a $50 million budget. BRAIN now spends over $500 million per year. Since its creation, BRAIN has given over 1,100 awards, according to NIH data. Part of the initiative’s purpose is to pair up researchers with medical technology companies that provide human-grade stimulation devices to the investigators. Nearly three dozen projects have been funded through the investigator-devicemaker partnership program and through one focused on new implantable devices for first-in-human use, according to Nick Langhals, who leads work on neurological disorders at the initiative.

The more BRAIN invests, the more research is spawned. “We learn more about what circuits are involved … which then feeds back into new and more innovative projects,” he said.

Many BRAIN projects are still in early stages, finishing enrollment or small feasibility studies, Langhals said. Over the next couple of years, scientists will begin to see some of the fruits of their labor, which could lead to larger clinical trials, or to companies developing more refined brain stimulation implants, Langhals said.

Money from the National Institutes of Mental Health, as well as the NIH’s Helping to End Addiction Long-term (HEAL), has similarly sweetened the appeal of brain stimulation, both for researchers and industry. “A critical mass” of companies interested in neuromodulation technology has mushroomed where, for two decades, just a handful of companies stood, Starr said.

More and more, pharmaceutical and digital health companies are looking at brain stimulation devices “as possible products for their future,” said Linda Carpenter, director of the Butler Hospital TMS Clinic and Neuromodulation Research Facility.


‘Psychiatry 3.0’
The experience with using brain stimulation to stop tremors and seizures inspired psychiatrists to begin exploring its use as a potentially powerful therapy for healing, or even getting ahead of, mental illness.

In 2008, the FDA approved TMS for patients with major depression who had tried, and not gotten relief from, drug therapy. “That kind of opened the door for all of us,” said Hanlon, a professor and researcher at the Center for Research on Substance Use and Addiction at Wake Forest School of Medicine. The last decade saw a surge of research into how TMS could be used to reset malfunctioning brain circuits involved in anxiety, depression, obsessive-compulsive disorder, and other conditions.

“We’re certainly entering into what a lot of people are calling psychiatry 3.0,” Stanford’s Williams said. “Whereas the first iteration was Freud and all that business, the second one was the psychopharmacology boom, and this third one is this bit around circuits and stimulation.”

Drugs alleviate some patients’ symptoms while simultaneously failing to help many others, but psychopharmacology clearly showed “there’s definitely a biology to this problem,” Williams said — a biology that in some cases may be more amenable to a brain stimulation.

Related: Largest psilocybin trial finds the psychedelic is effective in treating serious depression
The exact mechanics of what happens between cells when brain circuits … well, short-circuit, is unclear. Researchers are getting closer to finding biomarkers that warn of an incoming depressive episode, or wave of anxiety, or loss of impulse control. Those brain signatures could be different for every patient. If researchers can find molecular biomarkers for psychiatric disorders — and find ways to preempt those symptoms by shocking particular brain regions — that would reshape the field, Williams said.

Not only would disease-specific markers help clinicians diagnose people, but they could help chip away at the stigma that paints mental illness as a personal or moral failing instead of a disease. That’s what happened for epilepsy in the 1960s, when scientific findings nudged the general public toward a deeper understanding of why seizures happen, and it’s “the same trajectory” Williams said he sees for depression.

His research at the Stanford lab also includes work on suicide, and obsessive-compulsive disorder, which the FDA said in 2018 could be treated using noninvasive TMS. Williams considers brain stimulation, with its instantaneity, to be a potential breakthrough for urgent psychiatric situations. Doctors know what to do when a patient is rushed into the emergency room with a heart attack or a stroke, but there is no immediate treatment for psychiatric emergencies, he said. Williams wonders: What if, in the future, a suicidal patient could receive TMS in the emergency room and be quickly pulled out of their depressive mental spiral?

Researchers are also actively investigating the brain biology of addiction. In August 2020, the FDA approved TMS for smoking cessation, the first such OK for a substance use disorder, which is “really exciting,” Hanlon said. Although there is some nuance when comparing substance use disorders, a primal mechanism generally defines addiction: the eternal competition between “top-down” executive control functions and “bottom-up” cravings. It’s the same process that is at work when one is deciding whether to eat another cookie or abstain — just exacerbated.

Hanlon is trying to figure out if the stop and go circuits are in the same place for all people, and whether neuromodulation should be used to strengthen top-down control or weaken bottom-up cravings. Just as brain stimulation can be used to disrupt cellular misfiring, it could also be a tool for reinforcing helpful brain functions, or for giving the addicted brain what it wants in order to curb substance use.

Evidence suggests many people with schizophrenia smoke cigarettes (a leading cause of early death for this population) because nicotine reduces the “hyperconnectivity” that characterizes the brains of people with the disease, said Heather Ward, a research fellow at Boston’s Beth Israel Deaconess Medical Center. She suspects TMS could mimic that effect, and therefore reduce cravings and some symptoms of the disease, and she hopes to prove that in a pilot study that is now enrolling patients.

If the scientific evidence proves out, clinicians say brain stimulation could be used alongside behavioral therapy and drug-based therapy to treat substance use disorders. “In the end, we’re going to need all three to help people stay sober,” Hanlon said. “We’re adding another tool to the physician’s toolbox.”

Decoding the mysteries of pain
Afavorable outcome to the ongoing research, one that would fling the doors to brain stimulation wide open for patients with myriad disorders, is far from guaranteed. Chronic pain researchers know that firsthand.

Chronic pain, among the most mysterious and hard-to-study medical phenomena, was the first use for which the FDA approved deep brain stimulation, said Prasad Shirvalkar, an assistant professor of anesthesiology at UCSF. But when studies didn’t pan out after a year, the FDA retracted its approval.

Shirvalkar is working with Starr and neurosurgeon Edward Chang on a profoundly complex problem: “decoding pain in the brain states, which has never been done,” as Starr told STAT.

Part of the difficulty of studying pain is that there is no objective way to measure it. Much of what we know about pain is from rudimentary surveys that ask patients to rate how much they’re hurting, on a scale from zero to 10.

Using implantable brain stimulation devices, the researchers ask patients for a 0-to-10 rating of their pain while recording up-and-down cycles of activity in the brain. They then use machine learning to compare the two streams of information and see what brain activity correlates with a patient’s subjective pain experience. Implantable devices let researchers collect data over weeks and months, instead of basing findings on small snippets of information, allowing for a much richer analysis.

 

Related News

View more

COVID-19 Pandemic Puts $35 Billion in Wind Energy Investments at Risk, Says Industry Group

COVID-19 Impact on U.S. Wind Industry: disrupting wind power projects, tax credits, and construction timelines, risking rural revenues, jobs, and $35B investments; AWEA seeks Congressional flexibility as OEM shutdowns like Siemens Gamesa intensify delays.

 

Key Points

Pandemic disruptions threaten 25 GW of projects, $35B investment, rural revenues, jobs, and tax-credit timelines.

✅ 25 GW at risk; $35B investment jeopardized

✅ Rural taxes and land-lease payments may drop $8B

✅ AWEA seeks Congressional flexibility on tax-credit deadlines

 

In one of the latest examples of the havoc that the novel coronavirus is wreaking on the U.S. economy and the crisis hitting solar and wind sector alike, the American Wind Energy Association (AWEA) -- the national trade association for the U.S. wind industry -- yesterday stated its concerns that COVID-19 will "pose significant challenges to the American wind power industry." According to AWEA's calculations, the disease is jeopardizing the development of approximately 25 gigawatts of wind projects, representing $35 billion in investments, even as wind additions persist in some markets amid the pandemic.

Rural communities, where about 99% of wind projects are located, in particular, face considerable risk. The AWEA estimates that rural communities stand to lose about $8 billion in state and local tax payments and land-lease payments to private landowners. In addition, it's estimated that the pandemic threatens the loss of over 35,000 jobs, and the U.S. wind jobs outlook underscores the stakes, including wind turbine technicians, construction workers, and factory workers.

The development of wind projects is heavily reliant on the earning of tax credits, and debates over a Solar ITC extension highlight potential impacts on wind. However, in order to qualify for the current credits, project developers are bound to begin construction before Dec. 31, 2020. With local and state governments implementing various measures to stop the spread of the virus, the success of project developers' meeting this deadline is dubious, as utility-scale solar construction slows nationwide due to COVID-19. Addressing this and other challenges, the AWEA is turning to the government for help. In the trade association's press release, it states that "to protect the industry and these workers, AWEA is asking Congress for flexibility in allowing existing policies to continue working for the industry through this period of uncertainty."

Illustrating one of the ways in which COVID-19 is affecting the industry, Siemens Gamesa, a global leader in the manufacturing of wind turbines, closed a second Spanish factory this week after learning that a second of its employees had tested positive for the novel coronavirus.

 

Related News

View more

Grid coordination opens road for electric vehicle flexibility

Smart EV Charging orchestrates vehicle-to-grid (V2G), demand response, and fast charging to balance the power grid, integrating renewables, electrolyzers for hydrogen, and megawatt chargers for fleets with advanced control and co-optimization.

 

Key Points

Smart EV charging coordinates EV load to stabilize the grid, cut peaks, and integrate renewable energy efficiently.

✅ Reduces peak demand via coordinated, flexible load control

✅ Enables V2G services with renewables and battery storage

✅ Supports megawatt fast charging for heavy-duty fleets

 

As electric vehicle (EV) sales continue to rev up in the United States, the power grid is in parallel contending with the greatest transformation in its 100-year history: the large-scale integration of renewable energy and power electronic devices. The expected expansion of EVs will shift those challenges into high gear, causing cities to face gigawatt-growth in electricity demand, as analyses of EV grid impacts indicate, and higher amounts of variable energy.

Coordinating large numbers of EVs with the power system presents a highly complex challenge. EVs introduce variable electrical loads that are highly dependent on customer behavior. Electrified transportation involves co-optimization with other energy systems, like natural gas and bulk battery storage, including mobile energy storage flexibility for new operational options. It could involve fleets of automated ride-hailing EVs and lead to hybrid-energy truck stops that provide hydrogen and fast-charging to heavy-duty vehicles.

Those changes will all test the limits of grid integration, but the National Renewable Energy Laboratory (NREL) sees opportunity at the intersection of energy systems and transportation. With powerful resources for simulating and evaluating complex systems, several NREL projects are determining the coordination required for fast charging, balancing electrical supply and demand, and efficient use of all energy assets.


Smart and Not-So-Smart Control
To appreciate the value of coordinated EV charging, it is helpful to imagine the opposite scenario.

"Our first question is how much benefit or burden the super simple, uncoordinated approach to electric vehicle charging offers the grid," said Andrew Meintz, the researcher leading NREL's Electric Vehicle Grid Integration team, as well as the RECHARGE project for smart EV charging. "Then we compare that to the 'whiz-bang,' everything-is-connected approach. We want to know the difference in value."

In the "super simple" approach, Meintz explained that battery-powered electric vehicles grow in market share, exemplified by mass-market EVs, without any evolution in vehicle charging coordination. Picture every employee at your workplace driving home at 5 p.m. and charging their vehicle. That is the grid's equivalent of going 0 to 100 mph, and if it does not wreck the system, it is at least very expensive. According to NREL's Electrification Futures Study, a comprehensive analysis of the impacts of widespread electrification across all U.S. economic sectors, in 2050 EVs could contribute to a 33% increase in energy use during peak electrical demand, underscoring state grid challenges that make these intervals costly when energy reserves are procured. In duck curve parlance, EVs will further strain the duck's neck.

The Optimization and Control Lab's Electric Vehicle Grid Integration bays allow researchers to determine how advanced high power chargers can be added safely and effectively to the grid, with the potential to explore how to combine buildings and EV charging. Credit: Dennis Schroeder, NREL
Meintz's "whiz-bang" approach instead imagines EV control strategies that are deliberate and serve to smooth, rather than intensify, the upcoming demand for electricity. It means managing both when and where vehicles charge to create flexible load on the grid.

At NREL, smart strategies to dispatch vehicles for optimal charging are being developed for both the grid edge, where consumers and energy users connect to the grid, as in RECHARGEPDF, and the entire distribution system, as in the GEMINI-XFC projectPDF. Both projects, funded by the U.S. Department of Energy's (DOE's) Vehicle Technologies Office, lean on advanced capabilities at NREL's Energy Systems Integration Facility to simulate future energy systems.

At the grid edge, EVs can be co-optimized with distributed energy resources—small-scale generation or storage technologies—the subject of a partnership with Eaton that brought industry perspectives to bear on coordinated management of EV fleets.

At the larger-system level, the GEMINI-XFC project has extended EV optimization scenarios to the city scale—the San Francisco Bay Area, to be specific.

"GEMINI-XFC involves the highest-ever-fidelity modeling of transportation and the grid," said NREL Research Manager of Grid-Connected Energy Systems Bryan Palmintier.

"We're combining future transportation scenarios with a large metro area co-simulationPDF—millions of simulated customers and a realistic distribution system model—to find the best approaches to vehicles helping the grid."

GEMINI-XFC and RECHARGE can foresee future electrification scenarios and then insert controls that reduce grid congestion or offset peak demand, for example. Charging EVs involves a sort of shell game, where loads are continually moved among charging stations to accommodate grid demand.

But for heavy-duty vehicles, the load is harder to hide. Electrified truck fleets will hit the road soon, creating power needs for electric truck fleets that translate to megawatts of localized demand. No amount of rerouting can avoid the requirements of charging heavy-duty vehicles or other instances of extreme fast-charging (XFC). To address this challenge, NREL is working with industry and other national laboratories to study and demonstrate the technological buildout necessary to achieve 1+ MW charging stationsPDF that are capable of fast charging at very high energy levels for medium- and heavy-duty vehicles.

To reach such a scale, NREL is also considering new power conversion hardware based on advanced materials like wide-bandgap semiconductors, as well as new controllers and algorithms that are uniquely suited for fleets of charge-hungry vehicles. The challenge to integrate 1+ MW charging is also pushing NREL research to higher power: Upcoming capabilities will look at many-megawatt systems that tie in the support of other energy sectors.


Renewable In-Roads for Hydrogen

At NREL, the drive toward larger charging demands is being met with larger research capabilities. The announcement of ARIES opens the door to energy systems integration research at a scale 10-times greater than current capabilities: 20 MW, up from 2 MW. Critically, it presents an opportunity to understand how mobility with high energy demands can be co-optimized with other utility-scale assets to benefit grid stability.

"If you've got a grid humming along with a steady load, then a truck requires 500 kW or more of power, it could create a large disruption for the grid," said Keith Wipke, the laboratory program manager for fuel cells and hydrogen technologies at NREL.

Such a high power demand could be partially served by battery storage systems. Or it could be hidden entirely with hydrogen production. Wipke's program, with support from the DOE's Hydrogen and Fuel Cell Technologies Office, has been performing studies into how electrolyzers—devices that use electricity to break water into hydrogen and oxygen—could offset the grid impacts of XFC. These efforts are also closely aligned with DOE's H2@Scale vision for affordable and effective hydrogen use across multiple sectors, including heavy-duty transportation, power generation, and metals manufacturing, among others.

"We're simulating electrolyzers that can match the charging load of heavy-duty battery electric vehicles. When fast charging begins, the electrolyzers are ramped down. When fast charging ends, the electrolyzers are ramped back up," Wipke said. "If done smoothly, the utility doesn't even know it's happening."

NREL Researchers Rishabh Jain, Kazunori Nagasawa, and Jen Kurtz are working on how grid integration of electrolyzers—devices that use electricity to break water into hydrogen and oxygen—could offset the grid impacts of extreme fast-charging. Credit: National Renewable Energy Laboratory
As electrolyzers harness the cheap electrons from off-demand periods, a significant amount of hydrogen can be produced on site. That creates a natural energy pathway from discount electricity into a fuel. It is no wonder, then, that several well-known transportation and fuel companies have recently initiated a multimillion-dollar partnership with NREL to advance heavy-duty hydrogen vehicle technologies.

"The logistics of expanding electric charging infrastructure from 50 kW for a single demonstration battery electric truck to 5,000 kW for a fleet of 100 could present challenges," Wipke said. "Hydrogen scales very nicely; you're basically bringing hydrogen to a fueling station or producing it on site, but either way the hydrogen fueling events are decoupled in time from hydrogen production, providing benefits to the grid."

The long driving range and fast refuel times—including a DOE target of achieving 10-minutes refuel for a truck—have already made hydrogen the standout solution for applications in warehouse forklifts. Further, NREL is finding that distributed electrolyzers can simultaneously produce hydrogen and improve voltage conditions, which can add much-needed stability to a grid that is accommodating more energy from variable resources.

Those examples that co-optimize mobility with the grid, using diverse technologies, are encouraging NREL and its partners to pursue a new scale of systems integration. Several forward-thinking projects are reimagining urban mobility as a mix of energy solutions that integrate the relative strengths of transportation technologies, which complement each other to fill important gaps in grid reliability.


The Future of Urban Mobility
What will electrified transportation look like at high penetrations? A few NREL projects offer some perspective. Among the most experimental, NREL is helping the city of Denver develop a smart community, integrated with electrified mobility and featuring automated charging and vehicle dispatch.

On another path to advanced mobility, Los Angeles has embarked on a plan to modernize its electricity system infrastructure, reflecting California EV grid stability goals—aiming for a 100% renewable energy supply by 2045, along with aggressive electrification targets for buildings and vehicles. Through the Los Angeles 100% Renewable Energy Study, the city is currently working with NREL to assess the full-scale impacts of the transition in a detailed analysis that integrates diverse capabilities across the laboratory.

The transition would include the Port of Long Beach, the busiest container port in the United States.

At the port, NREL is applying the same sort of scenario forecasting and controls evaluation as other projects, in order to find the optimal mix of technologies that can be integrated for both grid stability and a reliable quality of service: a mix of hydrogen fuel-cell and battery EVs, battery storage systems, on-site renewable generation, and extreme coordination among everything.

"Hydrogen at ports makes sense for the same reason as trucks: Marine applications have big power and energy demands," Wipke said. "But it's really the synergies between diverse technologies—the existing infrastructure for EVs and the flexibility of bulk battery systems—that will truly make the transition to high renewable energy possible."

Like the Port of Long Beach, transportation hubs across the nation are adapting to a complex environment of new mobility solutions. Airports and public transit stations involve the movement of passengers, goods, and services at a volume exceeding anywhere else. With the transition to digitally connected electric mobility changing how airports plan for the future, NREL projects such as Athena are using the power of high-performance computing to demonstrate how these hubs can maximize the value of passenger and freight mobility per unit of energy, time, and/or cost.

The growth in complexity for transportation hubs has just begun, however. Looking ahead, fleets of ride-sharing EVs, automated vehicles, and automated ride-sharing EV fleets could present the largest effort to manage mobility yet.


A Self-Driving Power Grid
To understand the full impact of future mobility-service providers, NREL developed the HIVE (Highly Integrated Vehicle Ecosystem) simulation framework. HIVE combines factors related to serving mobility needs and grid operations—such as a customer's willingness to carpool or delay travel, and potentially time-variable costs of recharging—and simulates the outcome in an integrated environment.

"Our question is, how do you optimize the management of a fleet whose primary purpose is to provide rides and improve that fleet's dispatch and charging?" said Eric Wood, an NREL vehicle systems engineer.

HIVE was developed as part of NREL's Autonomous Energy Systems research to optimize the control of automated vehicle fleets. That is, optimized routing and dispatch of automated electric vehicles.

The project imagines how price signals could influence dispatch algorithms. Consider one customer booking a commute through a ride-hailing app. Out of the fleet of vehicles nearby—variously charged and continually changing locations—which one should pick up the customer?

Now consider the movements of thousands of passengers in a city and thousands of vehicles providing transportation services. Among the number of agents, the moment-to-moment change in energy supply and demand, and the broad diversity in vendor technologies, "we're playing with a lot of parameters," Wood said.

But cutting through all the complexity, and in the midst of massive simulations, the end goal for vehicle-to-grid integration is consistent:

"The motivation for our work is that there are forecasts for significant load on the grid from the electrification of transportation," Wood said. "We want to ensure that this load is safely and effectively integrated, while meeting the expectations and needs of passengers."

The Port of Long Beach uses a mix of hydrogen fuel-cell and battery EVs, battery storage systems, on-site renewable generation, and extreme coordination among everything. Credit: National Renewable Energy Laboratory
True Replacement without Caveats

Electric vehicles are not necessarily helpful to the grid, but they can be. As EVs become established in the transportation sector, NREL is studying how to even out any bumps that electrified mobility could cause on the grid and advance any benefits to commuters or industry.

"It all comes down to load flexibility," Meintz said. "We're trying to decide how to optimally dispatch vehicle charging to meet quality-of-service considerations, while also minimizing charging costs."

 

Related News

View more

Sign Up for Electricity Forum’s Newsletter

Stay informed with our FREE Newsletter — get the latest news, breakthrough technologies, and expert insights, delivered straight to your inbox.

Electricity Today T&D Magazine Subscribe for FREE

Stay informed with the latest T&D policies and technologies.
  • Timely insights from industry experts
  • Practical solutions T&D engineers
  • Free access to every issue

Download the 2025 Electrical Training Catalog

Explore 50+ live, expert-led electrical training courses –

  • Interactive
  • Flexible
  • CEU-cerified