NV Energy makes pitch for digital meters

By PennEnergy


NFPA 70e Training

Our customized live online or in‑person group training can be delivered to your staff at your location.

  • Live Online
  • 6 hours Instructor-led
  • Group Training Available
Regular Price:
$199
Coupon Price:
$149
Reserve Your Seat Today
Hearings began in the integrated-resource plan that power utility NV Energy has filed with the Public Utilities Commission of Nevada.

At issue during the hearings is the utility's $301 million Advanced Service Delivery initiative, which would replace 1.45 million electric meters across the state with digital meters that would help ratepayers track power consumption and enable NV Energy to charge flexible rates based on peak use.

NV Energy presented its case, with executives declaring written testimony, and commission staff and companies intervening in the case following up with questions.

If the cross-examinations were any indication, then commissioners, agency staffers, consumer advocates and interveners seem most concerned about how Advanced Service Delivery will affect rates. They also asked several questions about a lower-cost alternative to the initiative and sought to establish that existing metering is reliable and effective.

NV Energy has obtained $138 million in federal stimulus funds to help finance Advanced Service Delivery. The rest of the funding might have to come from higher rates in a future filing.

Paul Stuhff, a senior deputy attorney general who works for the state Bureau of Consumer Protection, quizzed NV Energy's interim chief financial officer, Kevin Bethel, on whether the utility should be at "risk of recovery" if Advanced Service Delivery's costs exceed its benefits.

Bethel responded that the commission could address Advanced Service Delivery's cost-benefit equation in the utility's next general rate case, scheduled for filing in December 2010.

Stuhff also asked Bethel twice if NV Energy's current metering and distribution system is reliable.

Bethel said it was, and Stuhff answered that "regulatory risk" should come with replacing a system that works.

Stuhff asked Bethel about other major expenses the utility expects to include in its next general rate case.

Investments in NV Energy's $683 million Harry Allen plant in Apex will be among the significant projects included in the general-rate application, Bethel said. Some of the plant's construction costs have already been accounted for in existing NV Energy rates.

Staffers and officials, including Commissioner Alaina Burtenshaw, also pointed to a separate NV Energy contingency plan if the commission doesn't approve Advanced Service Delivery.

The alternative proposal calls for $23 million over three years to augment NV Energy's budget for energy-conservation programs such as Cool Share, a voluntary program through which NV Energy temporarily raises the thermostat in the home during peak hours to conserve energy during high-use periods.

If the commission gives the go-ahead to Advanced Service Delivery, NV Energy would run a pilot program involving 10,000 ratepayers to test "dynamic," or variable, pricing based on high-use periods. Ratepayer participation in dynamic-pricing tests would be optional.

The company testified that it has 3,600 consumers signed up for NV Energy's Time of Use program, through which customers can save money by voluntarily reducing power use from 1 to 7 p.m. from June to September.

Also testifying was NV Energy President and Chief Executive Officer Michael Yackira.

Yackira said customers benefit from energy-conservation efforts both as individual ratepayers, because their power bills drop, and as a general group, because of peak-demand reduction.

NV Energy "does not receive direct benefits other than not having to raise capital" to build power plants, Yackira said. "It's a benefit, but an oblique benefit."

Yackira added that NV Energy has enough power-generation capability through ownership or purchasing contracts to provide power at peak consumption without problems or issues.

Commission staff members also asked Yackira whether NV Energy was positioned strategically to address potential federal regulations governing greenhouse-gas emissions.

NV Energy is in a "good" position thanks to investments in "highly efficient" plants that yield less carbon dioxide, as well as investments in renewable energy, Yackira said.

NV Energy's integrated-resource plan is a 20-year outline that details how NV Energy expects to obtain, finance and distribute electricity. Hearings related to another major plan component, a $510 million, 235-mile transmission line to link NV Energy's northern and southern power grids, are scheduled to start June 1.

Related News

What Will Drive Utility Revenue When Electricity Is Free?

AI-Powered Utility Customer Experience enables transparency, real-time pricing, smart thermostats, demand response, and billing optimization, helping utilities integrate distributed energy resources, battery storage, and microgrids while boosting customer satisfaction and reducing costs.

 

Key Points

An approach where utilities use AI and real-time data to personalize service, optimize billing, and cut energy costs.

✅ Real-time pricing aligns retail and wholesale market signals

✅ Device control via smart thermostats and home energy management

✅ Analytics reveal appliance-level usage and personalized savings

 

The latest electric utility customer satisfaction survey results from the American Customer Satisfaction Index (ACSI) Energy Utilities report reveal that nearly every investor-owned utility saw customer satisfaction go down from 2018 to 2019. Residential customers are sending a clear message in the report: They want more transparency and control over energy usage, billing and ways to reduce costs.

With both customer satisfaction and utility revenues on the decline, utilities are facing daunting challenges to their traditional business models amid flat electricity demand across many markets today. That said, it is the utilities that see these changing times as an opportunity to evolve that will become the energy leaders of tomorrow, where the customer relationship is no longer defined by sales volume but instead by a utility company's ability to optimize service and deliver meaningful customer solutions.

We have seen how the proliferation of centralized and distributed renewables on the grid has already dramatically changed the cost profile of traditional generation and variability of wholesale energy prices. This signals the real cost drivers in the future will come from optimizing energy service with things like batteries, microgrids and peer-to-peer trading networks. In the foreseeable future, flat electricity rates may be the norm, or electricity might even become entirely free as services become the primary source of utility revenue.

The key to this future is technological innovation that allows utilities to better understand a customer’s unique needs and priorities and then deliver personalized, well-timed solutions that make customers feel valued and appreciated as their utility helps them save and alleviates their greatest pain points.

I predict utilities that adopt new technologies focused on customer experience, aligned with key utility trends shaping the sector, and deliver continual service improvements and optimization will earn the most satisfied, most loyal customers.

To illustrate this, look at how fixed pricing today is applied for most residential customers. Unless you live in one of the states with deregulated utilities where most customers are free to choose a service provider in a competitive marketplace, as consumers in power markets increasingly reshape offerings, fixed-rate tariffs or time-of-use tariffs might be the only rate structures you have ever known, though new utility rate designs are being tested nationwide today. These tariffs are often market distortions, bearing little relation to the real-time price that the utility pays on the wholesale market.

It can be easy enough to compare the rate you pay as a consumer and the market rate that utilities pay. The California ISO has a public dashboard -- as do other grid operators -- that shows the real-time marginal cost of energy. On a recent Friday, for example, a buyer in San Francisco could go to the real-time market and procure electricity at a rate of around 9.5 cents per kilowatt-hour (kWh), yet a residential customer can pay the utility PG&E between 22 cents and 49 cents per kWh amid major changes to electric bills being debated, depending on usage.

The problem is that utility customers do not usually see this data or know how to interpret it in a way that helps add value to their service or drive down the cost.

This is a scenario ripe for innovation. Artificial intelligence (AI) technologies are beginning to be applied to give customers the transparency and control over the energy they desire, and a new type of utility is emerging using real-time pricing signals from wholesale markets to give households hassle-free energy savings. Evolve Energy in Texas is developing a utility service model, even as Texas utilities revisit smart home network strategies, that delivers electricity to consumers at real-time market prices and connects to smart thermostats and other connected devices in the home for simple monitoring and control -- all managed via an intuitive consumer app.

My company, Bidgely, partners with utilities and energy retailers all over the world to apply artificial intelligence and machine learning algorithms to customer data in order to bring transparency to their electricity bills, showing exactly where the customers’ money is going down to the appliance and offering personalized, actionable advice on how to save.

Another example is from energy management company Keewi. Its wireless outlet adaptors are revealing real-time energy usage information to Texas A&M dorm residents as well as providing students the ability to conserve energy through controlling items in their rooms from their smartphones.

These are but a few examples of innovations among many in play that answer the consumer demand for increased transparency and control over energy usage.

Electric service providers will be closely watching how consumers respond to AI-driven innovation, including providers in traditionally regulated markets that are exploring equitable regulation approaches now, to stay aligned with policy and customer expectations. While regulated utilities have no reason to fear that their customers might sign up with a competitor, they understand that the revenues from electricity sales are going down and the deployment of distributed energy resources is going up. Both trends were reflected in a March report from Bloomberg New Energy Finance (via ThinkProgress) that claimed unsubsidized storage projects co-located with solar or wind are starting to compete with coal and gas for dispatchable power. Change is coming to regulated markets, and some of that change can be attributed to customer dissatisfaction with utility service.

Like so many industries before, the utility-customer relationship is on track to become less about measuring unit sales and more about driving revenue through services and delivering the best customer value. Loyal customers are most likely to listen and follow through on the utility’s advice and to trust the utility for a wide range of energy-related products and services. Utilities that make customer experience the highest priority today will emerge tomorrow as the leaders of a new energy service era.

 

Related News

View more

Finland Investigates Russian Ship After Electricity Cable Damage

Finland Shadow Fleet Cable Investigation details suspected Russia-linked sabotage of Baltic Sea undersea cables, AIS dark activity, and false-flag tactics threatening critical infrastructure, prompting NATO and EU vigilance against hybrid warfare across Northern Europe.

 

Key Points

Finland probes suspected sabotage of undersea cables by a Russia-linked vessel using flag of convenience and AIS off.

✅ Undersea cable damage in Baltic Sea sparks security alerts

✅ Suspected shadow fleet ship ran AIS dark under false flag

✅ NATO and EU boost maritime surveillance, critical infrastructure

 

In December 2024, Finland launched an investigation into a ship allegedly linked to Russia’s “shadow fleet” following a series of incidents involving damage to undersea cables. The investigation has raised significant concerns in Finland and across Europe, as it suggests possible sabotage or other intentional acts related to the disruption of vital communication and energy infrastructure in the Baltic Sea region. This article explores the key details of the investigation, the role of Russia’s shadow fleet, and the broader geopolitical implications of this event.

The "Shadow Fleet" and Its Role

The term “shadow fleet” refers to a collection of ships, often disguised or operating under false flags, that are believed to be part of Russia's covert maritime operations. These vessels are typically used for activities such as smuggling, surveillance, and potentially military operations, mirroring the covert hacker infrastructure documented by researchers in related domains. In recent years, the "shadow fleet" has been under increasing scrutiny due to its involvement in various clandestine actions, especially in regions close to NATO member countries and areas with sensitive infrastructure.

Russia’s "shadow fleet" operates in the shadows of regular international shipping, often difficult to track due to the use of deceptive practices like turning off automatic identification systems (AIS). This makes it difficult for authorities to monitor their movements and assess their true purpose, raising alarm bells when one of these ships is suspected of being involved in damaging vital infrastructure like undersea cables.

The Cable Damage Incident

The investigation was sparked after damage was discovered to an undersea cable in the Baltic Sea, a vital link for communication, data transmission, and energy supply between Finland and other parts of Europe. These undersea cables are crucial for everything from internet connections to energy grid stability, with recent Nordic grid constraints underscoring their importance, and any disruption to them can have serious consequences.

Finnish authorities reported that the damage appeared to be deliberate, raising suspicions of potential sabotage. The timing of the damage coincides with a period of heightened tensions between Russia and the West, particularly following the escalation of the war in Ukraine, with recent strikes on Ukraine's power grid highlighting the stakes, and ongoing geopolitical instability. This has led many to speculate that the damage to the cables could be part of a broader strategy to undermine European security and disrupt critical infrastructure.

Upon further investigation, a vessel that had been in the vicinity at the time of the damage was identified as potentially being part of Russia’s "shadow fleet." The ship had been operating under a false flag and had disabled its AIS system, making it challenging for authorities to track its movements. The vessel’s activities raised red flags, and Finnish authorities are now working closely with international partners to ascertain its involvement in the incident.

Geopolitical Implications

The damage to undersea cables and the suspected involvement of Russia’s "shadow fleet" have broader geopolitical implications, particularly in the context of Europe’s security landscape. Undersea cables are considered critical infrastructure, akin to electric utilities where intrusions into US control rooms have been documented, and any deliberate attack on them could be seen as an act of war or an attempt to destabilize regional security.

In the wake of the investigation, there has been increased concern about the vulnerability of Europe’s energy and communication networks, which are increasingly reliant on these undersea connections, and as the Baltics pursue grid synchronization with the EU to reduce dependencies, policymakers are reassessing resilience measures. The European Union, alongside NATO, has expressed growing alarm over potential threats to this infrastructure, especially as tensions with Russia continue to escalate.

The incident also highlights the growing risks associated with hybrid warfare tactics, which combine conventional military actions with cyberattacks, including the U.S. condemnation of power grid hacking as a cautionary example, sabotage, and disinformation campaigns. The targeting of undersea cables could be part of a broader strategy by Russia to disrupt Europe’s ability to coordinate and respond effectively, particularly in the context of ongoing sanctions and diplomatic pressure.

Furthermore, the suspected involvement of a "shadow fleet" ship raises questions about the transparency and accountability of maritime activities in the region. The use of vessels operating under false flags or without identification systems complicates efforts to monitor and regulate shipping in international waters. This has led to calls for stronger maritime security measures and greater cooperation between European countries to ensure the safety and integrity of critical infrastructure.

Finland’s Response and Ongoing Investigation

In response to the cable damage incident, Finnish authorities have mobilized a comprehensive investigation, seeking to determine the extent of the damage and whether the actions were deliberate or accidental. The Finnish government has called for increased vigilance and cooperation with international partners to identify and address potential threats to undersea infrastructure, drawing on Symantec's Dragonfly research for insights into hostile capabilities.

Finland, which shares a border with Russia and has been increasingly concerned about its security in the wake of Russia's invasion of Ukraine, has ramped up its defense posture. The damage to undersea cables serves as a stark reminder of the vulnerabilities that come with an interconnected global infrastructure, and Finland’s security services are likely to scrutinize the incident as part of their broader defense strategy.

Additionally, the incident is being closely monitored by NATO and the European Union, both of which have emphasized the importance of safeguarding critical infrastructure. As an EU member and NATO partner, Finland’s response to this situation could influence how Europe addresses similar challenges in the future.

The investigation into the damage to undersea cables in the Baltic Sea, allegedly linked to Russia’s "shadow fleet," has significant implications for European security. The use of covert operations, including the deployment of ships under false flags, underscores the growing threats to vital infrastructure in the region. With tensions between Russia and the West continuing to rise, the potential for future incidents targeting critical communication and energy networks is a pressing concern.

As Finland continues its investigation, the incident highlights the need for greater international cooperation and vigilance in safeguarding undersea cables and other critical infrastructure. In a world where hybrid warfare tactics are becoming increasingly common, ensuring the security of these vital connections will be crucial for maintaining stability in Europe. The outcome of this investigation may serve as a crucial case study in the ongoing efforts to protect infrastructure from emerging and unconventional threats.

 

Related News

View more

Net-Zero Emissions Might Not Be Possible Without Nuclear Power

Nuclear Power for Net-Zero Grids anchors reliable baseload, integrating renewables with grid stability as solar, wind, and battery storage scale. Advanced reactors complement hydropower, curb natural gas reliance, and accelerate deep decarbonization of electricity systems.

 

Key Points

Uses nuclear baseload and advanced reactors to stabilize power grids and integrate higher shares of variable renewables.

✅ Provides firm, zero-carbon baseload for renewable-heavy grids

✅ Reduces natural gas dependence and peaker emissions

✅ Advanced reactors enhance safety, flexibility, and cost

 

Declining solar, wind, and battery technology costs are helping to grow the share of renewables in the world’s power mix to the point that governments are pledging net-zero emission electricity generation in two to three decades to fight global warming.

Yet, electricity grids will continue to require stable baseload to incorporate growing shares of renewable energy sources and ensure lights are on even when the sun doesn’t shine, or the wind doesn’t blow. Until battery technology evolves enough—and costs fall far enough—to allow massive storage and deployment of net-zero electricity to the grid, the systems will continue to need power from sources other than solar and wind.

And these will be natural gas and nuclear power, regardless of concerns about emissions from the fossil fuel natural gas and potential disasters at nuclear power facilities such as the ones in Chernobyl or Fukushima.

As natural gas is increasingly considered as just another fossil fuel, nuclear power generation provides carbon-free electricity to the countries that have it, even as debates over nuclear power’s outlook continue worldwide, and could be the key to ensuring a stable power grid capable of taking in growing shares of solar and wind power generation.

The United States, where nuclear energy currently provides more than half of the carbon-free electricity, is supporting the development of advanced nuclear reactors as part of the clean energy strategy.

But Europe, which has set a goal to reach carbon neutrality by 2050, could find itself with growing emissions from the power sector in a decade, as many nuclear reactors are slated for decommissioning and questions remain over whether its aging reactors can bridge the gap. The gap left by lost nuclear power is most easily filled by natural gas-powered electricity generation—and this, if it happens, could undermine the net-zero goals of the European Union (EU) and the bloc’s ambition to be a world leader in the fight against climate change.

 

U.S. Power Grid Will Need Nuclear For Net-Zero Emissions

A 2020 report from the University of California, Berkeley, said that rapidly declining solar, wind, and storage prices make it entirely feasible for the U.S. to meet 90 percent of its power needs from zero-emission energy sources by 2035 with zero increases in customer costs from today’s levels.

Still, natural gas-fired generation will be needed for 10 percent of America’s power needs. According to the report, in 2035 it would be possible that “during normal periods of generation and demand, wind, solar, and batteries provide 70% of annual generation, while hydropower and nuclear provide 20%.” Even with an exponential rise in renewable power generation, the U.S. grid will need nuclear power and hydropower to be stable with such a large share of solar and wind.

The U.S. Backs Advanced Nuclear Reactor Technology

The U.S. Department of Energy is funding programs of private companies under DOE’s new Advanced Reactor Demonstration Program (ARDP) to showcase next-gen nuclear designs for U.S. deployment.

“Taking leadership in advanced technology is so important to the country’s future because nuclear energy plays such a key role in our clean energy strategy,” U.S. Secretary of Energy Dan Brouillette said at the end of December when DOE announced it was financially backing five teams to develop and demonstrate advanced nuclear reactors in the United States.

“All of these projects will put the U.S. on an accelerated timeline to domestically and globally deploy advanced nuclear reactors that will enhance safety and be affordable to construct and operate,” Secretary Brouillette said.

According to Washington DC-based Nuclear Energy Institute (NEI), a policy organization of the nuclear technologies industry, nuclear energy provides nearly 55 percent of America’s carbon-free electricity. That is more than 2.5 times the amount generated by hydropower, nearly 3 times the amount generated by wind, and more than 12 times the amount generated by solar. Nuclear energy can help the United States to get to the deep carbonization needed to hit climate goals.

 

Europe Could See Rising Emissions Without Nuclear Power

While the United States is doubling down on efforts to develop advanced and cheaper nuclear reactors, including microreactors and such with new types of technology, Europe could be headed to growing emissions from the electricity sector as nuclear power facilities are scheduled to be decommissioned over the next decade and Europe is losing nuclear power just when it really needs energy, according to a Reuters analysis from last month.

In many cases, it will be natural gas that will come to the rescue to power grids to ensure grid stability and enough capacity during peak demand because solar and wind generation is variable and dependent on the weather.

For example, Germany, the biggest economy in Europe, is boosting its renewables targets, but it is also phasing out nuclear by next year, amid a nuclear option debate over climate strategy, while its deadline to phase out coal-fired generation is 2038—more than a decade later compared to phase-out plans in the UK and Italy, for example, where the deadline is the mid-2020s.

The UK, which left the EU last year, included support for nuclear power generation as one of the ten pillars in ‘The Ten Point Plan for a Green Industrial Revolution’ unveiled in November.

The UK’s National Grid has issued several warnings about tight supply since the fall of 2020, due to low renewable output amid high demand.

“National Grid’s announcement underscores the urgency of investing in new nuclear capacity, to secure reliable, always-on, emissions-free power, alongside other zero-carbon sources. Otherwise, we will continue to burn gas and coal as a fallback and fall short of our net zero ambitions,” Tom Greatrex, Chief Executive of the Nuclear Industry Association, said in response to one of those warnings.

But it’s in the UK that one major nuclear power plant project has notoriously seen a delay of nearly a decade—Hinkley Point C, originally planned in 2007 to help UK households to “cook their 2017 Christmas turkeys”, is now set for start-up in the middle of the 2020s.

Nuclear power development and plant construction is expensive, but it could save the plans for low-carbon emission power generation in many developed economies, including in the United States.

 

Related News

View more

Jolting the brain's circuits with electricity is moving from radical to almost mainstream therapy

Brain Stimulation is transforming neuromodulation, from TMS and DBS to closed loop devices, targeting neural circuits for addiction, depression, Parkinsons, epilepsy, and chronic pain, powered by advanced imaging, AI analytics, and the NIH BRAIN Initiative.

 

Key Points

Brain stimulation uses pulses to modulate neural circuits, easing symptoms in depression, Parkinsons, and epilepsy.

✅ Noninvasive TMS and invasive DBS modulate specific brain circuits

✅ Closed loop systems adapt stimulation via real time biomarker detection

✅ Emerging uses: addiction, depression, Parkinsons, epilepsy, chronic pain

 

In June 2015, biology professor Colleen Hanlon went to a conference on drug dependence. As she met other researchers and wandered around a glitzy Phoenix resort’s conference rooms to learn about the latest work on therapies for drug and alcohol use disorders, she realized that out of the 730 posters, there were only two on brain stimulation as a potential treatment for addiction — both from her own lab at Wake Forest School of Medicine.

Just four years later, she would lead 76 researchers on four continents in writing a consensus article about brain stimulation as an innovative tool for addiction. And in 2020, the Food and Drug Administration approved a transcranial magnetic stimulation device to help patients quit smoking, a milestone for substance use disorders.

Brain stimulation is booming. Hanlon can attend entire conferences devoted to the study of what electrical currents do—including how targeted stimulation can improve short-term memory in older adults—to the intricate networks of highways and backroads that make up the brain’s circuitry. This expanding field of research is slowly revealing truths of the brain: how it works, how it malfunctions, and how electrical impulses, precisely targeted and controlled, might be used to treat psychiatric and neurological disorders.

In the last half-dozen years, researchers have launched investigations into how different forms of neuromodulation affect addiction, depression, loss-of-control eating, tremor, chronic pain, obsessive compulsive disorder, Parkinson’s disease, epilepsy, and more. Early studies have shown subtle electrical jolts to certain brain regions could disrupt circuit abnormalities — the miscommunications — that are thought to underlie many brain diseases, and help ease symptoms that persist despite conventional treatments.

The National Institute of Health’s massive BRAIN Initiative put circuits front and center, distributing $2.4 billion to researchers since 2013 to devise and use new tools to observe interactions between brain cells and circuits. That, in turn, has kindled interest from the private sector. Among the advances that have enhanced our understanding of how distant parts of the brain talk with one another are new imaging technology and the use of machine learning, much as utilities use AI to adapt to shifting electricity demand, to interpret complex brain signals and analyze what happens when circuits go haywire.

Still, the field is in its infancy, and even therapies that have been approved for use in patients with, for example, Parkinson’s disease or epilepsy, help only a minority of patients, and in a world where electricity drives pandemic readiness expectations can outpace evidence. “If it was the Bible, it would be the first chapter of Genesis,” said Michael Okun, executive director of the Norman Fixel Institute for Neurological Diseases at University of Florida Health.

As brain stimulation evolves, researchers face daunting hurdles, and not just scientific ones. How will brain stimulation become accessible to all the patients who need it, given how expensive and invasive some treatments are? Proving to the FDA that brain stimulation works, and does so safely, is complicated and expensive. Even with a swell of scientific momentum and an influx of funding, the agency has so far cleared brain stimulation for only a handful of limited conditions. Persuading insurers to cover the treatments is another challenge altogether. And outside the lab, researchers are debating nascent issues, such as the ethics of mind control, the privacy of a person’s brain data—concerns that echo efforts to develop algorithms to prevent blackouts during rising ransomware threats—and how to best involve patients in the study of the human brain’s far-flung regions.

Neurologist Martha Morrell is optimistic about the future of brain stimulation. She remembers the shocked reactions of her colleagues in 2004 when she left full-time teaching at Stanford (she still has a faculty appointment as a clinical professor of neurology) to direct clinical trials at NeuroPace, then a young company making neurostimulator systems to potentially treat epilepsy patients.

Related: Once a last resort, this pain therapy is getting a new life amid the opioid crisis
“When I started working on this, everybody thought I was insane,” said Morrell. Nearly 20 years in, she sees a parallel between the story of jolting the brain’s circuitry and that of early implantable cardiac devices, such as pacemakers and defibrillators, which initially “were used as a last option, where all other medications have failed.” Now, “the field of cardiology is very comfortable incorporating electrical therapy, device therapy, into routine care. And I think that’s really where we’re going with neurology as well.”


Reaching a ‘slope of enlightenment’
Parkinson’s is, in some ways, an elder in the world of modern brain stimulation, and it shows the potential as well as the limitations of the technology. Surgeons have been implanting electrodes deep in the brains of Parkinson’s patients since the late 1990s, and in people with more advanced disease since the early 2000s.

In that time, it’s gone through the “hype cycle,” said Okun, the national medical adviser to the Parkinson’s Foundation since 2006. Feverish excitement and overinflated expectations have given way to reality, bringing scientists to a “slope of enlightenment,” he said. They have found deep brain stimulation to be very helpful for some patients with Parkinson’s, rendering them almost symptom-free by calming the shaking and tremors that medications couldn’t. But it doesn’t stop the progression of the disease, or resolve some of the problems patients with advanced Parkinson’s have walking, talking, and thinking.

In 2015, the same year Hanlon found only her lab’s research on brain stimulation at the addiction conference, Kevin O’Neill watched one finger on his left hand start doing something “funky.” One finger twitched, then two, then his left arm started tingling and a feeling appeared in his right leg, like it was about to shake but wouldn’t — a tremor.

“I was assuming it was anxiety,” O’Neill, 62, told STAT. He had struggled with anxiety before, and he had endured a stressful year: a separation, selling his home, starting a new job at a law firm in California’s Bay Area. But a year after his symptoms first began, O’Neill was diagnosed with Parkinson’s.

In the broader energy context, California has increasingly turned to battery storage to stabilize its strained grid.

Related: Psychiatric shock therapy, long controversial, may face fresh restrictions
Doctors prescribed him pills that promote the release of dopamine, to offset the death of brain cells that produce this messenger molecule in circuits that control movement. But he took them infrequently because he worried about insomnia as a side effect. Walking became difficult — “I had to kind of think my left leg into moving” — and the labor lawyer found it hard to give presentations and travel to clients’ offices.

A former actor with an outgoing personality, he developed social anxiety and didn’t tell his bosses about his diagnosis for three years, and wouldn’t have, if not for two workdays in summer 2018 when his tremors were severe and obvious.

O’Neill’s tremors are all but gone since he began deep brain stimulation last May, though his left arm shakes when he feels tense.

It was during that period that he learned about deep brain stimulation, at a support group for Parkinson’s patients. “I thought, ‘I will never let anybody fuss with my brain. I’m not going to be a candidate for that,’” he recalled. “It felt like mad scientist science fiction. Like, are you kidding me?”

But over time, the idea became less radical, as O’Neill spoke to DBS patients and doctors and did his own research, and as his symptoms worsened. He decided to go for it. Last May, doctors at the University of California, San Francisco surgically placed three metal leads into his brain, connected by thin cords to two implants in his chest, just near the clavicles. A month later, he went into the lab and researchers turned the device on.

“That was a revelation that day,” he said. “You immediately — literally, immediately — feel the efficacy of these things. … You go from fully symptomatic to non-symptomatic in seconds.”

When his nephew pulled up to the curb to pick him up, O’Neill started dancing, and his nephew teared up. The following day, O’Neill couldn’t wait to get out of bed and go out, even if it was just to pick up his car from the repair shop.

In the year since, O’Neill’s walking has gone from “awkward and painful” to much improved, and his tremors are all but gone. When he is extra frazzled, like while renovating and moving into his new house overlooking the hills of Marin County, he feels tense and his left arm shakes and he worries the DBS is “failing,” but generally he returns to a comfortable, tremor-free baseline.

O’Neill worried about the effects of DBS wearing off but, for now, he can think “in terms of decades, instead of years or months,” he recalled his neurologist telling him. “The fact that I can put away that worry was the big thing.”

He’s just one patient, though. The brain has regions that are mostly uniform across all people. The functions of those regions also tend to be the same. But researchers suspect that how brain regions interact with one another — who mingles with whom, and what conversation they have — and how those mixes and matches cause complex diseases varies from person to person. So brain stimulation looks different for each patient.

Related: New study revives a Mozart sonata as a potential epilepsy therapy
Each case of Parkinson’s manifests slightly differently, and that’s a bit of knowledge that applies to many other diseases, said Okun, who organized the nine-year-old Deep Brain Stimulation Think Tank, where leading researchers convene, review papers, and publish reports on the field’s progress each year.

“I think we’re all collectively coming to the realization that these diseases are not one-size-fits-all,” he said. “We have to really begin to rethink the entire infrastructure, the schema, the framework we start with.”

Brain stimulation is also used frequently to treat people with common forms of epilepsy, and has reduced the number of seizures or improved other symptoms in many patients. Researchers have also been able to collect high-quality data about what happens in the brain during a seizure — including identifying differences between epilepsy types. Still, only about 15% of patients are symptom-free after treatment, according to Robert Gross, a neurosurgery professor at Emory University in Atlanta.

“And that’s a critical difference for people with epilepsy. Because people who are symptom-free can drive,” which means they can get to a job in a place like Georgia, where there is little public transit, he said. So taking neuromodulation “from good to great,” is imperative, Gross said.


Renaissance for an ancient idea
Recent advances are bringing about what Gross sees as “almost a renaissance period” for brain stimulation, though the ideas that undergird the technology are millenia old. Neuromodulation goes back to at least ancient Egypt and Greece, when electrical shocks from a ray, called the “torpedo fish,” were recommended as a treatment for headache and gout. Over centuries, the fish zaps led to doctors burning holes into the brains of patients. Those “lesions” worked, somehow, but nobody could explain why they alleviated some patients’ symptoms, Okun said.

Perhaps the clearest predecessor to today’s technology is electroconvulsive therapy (ECT), which in a rudimentary and dangerous way began being used on patients with depression roughly 100 years ago, said Nolan Williams, director of the Brain Stimulation Lab at Stanford University.

Related: A new index measures the extent and depth of addiction stigma
More modern forms of brain stimulation came about in the United States in the mid-20th century. A common, noninvasive approach is transcranial magnetic stimulation, which involves placing an electromagnetic coil on the scalp to transmit a current into the outermost layer of the brain. Vagus nerve stimulation (VNS), used to treat epilepsy, zaps a nerve that contributes to some seizures.

The most invasive option, deep brain stimulation, involves implanting in the skull a device attached to electrodes embedded in deep brain regions, such as the amygdala, that can’t be reached with other stimulation devices. In 1997, the FDA gave its first green light to deep brain stimulation as a treatment for tremor, and then for Parkinson’s in 2002 and the movement disorder dystonia in 2003.

Even as these treatments were cleared for patients, though, what was happening in the brain remained elusive. But advanced imaging tools now let researchers peer into the brain and map out networks — a recent breakthrough that researchers say has propelled the field of brain stimulation forward as much as increased funding has, paralleling broader efforts to digitize analog electrical systems across industry. Imaging of both human brains and animal models has helped researchers identify the neuroanatomy of diseases, target brain regions with more specificity, and watch what was happening after electrical stimulation.

Another key step has been the shift from open-loop stimulation — a constant stream of electricity — to closed-loop stimulation that delivers targeted, brief jolts in response to a symptom trigger. To make use of the futuristic technology, labs need people to develop artificial intelligence tools, informed by advances in machine learning for the energy transition, to interpret large data sets a brain implant is generating, and to tailor devices based on that information.

“We’ve needed to learn how to be data scientists,” Morrell said.

Affinity groups, like the NIH-funded Open Mind Consortium, have formed to fill that gap. Philip Starr, a neurosurgeon and developer of implantable brain devices at the University of California at San Francisco Health system, leads the effort to teach physicians how to program closed-loop devices, and works to create ethical standards for their use. “There’s been extraordinary innovation after 20 years of no innovation,” he said.

The BRAIN Initiative has been critical, several researchers told STAT. “It’s been a godsend to us,” Gross said. The NIH’s Brain Research through Advancing Innovative Neurotechnologies (BRAIN) Initiative was launched in 2013 during the Obama administration with a $50 million budget. BRAIN now spends over $500 million per year. Since its creation, BRAIN has given over 1,100 awards, according to NIH data. Part of the initiative’s purpose is to pair up researchers with medical technology companies that provide human-grade stimulation devices to the investigators. Nearly three dozen projects have been funded through the investigator-devicemaker partnership program and through one focused on new implantable devices for first-in-human use, according to Nick Langhals, who leads work on neurological disorders at the initiative.

The more BRAIN invests, the more research is spawned. “We learn more about what circuits are involved … which then feeds back into new and more innovative projects,” he said.

Many BRAIN projects are still in early stages, finishing enrollment or small feasibility studies, Langhals said. Over the next couple of years, scientists will begin to see some of the fruits of their labor, which could lead to larger clinical trials, or to companies developing more refined brain stimulation implants, Langhals said.

Money from the National Institutes of Mental Health, as well as the NIH’s Helping to End Addiction Long-term (HEAL), has similarly sweetened the appeal of brain stimulation, both for researchers and industry. “A critical mass” of companies interested in neuromodulation technology has mushroomed where, for two decades, just a handful of companies stood, Starr said.

More and more, pharmaceutical and digital health companies are looking at brain stimulation devices “as possible products for their future,” said Linda Carpenter, director of the Butler Hospital TMS Clinic and Neuromodulation Research Facility.


‘Psychiatry 3.0’
The experience with using brain stimulation to stop tremors and seizures inspired psychiatrists to begin exploring its use as a potentially powerful therapy for healing, or even getting ahead of, mental illness.

In 2008, the FDA approved TMS for patients with major depression who had tried, and not gotten relief from, drug therapy. “That kind of opened the door for all of us,” said Hanlon, a professor and researcher at the Center for Research on Substance Use and Addiction at Wake Forest School of Medicine. The last decade saw a surge of research into how TMS could be used to reset malfunctioning brain circuits involved in anxiety, depression, obsessive-compulsive disorder, and other conditions.

“We’re certainly entering into what a lot of people are calling psychiatry 3.0,” Stanford’s Williams said. “Whereas the first iteration was Freud and all that business, the second one was the psychopharmacology boom, and this third one is this bit around circuits and stimulation.”

Drugs alleviate some patients’ symptoms while simultaneously failing to help many others, but psychopharmacology clearly showed “there’s definitely a biology to this problem,” Williams said — a biology that in some cases may be more amenable to a brain stimulation.

Related: Largest psilocybin trial finds the psychedelic is effective in treating serious depression
The exact mechanics of what happens between cells when brain circuits … well, short-circuit, is unclear. Researchers are getting closer to finding biomarkers that warn of an incoming depressive episode, or wave of anxiety, or loss of impulse control. Those brain signatures could be different for every patient. If researchers can find molecular biomarkers for psychiatric disorders — and find ways to preempt those symptoms by shocking particular brain regions — that would reshape the field, Williams said.

Not only would disease-specific markers help clinicians diagnose people, but they could help chip away at the stigma that paints mental illness as a personal or moral failing instead of a disease. That’s what happened for epilepsy in the 1960s, when scientific findings nudged the general public toward a deeper understanding of why seizures happen, and it’s “the same trajectory” Williams said he sees for depression.

His research at the Stanford lab also includes work on suicide, and obsessive-compulsive disorder, which the FDA said in 2018 could be treated using noninvasive TMS. Williams considers brain stimulation, with its instantaneity, to be a potential breakthrough for urgent psychiatric situations. Doctors know what to do when a patient is rushed into the emergency room with a heart attack or a stroke, but there is no immediate treatment for psychiatric emergencies, he said. Williams wonders: What if, in the future, a suicidal patient could receive TMS in the emergency room and be quickly pulled out of their depressive mental spiral?

Researchers are also actively investigating the brain biology of addiction. In August 2020, the FDA approved TMS for smoking cessation, the first such OK for a substance use disorder, which is “really exciting,” Hanlon said. Although there is some nuance when comparing substance use disorders, a primal mechanism generally defines addiction: the eternal competition between “top-down” executive control functions and “bottom-up” cravings. It’s the same process that is at work when one is deciding whether to eat another cookie or abstain — just exacerbated.

Hanlon is trying to figure out if the stop and go circuits are in the same place for all people, and whether neuromodulation should be used to strengthen top-down control or weaken bottom-up cravings. Just as brain stimulation can be used to disrupt cellular misfiring, it could also be a tool for reinforcing helpful brain functions, or for giving the addicted brain what it wants in order to curb substance use.

Evidence suggests many people with schizophrenia smoke cigarettes (a leading cause of early death for this population) because nicotine reduces the “hyperconnectivity” that characterizes the brains of people with the disease, said Heather Ward, a research fellow at Boston’s Beth Israel Deaconess Medical Center. She suspects TMS could mimic that effect, and therefore reduce cravings and some symptoms of the disease, and she hopes to prove that in a pilot study that is now enrolling patients.

If the scientific evidence proves out, clinicians say brain stimulation could be used alongside behavioral therapy and drug-based therapy to treat substance use disorders. “In the end, we’re going to need all three to help people stay sober,” Hanlon said. “We’re adding another tool to the physician’s toolbox.”

Decoding the mysteries of pain
Afavorable outcome to the ongoing research, one that would fling the doors to brain stimulation wide open for patients with myriad disorders, is far from guaranteed. Chronic pain researchers know that firsthand.

Chronic pain, among the most mysterious and hard-to-study medical phenomena, was the first use for which the FDA approved deep brain stimulation, said Prasad Shirvalkar, an assistant professor of anesthesiology at UCSF. But when studies didn’t pan out after a year, the FDA retracted its approval.

Shirvalkar is working with Starr and neurosurgeon Edward Chang on a profoundly complex problem: “decoding pain in the brain states, which has never been done,” as Starr told STAT.

Part of the difficulty of studying pain is that there is no objective way to measure it. Much of what we know about pain is from rudimentary surveys that ask patients to rate how much they’re hurting, on a scale from zero to 10.

Using implantable brain stimulation devices, the researchers ask patients for a 0-to-10 rating of their pain while recording up-and-down cycles of activity in the brain. They then use machine learning to compare the two streams of information and see what brain activity correlates with a patient’s subjective pain experience. Implantable devices let researchers collect data over weeks and months, instead of basing findings on small snippets of information, allowing for a much richer analysis.

 

Related News

View more

Is a Resurgence of Nuclear Energy Possible in Germany?

Germany Nuclear Phase-Out reflects a decisive energy policy shift, retiring reactors as firms shun new builds amid high costs, radioactive waste challenges, climate goals, insurance gaps, and debate over small modular reactors and subsidies.

 

Key Points

Germany's policy to end nuclear plants and block new builds, emphasizing safety, waste, climate goals, and viability.

✅ Driven by safety risks, waste storage limits, and insurance gaps

✅ High capital costs and subsidies make new reactors uneconomic

✅ Political debate persists; SMRs raise cost and proliferation concerns

 

A year has passed since Germany deactivated its last three nuclear power plants, marking a significant shift in its energy policy.

Nuclear fission once heralded as the future of energy in Germany during the 1960s, was initially embraced with minimal concern for the potential risks of nuclear accidents. As Heinz Smital from Greenpeace recalls, the early optimism was partly driven by national interest in nuclear weapon technology rather than energy companies' initiatives.

Jochen Flasbarth, State Secretary in the Ministry of Development, reflects on that era, noting Germany's strong, almost naive, belief in technology. Germany, particularly the Ruhr region, grappled with smog-filled skies at that time due to heavy industrialization and coal-fired power plants. Nuclear energy presented a "clean" alternative at the time.

This sentiment was also prevalent in East Germany, where the first commercial nuclear power plant came online in 1961. In total, 37 nuclear reactors were activated across Germany, reflecting a widespread confidence in nuclear technology.

However, the 1970s saw a shift in attitudes. Environmental activists protested the construction of new power plants, symbolizing a generational rift. The 1979 Three Mile Island incident in the US, followed by the catastrophic Chornobyl disaster in 1986, further eroded public trust in nuclear energy.

The Chornobyl accident, in particular, significantly dampened Germany's nuclear ambitions, according to Smital. Post-Chernobyl, plans for additional nuclear power plants in Germany, once numbering 60, drastically declined.

The emergence of the Green Party in 1980, rooted in anti-nuclear sentiment, and its subsequent rise to political prominence further influenced Germany's energy policy. The Greens, joining forces with the Social Democrats in 1998, initiated a move away from nuclear energy, facing opposition from the Christian Democrats (CDU) and Christian Social Union (CSU).

However, the Fukushima disaster in 2011 prompted a policy reversal from CDU and CSU under Chancellor Angela Merkel, leading to Germany's eventual nuclear phase-out in March 2023, after briefly extending nuclear power amid the energy crisis.

Recently, the CDU and CSU have revised their stance once more, signaling a potential U-turn on the nuclear phaseout, advocating for new nuclear reactors and the reactivation of the last shut-down plants, citing climate protection and rising fossil fuel costs. CDU leader Friedrich Merz has lamented the shutdown as a "black day for Germany." However, these suggestions have garnered little enthusiasm from German energy companies.

Steffi Lemke, the Federal Environment Minister, isn't surprised by the companies' reluctance, noting their longstanding opposition to nuclear power, which she argues would do little to solve the gas issue in Germany, due to its high-risk nature and the long-term challenge of radioactive waste management.

Globally, 412 reactors are operational across 32 countries, even as Europe is losing nuclear power during an energy crunch, with the total number remaining relatively stable over the years. While countries like China, France, and the UK plan new constructions, there's a growing interest in small, modern reactors, which Smital of Greenpeace views with skepticism, noting their potential military applications.

In Germany, the unresolved issue of nuclear waste storage looms large. With temporary storage facilities near power plants proving inadequate for long-term needs, the search for permanent sites faces resistance from local communities and poses financial and logistical challenges.

Environment Minister Lemke underscores the economic impracticality of nuclear energy in Germany, citing prohibitive costs and the necessity of substantial subsidies and insurance exemptions.

As things stand, the resurgence of nuclear power in Germany appears unlikely, with economic factors playing a decisive role in its future.

 

Related News

View more

More Polar Vortex 2021 Fallout (and Texas Two-Step): Monitor For ERCOT Identifies Improper Payments For Ancillary Services

ERCOT Ancillary Services Clawback and VOLL Pricing summarize PUCT and IMM actions on load shed, real-time pricing adders, clawbacks, and settlement corrections after the 2021 winter storm in the Texas power grid market.

 

Key Points

Policies addressing clawbacks for unprovided AS and correcting VOLL-based price adders after load shed ended in ERCOT.

✅ PUCT ordered clawbacks for ancillary services not delivered.

✅ IMM urged price correction after firm load shed ceased.

✅ ERCOT's VOLL adder raised costs by $16B during 32 hours.

 

Potomac Economics, the Independent Market Monitor (IMM) for the Electric Reliability Council of Texas (ERCOT), filed a report with the Public Utility Commission of Texas (PUCT) that certain payments were made by ERCOT for Ancillary Services (AS) that were not provided, even as ERCOT later issued a winter reliability RFP to procure capacity during subsequent seasons.

According to the IMM (emphasis added):

There were a number of instances during the operating days outlined above in which AS was not provided in real time because of forced outages or derations. For market participants that are not able to meet their AS responsibility, typically the ERCOT operator marks the short amount in the software. This causes the AS responsibility to be effectively removed and the day-ahead AS payment to be clawed back in settlement. However, the ERCOT operators did not complete this task during the winter event, echoing issues like the Ontario IESO phantom demand that cost customers millions, and therefore the "failure to provide" settlements were not invoked in real time.

Removing the operator intervention step and automating the "failure to provide" settlement was contemplated in NPRR947: Clarification to Ancillary Service Supply Responsibility Definition and Improvements to Determining and Charging for Ancillary Service Failed Quantities; however, the NPRR was withdrawn in August 2020 amid ongoing market reform discussions because of the system cost, some complexities related to AS trades, and the implementation of real-time co-optimization.

Invoking the "failure to provide" settlement for all AS that market participants failed to provide during the operating days outlined above will produce market outcomes and settlements consistent with underlying market principles. In this case, the principle is that market participants should not be paid for services that they do not provide, even as a separate ruling found power plants exempt from providing electricity in emergencies under Texas law, underscoring the distinction between obligations and settlements. Whether ERCOT marked the short amount in real-time or not should not affect the settlement of these ancillary services.

On March 3, 2021, the PUCT ordered (a related press release is here) that:

ERCOT shall claw back all payments for ancillary service that were made to an entity that did not provide its required ancillary service during real time on ERCOT operating days starting February 14, 2021 and ending on February 19,2021.

On March 4, 2021, the IMM filed another report and recommended that:

the [PUCT] direct ERCOT to correct the real-time prices from 0:00 February 18,2021, to 09:00 February 19, 2021, to remove the inappropriate pricing intervention that occurred during that time period.

The IMM approvingly noted the PUCT's February 15, 2021 order, which mandated that real-time energy prices reflect firm load shed by setting prices at the value of lost load (VOLL).1

According to the IMM (emphasis added):

This is essential in an energy-only market, like ERCOT's, where the Texas power grid faces recurring crisis risks, because it provides efficient economic signals to increase the electric generation needed to restore the load and service it reliably over the long term.

Conversely, it is equally important that prices not reflect VOLL when the system is not in shortage and load is being served, and experiences in capacity markets show auction payouts can fall sharply under different conditions. The Commission recognized this principle in its Order, expressly stating it is only ERCOT's out-of-market shedding firm load that is required to be reflected in prices. Unfortunately, ERCOT exceeded the mandate of the Commission by continuing to set process at VOLL long after it ceased the firm load shed.

ERCOT recalled the last of the firm load shed instructions at 23:55 on February 17, 2021. Therefore, in order to comply with the Commission Order, the pricing intervention that raised prices to VOLL should have ended immediately at that time. However, ERCOT continued to hold prices at VOLL by inflating the Real-Time On-Line Reliability Deployment Price Adder for an additional 32 hours through the morning of February 19. This decision resulted in $16 billion in additional costs to ERCOT's market, prompting legislative bailout proposals in Austin, of which roughly $1.5 billion was uplifted to load-serving entities to provide make-whole payments to generators for energy that was not needed or produced.

However, at its March 5, 2021, open meeting (related discussion begins around minute 20), although the PUCT acknowledged the "good points" raised by the IMM, the PUCT was not willing to retrospectively adjust its real-time pricing for this period out of concerns that some related transactions (ICE futures and others) may have already settled and for unintended consequences of such retroactive adjustments.  

 

Related News

View more

Sign Up for Electricity Forum’s Newsletter

Stay informed with our FREE Newsletter — get the latest news, breakthrough technologies, and expert insights, delivered straight to your inbox.

Electricity Today T&D Magazine Subscribe for FREE

Stay informed with the latest T&D policies and technologies.
  • Timely insights from industry experts
  • Practical solutions T&D engineers
  • Free access to every issue

Download the 2025 Electrical Training Catalog

Explore 50+ live, expert-led electrical training courses –

  • Interactive
  • Flexible
  • CEU-cerified