LANL officials see lab's mission evolving

By Santa Fe New Mexican


CSA Z463 Electrical Maintenance

Our customized live online or in‑person group training can be delivered to your staff at your location.

  • Live Online
  • 6 hours Instructor-led
  • Group Training Available
Regular Price:
$249
Coupon Price:
$199
Reserve Your Seat Today
"Complex transformation may have been the buzz-phrase of the year at Los Alamos National Laboratory, but actually, the lab has never been a stranger to mission shifts.

In the 1970s, the lab played a large role in energy initiatives for the Carter administration, in the '80s, the lab did a lot of work with the FBI, and now, while the labs historic focus on nuclear nonproliferation and managing the stockpile remains, the push is on once again for change, said Terry Wallace, the lab's principal associate director for science, technology and engineering.

"We are seeing a tremendous pressure, and rightfully so, for a shift in mission space," he said. "But that's really not a bad thing. It's easy to get overly worried about mission swings."

Talk of complex transformation across the Department of Energy and all of its labs and facilities continued throughout 2008, and could well continue to evolve this year. The National Nuclear Security Administration's plan for Los Alamos in that process is to consolidate plutonium research from other DOE facilities to the lab, to add the capability of producing 50 to 80 nuclear weapons cores called pits each year, and overall — in what might seem like a contradiction to the other missions — to reduce the stockpile and the labs' nuclear weapons-based operations.

Key to that change is a building complex called the Chemistry and Metallurgy Research Replacement facility, estimated to cost between $745 million and $975 million when completed, according to the lab's Web site.

When completed, it will house more than just nuclear weapons activities. It will also be used to train nuclear inspectors, to investigate other areas of nuclear science such as reactors or batteries for NASA spacecraft and even to look at nuclear isotopes for medicine, said Joe Martz, nuclear weapons program director.

Scientists will need the facility no matter how much the lab's nuclear weapons operations shrink. And even though the goal is to have fewer nuclear weapons and fewer people monitoring them across the entire DOE complex, the basic abilities of scientists to be able to understand nuclear weapons is something that cannot change, Wallace said.

"Something like plutonium, even though we say we can handle it safely, we still need world-class facilities and scientists to maintain those abilities," he said. "Whether you have 20 or 2,000 warheads, you still have to maintain them."

The lab will likely always have some sort of focus on nuclear weapons science, but the shift to a smaller program started before complex transformation really spread into the limelight by the middle of the year. In late 2007 and early 2008, lab management shrank the staff by about 570 employees, through attrition or voluntary reductions.

And most of those jobs came from nuclear weapons areas, Wallace said, adding he doesn't foresee any more staff reductions in the coming year.

That said, some employees who left nuclear weapons-related jobs also didn't actually leave the lab. They simply shifted to other spots that could also use their skill sets, said Mike Burns, acting associate director for Threat Reduction.

"Threat reduction, we refer to that as national security programs that do not involve our nuclear stockpile, and through 2008, our portfolio grew by 6 percent," Burns said. "I personally think there are a lot of opportunities to grow in our areas of the lab."

Threat reduction, especially in the modern era of terrorism, will probably continue to grow rapidly in coming years. In that area, the lab investigates potential threats using sensors and monitors, by creating simulations on supercomputers and by looking at ways to save energy, among other things, Burns said.

"One area that's really interesting is something called mobility energy," Burns said. "The Department of Defense is the nation's largest user of things like fuel, jet fuel, gasoline. So if the lab can help find ways to make small reductions or changes there, it can create huge benefits for the nation."

Threat reduction also works on surveillance gadgets to help improve situational awareness on battlefields, hopefully saving American lives in the process, he said.

Another big area where some activities are shifting is to energy and power issues and how to improve storage and power grids in the United States, Wallace said. "We're also looking at next generation nuclear power because it can be an important resource that doesn't produce greenhouse gasses," he said.

Supercomputer activities, as well, have grown far beyond nuclear weapons functions at the lab.

Earlier in 2008 the lab started to install Roadrunner, a supercomputer that continues to be the fastest in the world. The speeds available on that computer have opened up entirely new areas of science, Martz said.

"These changes aren't just true at the lab — the nature of science is changing," he said. "We can do things now that seemed impossible 10 years ago."

Computer models of systems like the ribosome, tiny cellular factories that transform instructions from DNA into biological material, could play a huge role in medical science in the not-so-distant future. And computer models of ocean systems and climate can help us better understand how human activities are changing those systems, Martz said.

"In some ways, through these computer systems, science is coalescing," Martz said. "Many disciplines are coming together."

Still, while activities at the lab are shifting and changing, nuclear weapons science remains a large chunk of the budget.

About $650 million of the lab's fiscal 2008 $2.074 billion budget is not tied to nuclear weapons-related activities, Wallace said.

But he thinks it's likely the budget distribution will continue to shift away from nuclear and into more of the emerging science and technology areas.

"Budgets will likely remain about the same, but the buckets that each dollar goes into may change as we transform," Wallace said.

And there's an advantage to keeping people from nuclear operations around and letting them switch to work in non-nuclear weapons-related areas of the lab, he said.

Should the nation need them to go back to nuclear work, those workers can shift back and be up to speed on the science fairly quickly, Wallace said.

Overall, It's hard to say anything definitively about exactly where the lab will grow and shrink and how funding will continue to change. The Obama administration could change many things when it takes over the White House later this month.

No matter what happens, though, the lab seems to be in good shape to handle it and continue to transform with the times, Wallace said.

"We do what the nation asks us to do," he said. "And of course we'll continue to do whatever is needed."

Related News

Why Fort Frances wants to build an integrated microgrid to deliver its electricity

Fort Frances Microgrid aims to boost reliability in Ontario with grid-connected and island modes, Siemens feasibility study, renewable energy integration, EV charging expansion, and resilience modeled after First Nations projects and regional biomass initiatives.

 

Key Points

A community microgrid in Fort Frances enabling grid and island modes to improve reliability and integrate renewables.

✅ Siemens-led feasibility via FedNor funding

✅ Grid-connected or islanded for outage resilience

✅ Integrates renewables, EV charging, and industry growth

 

When the power goes out in Fort Frances, Ont., the community may be left in the dark for hours.

The hydro system's unreliability — caused by its location on the provincial power grid — has prompted the town to seek a creative solution: its own self-contained electricity grid with its own source of power, known as a microgrid. 

Located more than 340 kilometres west of Thunder Bay, Ont., on the border of Minnesota, near the Great Northern Transmission Line corridor, Fort Frances gets its power from a single supply point on Ontario's grid. 

"Sometimes, it's inevitable that we have to have like a six- to eight-hour power outage while equipment is being worked on, and that is no longer acceptable to many of our customers," said Joerg Ruppenstein, president and chief executive officer of Fort Frances Power Corporation.

While Ontario's electrical grid serves the entire province, and national efforts explore macrogrids, a microgrid is contained within a community. Fort Frances hopes to develop an integrated, community-based electric microgrid system that can operate in two modes:

  • Grid-connected mode, which means it's connected to the provincial grid and informed by western grid planning approaches
  • Island mode, which means it's disconnected from the provincial grid and operates independently

The ability to switch between modes allows flexibility. If a storm knocks down a line, the community will still have power.

The town has been given grant funding from the Federal Economic Development Agency for Northern Ontario (FedNor), echoing smart grid funding in Sault Ste. Marie initiatives, for the project. On Monday night, council voted to grant a request for proposal to Siemens Canada Limited to conduct a feasibility study into a microgrid system.

The study, anticipated to be completed by the end of 2023 or early 2024, will assess what an integrated community-based microgrid system could look like in the town of just over 7,000 people, said Faisal Anwar, chief administrative officer of Fort Frances. A timeline for construction will be determined after that. 

The community is still reeling from the closure of the Resolute Forest Products pulp and paper mill in 2014 and faces a declining population, said Ruppenstein. It's hoped the microgrid system will help attract new industry to replace those lost workers and jobs, drawing on Manitoba's hydro experience as a model.

This gives the town a competitive advantage.

"If we were conceivably to attract a larger industrial player that would consume a considerable amount of energy, it would result in reduced rates for everyone…we're the only utility really in Ontario that can offer that model," Ruppenstein said.

The project can also incorporate renewable energy like solar or wind power, as seen in B.C.'s clean energy shift efforts, into the microgrid system, and support the growth of electric vehicles, he said. Many residents fill their gas tanks in Minnesota because it's cheaper, but Fort Frances has the potential to become a hub for electric vehicle charging.

A few remote First Nations have recently switched to microgrid systems fuelled by green energy, including Gull Bay First Nation and Fort Severn First Nation. These are communities that have historically relied on diesel fuel either flown in, which is incredibly expensive, or transported via ice roads, which are seeing shorter seasons each year.

Natural Resources Minister Jonathan Wilkinson was in Thunder Bay, Ont., to announce $35 million for a biomass generation facility in Whitesand First Nation, complementing federal funding for the Manitoba-Saskatchewan transmission line elsewhere in the region.

 

Related News

View more

Ottawa making electricity more expensive for Albertans

Alberta Electricity Price Surge reflects soaring wholesale rates, natural gas spikes, carbon tax pressures, and grid decarbonization challenges amid cold-weather demand, constrained supply, and Europe-style energy crisis impacts across the province.

 

Key Points

An exceptional jump in Alberta's power costs driven by gas price spikes, high demand, policy costs, and tight supply.

✅ Wholesale prices averaged $123/MWh in December

✅ Gas costs surged; supply constraints and outages

✅ Carbon tax and decarbonization policies raised costs

 

Albertans just endured the highest electricity prices in 21 years. Wholesale prices averaged $123 per megawatt-hour in December, more than triple the level from the previous year and highest for December since 2000.

The situation in Alberta mirrors the energy crisis striking Europe where electricity prices are also surging, largely due to a shocking five-fold increase in natural gas prices in 2021 compared to the prior year.

The situation should give pause to Albertans when they consider aggressive plans to “decarbonize” the electric grid, including proposals for a fully renewable grid by 2030 from some policymakers.

The explanation for skyrocketing energy prices is simple: increased demand (because of Calgary's frigid February demand and a slowly-reviving post-pandemic economy) coupled with constrained supply.

In the nitty gritty details, there are always particular transitory causes, such as disputes with Russian gas companies (in the case of Europe) or plant outages (in the case of Alberta).

But beyond these fleeting factors, there are more permanent systemic constraints on natural gas (and even more so, coal-fired) power plants.

I refer of course to the climate change policies of the Trudeau government at the federal level and some of the more aggressive provincial governments, which have notable implications for electricity grids across Canada.

The most obvious example is the carbon tax, the repeal of which Premier Jason Kenney made a staple of his government.

Putting aside the constitutional issues (on which the Supreme Court ruled in March of last year that the federal government could impose a carbon tax on Alberta), the obvious economic impact will be to make carbon-sourced electricity more expensive.

This isn’t a bug or undesired side-effect, it’s the explicit purpose of a carbon tax.

Right now, the federal carbon tax is $40 per tonne, is scheduled to increase to $50 in April, and will ultimately max out at a whopping $170 per tonne in 2030.

Again, the conscious rationale of the tax, aligned with goals for cleaning up Canada's electricity, is to make coal, oil and natural gas more expensive to induce consumers and businesses to use alternative energy sources.

As Albertans experience sticker shock this winter, they should ask themselves — do we want the government intentionally making electricity and heating oil more expensive?

Of course, the proponent of a carbon tax (and other measures designed to shift Canadians away from carbon-based fuels) would respond that it’s a necessary measure in the fight against climate change, and that Canada will need more electricity to hit net-zero according to the IEA.

Yet the reality is that Canada is a bit player on the world stage when it comes to carbon dioxide, responsible for only 1.5% of global emissions (as of 2018).

As reported at this “climate tracker” website, if we look at the actual policies put in place by governments around the world, they’re collectively on track for the Earth to warm 2.7 degrees Celsius by 2100, far above the official target codified in the Paris Agreement.

Canadians can’t do much to alter the global temperature, but federal and provincial governments can make energy more expensive if policymakers so choose, and large-scale electrification could be costly—the Canadian Gas Association warns of $1.4 trillion— if pursued rapidly.

As renewable technologies become more reliable and affordable, business and consumers will naturally adopt them; it didn’t take a “manure tax” to force people to use cars rather than horses.

As official policy continues to make electricity more expensive, Albertans should ask if this approach is really worth it, or whether options like bridging the Alberta-B.C. electricity gap could better balance costs.

Robert P. Murphy is a senior fellow at the Fraser Institute.

 

Related News

View more

B.C. residents and businesses get break on electricity bills for three months

BC Hydro COVID-19 Bill Relief offers pandemic support with bill credits, rate cuts, and deferred payments for residential, small business, and industrial customers across B.C., easing utilities costs during COVID-19 economic hardship.

 

Key Points

COVID-19 bill credits, a rate cut, and deferred payments for eligible B.C. homes, small businesses, and industrial customers.

✅ Non-repayable credits equal to 3 months of average bills.

✅ Small businesses closed can skip bills for three months.

✅ Large industry may defer 50% of electricity costs.

 

B.C. residents who have lost their jobs or had their wages cut will get a three-month break on BC Hydro bills, while small businesses, amid commercial consumption plummets during COVID-19, are also eligible to apply for similar relief.

Premier John Horgan said Wednesday the credit for residential customers will be three times a household’s average monthly bill over the past year and does not have to be repaid as part of the government’s support package during the COVID-19 pandemic, as BC Hydro demand down 10% highlights the wider market pressures.

He said small businesses that are closed will not have to pay their power bills for three months, and in Ontario an Ontario COVID-19 hydro rebate complemented similar relief, and large industrial customers, including those operating mines and pulp mills, can opt to have 50 per cent of their electricity costs deferred, though a deferred costs report warned of long-term liabilities.

BC Hydro rates will be cut for all customers by one per cent as of April 1, a move similar to Ontario 2021 rate reductions that manufacturers supported lower rates at the time, after the B.C. Utilities Commission provided interim approval of an application the utility submitted last August.

Eligible residential customers can apply for bill relief starting next week and small business applications will be accepted as of April 14, while staying alert to BC Hydro scam attempts during this period, with the deadline for both categories set at June 30.

 

Related News

View more

Longer, more frequent outages afflict the U.S. power grid as states fail to prepare for climate change

Power Grid Climate Resilience demands storm hardening, underground power lines, microgrids, batteries, and renewable energy as regulators and utilities confront climate change, sea level rise, and extreme weather to reduce outages and protect vulnerable communities.

 

Key Points

It is the grid capacity to resist and recover from climate hazards using buried lines, microgrids, and batteries.

✅ Underground lines reduce wind outages and wildfire ignition risk.

✅ Microgrids with solar and batteries sustain critical services.

✅ Regulators balance cost, resilience, equity, and reliability.

 

Every time a storm lashes the Carolina coast, the power lines on Tonye Gray’s street go down, cutting her lights and air conditioning. After Hurricane Florence in 2018, Gray went three days with no way to refrigerate medicine for her multiple sclerosis or pump the floodwater out of her basement.

What you need to know about the U.N. climate summit — and why it matters
“Florence was hell,” said Gray, 61, a marketing account manager and Wilmington native who finds herself increasingly frustrated by the city’s vulnerability.

“We’ve had storms long enough in Wilmington and this particular area that all power lines should have been underground by now. We know we’re going to get hit.”

Across the nation, severe weather fueled by climate change is pushing aging electrical systems past their limits, often with deadly results. Last year, amid increasing nationwide blackouts, the average American home endured more than eight hours without power, according to the U.S. Energy Information Administration — more than double the outage time five years ago.

This year alone, a wave of abnormally severe winter storms caused a disastrous power failure in Texas, leaving millions of homes in the dark, sometimes for days, and at least 200 dead. Power outages caused by Hurricane Ida contributed to at least 14 deaths in Louisiana, as some of the poorest parts of the state suffered through weeks of 90-degree heat without air conditioning.

As storms grow fiercer and more frequent, environmental groups are pushing states to completely reimagine the electrical grid, incorporating more grid-scale batteries, renewable energy sources and localized systems known as “microgrids,” which they say could reduce the incidence of wide-scale outages. Utility companies have proposed their own storm-proofing measures, including burying power lines underground.

But state regulators largely have rejected these ideas, citing pressure to keep energy rates affordable. Of $15.7 billion in grid improvements under consideration last year, regulators approved only $3.4 billion, according to a national survey by the NC Clean Energy Technology Center — about one-fifth, highlighting persistent vulnerabilities in the grid nationwide.

After a weather disaster, “everybody’s standing around saying, ‘Why didn’t you spend more to keep the lights on?’ ” Ted Thomas, chairman of the Arkansas Public Service Commission, said in an interview with The Washington Post. “But when you try to spend more when the system is working, it’s a tough sell.”

A major impediment is the failure by state regulators and the utility industry to consider the consequences of a more volatile climate — and to come up with better tools to prepare for it. For example, a Berkeley Lab study last year of outages caused by major weather events in six states found that neither state officials nor utility executives attempted to calculate the social and economic costs of longer and more frequent outages, such as food spoilage, business closures, supply chain disruptions and medical problems.

“There is no question that climatic changes are happening that directly affect the operation of the power grid,” said Justin Gundlach, a senior attorney at the Institute for Policy Integrity, a think tank at New York University Law School. “What you still haven’t seen … is a [state] commission saying: 'Isn’t climate the through line in all of this? Let’s examine it in an open-ended way. Let’s figure out where the information takes us and make some decisions.’ ”

In interviews, several state commissioners acknowledged that failure.

“Our electric grid was not built to handle the storms that are coming this next century,” said Tremaine L. Phillips, a commissioner on the Michigan Public Service Commission, which in August held an emergency meeting to discuss the problem of power outages. “We need to come up with a broader set of metrics in order to better understand the success of future improvements.”

Five disasters in four years
The need is especially urgent in North Carolina, where experts warn Atlantic grids and coastlines need a rethink as the state has declared a federal disaster from a hurricane or tropical storm five times in the past four years. Among them was Hurricane Florence, which brought torrential rain, catastrophic flooding and the state’s worst outage in over a decade in September 2018.

More than 1 million residents were left disconnected from refrigerators, air conditioners, ventilators and other essential machines, some for up to two weeks. Elderly residents dependent on oxygen were evacuated from nursing homes. Relief teams flew medical supplies to hospitals cut off by flooded roads. Desperate people facing closed stores and rotting food looted a Wilmington Family Dollar.

“I have PTSD from Hurricane Florence, not because of the actual storm but the aftermath,” said Evelyn Bryant, a community organizer who took part in the Wilmington response.

The storm reignited debate over a $13 billion proposal by Duke Energy, one of the largest power companies in the nation, to reinforce the state’s power grid. A few months earlier, the state had rejected Duke’s request for full repayment of those costs, determining that protecting the grid against weather is a normal part of doing business and not eligible for the type of reimbursement the company had sought.

After Florence, Duke offered a smaller, $2.5 billion plan, along with the argument that severe weather events are one of seven “megatrends” (including cyberthreats and population growth) that require greater investment, according to a PowerPoint presentation included in testimony to the state. The company owns the two largest utilities in North Carolina, Duke Energy Carolinas and Duke Energy Progress.

Vote Solar, a nonprofit climate advocacy group, objected to Duke’s plan, saying the utility had failed to study the risks of climate impacts. Duke’s flood maps, for example, had not been updated to reflect the latest projections for sea level rise, they said. In testimony, Vote Solar claimed Duke was using environmental trends to justify investments “it had already decided to pursue.”

The United States is one of the few countries where regulated utilities are usually guaranteed a rate of return on capital investments, even as studies show the U.S. experiences more blackouts than much of the developed world. That business model incentivizes spending regardless of how well it solves problems for customers and inspires skepticism. Ric O’Connell, executive director of GridLab, a nonprofit group that assists state and regional policymakers on electrical grid issues, said utilities in many states “are waving their hands and saying hurricanes” to justify spending that would do little to improve climate resilience.

In North Carolina, hurricanes convinced Republicans that climate change is real

Duke Energy spokesman Jeff Brooks acknowledged that the company had not conducted a climate risk study but pointed out that this type of analysis is still relatively new for the industry. He said Duke’s grid improvement plan “inherently was designed to think about future needs,” including reinforced substations with walls that rise several feet above the previous high watermark for flooding, and partly relied on federal flood maps to determine which stations are at most risk.

Brooks said Duke is not using weather events to justify routine projects, noting that the company had spent more than a year meeting with community stakeholders and using their feedback to make significant changes to its grid improvement plan.

This year, the North Carolina Utilities Commission finally approved a set of grid improvements that will cost customers $1.2 billion. But the commission reserved the right to deny Duke reimbursement of those costs if it cannot prove they are prudent and reasonable. The commission’s general counsel, Sam Watson, declined to discuss the decision, saying the commission can comment on specific cases only in public orders.

The utility is now burying power lines in “several neighborhoods across the state” that are most vulnerable to wide-scale outages, Brooks said. It is also fitting aboveground power lines with “self-healing” technology, a network of sensors that diverts electricity away from equipment failures to minimize the number of customers affected by an outage.

As part of a settlement with Vote Solar, Duke Energy last year agreed to work with state officials and local leaders to further evaluate the potential impacts of climate change, a process that Brooks said is expected to take two to three years.

High costs create hurdles
The debate in North Carolina is being echoed in states across the nation, where burying power lines has emerged as one of the most common proposals for insulating the grid from high winds, fires and flooding. But opponents have balked at the cost, which can run in the millions of dollars per mile.

In California, for example, Pacific Gas & Electric wants to bury 10,000 miles of power lines, both to make the grid more resilient and to reduce the risk of sparking wildfires. Its power equipment has contributed to multiple deadly wildfires in the past decade, including the 2018 Camp Fire that killed at least 85 people.

PG&E’s proposal has drawn scorn from critics, including San Jose Mayor Sam Liccardo, who say it would be too slow and expensive. But Patricia Poppe, the company’s CEO, told reporters that doing nothing would cost California even more in lost lives and property while struggling to keep the lights on during wildfires. The plan has yet to be submitted to the state, but Terrie Prosper, a spokeswoman for the California Public Utilities Commission, said the commission has supported underground lines as a wildfire mitigation strategy.

Another oft-floated solution is microgrids, small electrical systems that provide power to a single neighborhood, university or medical center. Most of the time, they are connected to a larger utility system. But in the event of an outage, microgrids can operate on their own, with the aid of solar energy stored in batteries.

In Florida, regulators recently approved a four-year microgrid pilot project, but the technology remains expensive and unproven. In Maryland, regulators in 2016 rejected a plan to spend about $16 million for two microgrids in Baltimore, in part because the local utility made no attempt to quantify “the tangible benefits to its customer base.”

Amid shut-off woes, a beacon of energy

In Texas, where officials have largely abandoned state regulation in favor of the free market, the results have been no more encouraging. Without requirements, as exist elsewhere, for building extra capacity for times of high demand or stress, the state was ill-equipped to handle an abnormal deep freeze in February that knocked out power to 4 million customers for days.

Since then, Berkshire Hathaway Energy and Starwood Energy Group each proposed spending $8 billion to build new power plants to provide backup capacity, with guaranteed returns on the investment of 9 percent, but the Texas legislature has not acted on either plan.

New York is one of the few states where regulators have assessed the risks of climate change and pushed utilities to invest in solutions. After 800,000 New Yorkers lost power for 10 days in 2012 in the wake of Hurricane Sandy, state regulators ordered utility giant Con Edison to evaluate the state’s vulnerability to weather events.

The resulting report, which estimated climate risks could cost the company as much as $5.2 billion by 2050, gave ConEd data to inform its investments in storm hardening measures, including new storm walls and submersible equipment in areas at risk of flooding.

Meanwhile, the New York Public Service Commission has aggressively enforced requirements that utility companies keep the lights on during big storms, fining utility providers nearly $190 million for violations including inadequate staffing during Tropical Storm Isaias in 2020.

“At the end of the day, we do not want New Yorkers to be at the mercy of outdated infrastructure,” said Rory M. Christian, who last month was appointed chair of the New York commission.

The price of inaction
In North Carolina, as Duke Energy slowly works to harden the grid, some are pursuing other means of fostering climate-resilient communities.

Beth Schrader, the recovery and resilience director for New Hanover County, which includes Wilmington, said some of the people who went the longest without power after Florence had no vehicles, no access to nearby grocery stores and no means of getting to relief centers set up around the city.

For example, Quanesha Mullins, a 37-year-old mother of three, went eight days without power in her housing project on Wilmington’s east side. Her family got by on food from the Red Cross and walked a mile to charge their phones at McDonald’s. With no air conditioning, they slept with the windows open in a neighborhood with a history of violent crime.

Schrader is working with researchers at the University of North Carolina in Charlotte to estimate the cost of helping people like Mullins. The researchers estimate that it would have cost about $572,000 to provide shelter, meals and emergency food stamp benefits to 100 families for two weeks, said Robert Cox, an engineering professor who researches power systems at UNC-Charlotte.

Such calculations could help spur local governments to do more to help vulnerable communities, for example by providing “resilience outposts” with backup power generators, heating or cooling rooms, Internet access and other resources, Schrader said. But they also are intended to show the costs of failing to shore up the grid.

“The regulators need to be moved along,” Cox said.

In the meantime, Tonye Gray finds herself worrying about what happens when the next storm hits. While Duke Energy says it is burying power lines in the most outage-prone areas, she has yet to see its yellow-vested crews turn up in her neighborhood.

“We feel,” she said, “that we’re at the end of the line.”

 

Related News

View more

Jolting the brain's circuits with electricity is moving from radical to almost mainstream therapy

Brain Stimulation is transforming neuromodulation, from TMS and DBS to closed loop devices, targeting neural circuits for addiction, depression, Parkinsons, epilepsy, and chronic pain, powered by advanced imaging, AI analytics, and the NIH BRAIN Initiative.

 

Key Points

Brain stimulation uses pulses to modulate neural circuits, easing symptoms in depression, Parkinsons, and epilepsy.

✅ Noninvasive TMS and invasive DBS modulate specific brain circuits

✅ Closed loop systems adapt stimulation via real time biomarker detection

✅ Emerging uses: addiction, depression, Parkinsons, epilepsy, chronic pain

 

In June 2015, biology professor Colleen Hanlon went to a conference on drug dependence. As she met other researchers and wandered around a glitzy Phoenix resort’s conference rooms to learn about the latest work on therapies for drug and alcohol use disorders, she realized that out of the 730 posters, there were only two on brain stimulation as a potential treatment for addiction — both from her own lab at Wake Forest School of Medicine.

Just four years later, she would lead 76 researchers on four continents in writing a consensus article about brain stimulation as an innovative tool for addiction. And in 2020, the Food and Drug Administration approved a transcranial magnetic stimulation device to help patients quit smoking, a milestone for substance use disorders.

Brain stimulation is booming. Hanlon can attend entire conferences devoted to the study of what electrical currents do—including how targeted stimulation can improve short-term memory in older adults—to the intricate networks of highways and backroads that make up the brain’s circuitry. This expanding field of research is slowly revealing truths of the brain: how it works, how it malfunctions, and how electrical impulses, precisely targeted and controlled, might be used to treat psychiatric and neurological disorders.

In the last half-dozen years, researchers have launched investigations into how different forms of neuromodulation affect addiction, depression, loss-of-control eating, tremor, chronic pain, obsessive compulsive disorder, Parkinson’s disease, epilepsy, and more. Early studies have shown subtle electrical jolts to certain brain regions could disrupt circuit abnormalities — the miscommunications — that are thought to underlie many brain diseases, and help ease symptoms that persist despite conventional treatments.

The National Institute of Health’s massive BRAIN Initiative put circuits front and center, distributing $2.4 billion to researchers since 2013 to devise and use new tools to observe interactions between brain cells and circuits. That, in turn, has kindled interest from the private sector. Among the advances that have enhanced our understanding of how distant parts of the brain talk with one another are new imaging technology and the use of machine learning, much as utilities use AI to adapt to shifting electricity demand, to interpret complex brain signals and analyze what happens when circuits go haywire.

Still, the field is in its infancy, and even therapies that have been approved for use in patients with, for example, Parkinson’s disease or epilepsy, help only a minority of patients, and in a world where electricity drives pandemic readiness expectations can outpace evidence. “If it was the Bible, it would be the first chapter of Genesis,” said Michael Okun, executive director of the Norman Fixel Institute for Neurological Diseases at University of Florida Health.

As brain stimulation evolves, researchers face daunting hurdles, and not just scientific ones. How will brain stimulation become accessible to all the patients who need it, given how expensive and invasive some treatments are? Proving to the FDA that brain stimulation works, and does so safely, is complicated and expensive. Even with a swell of scientific momentum and an influx of funding, the agency has so far cleared brain stimulation for only a handful of limited conditions. Persuading insurers to cover the treatments is another challenge altogether. And outside the lab, researchers are debating nascent issues, such as the ethics of mind control, the privacy of a person’s brain data—concerns that echo efforts to develop algorithms to prevent blackouts during rising ransomware threats—and how to best involve patients in the study of the human brain’s far-flung regions.

Neurologist Martha Morrell is optimistic about the future of brain stimulation. She remembers the shocked reactions of her colleagues in 2004 when she left full-time teaching at Stanford (she still has a faculty appointment as a clinical professor of neurology) to direct clinical trials at NeuroPace, then a young company making neurostimulator systems to potentially treat epilepsy patients.

Related: Once a last resort, this pain therapy is getting a new life amid the opioid crisis
“When I started working on this, everybody thought I was insane,” said Morrell. Nearly 20 years in, she sees a parallel between the story of jolting the brain’s circuitry and that of early implantable cardiac devices, such as pacemakers and defibrillators, which initially “were used as a last option, where all other medications have failed.” Now, “the field of cardiology is very comfortable incorporating electrical therapy, device therapy, into routine care. And I think that’s really where we’re going with neurology as well.”


Reaching a ‘slope of enlightenment’
Parkinson’s is, in some ways, an elder in the world of modern brain stimulation, and it shows the potential as well as the limitations of the technology. Surgeons have been implanting electrodes deep in the brains of Parkinson’s patients since the late 1990s, and in people with more advanced disease since the early 2000s.

In that time, it’s gone through the “hype cycle,” said Okun, the national medical adviser to the Parkinson’s Foundation since 2006. Feverish excitement and overinflated expectations have given way to reality, bringing scientists to a “slope of enlightenment,” he said. They have found deep brain stimulation to be very helpful for some patients with Parkinson’s, rendering them almost symptom-free by calming the shaking and tremors that medications couldn’t. But it doesn’t stop the progression of the disease, or resolve some of the problems patients with advanced Parkinson’s have walking, talking, and thinking.

In 2015, the same year Hanlon found only her lab’s research on brain stimulation at the addiction conference, Kevin O’Neill watched one finger on his left hand start doing something “funky.” One finger twitched, then two, then his left arm started tingling and a feeling appeared in his right leg, like it was about to shake but wouldn’t — a tremor.

“I was assuming it was anxiety,” O’Neill, 62, told STAT. He had struggled with anxiety before, and he had endured a stressful year: a separation, selling his home, starting a new job at a law firm in California’s Bay Area. But a year after his symptoms first began, O’Neill was diagnosed with Parkinson’s.

In the broader energy context, California has increasingly turned to battery storage to stabilize its strained grid.

Related: Psychiatric shock therapy, long controversial, may face fresh restrictions
Doctors prescribed him pills that promote the release of dopamine, to offset the death of brain cells that produce this messenger molecule in circuits that control movement. But he took them infrequently because he worried about insomnia as a side effect. Walking became difficult — “I had to kind of think my left leg into moving” — and the labor lawyer found it hard to give presentations and travel to clients’ offices.

A former actor with an outgoing personality, he developed social anxiety and didn’t tell his bosses about his diagnosis for three years, and wouldn’t have, if not for two workdays in summer 2018 when his tremors were severe and obvious.

O’Neill’s tremors are all but gone since he began deep brain stimulation last May, though his left arm shakes when he feels tense.

It was during that period that he learned about deep brain stimulation, at a support group for Parkinson’s patients. “I thought, ‘I will never let anybody fuss with my brain. I’m not going to be a candidate for that,’” he recalled. “It felt like mad scientist science fiction. Like, are you kidding me?”

But over time, the idea became less radical, as O’Neill spoke to DBS patients and doctors and did his own research, and as his symptoms worsened. He decided to go for it. Last May, doctors at the University of California, San Francisco surgically placed three metal leads into his brain, connected by thin cords to two implants in his chest, just near the clavicles. A month later, he went into the lab and researchers turned the device on.

“That was a revelation that day,” he said. “You immediately — literally, immediately — feel the efficacy of these things. … You go from fully symptomatic to non-symptomatic in seconds.”

When his nephew pulled up to the curb to pick him up, O’Neill started dancing, and his nephew teared up. The following day, O’Neill couldn’t wait to get out of bed and go out, even if it was just to pick up his car from the repair shop.

In the year since, O’Neill’s walking has gone from “awkward and painful” to much improved, and his tremors are all but gone. When he is extra frazzled, like while renovating and moving into his new house overlooking the hills of Marin County, he feels tense and his left arm shakes and he worries the DBS is “failing,” but generally he returns to a comfortable, tremor-free baseline.

O’Neill worried about the effects of DBS wearing off but, for now, he can think “in terms of decades, instead of years or months,” he recalled his neurologist telling him. “The fact that I can put away that worry was the big thing.”

He’s just one patient, though. The brain has regions that are mostly uniform across all people. The functions of those regions also tend to be the same. But researchers suspect that how brain regions interact with one another — who mingles with whom, and what conversation they have — and how those mixes and matches cause complex diseases varies from person to person. So brain stimulation looks different for each patient.

Related: New study revives a Mozart sonata as a potential epilepsy therapy
Each case of Parkinson’s manifests slightly differently, and that’s a bit of knowledge that applies to many other diseases, said Okun, who organized the nine-year-old Deep Brain Stimulation Think Tank, where leading researchers convene, review papers, and publish reports on the field’s progress each year.

“I think we’re all collectively coming to the realization that these diseases are not one-size-fits-all,” he said. “We have to really begin to rethink the entire infrastructure, the schema, the framework we start with.”

Brain stimulation is also used frequently to treat people with common forms of epilepsy, and has reduced the number of seizures or improved other symptoms in many patients. Researchers have also been able to collect high-quality data about what happens in the brain during a seizure — including identifying differences between epilepsy types. Still, only about 15% of patients are symptom-free after treatment, according to Robert Gross, a neurosurgery professor at Emory University in Atlanta.

“And that’s a critical difference for people with epilepsy. Because people who are symptom-free can drive,” which means they can get to a job in a place like Georgia, where there is little public transit, he said. So taking neuromodulation “from good to great,” is imperative, Gross said.


Renaissance for an ancient idea
Recent advances are bringing about what Gross sees as “almost a renaissance period” for brain stimulation, though the ideas that undergird the technology are millenia old. Neuromodulation goes back to at least ancient Egypt and Greece, when electrical shocks from a ray, called the “torpedo fish,” were recommended as a treatment for headache and gout. Over centuries, the fish zaps led to doctors burning holes into the brains of patients. Those “lesions” worked, somehow, but nobody could explain why they alleviated some patients’ symptoms, Okun said.

Perhaps the clearest predecessor to today’s technology is electroconvulsive therapy (ECT), which in a rudimentary and dangerous way began being used on patients with depression roughly 100 years ago, said Nolan Williams, director of the Brain Stimulation Lab at Stanford University.

Related: A new index measures the extent and depth of addiction stigma
More modern forms of brain stimulation came about in the United States in the mid-20th century. A common, noninvasive approach is transcranial magnetic stimulation, which involves placing an electromagnetic coil on the scalp to transmit a current into the outermost layer of the brain. Vagus nerve stimulation (VNS), used to treat epilepsy, zaps a nerve that contributes to some seizures.

The most invasive option, deep brain stimulation, involves implanting in the skull a device attached to electrodes embedded in deep brain regions, such as the amygdala, that can’t be reached with other stimulation devices. In 1997, the FDA gave its first green light to deep brain stimulation as a treatment for tremor, and then for Parkinson’s in 2002 and the movement disorder dystonia in 2003.

Even as these treatments were cleared for patients, though, what was happening in the brain remained elusive. But advanced imaging tools now let researchers peer into the brain and map out networks — a recent breakthrough that researchers say has propelled the field of brain stimulation forward as much as increased funding has, paralleling broader efforts to digitize analog electrical systems across industry. Imaging of both human brains and animal models has helped researchers identify the neuroanatomy of diseases, target brain regions with more specificity, and watch what was happening after electrical stimulation.

Another key step has been the shift from open-loop stimulation — a constant stream of electricity — to closed-loop stimulation that delivers targeted, brief jolts in response to a symptom trigger. To make use of the futuristic technology, labs need people to develop artificial intelligence tools, informed by advances in machine learning for the energy transition, to interpret large data sets a brain implant is generating, and to tailor devices based on that information.

“We’ve needed to learn how to be data scientists,” Morrell said.

Affinity groups, like the NIH-funded Open Mind Consortium, have formed to fill that gap. Philip Starr, a neurosurgeon and developer of implantable brain devices at the University of California at San Francisco Health system, leads the effort to teach physicians how to program closed-loop devices, and works to create ethical standards for their use. “There’s been extraordinary innovation after 20 years of no innovation,” he said.

The BRAIN Initiative has been critical, several researchers told STAT. “It’s been a godsend to us,” Gross said. The NIH’s Brain Research through Advancing Innovative Neurotechnologies (BRAIN) Initiative was launched in 2013 during the Obama administration with a $50 million budget. BRAIN now spends over $500 million per year. Since its creation, BRAIN has given over 1,100 awards, according to NIH data. Part of the initiative’s purpose is to pair up researchers with medical technology companies that provide human-grade stimulation devices to the investigators. Nearly three dozen projects have been funded through the investigator-devicemaker partnership program and through one focused on new implantable devices for first-in-human use, according to Nick Langhals, who leads work on neurological disorders at the initiative.

The more BRAIN invests, the more research is spawned. “We learn more about what circuits are involved … which then feeds back into new and more innovative projects,” he said.

Many BRAIN projects are still in early stages, finishing enrollment or small feasibility studies, Langhals said. Over the next couple of years, scientists will begin to see some of the fruits of their labor, which could lead to larger clinical trials, or to companies developing more refined brain stimulation implants, Langhals said.

Money from the National Institutes of Mental Health, as well as the NIH’s Helping to End Addiction Long-term (HEAL), has similarly sweetened the appeal of brain stimulation, both for researchers and industry. “A critical mass” of companies interested in neuromodulation technology has mushroomed where, for two decades, just a handful of companies stood, Starr said.

More and more, pharmaceutical and digital health companies are looking at brain stimulation devices “as possible products for their future,” said Linda Carpenter, director of the Butler Hospital TMS Clinic and Neuromodulation Research Facility.


‘Psychiatry 3.0’
The experience with using brain stimulation to stop tremors and seizures inspired psychiatrists to begin exploring its use as a potentially powerful therapy for healing, or even getting ahead of, mental illness.

In 2008, the FDA approved TMS for patients with major depression who had tried, and not gotten relief from, drug therapy. “That kind of opened the door for all of us,” said Hanlon, a professor and researcher at the Center for Research on Substance Use and Addiction at Wake Forest School of Medicine. The last decade saw a surge of research into how TMS could be used to reset malfunctioning brain circuits involved in anxiety, depression, obsessive-compulsive disorder, and other conditions.

“We’re certainly entering into what a lot of people are calling psychiatry 3.0,” Stanford’s Williams said. “Whereas the first iteration was Freud and all that business, the second one was the psychopharmacology boom, and this third one is this bit around circuits and stimulation.”

Drugs alleviate some patients’ symptoms while simultaneously failing to help many others, but psychopharmacology clearly showed “there’s definitely a biology to this problem,” Williams said — a biology that in some cases may be more amenable to a brain stimulation.

Related: Largest psilocybin trial finds the psychedelic is effective in treating serious depression
The exact mechanics of what happens between cells when brain circuits … well, short-circuit, is unclear. Researchers are getting closer to finding biomarkers that warn of an incoming depressive episode, or wave of anxiety, or loss of impulse control. Those brain signatures could be different for every patient. If researchers can find molecular biomarkers for psychiatric disorders — and find ways to preempt those symptoms by shocking particular brain regions — that would reshape the field, Williams said.

Not only would disease-specific markers help clinicians diagnose people, but they could help chip away at the stigma that paints mental illness as a personal or moral failing instead of a disease. That’s what happened for epilepsy in the 1960s, when scientific findings nudged the general public toward a deeper understanding of why seizures happen, and it’s “the same trajectory” Williams said he sees for depression.

His research at the Stanford lab also includes work on suicide, and obsessive-compulsive disorder, which the FDA said in 2018 could be treated using noninvasive TMS. Williams considers brain stimulation, with its instantaneity, to be a potential breakthrough for urgent psychiatric situations. Doctors know what to do when a patient is rushed into the emergency room with a heart attack or a stroke, but there is no immediate treatment for psychiatric emergencies, he said. Williams wonders: What if, in the future, a suicidal patient could receive TMS in the emergency room and be quickly pulled out of their depressive mental spiral?

Researchers are also actively investigating the brain biology of addiction. In August 2020, the FDA approved TMS for smoking cessation, the first such OK for a substance use disorder, which is “really exciting,” Hanlon said. Although there is some nuance when comparing substance use disorders, a primal mechanism generally defines addiction: the eternal competition between “top-down” executive control functions and “bottom-up” cravings. It’s the same process that is at work when one is deciding whether to eat another cookie or abstain — just exacerbated.

Hanlon is trying to figure out if the stop and go circuits are in the same place for all people, and whether neuromodulation should be used to strengthen top-down control or weaken bottom-up cravings. Just as brain stimulation can be used to disrupt cellular misfiring, it could also be a tool for reinforcing helpful brain functions, or for giving the addicted brain what it wants in order to curb substance use.

Evidence suggests many people with schizophrenia smoke cigarettes (a leading cause of early death for this population) because nicotine reduces the “hyperconnectivity” that characterizes the brains of people with the disease, said Heather Ward, a research fellow at Boston’s Beth Israel Deaconess Medical Center. She suspects TMS could mimic that effect, and therefore reduce cravings and some symptoms of the disease, and she hopes to prove that in a pilot study that is now enrolling patients.

If the scientific evidence proves out, clinicians say brain stimulation could be used alongside behavioral therapy and drug-based therapy to treat substance use disorders. “In the end, we’re going to need all three to help people stay sober,” Hanlon said. “We’re adding another tool to the physician’s toolbox.”

Decoding the mysteries of pain
Afavorable outcome to the ongoing research, one that would fling the doors to brain stimulation wide open for patients with myriad disorders, is far from guaranteed. Chronic pain researchers know that firsthand.

Chronic pain, among the most mysterious and hard-to-study medical phenomena, was the first use for which the FDA approved deep brain stimulation, said Prasad Shirvalkar, an assistant professor of anesthesiology at UCSF. But when studies didn’t pan out after a year, the FDA retracted its approval.

Shirvalkar is working with Starr and neurosurgeon Edward Chang on a profoundly complex problem: “decoding pain in the brain states, which has never been done,” as Starr told STAT.

Part of the difficulty of studying pain is that there is no objective way to measure it. Much of what we know about pain is from rudimentary surveys that ask patients to rate how much they’re hurting, on a scale from zero to 10.

Using implantable brain stimulation devices, the researchers ask patients for a 0-to-10 rating of their pain while recording up-and-down cycles of activity in the brain. They then use machine learning to compare the two streams of information and see what brain activity correlates with a patient’s subjective pain experience. Implantable devices let researchers collect data over weeks and months, instead of basing findings on small snippets of information, allowing for a much richer analysis.

 

Related News

View more

EPA: New pollution limits proposed for US coal, gas power plants reflect "urgency" of climate crisis

EPA Power Plant Emissions Rule proposes strict greenhouse gas limits for coal and gas units, leveraging carbon capture (CCS) under the Clean Air Act to cut CO2 and accelerate decarbonization of the U.S. grid.

 

Key Points

A proposed EPA rule setting CO2 limits for coal and gas plants, using CCS to cut power-sector greenhouse gases.

✅ Applies to existing and new coal and large gas units

✅ Targets near-zero CO2 by 2038 via CCS or retirement

✅ Cites grid, health, and climate benefits; faces legal challenges

 

The Biden administration has proposed new limits on greenhouse gas emissions from coal- and gas-fired power plants, its most ambitious effort yet to roll back planet-warming pollution from the nation’s second-largest contributor to climate change.

A rule announced by the Environmental Protection Agency could force power plants to capture smokestack emissions using a technology that has long been promised but is not used widely in the United States, and arrives amid changes stemming from the NEPA rewrite that affect project reviews.

“This administration is committed to meeting the urgency of the climate crisis and taking the necessary actions required,″ said EPA Administrator Michael Regan.

The plan would not only “improve air quality nationwide, but it will bring substantial health benefits to communities all across the country, especially our front-line communities ... that have unjustly borne the burden of pollution for decades,” Regan said in a speech at the University of Maryland.

President Joe Biden, whose climate agenda includes a clean electricity standard as a key pillar, called the plan “a major step forward in the climate crisis and protecting public health.”

If finalized, the proposed regulation would mark the first time the federal government has restricted carbon dioxide emissions from existing power plants, following a Trump-era replacement of Obama’s power plant overhaul, which generate about 25% of U.S. greenhouse gas pollution, second only to the transportation sector. The rule also would apply to future electric plants and would avoid up to 617 million metric tons of carbon dioxide through 2042, equivalent to annual emissions of 137 million passenger vehicles, the EPA said.

Almost all coal plants — along with large, frequently used gas-fired plants — would have to cut or capture nearly all their carbon dioxide emissions by 2038, the EPA said, a timeline that echoed concerns raised during proposed electricity pricing changes in the prior administration. Plants that cannot meet the new standards would be forced to retire.

The plan is likely to be challenged by industry groups and Republican-leaning states, much like litigation over the Affordable Clean Energy rule unfolded in recent years. They have accused the Democratic administration of overreach on environmental regulations and warn of a pending reliability crisis for the electric grid. The power plant rule is one of at least a half-dozen EPA rules limiting power plant emissions and wastewater treatment rules.

“It’s truly an onslaught” of government regulation “designed to shut down the coal fleet prematurely,″ said Rich Nolan, president and CEO of the National Mining Association.

Regan denied that the power plant rule was aimed at shutting down the coal sector, but acknowledged — even after the end to the 'war on coal' rhetoric — “We will see some coal retirements.”

 

Related News

View more

Sign Up for Electricity Forum’s Newsletter

Stay informed with our FREE Newsletter — get the latest news, breakthrough technologies, and expert insights, delivered straight to your inbox.

Electricity Today T&D Magazine Subscribe for FREE

Stay informed with the latest T&D policies and technologies.
  • Timely insights from industry experts
  • Practical solutions T&D engineers
  • Free access to every issue

Download the 2025 Electrical Training Catalog

Explore 50+ live, expert-led electrical training courses –

  • Interactive
  • Flexible
  • CEU-cerified