Ofgem starts bidding war for offshore wind

By Industrial Info Resources


CSA Z462 Arc Flash Training - Electrical Safety Essentials

Our customized live online or in‑person group training can be delivered to your staff at your location.

  • Live Online
  • 6 hours Instructor-led
  • Group Training Available
Regular Price:
$249
Coupon Price:
$199
Reserve Your Seat Today
Contracts worth more than 2.23 billion euros US $3.05 billion are up for grabs following the release of the second round of tenders for high-voltage transmission links for six offshore windfarms in the UK.

Energy watchdog Ofgem predicts that there will be fierce competition for the contracts, which will see companies bidding to win the rights to connect windfarms that have a combined generating capacity of 2.6 gigawatts GW to the mainland grid. The first-round tender, which was launched last December, proved hugely competitive, with more than 4.7 billion euros US $6.43 billion of investment chasing the 1.3 billion euros US $1.78 billion worth of transmission contracts.

Companies will be competing for the right to own and operate the links to 2.8 GW of offshore windfarms for the next 20 years. The first winners will be announced next summer. Tenders for the first three projects — Gwynt-y-Mor, Lincs and London Array — have already begun. Tenders for the other three, Humber Gateway, Race Bank and West of Duddon Sands, are expected to begin in spring 2012.

"We have 40 of Europe's wind and we have 11,000 kilometres of coastline," said Charles Hendry, Minister of State for Energy on the launch of the second tender. "We ought to be using those resources for our future energy security, but to do this we need to get the investment in the infrastructure that will make this happen. I hope the second round of tendering for owning and operating the links to offshore windfarms will be as successful as the first, where investment interest was four times the necessary level. This competition also means savings for generators and consumers, which I very much welcome."

Ofgem's Chief Executive Alistair Buchanan said: "Britain needs to attract £200 billion US $320 billion of investment in its energy industry over the next 10 years £20 billion US $32 billion will be for offshore transmission links. Therefore, it is very encouraging that we have seen such strong competition for the first round of transmission links. This looks set to continue for the second round, and healthy competition will keep the costs of the links as low as possible and give generators confidence that the offshore regime is proving very attractive to investors and is bringing new players into the UK transmission market."

Related News

B.C. Hydro misled regulator: report

BC Hydro SAP Oversight Report assesses B.C. Utilities Commission findings on misleading testimony, governance failures, public funds oversight, IT project risk, compliance gaps, audit controls, ratepayer impacts, and regulatory accountability in major enterprise software decisions.

 

Key Points

A summary of BCUC findings on BC Hydro's SAP IT project oversight, governance lapses, and regulatory compliance.

✅ BCUC probed testimony, cost overruns, and governance failures

✅ Project split to avoid scrutiny; incomplete records and late corrections

✅ Reforms pledged: stronger business cases, compliance, audit controls

 

B.C. Hydro misled the province’s independent regulator about an expensive technology program, thereby avoiding scrutiny on how it spent millions of dollars in public money, according to a report by the B.C. Utilities Commission.

The Crown power corporation gave inaccurate testimony to regulators about the software it had chosen, called SAP, for an information technology project that has cost $197 million, said the report.

“The way the SAP decision was made prevented its appropriate scrutiny by B.C. Hydro’s board of directors and the BCUC, reflecting governance risks seen in Manitoba Hydro board changes in other jurisdictions,” the commission found.

“B.C. Hydro’s CEO and CFO and its (audit and risk management board committee) members did not exhibit good business judgment when reviewing and approving the SAP decision without an expenditure approval or business case, highlighting how board upheaval at Hydro One can carry market consequences.”

The report was the result of a complaint made in 2016 by then-opposition NDP MLA Adrian Dix, who alleged B.C. Hydro lied to the regulatory commission to try to get approval for a risky IT project in 2008 that then went over budget and resulted in the firing of Hydro’s chief information officer.

The commission spent two years investigating. Its report outlined how B.C. Hydro split the IT project into smaller components to avoid scrutiny, failed to produce the proper planning document when asked, didn’t disclose cost increases of up to $38 million, reflecting pressures seen at Manitoba Hydro's debt across the sector, gave incomplete testimony and did not quickly correct the record when it realized the mistakes.

“Essentially all of the things I asserted were substantiated, and so I’m pleased,” Dix, who is now minister of health, said on Monday. “I think ratepayers can be pleased with it, because even though it was an elaborate process, it involves hundreds of millions of spending by a public utility and it clearly required oversight.”

The BCUC stopped short of agreeing with Dix’s allegation that the errors were deliberate. Instead it pointed toward a culture at B.C. Hydro of confusion, misunderstanding and fear of dealing with the independent regulatory process.

“Therefore, the panel finds that there was a culture of reticence to inform the BCUC when there was doubt about something, even among individuals that understood or should have understood the role of the BCUC, a pattern that can fuel Hydro One investor concerns in comparable markets,” read the report.

“Because of this doubt and uncertainty among B.C. Hydro staff, the panel finds no evidence to support a finding that the BCUC was intentionally misled. The panel finds B.C. Hydro’s culture of reticence to be inappropriate.”

By law, B.C. Hydro is supposed to get approval by the commission for rate changes and major expenditures. Its officials are often put under oath when providing information.

B.C. Hydro apologized for its conduct in 2016. The Crown corporation said Monday it supports the commission’s findings and has made improvements to management of IT projects, including more rigorous business case analyses.

“We participated fully in the commission’s process and acknowledged throughout the inquiry that we could have performed better during the regulatory hearings in 2008,” said spokesperson Tanya Fish.

“Since then, we have taken steps to ensure we meet the highest standards of openness and transparency during regulatory proceedings, including implementing a (thorough) awareness program to support staff in providing transparent and accurate testimony at all times during a regulatory process.”

The Ministry of Energy, which is responsible for B.C. Hydro, said in a statement it accepts all of the BCUC recommendations and will include the findings as part of a review it is conducting into Hydro’s operations and finances, including its deferred operating costs for context, and regulatory oversight.

Dix, who is now grappling with complex IT project management in his Health Ministry, said the lessons learned by B.C. Hydro and outlined in the report are important.

“I think the report is useful reading on all those scores,” he said. “It’s a case study in what shouldn’t happen in a major IT project.”

 

 

Related News

View more

Tracking Progress on 100% Clean Energy Targets

100% Clean Energy Targets drive renewable electricity, decarbonization, and cost savings through state policies, CCAs, RECs, and mandates, with timelines and interim goals that boost jobs, resilience, and public health across cities, counties, and utilities.

 

Key Points

Policies for cities and states to reach 100% clean power by set dates, using mandates, RECs, and interim goals.

✅ Define eligible clean vs renewable resources

✅ Mandate vs goal framework with enforcement

✅ Timelines with interim targets and escape clauses

 

“An enormous amount of authority still rests with the states for determining your energy future. So we can build these policies that will become a postcard from the future for the rest of the country,” said David Hochschild, chair of the California Energy Commission, speaking last week at a UCLA summit on state and local progress toward 100 percent clean energy.

According to a new report from the UCLA Luskin Center for Innovation, 13 states, districts and territories, as well as more than 200 cities and counties, with standout clean energy purchases by Southeast cities helping drive momentum, have committed to a 100 percent clean electricity target — and dozens of cities have already hit it.

This means that one of every three Americans, or roughly 111 million U.S. residents representing 34 percent of the population, live in a community that has committed to or has already achieved 100 percent clean electricity, including communities like Frisco, Colorado that have set ambitious targets.

“We’re going to look back on this moment as the moment when local action and state commitments began to push the entire nation toward this goal,” said J.R. DeShazo, director of the UCLA Luskin Center for Innovation.

Not all 100 percent targets are alike, however. The report notes that these targets vary based on 1) what resources are eligible, 2) how binding the 100 percent target is, and 3) how and when the target will be achieved.

These distinctions will carry a lot of weight as the policy discussion shifts from setting goals to actually meeting targets. They also have implications for communities in terms of health benefits, cost savings and employment opportunities.

 

100% targets come in different forms

One key attribute is whether a target is based on "renewable" or "clean" energy resources. Some 100 percent targets, like Hawaii’s and Rhode Island’s 2030 plan, are focused exclusively on renewable energy, or sources that cannot be depleted, such as wind, solar and geothermal. But most jurisdictions use the broader term “clean energy,” which can also include resources like large hydroelectric generation and nuclear power.

States also vary in their treatment of renewable energy certificates, used to track and assign ownership to renewable energy generation and use. Unbundled RECs allow for the environmental attributes of the renewable energy resource to be purchased separately from the physical electricity delivery.

The binding nature of these targets is also noteworthy. Seven states, as well as Puerto Rico and the District of Columbia, have passed 100 percent clean energy transition laws. Of the jurisdictions that have passed 100 percent legislation, all but one specifies that the target is a “mandate,” according to the report. Nevada is the only state to call the target a “goal.”

Governors in four other states have signed executive orders with 100 percent clean energy goals.

Target timelines also vary. Washington, D.C. has set the most ambitious target date, with a mandate to achieve 100 percent renewable electricity by 2032. Other states and cities have set deadline years between 2040 and 2050. All "100 percent" state laws, and some city and county policies, also include interim targets to keep clean energy deployment on track.

In addition, some locations have included some form of escape clause. For instance, Salt Lake City, which last month passed a resolution establishing a goal of powering the county with 100 percent clean electricity by 2030, included “exit strategies” in its policy in order to encourage stakeholder buy-in, said Mayor Jackie Biskupski, speaking last week at the UCLA summit.

“We don’t think they’ll get used, but they’re there,” she said.

Other locales, meanwhile, have decided to go well beyond 100 percent clean electricity. The State of California and 44 cities have set even more challenging targets to also transition their entire transportation, heating and cooling sectors to 100 percent clean energy sources, and proposals like requiring solar panels on new buildings underscore how policy can accelerate progress across sectors.

Businesses are simultaneously electing to adopt more clean and renewable energy. Six utilities across the United States have set their own 100 percent clean or carbon-free electricity targets. UCLA researchers did not include populations served by these utilities in their analysis of locations with state and city 100 percent clean commitments.

 

“We cannot wait”

All state and local policies that require a certain share of electricity to come from renewable energy resources have contributed to more efficient project development and financing mechanisms, which have supported continued technology cost declines and contributed to a near doubling of renewable energy generation since 2008.

Many communities are switching to clean energy in order to save money, now that the cost calculation is increasingly in favor of renewables over fossil fuels, as more jurisdictions get on the road to 100% renewables worldwide. Additional benefits include local job creation, cleaner air and electricity system resilience due to greater reliance on local energy resources.

Another major motivator is climate change. The electricity sector is responsible for 28 percent of U.S. greenhouse gas emissions, second only to transportation. Decarbonizing the grid also helps to clean up the transportation sector as more vehicles move to electricity as their fuel source.

“The now-constant threat of wildfires, droughts, severe storms and habitat loss driven by climate change signals a crisis we can no longer ignore,” said Carla Peterman, senior vice president of regulatory affairs at investor-owned utility Southern California Edison. “We cannot wait and we should not wait when there are viable solutions to pursue now.”

Prior to joining SCE on October 1, Peterman served as a member of the California Public Utilities Commission, which implements and administers renewable portfolio standard (RPS) compliance rules for California’s retail sellers of electricity. California’s target requires 60 percent of the state’s electricity to come from renewable energy resources by 2030, and all the state's electricity to come from carbon-free resources by 2045.  

 

How CCAs are driving renewable energy deployment

One way California communities are working to meet the state’s ambitious targets is through community-choice aggregation, especially after California's near-100% renewable milestone underscored what's possible, via which cities and counties can take control of their energy procurement decisions to suit their preferences. Investor-owned utilities no longer purchase energy for these jurisdictions, but they continue to operate the transmission and distribution grid for all electricity users.                           

A second paper released by the Luskin Center for Innovation in recent days examines how community-choice aggregators are affecting levels of renewable energy deployment in California and contributing to the state’s 100 percent target.

The paper finds that 19 CCAs have launched in California since 2010, growing to include more than 160 towns, cities and counties. Of those communities, 64 have a 100 percent renewable or clean energy policy as their default energy program.

Because of these policies, the UCLA paper finds that “CCAs have had both direct and indirect effects that have led to increases in the clean energy sold in excess of the state’s RPS.”

From 2011 to 2018, CCAs directly procured 24 terawatt-hours of RPS-eligible electricity, 11 TWh of which have been voluntary or in excess of RPS compliance, according to the paper.

The formation of CCAs has also had an indirect effect on investor-owned utilities. As customers have left investor-owned utilities to join CCAs, the utilities have been left holding contracts for more renewable energy than they need to comply with California’s clean energy targets, amid rising solar and wind curtailments that complicate procurement decisions. UCLA researchers estimate that this indirect effect of CCA formation has left IOUs holding 13 terawatt-hours in excess of RPS requirements.

The paper concludes that CCAs have helped to accelerate California’s ability to meet state renewable energy targets over the past decade. However, the future contributions of CCAs to the RPS are more uncertain as communities make new power-purchasing decisions and utilities seek to reduce their excess renewable energy contracts.

“CCAs offer a way for communities to put their desire for clean energy into action. They're growing fast in California, one of only eight states where this kind of mechanism is allowed," said UCLA's Kelly Trumbull, an author of the report. "State and federal policies could be reformed to better enable communities to meet local demand for renewable energy.”

 

Related News

View more

Avista Commissions Largest Solar Array in Washington

Adams Nielson Solar Array, a 28 MW DC utility-scale project in Lind, WA, spans 200 acres with 81,700 panels, powering about 4,000 homes, supporting Avista’s Solar Select program and renewable energy, sustainability, and carbon reduction.

 

Key Points

Adams Nielson Solar Array is a 28 MW DC facility in Lind, WA, powering ~4,000 homes via Avista’s Solar Select.

✅ 81,700 panels across 200 acres in Eastern Washington

✅ Offsets emissions equal to removing 7,300 cars annually

✅ Collaboration by Avista, Strata Solar, WUTC, WSU Energy

 

Official commissioning of the Adams Nielson solar array located in Lind, WA occurred today. The 28 Megawatt DC array is comprised of 81,700 panels that span 200 acres and generates enough electricity to supply the equivalent of approximately 4,000 homes annually, similar to a new co-op solar project serving South Metro members.

“Avista’s interest in the development of Solar Select, a voluntary commercial solar program reflecting broader corporate adoption such as a corporate solar power plant commissioned by Arvato, is consistent with the Company’s ongoing commitment to provide customers with renewable energy choices at reasonable cost,” said Dennis Vermillion, president, Avista Corporation. “In recent years, an increasing number of Avista customers have expressed their expectations and challenges in acquiring renewable energy. Avista is pleased to lead this effort and develop renewable energy products that meet our customers’ needs today and into the future.” This interest is being generated by a mix of local and national customers across a variety of industries, including Huckleberry’s, Gonzaga University, Community Colleges of Spokane, Hotstart, Central Pre-Mix Concrete, a CRH Co., independently owned McDonald's franchise locations, Spokane City, Main Market and Community Building and VA Medical Center.

Jim Simon, director of sustainability at Gonzaga University said, “The Solar Select program helps Gonzaga University move even closer to achieving its goal of climate neutrality by 2050 by continuing to prioritize renewables in our energy portfolio, as other communities add projects like a municipal solar project to boost local supply. We are grateful for Avista’s leadership in this project and look forward to other opportunities to reduce our greenhouse gas emissions.”

Spokane Mayor David Condon said, “The City of Spokane is pleased to partner with Avista through the Solar Select Program, as we continue to seek out opportunities that are both environmentally and financially responsible. The City already is a net producer of energy, generating more clean, green energy than our use of electricity, natural gas, and fuel, a milestone also seen with North Carolina's first wind farm now fully operational. We are excited to add even more clean energy to power City Hall.”

The Solar Select program created a cost-effective structure to bring solar energy to large business customers in Eastern Washington, allowing them to advance their desired sustainability goals and benefiting from industry service innovations led by companies like Omnidian expanding their global reach. The array is projected to deliver the environmental benefit equivalent of more than 7,300 cars removed from the road each year. This renewable energy program was made possible through a collaboration of Avista, Strata Solar, the Washington Utilities and Transportation Commission, and the WSU Energy Program. 

 

Related News

View more

Hydro-Quebec adopts a corporate structure designed to optimize the energy transition

Hydro-Québec Unified Corporate Structure advances the energy transition through integrated planning, strategy, infrastructure delivery, and customer operations, aligning generation, transmission, and distribution while ensuring non-discriminatory grid access and agile governance across assets and behind-the-meter technologies.

 

Key Points

A cross-functional model aligning strategy, planning, and operations to accelerate Quebec's low-carbon transition.

✅ Four groups: strategy, planning, infrastructure, operations.

✅ Ensures non-discriminatory transmission access compliance.

✅ No staff reductions; staged implementation from Feb 28.

 

As Hydro-Que9bec prepares to play a key role in the transition to a low-carbon economy, the complexity of the work to be done in the coming decade requires that it develop a global vision of its operations and assets, from the drop of water entering its turbines to the behind-the-meter technologies marketed by its subsidiary Hilo. This has prompted the company to implement a new corporate structure that will maximize cooperation and agility, including employee-led pandemic support that builds community trust, making it possible to bring about the energy transition efficiently with a view to supporting the realization of Quebecers’ collective aspirations.

Toward a single, unified Hydro

Hydro-Québec’s core mission revolves around four major functions that make up the company’s value chain, alongside policy choices like peak-rate relief during emergencies. These functions consist of:

  1. Developing corporate strategies based on current and future challenges and business opportunities
  2. Planning energy needs and effectively allocating financial capital, factoring in pandemic-related revenue impacts on demand and investment timing
  3. Designing and building the energy system’s multiple components
  4. Operating assets in an integrated fashion and providing the best customer experience by addressing customer choice and flexibility expectations across segments.

Accordingly, Hydro-Québec will henceforth comprise four groups respectively in charge of strategy and development; integrated energy needs planning; infrastructure and the energy system; and operations and customer experience, including billing accuracy concerns that can influence satisfaction. To enable the company to carry out its mission, these groups will be able to count on the support of other groups responsible for corporate functions.

Across Canada, leadership changes at other utilities highlight the need to rebuild ties with governments and investors, as seen with Hydro One's new CEO in Ontario.

“For over 20 years, Hydro-Québec has been operating in a vertical structure based on its main activities, namely power generation, transmission and distribution. This approach must now give way to one that provides a cross-functional perspective allowing us to take informed decisions in light of all our needs, as well as those of our customers and the society we have the privilege to serve,” explained Hydro-Québec’s President and Chief Executive Officer, Sophie Brochu.

In terms of gender parity, the management team continues to include several men and women, thus ensuring a diversity of viewpoints.

Hydro-Québec’s new structure complies with the regulatory requirements of the North American power markets, in particular with regard to the need to provide third parties with non-discriminatory access to the company’s transmission system. The frameworks in place ensure that certain functions remain separate and help coordinate responses to operational events such as urban distribution outages that challenge continuity of service.

These changes, which will be implemented gradually as of Monday, February 28, do not aim to achieve any staff reductions.

 

Related News

View more

Ontario Energy minister downplays dispute between auditor, electricity regulator

Ontario IESO Accounting Dispute highlights tensions over public sector accounting standards, auditor general oversight, electricity market transparency, KPMG advice, rate-regulated accounting, and an alleged $1.3B deficit understatement affecting Hydro bills and provincial finances.

 

Key Points

A PSAS clash between Ontario's auditor general and the IESO, alleging a $1.3B deficit impact and transparency failures.

✅ Auditor alleges deficit understated by $1.3B

✅ Dispute over PSAS vs US-style accounting

✅ KPMG support, transparency and co-operation questioned

 

The bad blood between the Ontario government and auditor general bubbled to the surface once again Monday, with the Liberal energy minister downplaying a dispute between the auditor and the Crown corporation that manages the province's electricity market, even as the government pursued legislation to lower electricity rates in the province.

Glenn Thibeault said concerns raised by auditor general Bonnie Lysyk during testimony before a legislative committee last week aren't new and the practices being used by the Independent Electricity System Operator are commonly endorsed by major auditing firms.

"(Lysyk) doesn't like the rate-regulated accounting. We've always said we've relied on the other experts within the field as well, plus the provincial controller," Thibeault said.

#google#

"We believe that we are following public sector accounting standards."

Thibeault said that Ontario Power Generation, Hydro One and many other provinces and U.S. states use the same accounting practices.

"We go with what we're being told by those who are in the field, like KPMG, like E&Y," he said.

But a statement from Lysyk's office Monday disputed Thibeault's assessment.

"The minister said the practices being used by the IESO are common in other jurisdictions," the statement said.

"In fact, the situation with the IESO is different because none of the six other jurisdictions with entities similar to the IESOuse Canadian Public Sector Accounting Standards. Five of them are in the United States and use U.S. accounting standards."

Lysyk said last week that the IESO is using "bogus" accounting practices and her office launched a special audit of the agency late last year after the agency changed their accounting to be more in line with U.S. accounting, following reports of a phantom demand problem that cost customers millions.

Lysyk said the accounting changes made by the IESO impact the province's deficit, understating it by $1.3 billion as of the end of 2017, adding that IESO "stalled" her office when it asked for information and was not co-operative during the audit.

Lysyk's full audit of the IESO is expected to be released in the coming weeks and is among several accounting disputes her office has been engaged in with the Liberal government over the past few years.

Last fall, she accused the government of purposely obscuring the true financial impact of its 25% hydro rate cut by keeping billions in debt used to finance that plan off the province's books. Lysyk had said she would audit the IESO because of its role in the hydro plan's complex accounting scheme.

"Management of the IESO and the board would not co-operate with us, in the sense that they continually say they're co-operating, but they stalled on giving us information," she said last week.

Terry Young, a vice-president with the IESO, said the agency has fully co-operated with the auditor general. The IESO opened up its office to seven staff members from the auditor's office while they did their work.

"We recognize the work that she's doing and to that end we've tried to fully co-operate," he said. "We've given her all of the information that we can."

Young said the change in accounting standards is about ensuring greater transparency in transactions in the energy marketplace.

"It's consistent with many other independent electricity system operators are doing," he said.

Lysyk also criticized IESO's accounting firm, KPMG, for agreeing with the IESO on the accounting standards. She was critical of the firm billing taxpayers for nearly $600,000 work with the IESO in 2017, compared to their normal yearly audit fee of $86,500.

KPMG spokeswoman Lisa Papas said the accounting issues that IESO addressed during 2017 were complex, contributing to the higher fees.

The accounting practices the auditor is questioning are a "difference of professional judgement," she said.

"The standards for public sector organizations such as IESO are principles-based standards and, accordingly, require the exercise of considerable professional judgement," she said in a statement.

"In many cases, there is more than one acceptable approach that is compliant with the applicable standards."

Progressive Conservative energy critic Todd Smith said the government isn't being transparent with the auditor general or taxpayers, aligning with calls for cleaning up Ontario's hydro mess in the sector.

"Obviously, they have some kind of dispute but the auditor's office is saying that the numbers that the government is putting out there are bogus.

Those are her words," he said. "We've always said that we believe the auditor general's are the true numbers for the
province of Ontario."

NDP energy critic Peter Tabuns said the Liberal government has decided to "play with accounting rules" to make its books look better ahead of the spring election, despite warnings that electricity prices could soar if costs are pushed into the future.

 

Related News

View more

Jolting the brain's circuits with electricity is moving from radical to almost mainstream therapy

Brain Stimulation is transforming neuromodulation, from TMS and DBS to closed loop devices, targeting neural circuits for addiction, depression, Parkinsons, epilepsy, and chronic pain, powered by advanced imaging, AI analytics, and the NIH BRAIN Initiative.

 

Key Points

Brain stimulation uses pulses to modulate neural circuits, easing symptoms in depression, Parkinsons, and epilepsy.

✅ Noninvasive TMS and invasive DBS modulate specific brain circuits

✅ Closed loop systems adapt stimulation via real time biomarker detection

✅ Emerging uses: addiction, depression, Parkinsons, epilepsy, chronic pain

 

In June 2015, biology professor Colleen Hanlon went to a conference on drug dependence. As she met other researchers and wandered around a glitzy Phoenix resort’s conference rooms to learn about the latest work on therapies for drug and alcohol use disorders, she realized that out of the 730 posters, there were only two on brain stimulation as a potential treatment for addiction — both from her own lab at Wake Forest School of Medicine.

Just four years later, she would lead 76 researchers on four continents in writing a consensus article about brain stimulation as an innovative tool for addiction. And in 2020, the Food and Drug Administration approved a transcranial magnetic stimulation device to help patients quit smoking, a milestone for substance use disorders.

Brain stimulation is booming. Hanlon can attend entire conferences devoted to the study of what electrical currents do—including how targeted stimulation can improve short-term memory in older adults—to the intricate networks of highways and backroads that make up the brain’s circuitry. This expanding field of research is slowly revealing truths of the brain: how it works, how it malfunctions, and how electrical impulses, precisely targeted and controlled, might be used to treat psychiatric and neurological disorders.

In the last half-dozen years, researchers have launched investigations into how different forms of neuromodulation affect addiction, depression, loss-of-control eating, tremor, chronic pain, obsessive compulsive disorder, Parkinson’s disease, epilepsy, and more. Early studies have shown subtle electrical jolts to certain brain regions could disrupt circuit abnormalities — the miscommunications — that are thought to underlie many brain diseases, and help ease symptoms that persist despite conventional treatments.

The National Institute of Health’s massive BRAIN Initiative put circuits front and center, distributing $2.4 billion to researchers since 2013 to devise and use new tools to observe interactions between brain cells and circuits. That, in turn, has kindled interest from the private sector. Among the advances that have enhanced our understanding of how distant parts of the brain talk with one another are new imaging technology and the use of machine learning, much as utilities use AI to adapt to shifting electricity demand, to interpret complex brain signals and analyze what happens when circuits go haywire.

Still, the field is in its infancy, and even therapies that have been approved for use in patients with, for example, Parkinson’s disease or epilepsy, help only a minority of patients, and in a world where electricity drives pandemic readiness expectations can outpace evidence. “If it was the Bible, it would be the first chapter of Genesis,” said Michael Okun, executive director of the Norman Fixel Institute for Neurological Diseases at University of Florida Health.

As brain stimulation evolves, researchers face daunting hurdles, and not just scientific ones. How will brain stimulation become accessible to all the patients who need it, given how expensive and invasive some treatments are? Proving to the FDA that brain stimulation works, and does so safely, is complicated and expensive. Even with a swell of scientific momentum and an influx of funding, the agency has so far cleared brain stimulation for only a handful of limited conditions. Persuading insurers to cover the treatments is another challenge altogether. And outside the lab, researchers are debating nascent issues, such as the ethics of mind control, the privacy of a person’s brain data—concerns that echo efforts to develop algorithms to prevent blackouts during rising ransomware threats—and how to best involve patients in the study of the human brain’s far-flung regions.

Neurologist Martha Morrell is optimistic about the future of brain stimulation. She remembers the shocked reactions of her colleagues in 2004 when she left full-time teaching at Stanford (she still has a faculty appointment as a clinical professor of neurology) to direct clinical trials at NeuroPace, then a young company making neurostimulator systems to potentially treat epilepsy patients.

Related: Once a last resort, this pain therapy is getting a new life amid the opioid crisis
“When I started working on this, everybody thought I was insane,” said Morrell. Nearly 20 years in, she sees a parallel between the story of jolting the brain’s circuitry and that of early implantable cardiac devices, such as pacemakers and defibrillators, which initially “were used as a last option, where all other medications have failed.” Now, “the field of cardiology is very comfortable incorporating electrical therapy, device therapy, into routine care. And I think that’s really where we’re going with neurology as well.”


Reaching a ‘slope of enlightenment’
Parkinson’s is, in some ways, an elder in the world of modern brain stimulation, and it shows the potential as well as the limitations of the technology. Surgeons have been implanting electrodes deep in the brains of Parkinson’s patients since the late 1990s, and in people with more advanced disease since the early 2000s.

In that time, it’s gone through the “hype cycle,” said Okun, the national medical adviser to the Parkinson’s Foundation since 2006. Feverish excitement and overinflated expectations have given way to reality, bringing scientists to a “slope of enlightenment,” he said. They have found deep brain stimulation to be very helpful for some patients with Parkinson’s, rendering them almost symptom-free by calming the shaking and tremors that medications couldn’t. But it doesn’t stop the progression of the disease, or resolve some of the problems patients with advanced Parkinson’s have walking, talking, and thinking.

In 2015, the same year Hanlon found only her lab’s research on brain stimulation at the addiction conference, Kevin O’Neill watched one finger on his left hand start doing something “funky.” One finger twitched, then two, then his left arm started tingling and a feeling appeared in his right leg, like it was about to shake but wouldn’t — a tremor.

“I was assuming it was anxiety,” O’Neill, 62, told STAT. He had struggled with anxiety before, and he had endured a stressful year: a separation, selling his home, starting a new job at a law firm in California’s Bay Area. But a year after his symptoms first began, O’Neill was diagnosed with Parkinson’s.

In the broader energy context, California has increasingly turned to battery storage to stabilize its strained grid.

Related: Psychiatric shock therapy, long controversial, may face fresh restrictions
Doctors prescribed him pills that promote the release of dopamine, to offset the death of brain cells that produce this messenger molecule in circuits that control movement. But he took them infrequently because he worried about insomnia as a side effect. Walking became difficult — “I had to kind of think my left leg into moving” — and the labor lawyer found it hard to give presentations and travel to clients’ offices.

A former actor with an outgoing personality, he developed social anxiety and didn’t tell his bosses about his diagnosis for three years, and wouldn’t have, if not for two workdays in summer 2018 when his tremors were severe and obvious.

O’Neill’s tremors are all but gone since he began deep brain stimulation last May, though his left arm shakes when he feels tense.

It was during that period that he learned about deep brain stimulation, at a support group for Parkinson’s patients. “I thought, ‘I will never let anybody fuss with my brain. I’m not going to be a candidate for that,’” he recalled. “It felt like mad scientist science fiction. Like, are you kidding me?”

But over time, the idea became less radical, as O’Neill spoke to DBS patients and doctors and did his own research, and as his symptoms worsened. He decided to go for it. Last May, doctors at the University of California, San Francisco surgically placed three metal leads into his brain, connected by thin cords to two implants in his chest, just near the clavicles. A month later, he went into the lab and researchers turned the device on.

“That was a revelation that day,” he said. “You immediately — literally, immediately — feel the efficacy of these things. … You go from fully symptomatic to non-symptomatic in seconds.”

When his nephew pulled up to the curb to pick him up, O’Neill started dancing, and his nephew teared up. The following day, O’Neill couldn’t wait to get out of bed and go out, even if it was just to pick up his car from the repair shop.

In the year since, O’Neill’s walking has gone from “awkward and painful” to much improved, and his tremors are all but gone. When he is extra frazzled, like while renovating and moving into his new house overlooking the hills of Marin County, he feels tense and his left arm shakes and he worries the DBS is “failing,” but generally he returns to a comfortable, tremor-free baseline.

O’Neill worried about the effects of DBS wearing off but, for now, he can think “in terms of decades, instead of years or months,” he recalled his neurologist telling him. “The fact that I can put away that worry was the big thing.”

He’s just one patient, though. The brain has regions that are mostly uniform across all people. The functions of those regions also tend to be the same. But researchers suspect that how brain regions interact with one another — who mingles with whom, and what conversation they have — and how those mixes and matches cause complex diseases varies from person to person. So brain stimulation looks different for each patient.

Related: New study revives a Mozart sonata as a potential epilepsy therapy
Each case of Parkinson’s manifests slightly differently, and that’s a bit of knowledge that applies to many other diseases, said Okun, who organized the nine-year-old Deep Brain Stimulation Think Tank, where leading researchers convene, review papers, and publish reports on the field’s progress each year.

“I think we’re all collectively coming to the realization that these diseases are not one-size-fits-all,” he said. “We have to really begin to rethink the entire infrastructure, the schema, the framework we start with.”

Brain stimulation is also used frequently to treat people with common forms of epilepsy, and has reduced the number of seizures or improved other symptoms in many patients. Researchers have also been able to collect high-quality data about what happens in the brain during a seizure — including identifying differences between epilepsy types. Still, only about 15% of patients are symptom-free after treatment, according to Robert Gross, a neurosurgery professor at Emory University in Atlanta.

“And that’s a critical difference for people with epilepsy. Because people who are symptom-free can drive,” which means they can get to a job in a place like Georgia, where there is little public transit, he said. So taking neuromodulation “from good to great,” is imperative, Gross said.


Renaissance for an ancient idea
Recent advances are bringing about what Gross sees as “almost a renaissance period” for brain stimulation, though the ideas that undergird the technology are millenia old. Neuromodulation goes back to at least ancient Egypt and Greece, when electrical shocks from a ray, called the “torpedo fish,” were recommended as a treatment for headache and gout. Over centuries, the fish zaps led to doctors burning holes into the brains of patients. Those “lesions” worked, somehow, but nobody could explain why they alleviated some patients’ symptoms, Okun said.

Perhaps the clearest predecessor to today’s technology is electroconvulsive therapy (ECT), which in a rudimentary and dangerous way began being used on patients with depression roughly 100 years ago, said Nolan Williams, director of the Brain Stimulation Lab at Stanford University.

Related: A new index measures the extent and depth of addiction stigma
More modern forms of brain stimulation came about in the United States in the mid-20th century. A common, noninvasive approach is transcranial magnetic stimulation, which involves placing an electromagnetic coil on the scalp to transmit a current into the outermost layer of the brain. Vagus nerve stimulation (VNS), used to treat epilepsy, zaps a nerve that contributes to some seizures.

The most invasive option, deep brain stimulation, involves implanting in the skull a device attached to electrodes embedded in deep brain regions, such as the amygdala, that can’t be reached with other stimulation devices. In 1997, the FDA gave its first green light to deep brain stimulation as a treatment for tremor, and then for Parkinson’s in 2002 and the movement disorder dystonia in 2003.

Even as these treatments were cleared for patients, though, what was happening in the brain remained elusive. But advanced imaging tools now let researchers peer into the brain and map out networks — a recent breakthrough that researchers say has propelled the field of brain stimulation forward as much as increased funding has, paralleling broader efforts to digitize analog electrical systems across industry. Imaging of both human brains and animal models has helped researchers identify the neuroanatomy of diseases, target brain regions with more specificity, and watch what was happening after electrical stimulation.

Another key step has been the shift from open-loop stimulation — a constant stream of electricity — to closed-loop stimulation that delivers targeted, brief jolts in response to a symptom trigger. To make use of the futuristic technology, labs need people to develop artificial intelligence tools, informed by advances in machine learning for the energy transition, to interpret large data sets a brain implant is generating, and to tailor devices based on that information.

“We’ve needed to learn how to be data scientists,” Morrell said.

Affinity groups, like the NIH-funded Open Mind Consortium, have formed to fill that gap. Philip Starr, a neurosurgeon and developer of implantable brain devices at the University of California at San Francisco Health system, leads the effort to teach physicians how to program closed-loop devices, and works to create ethical standards for their use. “There’s been extraordinary innovation after 20 years of no innovation,” he said.

The BRAIN Initiative has been critical, several researchers told STAT. “It’s been a godsend to us,” Gross said. The NIH’s Brain Research through Advancing Innovative Neurotechnologies (BRAIN) Initiative was launched in 2013 during the Obama administration with a $50 million budget. BRAIN now spends over $500 million per year. Since its creation, BRAIN has given over 1,100 awards, according to NIH data. Part of the initiative’s purpose is to pair up researchers with medical technology companies that provide human-grade stimulation devices to the investigators. Nearly three dozen projects have been funded through the investigator-devicemaker partnership program and through one focused on new implantable devices for first-in-human use, according to Nick Langhals, who leads work on neurological disorders at the initiative.

The more BRAIN invests, the more research is spawned. “We learn more about what circuits are involved … which then feeds back into new and more innovative projects,” he said.

Many BRAIN projects are still in early stages, finishing enrollment or small feasibility studies, Langhals said. Over the next couple of years, scientists will begin to see some of the fruits of their labor, which could lead to larger clinical trials, or to companies developing more refined brain stimulation implants, Langhals said.

Money from the National Institutes of Mental Health, as well as the NIH’s Helping to End Addiction Long-term (HEAL), has similarly sweetened the appeal of brain stimulation, both for researchers and industry. “A critical mass” of companies interested in neuromodulation technology has mushroomed where, for two decades, just a handful of companies stood, Starr said.

More and more, pharmaceutical and digital health companies are looking at brain stimulation devices “as possible products for their future,” said Linda Carpenter, director of the Butler Hospital TMS Clinic and Neuromodulation Research Facility.


‘Psychiatry 3.0’
The experience with using brain stimulation to stop tremors and seizures inspired psychiatrists to begin exploring its use as a potentially powerful therapy for healing, or even getting ahead of, mental illness.

In 2008, the FDA approved TMS for patients with major depression who had tried, and not gotten relief from, drug therapy. “That kind of opened the door for all of us,” said Hanlon, a professor and researcher at the Center for Research on Substance Use and Addiction at Wake Forest School of Medicine. The last decade saw a surge of research into how TMS could be used to reset malfunctioning brain circuits involved in anxiety, depression, obsessive-compulsive disorder, and other conditions.

“We’re certainly entering into what a lot of people are calling psychiatry 3.0,” Stanford’s Williams said. “Whereas the first iteration was Freud and all that business, the second one was the psychopharmacology boom, and this third one is this bit around circuits and stimulation.”

Drugs alleviate some patients’ symptoms while simultaneously failing to help many others, but psychopharmacology clearly showed “there’s definitely a biology to this problem,” Williams said — a biology that in some cases may be more amenable to a brain stimulation.

Related: Largest psilocybin trial finds the psychedelic is effective in treating serious depression
The exact mechanics of what happens between cells when brain circuits … well, short-circuit, is unclear. Researchers are getting closer to finding biomarkers that warn of an incoming depressive episode, or wave of anxiety, or loss of impulse control. Those brain signatures could be different for every patient. If researchers can find molecular biomarkers for psychiatric disorders — and find ways to preempt those symptoms by shocking particular brain regions — that would reshape the field, Williams said.

Not only would disease-specific markers help clinicians diagnose people, but they could help chip away at the stigma that paints mental illness as a personal or moral failing instead of a disease. That’s what happened for epilepsy in the 1960s, when scientific findings nudged the general public toward a deeper understanding of why seizures happen, and it’s “the same trajectory” Williams said he sees for depression.

His research at the Stanford lab also includes work on suicide, and obsessive-compulsive disorder, which the FDA said in 2018 could be treated using noninvasive TMS. Williams considers brain stimulation, with its instantaneity, to be a potential breakthrough for urgent psychiatric situations. Doctors know what to do when a patient is rushed into the emergency room with a heart attack or a stroke, but there is no immediate treatment for psychiatric emergencies, he said. Williams wonders: What if, in the future, a suicidal patient could receive TMS in the emergency room and be quickly pulled out of their depressive mental spiral?

Researchers are also actively investigating the brain biology of addiction. In August 2020, the FDA approved TMS for smoking cessation, the first such OK for a substance use disorder, which is “really exciting,” Hanlon said. Although there is some nuance when comparing substance use disorders, a primal mechanism generally defines addiction: the eternal competition between “top-down” executive control functions and “bottom-up” cravings. It’s the same process that is at work when one is deciding whether to eat another cookie or abstain — just exacerbated.

Hanlon is trying to figure out if the stop and go circuits are in the same place for all people, and whether neuromodulation should be used to strengthen top-down control or weaken bottom-up cravings. Just as brain stimulation can be used to disrupt cellular misfiring, it could also be a tool for reinforcing helpful brain functions, or for giving the addicted brain what it wants in order to curb substance use.

Evidence suggests many people with schizophrenia smoke cigarettes (a leading cause of early death for this population) because nicotine reduces the “hyperconnectivity” that characterizes the brains of people with the disease, said Heather Ward, a research fellow at Boston’s Beth Israel Deaconess Medical Center. She suspects TMS could mimic that effect, and therefore reduce cravings and some symptoms of the disease, and she hopes to prove that in a pilot study that is now enrolling patients.

If the scientific evidence proves out, clinicians say brain stimulation could be used alongside behavioral therapy and drug-based therapy to treat substance use disorders. “In the end, we’re going to need all three to help people stay sober,” Hanlon said. “We’re adding another tool to the physician’s toolbox.”

Decoding the mysteries of pain
Afavorable outcome to the ongoing research, one that would fling the doors to brain stimulation wide open for patients with myriad disorders, is far from guaranteed. Chronic pain researchers know that firsthand.

Chronic pain, among the most mysterious and hard-to-study medical phenomena, was the first use for which the FDA approved deep brain stimulation, said Prasad Shirvalkar, an assistant professor of anesthesiology at UCSF. But when studies didn’t pan out after a year, the FDA retracted its approval.

Shirvalkar is working with Starr and neurosurgeon Edward Chang on a profoundly complex problem: “decoding pain in the brain states, which has never been done,” as Starr told STAT.

Part of the difficulty of studying pain is that there is no objective way to measure it. Much of what we know about pain is from rudimentary surveys that ask patients to rate how much they’re hurting, on a scale from zero to 10.

Using implantable brain stimulation devices, the researchers ask patients for a 0-to-10 rating of their pain while recording up-and-down cycles of activity in the brain. They then use machine learning to compare the two streams of information and see what brain activity correlates with a patient’s subjective pain experience. Implantable devices let researchers collect data over weeks and months, instead of basing findings on small snippets of information, allowing for a much richer analysis.

 

Related News

View more

Sign Up for Electricity Forum’s Newsletter

Stay informed with our FREE Newsletter — get the latest news, breakthrough technologies, and expert insights, delivered straight to your inbox.

Electricity Today T&D Magazine Subscribe for FREE

Stay informed with the latest T&D policies and technologies.
  • Timely insights from industry experts
  • Practical solutions T&D engineers
  • Free access to every issue

Download the 2025 Electrical Training Catalog

Explore 50+ live, expert-led electrical training courses –

  • Interactive
  • Flexible
  • CEU-cerified