Orecon to test wave-energy conversion buoy off English coast

By Industrial Info Resources


NFPA 70e Training

Our customized live online or in‑person group training can be delivered to your staff at your location.

  • Live Online
  • 6 hours Instructor-led
  • Group Training Available
Regular Price:
$199
Coupon Price:
$149
Reserve Your Seat Today
Orecon Limited, a spinoff company from Plymouth University, has developed a wave-energy conversion buoy that utilizes three multi-resonant chambers to provide renewable energy.

Orecon has raised more than $19.5 million to fund the construction and deployment of the project, its first commercial 1.5-megawatt (MW) offshore buoy, southwest of Hayle, England, in conjunction with the South West of England Regional Development Agency's Wave Hub Project.

The buoy utilizes multiple oscillating water columns that drive air turbo-generator sets. The buoys are about 40 meters in diameter and rise more than 5 meters over waves. All of the machinery is above the water. Construction of the first unit is expected to take eight months.

After the prototype is complete, Orecon expects following units to be built in about five months. Deployment is quick and depends a lot on how far the buoys are transported. The actual installation will take about three and a half days: three days to install the six anchors and another 12 hours to attach the buoy.

Construction of the first unit is expected to begin in second quarter of 2009 with deployment set for April 2010.

Related News

Canadian Scientists say power utilities need to adapt to climate change

Canada Power Grid Climate Resilience integrates extreme weather planning, microgrids, battery storage, renewable energy, vegetation management, and undergrounding to reduce outages, harden infrastructure, modernize utilities, and safeguard reliability during storms, ice events, and wildfires.

 

Key Points

Canada's grid resilience hardens utilities against extreme weather using microgrids, storage, renewables, and upgrades.

✅ Grid hardening: microgrids, storage, renewable integration

✅ Vegetation management reduces storm-related line contact

✅ Selective undergrounding where risk and cost justify

 

The increasing intensity of storms that lead to massive power outages highlights the need for Canada’s electrical utilities to be more robust and innovative, climate change scientists say.

“We need to plan to be more resilient in the face of the increasing chances of these events occurring,” University of New Brunswick climate change scientist Louise Comeau said in a recent interview.

The East Coast was walloped this week by the third storm in as many days, with high winds toppling trees and even part of a Halifax church steeple, underscoring the value of storm-season electrical safety tips for residents.

Significant weather events have consistently increased over the last five years, according to the Canadian Electricity Association (CEA), which has tracked such events since 2003.

#google#

Nearly a quarter of total outage hours nationally in 2016 – 22 per cent – were caused by two ice storms, a lightning storm, and the Fort McMurray fires, which the CEA said may or may not be classified as a climate event.

“It (climate change) is putting quite a lot of pressure on electricity companies coast to coast to coast to improve their processes and look for ways to strengthen their systems in the face of this evolving threat,” said Devin McCarthy, vice president of public affairs and U.S. policy for the CEA, which represents 40 utilities serving 14 million customers.

The 2016 figures – the most recent available – indicate the average Canadian customer experienced 3.1 outages and 5.66 hours of outage time.

McCarthy said electricity companies can’t just build their systems to withstand the worst storm they’d dealt with over the previous 30 years. They must prepare for worse, and address risks highlighted by Site C dam stability concerns as part of long-term planning.

“There needs to be a more forward looking approach, climate science led, that looks at what do we expect our system to be up against in the next 20, 30 or 50 years,” he said.

Toronto Hydro is either looking at or installing equipment with extreme weather in mind, Elias Lyberogiannis, the utility’s general manager of engineering, said in an email.

That includes stainless steel transformers that are more resistant to corrosion, and breakaway links for overhead service connections, which allow service wires to safely disconnect from poles and prevents damage to service masts.

Comeau said smaller grids, tied to electrical systems operated by larger utilities, often utilize renewable energy sources such as solar and wind as well as battery storage technology to power collections of buildings, homes, schools and hospitals.

“Capacity to do that means we are less vulnerable when the central systems break down,” Comeau said.

Nova Scotia Power recently announced an “intelligent feeder” pilot project, which involves the installation of Tesla Powerwall storage batteries in 10 homes in Elmsdale, N.S., and a large grid-sized battery at the local substation. The batteries are connected to an electrical line powered in part by nearby wind turbines.

The idea is to test the capability of providing customers with back-up power, while collecting data that will be useful for planning future energy needs.

Tony O’Hara, NB Power’s vice-president of engineering, said the utility, which recently sounded an alarm on copper theft, was in the late planning stages of a micro-grid for the western part of the province, and is also studying the use of large battery storage banks.

“Those things are coming, that will be an evolution over time for sure,” said O’Hara.

Some solutions may be simpler. Smaller utilities, like Nova Scotia Power, are focusing on strengthening overhead systems, mainly through vegetation management, while in Ontario, Hydro One and Alectra are making major investments to strengthen infrastructure in the Hamilton area.

“The number one cause of outages during storms, particularly those with high winds and heavy snow, is trees making contact with power lines,” said N.S. Power’s Tiffany Chase.

The company has an annual budget of $20 million for tree trimming and removal.

“But the reality is with overhead infrastructure, trees are going to cause damage no matter how robust the infrastructure is,” said Matt Drover, the utility’s director for regional operations.

“We are looking at things like battery storage and a variety of other reliability programs to help with that.”

NB Power also has an increased emphasis on tree trimming and removal, and now spends $14 million a year on it, up from $6 million prior to 2014.

O’Hara said the vegetation program has helped drive the average duration of power outages down since 2014 from about three hours to two hours and 45 minutes.

Some power cables are buried in both Nova Scotia and New Brunswick, mostly in urban areas. But both utilities maintain it’s too expensive to bury entire systems – estimated at $1 million per kilometre by Nova Scotia Power.

The issue of burying more lines was top of mind in Toronto following a 2013 ice storm, but that’s city’s utility also rejected the idea of a large-scale underground system as too expensive – estimating the cost at around $15 billion, while Ontario customers have seen Hydro One delivery rates rise in recent adjustments.

“Having said that, it is prudent to do so for some installations depending on site specific conditions and the risks that exist,” Lyberogiannis said.

Comeau said lowering risks will both save money and disruption to people’s lives.

“We can’t just do what we used to do,” said Xuebin Zhang, a senior climate change scientist at Environment and Climate Change Canada.

“We have to build in management risk … this has to be a new norm.”

 

Related News

View more

New England Emergency fuel stock to cost millions

Inventoried Energy Program pays ISO-NE generators for fuel security to boost winter reliability, with FERC approval, covering fossil, nuclear, hydropower, and batteries, complementing capacity markets to enhance grid resilience during severe cold snaps.

 

Key Points

ISO-NE program paying generators to hold fuel or energy reserves for emergencies, boosting winter reliability.

✅ FERC-approved stopgap for 2023 and 2024 winter seasons

✅ Pays for on-site fuel or stored energy during cold-trigger events

✅ Open to fossil, nuclear, hydro, batteries; limited gas participation

 

Electricity ratepayers in New England will pay tens of millions of dollars to fossil fuel and nuclear power plants later this decade under a program that proponents say is needed to keep the lights on during severe winters but which critics call a subsidy with little benefit to consumers or the grid, even as Connecticut is pushing a market overhaul across the region.

Last week the Federal Energy Regulatory Commission said ISO-New England, which runs the six-state power grid, can create what it calls the Inventoried Energy Program or IEP. This basically will pay certain power plants to stockpile of fuel for use in emergencies during two upcoming winters as longer-term solutions are developed.

The federal commission called it a reasonable short-term solution to avoid brownouts which doesn’t favor any given technology.

Not all agree, however, including FERC Commissioner Richard Glick, who wrote a fiery dissent to the other three commissioners.

“The program will hand out tens of millions of dollars to nuclear, coal and hydropower generators without any indication that those payments will cause the slightest change in those generators’ behavior,” Glick wrote. “Handing out money for nothing is a windfall, not a just and reasonable rate.”

The program is the latest reaction by ISO-NE to the winter of 2013-14 when New England almost saw brownouts because of a shortage of natural gas to create electricity during a pair of week-long deep freezes.

ISO-New England says the situation is more critical now because of the possible retirement of the gas-fired Mystic Generating Station in Massachusetts. As with closed nuclear plants such as Vermont Yankee and Pilgrim in Massachusetts, power plant owners say lower electricity prices, partly due to cheap renewables and partly to stagnant demand, means they can’t be profitable just by selling power.

Programs like the IEP are meant to subsidize such plants – “incentivize” is the industry term – even though some argue there is no need to subsidize nuclear in deregulated markets so they’ll stay open if they are needed.

The IEP approved last week will be applied to the winters of 2023 and 2024, after a different subsidy program expires. It sets prices, despite warnings about rushing pricing changes from industry groups, for stocking certain amounts of fuel and payments during any “trigger” event, defined as a day when the average of high and low temperatures at Bradley International Airport in Connecticut is no more than 17 degrees Fahrenheit.

These payments will be made on top of a complex system of grid auctions used to decide how much various plants get paid for generating electricity at which times.

ISO-NE estimates the new program will cost between $102 million and $148 million each winter, depending on weather and market conditions.

It says the payments are open to plants that burn oil, coal, nuclear fuel, wood chips or trash; utility-scale battery storage facilities; and hydropower dams “that store water in a pond or reservoir.” Natural gas plants can participate if they guarantee to have fuel available, but that seems less likely because of winter heating contracts.

A major complaint and groups that filed petitions opposing the project is that ISO-NE presented little supporting evidence of how prices, amount and overall cost were determined. ISO-NE argued that there wasn’t time for such analysis before the Mystic shutdown, and FERC agreed.

“The proposal is a step in the right direction … while ISO-NE finishes developing a long-term market solution,” the commission said in its ruling.

The program is the latest example of complexities facing the nation’s electricity system evolves in the face of solar and wind power, which produce electricity so cheaply that they can render traditional power uneconomic but which can’t always produce power on demand, prompting discussions of Texas grid improvements among policymakers. Another major factor is climate change, which has increased the pressure to support renewable alternatives to plants that burn fossil fuels, as well as stagnant electricity demand caused by increased efficiency.

Opponents, including many environmental groups, say electricity utilities and regulators are too quick to prop up existing systems, as the 145-mile Maine transmission line debate shows, built when electricity was sent one way from a few big plants to many customers. They argue that to combat climate change as well as limit cost, the emphasis must be on developing “non-wire alternatives” such as smart systems for controlling demand, in order to take advantage of the current system in which electricity goes two ways, such as from rooftop solar back into the grid.

 

Related News

View more

Global use of coal-fired electricity set for biggest fall this year

Global Coal Power Decline 2019 signals a record fall in coal-fired electricity as China plateaus, India dips, and the EU and US accelerate renewables, curbing carbon emissions and advancing the global energy transition.

 

Key Points

A record 2019 drop in global coal power as renewables rise and demand slows across China, India, the EU, and the US.

✅ 3% global fall in coal-fired electricity in 2019.

✅ China plateaus; India declines for first time in decades.

✅ EU and US shift to renewables and gas, cutting emissions.

 

The world’s use of coal-fired electricity is on track for its biggest annual fall on record this year after more than four decades of near-uninterrupted growth that has stoked the global climate crisis.

Data shows that coal-fired electricity is expected to fall by 3% in 2019, or more than the combined coal generation in Germany, Spain and the UK last year and could help stall the world’s rising carbon emissions this year.

The steepest global slump on record is likely to emerge in 2019 as India’s reliance on coal power falls for the first time in at least three decades this year, and China’s coal power demand plateaus, reflecting the broader global energy transition underway.

Both developing nations are using less coal-fired electricity due to slowing economic growth in Asia as well as the rise of cleaner energy alternatives. There is also expected to be unprecedented coal declines across the EU and the US as developed economies turn to clean forms of energy such as low-cost solar power to replace ageing coal plants.

In almost 40 years the world’s annual coal generation has fallen only twice before: in 2009, in the wake of the global financial crisis, and in 2015, following a slowdown in China’s coal plants amid rising levels of deadly air pollution.

The research was undertaken by the Centre for Research on Energy and Clean Air , the Institute for Energy Economics and Financial Analysis and the UK climate thinktank Sandbag.

The researchers found that China’s coal-fired power generation was flatlining, despite an increase in the number of coal plants being built, because they were running at record low rates. China builds the equivalent of one large new coal plant every two weeks, according to the report, but its coal plants run for only 48.6% of the time, compared with a global utilisation rate of 54% on average.

The findings come after a report from Global Energy Monitor found that the number of coal-fired power plants in the world is growing, because China is building new coal plants five times faster than the rest of the world is reducing their coal-fired power capacity.

The report found that in other countries coal-fired power capacity fell by 8GW in the 18 months to June but over the same period China increased its capacity by 42.9GW.

In a paper for the industry journal Carbon Brief, the researchers said: “A 3% reduction in power sector coal use could imply zero growth in global CO2 emissions, if emissions changes in other sectors mirror those during 2018.”

However, the authors of the report have warned that despite the record coal power slump the world’s use of coal remained far too high to meet the climate goals of the Paris agreement, and some countries are still seeing increases, such as Australia’s emissions rise amid increased pollution from electricity and transport.

The US – which is backing out of the Paris agreement – has made the deepest cuts to coal power of any developed country this year by shutting coal plants down in favour of gas power and renewable energy, with utilities such as Duke Energy facing investor pressure to disclose climate plans. By the end of August the US had reduced coal by almost 14% over the year compared with the same months in 2018.

The EU reported a record slump in coal-fired electricity use in the first half of the year of almost a fifth compared with the same months last year. This trend is expected to accelerate over the second half of the year to average a 23% fall over 2019 as a whole. The EU is using less coal power in favour of gas-fired electricity – which can have roughly half the carbon footprint of coal – and renewable energy, helped by policies such as the UK carbon tax that have slashed coal-fired generation.

We will not stay quiet on the escalating climate crisis and we recognise it as the defining issue of our lifetimes. The Guardian will give global heating, wildlife extinction and pollution the urgent attention they demand. Our independence means we can interrogate inaction by those in power. It means Guardian reporting will always be driven by scientific facts, never by commercial or political interests.

We believe that the problems we face on the climate crisis are systemic and that fundamental societal change is needed. We will keep reporting on the efforts of individuals and communities around the world who are fearlessly taking a stand for future generations and the preservation of human life on earth. We want their stories to inspire hope. We will also report back on our own progress as an organisation, as we take important steps to address our impact on the environment.

 

Related News

View more

Hydro-Quebec adopts a corporate structure designed to optimize the energy transition

Hydro-Québec Unified Corporate Structure advances the energy transition through integrated planning, strategy, infrastructure delivery, and customer operations, aligning generation, transmission, and distribution while ensuring non-discriminatory grid access and agile governance across assets and behind-the-meter technologies.

 

Key Points

A cross-functional model aligning strategy, planning, and operations to accelerate Quebec's low-carbon transition.

✅ Four groups: strategy, planning, infrastructure, operations.

✅ Ensures non-discriminatory transmission access compliance.

✅ No staff reductions; staged implementation from Feb 28.

 

As Hydro-Que9bec prepares to play a key role in the transition to a low-carbon economy, the complexity of the work to be done in the coming decade requires that it develop a global vision of its operations and assets, from the drop of water entering its turbines to the behind-the-meter technologies marketed by its subsidiary Hilo. This has prompted the company to implement a new corporate structure that will maximize cooperation and agility, including employee-led pandemic support that builds community trust, making it possible to bring about the energy transition efficiently with a view to supporting the realization of Quebecers’ collective aspirations.

Toward a single, unified Hydro

Hydro-Québec’s core mission revolves around four major functions that make up the company’s value chain, alongside policy choices like peak-rate relief during emergencies. These functions consist of:

  1. Developing corporate strategies based on current and future challenges and business opportunities
  2. Planning energy needs and effectively allocating financial capital, factoring in pandemic-related revenue impacts on demand and investment timing
  3. Designing and building the energy system’s multiple components
  4. Operating assets in an integrated fashion and providing the best customer experience by addressing customer choice and flexibility expectations across segments.

Accordingly, Hydro-Québec will henceforth comprise four groups respectively in charge of strategy and development; integrated energy needs planning; infrastructure and the energy system; and operations and customer experience, including billing accuracy concerns that can influence satisfaction. To enable the company to carry out its mission, these groups will be able to count on the support of other groups responsible for corporate functions.

Across Canada, leadership changes at other utilities highlight the need to rebuild ties with governments and investors, as seen with Hydro One's new CEO in Ontario.

“For over 20 years, Hydro-Québec has been operating in a vertical structure based on its main activities, namely power generation, transmission and distribution. This approach must now give way to one that provides a cross-functional perspective allowing us to take informed decisions in light of all our needs, as well as those of our customers and the society we have the privilege to serve,” explained Hydro-Québec’s President and Chief Executive Officer, Sophie Brochu.

In terms of gender parity, the management team continues to include several men and women, thus ensuring a diversity of viewpoints.

Hydro-Québec’s new structure complies with the regulatory requirements of the North American power markets, in particular with regard to the need to provide third parties with non-discriminatory access to the company’s transmission system. The frameworks in place ensure that certain functions remain separate and help coordinate responses to operational events such as urban distribution outages that challenge continuity of service.

These changes, which will be implemented gradually as of Monday, February 28, do not aim to achieve any staff reductions.

 

Related News

View more

Jolting the brain's circuits with electricity is moving from radical to almost mainstream therapy

Brain Stimulation is transforming neuromodulation, from TMS and DBS to closed loop devices, targeting neural circuits for addiction, depression, Parkinsons, epilepsy, and chronic pain, powered by advanced imaging, AI analytics, and the NIH BRAIN Initiative.

 

Key Points

Brain stimulation uses pulses to modulate neural circuits, easing symptoms in depression, Parkinsons, and epilepsy.

✅ Noninvasive TMS and invasive DBS modulate specific brain circuits

✅ Closed loop systems adapt stimulation via real time biomarker detection

✅ Emerging uses: addiction, depression, Parkinsons, epilepsy, chronic pain

 

In June 2015, biology professor Colleen Hanlon went to a conference on drug dependence. As she met other researchers and wandered around a glitzy Phoenix resort’s conference rooms to learn about the latest work on therapies for drug and alcohol use disorders, she realized that out of the 730 posters, there were only two on brain stimulation as a potential treatment for addiction — both from her own lab at Wake Forest School of Medicine.

Just four years later, she would lead 76 researchers on four continents in writing a consensus article about brain stimulation as an innovative tool for addiction. And in 2020, the Food and Drug Administration approved a transcranial magnetic stimulation device to help patients quit smoking, a milestone for substance use disorders.

Brain stimulation is booming. Hanlon can attend entire conferences devoted to the study of what electrical currents do—including how targeted stimulation can improve short-term memory in older adults—to the intricate networks of highways and backroads that make up the brain’s circuitry. This expanding field of research is slowly revealing truths of the brain: how it works, how it malfunctions, and how electrical impulses, precisely targeted and controlled, might be used to treat psychiatric and neurological disorders.

In the last half-dozen years, researchers have launched investigations into how different forms of neuromodulation affect addiction, depression, loss-of-control eating, tremor, chronic pain, obsessive compulsive disorder, Parkinson’s disease, epilepsy, and more. Early studies have shown subtle electrical jolts to certain brain regions could disrupt circuit abnormalities — the miscommunications — that are thought to underlie many brain diseases, and help ease symptoms that persist despite conventional treatments.

The National Institute of Health’s massive BRAIN Initiative put circuits front and center, distributing $2.4 billion to researchers since 2013 to devise and use new tools to observe interactions between brain cells and circuits. That, in turn, has kindled interest from the private sector. Among the advances that have enhanced our understanding of how distant parts of the brain talk with one another are new imaging technology and the use of machine learning, much as utilities use AI to adapt to shifting electricity demand, to interpret complex brain signals and analyze what happens when circuits go haywire.

Still, the field is in its infancy, and even therapies that have been approved for use in patients with, for example, Parkinson’s disease or epilepsy, help only a minority of patients, and in a world where electricity drives pandemic readiness expectations can outpace evidence. “If it was the Bible, it would be the first chapter of Genesis,” said Michael Okun, executive director of the Norman Fixel Institute for Neurological Diseases at University of Florida Health.

As brain stimulation evolves, researchers face daunting hurdles, and not just scientific ones. How will brain stimulation become accessible to all the patients who need it, given how expensive and invasive some treatments are? Proving to the FDA that brain stimulation works, and does so safely, is complicated and expensive. Even with a swell of scientific momentum and an influx of funding, the agency has so far cleared brain stimulation for only a handful of limited conditions. Persuading insurers to cover the treatments is another challenge altogether. And outside the lab, researchers are debating nascent issues, such as the ethics of mind control, the privacy of a person’s brain data—concerns that echo efforts to develop algorithms to prevent blackouts during rising ransomware threats—and how to best involve patients in the study of the human brain’s far-flung regions.

Neurologist Martha Morrell is optimistic about the future of brain stimulation. She remembers the shocked reactions of her colleagues in 2004 when she left full-time teaching at Stanford (she still has a faculty appointment as a clinical professor of neurology) to direct clinical trials at NeuroPace, then a young company making neurostimulator systems to potentially treat epilepsy patients.

Related: Once a last resort, this pain therapy is getting a new life amid the opioid crisis
“When I started working on this, everybody thought I was insane,” said Morrell. Nearly 20 years in, she sees a parallel between the story of jolting the brain’s circuitry and that of early implantable cardiac devices, such as pacemakers and defibrillators, which initially “were used as a last option, where all other medications have failed.” Now, “the field of cardiology is very comfortable incorporating electrical therapy, device therapy, into routine care. And I think that’s really where we’re going with neurology as well.”


Reaching a ‘slope of enlightenment’
Parkinson’s is, in some ways, an elder in the world of modern brain stimulation, and it shows the potential as well as the limitations of the technology. Surgeons have been implanting electrodes deep in the brains of Parkinson’s patients since the late 1990s, and in people with more advanced disease since the early 2000s.

In that time, it’s gone through the “hype cycle,” said Okun, the national medical adviser to the Parkinson’s Foundation since 2006. Feverish excitement and overinflated expectations have given way to reality, bringing scientists to a “slope of enlightenment,” he said. They have found deep brain stimulation to be very helpful for some patients with Parkinson’s, rendering them almost symptom-free by calming the shaking and tremors that medications couldn’t. But it doesn’t stop the progression of the disease, or resolve some of the problems patients with advanced Parkinson’s have walking, talking, and thinking.

In 2015, the same year Hanlon found only her lab’s research on brain stimulation at the addiction conference, Kevin O’Neill watched one finger on his left hand start doing something “funky.” One finger twitched, then two, then his left arm started tingling and a feeling appeared in his right leg, like it was about to shake but wouldn’t — a tremor.

“I was assuming it was anxiety,” O’Neill, 62, told STAT. He had struggled with anxiety before, and he had endured a stressful year: a separation, selling his home, starting a new job at a law firm in California’s Bay Area. But a year after his symptoms first began, O’Neill was diagnosed with Parkinson’s.

In the broader energy context, California has increasingly turned to battery storage to stabilize its strained grid.

Related: Psychiatric shock therapy, long controversial, may face fresh restrictions
Doctors prescribed him pills that promote the release of dopamine, to offset the death of brain cells that produce this messenger molecule in circuits that control movement. But he took them infrequently because he worried about insomnia as a side effect. Walking became difficult — “I had to kind of think my left leg into moving” — and the labor lawyer found it hard to give presentations and travel to clients’ offices.

A former actor with an outgoing personality, he developed social anxiety and didn’t tell his bosses about his diagnosis for three years, and wouldn’t have, if not for two workdays in summer 2018 when his tremors were severe and obvious.

O’Neill’s tremors are all but gone since he began deep brain stimulation last May, though his left arm shakes when he feels tense.

It was during that period that he learned about deep brain stimulation, at a support group for Parkinson’s patients. “I thought, ‘I will never let anybody fuss with my brain. I’m not going to be a candidate for that,’” he recalled. “It felt like mad scientist science fiction. Like, are you kidding me?”

But over time, the idea became less radical, as O’Neill spoke to DBS patients and doctors and did his own research, and as his symptoms worsened. He decided to go for it. Last May, doctors at the University of California, San Francisco surgically placed three metal leads into his brain, connected by thin cords to two implants in his chest, just near the clavicles. A month later, he went into the lab and researchers turned the device on.

“That was a revelation that day,” he said. “You immediately — literally, immediately — feel the efficacy of these things. … You go from fully symptomatic to non-symptomatic in seconds.”

When his nephew pulled up to the curb to pick him up, O’Neill started dancing, and his nephew teared up. The following day, O’Neill couldn’t wait to get out of bed and go out, even if it was just to pick up his car from the repair shop.

In the year since, O’Neill’s walking has gone from “awkward and painful” to much improved, and his tremors are all but gone. When he is extra frazzled, like while renovating and moving into his new house overlooking the hills of Marin County, he feels tense and his left arm shakes and he worries the DBS is “failing,” but generally he returns to a comfortable, tremor-free baseline.

O’Neill worried about the effects of DBS wearing off but, for now, he can think “in terms of decades, instead of years or months,” he recalled his neurologist telling him. “The fact that I can put away that worry was the big thing.”

He’s just one patient, though. The brain has regions that are mostly uniform across all people. The functions of those regions also tend to be the same. But researchers suspect that how brain regions interact with one another — who mingles with whom, and what conversation they have — and how those mixes and matches cause complex diseases varies from person to person. So brain stimulation looks different for each patient.

Related: New study revives a Mozart sonata as a potential epilepsy therapy
Each case of Parkinson’s manifests slightly differently, and that’s a bit of knowledge that applies to many other diseases, said Okun, who organized the nine-year-old Deep Brain Stimulation Think Tank, where leading researchers convene, review papers, and publish reports on the field’s progress each year.

“I think we’re all collectively coming to the realization that these diseases are not one-size-fits-all,” he said. “We have to really begin to rethink the entire infrastructure, the schema, the framework we start with.”

Brain stimulation is also used frequently to treat people with common forms of epilepsy, and has reduced the number of seizures or improved other symptoms in many patients. Researchers have also been able to collect high-quality data about what happens in the brain during a seizure — including identifying differences between epilepsy types. Still, only about 15% of patients are symptom-free after treatment, according to Robert Gross, a neurosurgery professor at Emory University in Atlanta.

“And that’s a critical difference for people with epilepsy. Because people who are symptom-free can drive,” which means they can get to a job in a place like Georgia, where there is little public transit, he said. So taking neuromodulation “from good to great,” is imperative, Gross said.


Renaissance for an ancient idea
Recent advances are bringing about what Gross sees as “almost a renaissance period” for brain stimulation, though the ideas that undergird the technology are millenia old. Neuromodulation goes back to at least ancient Egypt and Greece, when electrical shocks from a ray, called the “torpedo fish,” were recommended as a treatment for headache and gout. Over centuries, the fish zaps led to doctors burning holes into the brains of patients. Those “lesions” worked, somehow, but nobody could explain why they alleviated some patients’ symptoms, Okun said.

Perhaps the clearest predecessor to today’s technology is electroconvulsive therapy (ECT), which in a rudimentary and dangerous way began being used on patients with depression roughly 100 years ago, said Nolan Williams, director of the Brain Stimulation Lab at Stanford University.

Related: A new index measures the extent and depth of addiction stigma
More modern forms of brain stimulation came about in the United States in the mid-20th century. A common, noninvasive approach is transcranial magnetic stimulation, which involves placing an electromagnetic coil on the scalp to transmit a current into the outermost layer of the brain. Vagus nerve stimulation (VNS), used to treat epilepsy, zaps a nerve that contributes to some seizures.

The most invasive option, deep brain stimulation, involves implanting in the skull a device attached to electrodes embedded in deep brain regions, such as the amygdala, that can’t be reached with other stimulation devices. In 1997, the FDA gave its first green light to deep brain stimulation as a treatment for tremor, and then for Parkinson’s in 2002 and the movement disorder dystonia in 2003.

Even as these treatments were cleared for patients, though, what was happening in the brain remained elusive. But advanced imaging tools now let researchers peer into the brain and map out networks — a recent breakthrough that researchers say has propelled the field of brain stimulation forward as much as increased funding has, paralleling broader efforts to digitize analog electrical systems across industry. Imaging of both human brains and animal models has helped researchers identify the neuroanatomy of diseases, target brain regions with more specificity, and watch what was happening after electrical stimulation.

Another key step has been the shift from open-loop stimulation — a constant stream of electricity — to closed-loop stimulation that delivers targeted, brief jolts in response to a symptom trigger. To make use of the futuristic technology, labs need people to develop artificial intelligence tools, informed by advances in machine learning for the energy transition, to interpret large data sets a brain implant is generating, and to tailor devices based on that information.

“We’ve needed to learn how to be data scientists,” Morrell said.

Affinity groups, like the NIH-funded Open Mind Consortium, have formed to fill that gap. Philip Starr, a neurosurgeon and developer of implantable brain devices at the University of California at San Francisco Health system, leads the effort to teach physicians how to program closed-loop devices, and works to create ethical standards for their use. “There’s been extraordinary innovation after 20 years of no innovation,” he said.

The BRAIN Initiative has been critical, several researchers told STAT. “It’s been a godsend to us,” Gross said. The NIH’s Brain Research through Advancing Innovative Neurotechnologies (BRAIN) Initiative was launched in 2013 during the Obama administration with a $50 million budget. BRAIN now spends over $500 million per year. Since its creation, BRAIN has given over 1,100 awards, according to NIH data. Part of the initiative’s purpose is to pair up researchers with medical technology companies that provide human-grade stimulation devices to the investigators. Nearly three dozen projects have been funded through the investigator-devicemaker partnership program and through one focused on new implantable devices for first-in-human use, according to Nick Langhals, who leads work on neurological disorders at the initiative.

The more BRAIN invests, the more research is spawned. “We learn more about what circuits are involved … which then feeds back into new and more innovative projects,” he said.

Many BRAIN projects are still in early stages, finishing enrollment or small feasibility studies, Langhals said. Over the next couple of years, scientists will begin to see some of the fruits of their labor, which could lead to larger clinical trials, or to companies developing more refined brain stimulation implants, Langhals said.

Money from the National Institutes of Mental Health, as well as the NIH’s Helping to End Addiction Long-term (HEAL), has similarly sweetened the appeal of brain stimulation, both for researchers and industry. “A critical mass” of companies interested in neuromodulation technology has mushroomed where, for two decades, just a handful of companies stood, Starr said.

More and more, pharmaceutical and digital health companies are looking at brain stimulation devices “as possible products for their future,” said Linda Carpenter, director of the Butler Hospital TMS Clinic and Neuromodulation Research Facility.


‘Psychiatry 3.0’
The experience with using brain stimulation to stop tremors and seizures inspired psychiatrists to begin exploring its use as a potentially powerful therapy for healing, or even getting ahead of, mental illness.

In 2008, the FDA approved TMS for patients with major depression who had tried, and not gotten relief from, drug therapy. “That kind of opened the door for all of us,” said Hanlon, a professor and researcher at the Center for Research on Substance Use and Addiction at Wake Forest School of Medicine. The last decade saw a surge of research into how TMS could be used to reset malfunctioning brain circuits involved in anxiety, depression, obsessive-compulsive disorder, and other conditions.

“We’re certainly entering into what a lot of people are calling psychiatry 3.0,” Stanford’s Williams said. “Whereas the first iteration was Freud and all that business, the second one was the psychopharmacology boom, and this third one is this bit around circuits and stimulation.”

Drugs alleviate some patients’ symptoms while simultaneously failing to help many others, but psychopharmacology clearly showed “there’s definitely a biology to this problem,” Williams said — a biology that in some cases may be more amenable to a brain stimulation.

Related: Largest psilocybin trial finds the psychedelic is effective in treating serious depression
The exact mechanics of what happens between cells when brain circuits … well, short-circuit, is unclear. Researchers are getting closer to finding biomarkers that warn of an incoming depressive episode, or wave of anxiety, or loss of impulse control. Those brain signatures could be different for every patient. If researchers can find molecular biomarkers for psychiatric disorders — and find ways to preempt those symptoms by shocking particular brain regions — that would reshape the field, Williams said.

Not only would disease-specific markers help clinicians diagnose people, but they could help chip away at the stigma that paints mental illness as a personal or moral failing instead of a disease. That’s what happened for epilepsy in the 1960s, when scientific findings nudged the general public toward a deeper understanding of why seizures happen, and it’s “the same trajectory” Williams said he sees for depression.

His research at the Stanford lab also includes work on suicide, and obsessive-compulsive disorder, which the FDA said in 2018 could be treated using noninvasive TMS. Williams considers brain stimulation, with its instantaneity, to be a potential breakthrough for urgent psychiatric situations. Doctors know what to do when a patient is rushed into the emergency room with a heart attack or a stroke, but there is no immediate treatment for psychiatric emergencies, he said. Williams wonders: What if, in the future, a suicidal patient could receive TMS in the emergency room and be quickly pulled out of their depressive mental spiral?

Researchers are also actively investigating the brain biology of addiction. In August 2020, the FDA approved TMS for smoking cessation, the first such OK for a substance use disorder, which is “really exciting,” Hanlon said. Although there is some nuance when comparing substance use disorders, a primal mechanism generally defines addiction: the eternal competition between “top-down” executive control functions and “bottom-up” cravings. It’s the same process that is at work when one is deciding whether to eat another cookie or abstain — just exacerbated.

Hanlon is trying to figure out if the stop and go circuits are in the same place for all people, and whether neuromodulation should be used to strengthen top-down control or weaken bottom-up cravings. Just as brain stimulation can be used to disrupt cellular misfiring, it could also be a tool for reinforcing helpful brain functions, or for giving the addicted brain what it wants in order to curb substance use.

Evidence suggests many people with schizophrenia smoke cigarettes (a leading cause of early death for this population) because nicotine reduces the “hyperconnectivity” that characterizes the brains of people with the disease, said Heather Ward, a research fellow at Boston’s Beth Israel Deaconess Medical Center. She suspects TMS could mimic that effect, and therefore reduce cravings and some symptoms of the disease, and she hopes to prove that in a pilot study that is now enrolling patients.

If the scientific evidence proves out, clinicians say brain stimulation could be used alongside behavioral therapy and drug-based therapy to treat substance use disorders. “In the end, we’re going to need all three to help people stay sober,” Hanlon said. “We’re adding another tool to the physician’s toolbox.”

Decoding the mysteries of pain
Afavorable outcome to the ongoing research, one that would fling the doors to brain stimulation wide open for patients with myriad disorders, is far from guaranteed. Chronic pain researchers know that firsthand.

Chronic pain, among the most mysterious and hard-to-study medical phenomena, was the first use for which the FDA approved deep brain stimulation, said Prasad Shirvalkar, an assistant professor of anesthesiology at UCSF. But when studies didn’t pan out after a year, the FDA retracted its approval.

Shirvalkar is working with Starr and neurosurgeon Edward Chang on a profoundly complex problem: “decoding pain in the brain states, which has never been done,” as Starr told STAT.

Part of the difficulty of studying pain is that there is no objective way to measure it. Much of what we know about pain is from rudimentary surveys that ask patients to rate how much they’re hurting, on a scale from zero to 10.

Using implantable brain stimulation devices, the researchers ask patients for a 0-to-10 rating of their pain while recording up-and-down cycles of activity in the brain. They then use machine learning to compare the two streams of information and see what brain activity correlates with a patient’s subjective pain experience. Implantable devices let researchers collect data over weeks and months, instead of basing findings on small snippets of information, allowing for a much richer analysis.

 

Related News

View more

Data Center Boom Poses a Power Challenge for U.S. Utilities

U.S. Data Center Power Demand is straining electric utilities and grid reliability as AI, cloud computing, and streaming surge, driving transmission and generation upgrades, demand response, and renewable energy sourcing amid rising electricity costs.

 

Key Points

The rising electricity load from U.S. data centers, affecting utilities, grid capacity, and energy prices.

✅ AI, cloud, and streaming spur hyperscale compute loads

✅ Grid upgrades: transmission, generation, and substations

✅ Demand response, efficiency, and renewables mitigate strain

 

U.S. electric utilities are facing a significant new challenge as the explosive growth of data centers puts unprecedented strain on power grids across the nation. According to a new report from Reuters, data centers' power demands are expected to increase dramatically over the next few years, raising concerns about grid reliability and potential increases in electricity costs for businesses and consumers.


What's Driving the Data Center Surge?

The explosion in data centers is being fueled by several factors, with grid edge trends offering early context for these shifts:

  • Cloud Computing: The rise of cloud computing services, where businesses and individuals store and process data on remote servers, significantly increases demand for data centers.
  • Artificial Intelligence (AI): Data-hungry AI applications and machine learning algorithms are driving a massive need for computing power, accelerating the growth of data centers.
  • Streaming and Video Content: The growth of streaming platforms and high-definition video content requires vast amounts of data storage and processing, further boosting demand for data centers.


Challenges for Utilities

Data centers are notorious energy hogs. Their need for a constant, reliable supply of electricity places  heavy demand on the grid, making integrating AI data centers a complex planning challenge, often in regions where power infrastructure wasn't designed for such large loads. Utilities must invest significantly in transmission and generation capacity upgrades to meet the demand while ensuring grid stability.

Some experts warn that the growth of data centers could lead to brownouts or outages, as a U.S. blackout study underscores ongoing risks, especially during peak demand periods in areas where the grid is already strained. Increased electricity demand could also lead to price hikes, with utilities potentially passing the additional costs onto consumers and businesses.


Sustainable Solutions Needed

Utility companies, governments, and the data center industry are scrambling to find sustainable solutions, including using AI to manage demand initiatives across utilities, to mitigate these challenges:

  • Energy Efficiency: Data center operators are investing in new cooling and energy management solutions to improve energy efficiency. Some are even exploring renewable energy sources like onsite solar and wind power.
  • Strategic Placement: Authorities are encouraging the development of data centers in areas with abundant renewable energy and access to existing grid infrastructure. This minimizes the need for expensive new transmission lines.
  • Demand Flexibility: Utility companies are experimenting with programs as part of a move toward a digital grid architecture to incentivize data centers to reduce their power consumption during peak demand periods, which could help mitigate power strain.


The Future of the Grid

The rapid growth of data centers exemplifies the significant challenges facing the aging U.S. electrical grid, with a recent grid report card highlighting dangerous vulnerabilities. It highlights the need for a modernized power infrastructure, capable of accommodating increasing demand spurred by new technologies while addressing climate change impacts that threaten reliability and affordability.  The question for utilities, as well as data center operators, is how to balance the increasing need for computing power with the imperative of a sustainable and reliable energy future.

 

Related News

View more

Sign Up for Electricity Forum’s Newsletter

Stay informed with our FREE Newsletter — get the latest news, breakthrough technologies, and expert insights, delivered straight to your inbox.

Electricity Today T&D Magazine Subscribe for FREE

Stay informed with the latest T&D policies and technologies.
  • Timely insights from industry experts
  • Practical solutions T&D engineers
  • Free access to every issue

Live Online & In-person Group Training

Advantages To Instructor-Led Training – Instructor-Led Course, Customized Training, Multiple Locations, Economical, CEU Credits, Course Discounts.

Request For Quotation

Whether you would prefer Live Online or In-Person instruction, our electrical training courses can be tailored to meet your company's specific requirements and delivered to your employees in one location or at various locations.