LifeVillage demonstration project planned in Cote dÂ’Ivoire

By Business Wire


NFPA 70e Training

Our customized live online or in‑person group training can be delivered to your staff at your location.

  • Live Online
  • 6 hours Instructor-led
  • Group Training Available
Regular Price:
$199
Coupon Price:
$149
Reserve Your Seat Today
Envision Solar International is launching its latest product designed to integrate clean power into buildings and communities. LifeVillage, a modular self-contained “infrastructure in a crate” that can provide critical utilities in even the most remote areas of the planet.

A demonstration project featuring the LifeVillage is planned for installation in Cote dÂ’Ivoire, on the west coast of Africa.

The LifeVillage concept includes Envision SolarÂ’s LifePort and LifePod modular steel-frame structures that include solar panels that provide between 1.5 and 4 kW of electricity, and also includes energy storage in the form of batteries to provide electricity outside of the hours when the sun is shining. Finally, the LifeVillage includes a self-contained water treatment unit that will capture and reuse water. The LifeVillage can be shipped in two shipping containers for assembly on-site.

LifeVillage creators Robert Noble, AIA, LEED AP, and William Adelson, AIA, are sustainable architects based in La Jolla, California, who founded Envision Solar to create architectural solutions to facilitate the use of clean energy.

“We were captivated by what occurred in Africa with the telecommunications industry,” notes Adelson. “With the use of mobile telephones, the system is highly decentralized; effectively, satellite technology evolved more quickly than did the ability to install infrastructure in Africa.” Noble adds, “The LifePort and LifePod products were always intended to be enabling technologies for photovoltaics and other clean technology. The LifeVillage, by including energy storage and water treatment, adds utility for an off-grid solution for remote areas without access to traditional infrastructure.”

“The LifeVillage structures can be used to create medical clinics, schools, housing for doctors and teachers, cell, mobile telephone, radio, TV, WiFi, and WiMax facilities and transmission. The combination of international building code engineered buildings, solar energy generation and battery storage provides all that is needed to improve health, safety, education, economics and general quality of life around every project or team has planned,” Noble says.

Each structure can be assembled and activated within two weeks of its arrival at the site. “The goal is to plant LifeVillages anywhere decentralized critical infrastructure is needed, starting in Western and Central Africa,” comments Noble. Adelson adds, “We hope that the LifeVillage product will provide renewable energy and sustainable infrastructure for growth and advancement of underdeveloped areas.”

For the demonstration project in Cote dÂ’Ivoire, Envision Solar partnered with ZBB Energy Corporation, a Wisconsin-based energy storage innovator. ZBB will provide their long-lasting ZESS-50TM zinc energy storage system. Other potential partners include Nucon Steel for the structure framing, Kyocera for the solar panels, McElroy Roofing for the standing seam metal roof and Worldwater & Solar Technologies for the water purification and pumping components.

The demonstration project will provide critical community buildings, powered by solar energy with battery support, and with a self-contained water treatment system to allow occupants to recycle and reuse water. A prototype will be constructed in the United States and tested prior to deployment at one hundred villages in Cote dÂ’Ivoire.

The Cote dÂ’Ivoire project, at the specific request of the countryÂ’s Ministry of Energy, will incorporate a small medical clinic and a schoolhouse, including the housing for a resident doctor or nurse and the housing for a resident teacher. Envision Solar is partnering with Scripps Health and Hospitals of San Diego for the implementation of the clinic and its components.

Related News

When paying $1 for a coal power plant is still paying too much

San Juan Generating Station eyed for $1 coal-plant sale, as Farmington and Acme propose CCS retrofit, meeting emissions caps and renewable mandates by selling captured CO2 for enhanced oil recovery via a nearby pipeline.

 

Key Points

A New Mexico coal plant eyed for $1 and a CCS retrofit to cut emissions and sell CO2 for enhanced oil recovery.

✅ $400M-$800M CCS retrofit; 90% CO2 capture target

✅ CO2 sales for enhanced oil recovery; 20-mile pipeline gap

✅ PNM projects shutdown savings; renewable and emissions mandates

 

One dollar. That’s how much an aging New Mexico coal plant is worth. And by some estimates, even that may be too much.

Acme Equities LLC, a New York-based holding company, is in talks to buy the 847-megawatt San Juan Generating Station for $1, after four of its five owners decided to shut it down. The fifth owner, the nearby city of Farmington, says it’s pursuing the bargain-basement deal with Acme to avoid losing about 1,600 direct and indirect jobs in the area amid a broader just transition debate for energy workers.

 

We respectfully disagree with the notion that the plant is not economical

Acme’s interest comes as others are looking to exit a coal industry that’s been plagued by costly anti-pollution regulations. Acme’s plan: Buy the plant "at a very low cost," invest in carbon capture technology that will lower emissions, and then sell the captured CO2 to oil companies, said Larry Heller, a principal at the holding group.

By doing this, Acme “believes we can generate an acceptable rate of return,” Heller said in an email.

Meanwhile, San Juan’s majority owner, PNM Resources Inc., offers a distinctly different view, echoing declining coal returns reported by other utilities. A 2022 shutdown will push ratepayers to other energy alternatives now being planned, saving them about $3 to $4 a month on average, PNM has said.

“We could not identify a solution that would make running San Juan Generating Station economical,” said Tom Fallgren, a PNM vice president, in an email.

The potential sale comes as a new clean-energy bill, supported by Governor Lujan Grisham, is working its way through the state legislature. It would require the state to get half of its power from renewable sources by 2030, and 100 percent by 2045, even as other jurisdictions explore small modular reactor strategies to meet future demand. At the same time, the legislation imposes an emissions cap that’s about 60 percent lower than San Juan’s current levels.

In response, Acme is planning to spend $400 million to $800 million to retrofit the facility with carbon capture and sequestration technology that would collect carbon dioxide before it’s released into the atmosphere, Heller said. That would put the facility into compliance with the pending legislation and, at the same time, help generate revenue for the plant.

The company estimates the system would cut emissions by as much as 90 percent, and the captured gas could be sold to oil companies, which uses it to enhance well recovery. The bottom line, according to Heller: “A winning financial formula.”

It’s a tricky formula at best. Carbon-capture technology has been controversial, even as new coal plant openings remain rare, expensive to install and unproven at scale. Additionally, to make it work at the San Juan plant, the company would need to figure out how to deliver the CO2 to customers since the nearest pipeline is about 20 miles (32 kilometers) away.

 

Reducing costs

Acme is also evaluating ways to reduce costs at San Juan, Heller said, including approaches seen at operators extending the life of coal plants under regulatory scrutiny, such as negotiating a cheaper coal-supply contract and qualifying for subsidies.

Farmington’s stake in the plant is less than 10 percent. But under terms of the partnership, the city — population 45,000 — can assume full control of San Juan should the other partners decide to pull out, mirroring policy debates over saving struggling nuclear plants in other regions. That’s given Farmington the legal authority to pursue the plant’s sale to Acme.

 

At the end of the day, nobody wants the energy

“We respectfully disagree with the notion that the plant is not economical,” Farmington Mayor Nate Duckett said by email. Ducket said he’s in better position than the other owners to assess San Juan’s importance “because we sit at Ground Zero.”

The city’s economy would benefit from keeping open both the plant and a nearby coal mine that feeds it, according to Duckett, with operations that contribute about $170 million annually to the local area.

While the loss of those jobs would be painful to some, Camilla Feibelman, a Sierra Club chapter director, is hard pressed to see a business case for keeping San Juan open, pointing to sector closures such as the Three Mile Island shutdown as evidence of shifting economics. The plant isn’t economical now, and would almost certainly be less so after investing the capital to add carbon-capture systems.

 

Related News

View more

Trump Is Seen Replacing Obama’s Power Plant Overhaul With a Tune-Up

Clean Power Plan Rollback signals EPA's shift to inside-the-fence efficiency at coal plants, emphasizing heat-rate improvements over sector-wide decarbonization, renewables, natural gas switching, demand-side efficiency, and carbon capture under Clean Air Act constraints.

 

Key Points

A policy shift by the EPA to replace broad emissions rules with plant-level efficiency standards, limiting CO2 cuts.

✅ Inside-the-fence heat-rate improvements at coal units

✅ Potential CO2 cuts limited to about 6% per plant

✅ Alternatives: fuel switching, renewables, carbon capture

 

President Barack Obama’s signature plan to reduce carbon dioxide emissions from electrical generation took years to develop and touched every aspect of power production and use, from smokestacks to home insulation.

The Trump administration is moving to scrap that plan and has signaled that any alternative it might adopt would take a much less expansive approach, possibly just telling utilities to operate their plants more efficiently.

That’s a strategy environmentalists say is almost certain to fall short of what’s needed.

The Trump administration is making "a wholesale retreat from EPA’s legal, scientific and moral obligation to address the threats of climate change," said former Environmental Protection Agency head Gina McCarthy, the architect of Obama’s Clean Power Plan.

President Donald Trump promised to rip up the initiative, echoing an end to the 'war on coal' message from his campaign, which mandated that states change their overall power mix, displacing coal-fired electricity with that from wind, solar and natural gas. The EPA is about to make it official, arguing the prior administration violated the Clean Air Act by requiring those broad changes to the electricity sector, according to a draft obtained by Bloomberg.

 

Possible Replacements

Later, the agency will also ask the public to weigh in on possible replacements. The administration will ask whether the EPA can or should develop a replacement rule -- and, if so, what actions can be mandated at individual power plants, though some policymakers favor a clean electricity standard to drive broader decarbonization.

 

Follow the Trump Administration’s Every Move

Such changes -- such as adding automation or replacing worn turbine seals -- would yield at most a 6 percent gain in efficiency, along with a corresponding fall in greenhouse gas emissions, according to earlier modeling by the Environmental Protection Agency and other analysts. That compares to the 32 percent drop in emissions by 2030 under Obama’s Clean Power Plan.

"In these existing plants, there’s only so many places to look for savings," said John Larsen, a director of the Rhodium Group, a research firm. "There’s only so many opportunities within a big spinning machine like that."

EPA Administrator Scott Pruitt outlined such an "inside-the-fence-line" approach in 2014, later embodied in the Affordable Clean Energy rule that industry groups backed, when he served as Oklahoma’s attorney general. Under his blueprint, states would set emissions standards after a detailed unit-by-unit analysis, looking at what reductions are possible given "the engineering limits of each facility."

The EPA has not decided whether it will promulgate a new rule at all, though it has also proposed new pollution limits for coal and gas plants in separate actions. In a forthcoming advanced notice of proposed rulemaking, the EPA will ask "what inside-the-fence-line options are legal, feasible and appropriate," according to a document obtained by Bloomberg.

Increased efficiency at a coal plant -- known as heat-rate improvement -- translates into fewer carbon-dioxide emissions per unit of electric power generated.

Under Obama, the EPA envisioned utilities would make some straightforward efficiency improvements at coal-fired power plants as the first step to comply with the Clean Power Plan. But that was expected to coincide with bigger, broader changes -- such as using more cleaner-burning natural gas, adding more renewable power projects and simply encouraging customers to do a better job turning down their thermostats and turning off their lights.

Obama’s EPA didn’t ask utilities to wring every ounce of efficiency they could out of coal-fired power plants because they saw the other options as cheaper. A plant-specific approach "would be grossly insufficient to address the public health and environmental impacts from CO2 emissions," Obama’s EPA said.

That approach might yield modest emissions reductions and, in a perverse twist, might event have the opposite effect. If utilities make coal plants more efficient -- thereby driving down operating costs -- they also make them more competitive with natural gas and renewables, "so they might run more and pollute more," said Conrad Schneider, advocacy director for the Clean Air Task Force.  

In a competitive market, any improvement in emissions produced for each unit of energy could be overwhelmed by an increase in electrical output, and debates over changes to electricity pricing under Trump and Perry added further uncertainty.

"A very minor heat rate improvement program would very likely result in increased emissions," Schneider said. "It might be worse than nothing."

Power companies want to get as much electricity as possible from every pound of coal, so they already have an incentive to keep efficiency high, said Jeff Holmstead, a former assistant EPA administrator now at Bracewell LLP. But an EPA regulation known as “new source review” has discouraged some from making those changes, for fear of triggering other pollution-control requirements, he said.

"If EPA’s replacement rule allows companies to improve efficiency without triggering new source review, it would make a real difference in terms of reducing carbon-dioxide emissions," Holmstead said.

 

Modest Impact

A plant-specific approach doesn’t have to mean modest impact.

"If you’re thinking about what can be done at the power plants by themselves, you don’t stop at efficiency tune-ups," said David Doniger, director of the Natural Resources Defense Council’s climate and clean air program. "You look at things like switching to natural gas or installing carbon capture and storage."

Requirements that facilities use carbon capture technology or swap in natural gas for coal could actually come close to hitting the same goals as in Obama’s Clean Power Plan -- if not go even further, Schneider said. They just would cost more.

The benefit of the Clean Power Plan "is that it enabled states to create programs and enabled companies to find a reduction strategy that was the most efficient and made the most sense for their own content," said Kathryn Zyla, deputy director of the Georgetown Climate Center. "And that flexibility was really important for the states and companies."

Some utilities, including Houston-based Calpine Corp., PG&E Corp. and Dominion Resources Inc., backed the Obama-era approach. And they are still pushing the Trump administration to be creative now.

"The Clean Power Plan achieved a thoughtful, balanced approach that gave companies and states considerable flexibility on how best to pursue that goal," said Melissa Lavinson, vice president of federal affairs and policy for PG&E’s Pacific Gas and Electric utility. “We look forward to working with the administration to devise an alternative plan for decarbonizing the U.S. economy."

 

Related News

View more

Is The Global Energy Transition On Track?

Global Decarbonization Strategies align renewable energy, electrification, clean air policies, IMO sulfur cap, LNG fuels, and the EU 2050 roadmap to cut carbon intensity and meet Paris Agreement targets via EVs and efficiency.

 

Key Points

Frameworks that cut emissions via renewables, EVs, efficiency, cleaner marine fuels, and EU policy roadmaps.

✅ Renewables scale as wind and solar outcompete new coal and gas.

✅ Electrification of transport grows as EV costs fall and charging expands.

✅ IMO 2020 sulfur cap and LNG shift cut shipping emissions and particulates.

 

Are we doing enough to save the planet? Silly question. The latest prognosis from the United Nations’ Intergovernmental Panel on Climate Change made for gloomy reading. Fundamental to the Paris Agreement is the target of keeping global average temperatures from rising beyond 2°C. The UN argues that radical measures are needed, and investment incentives for clean electricity are seen as critical by many leaders to accelerate progress to meet that target.

Renewable power and electrification of transport are the pillars of decarbonization. It’s well underway in renewables - the collapse in costs make wind and solar generation competitive with new build coal and gas.

Renewables’ share of the global power market will triple by 2040 from its current level of 6% according to our forecasts.

The consumption side is slower, awaiting technological breakthrough and informed by efforts in countries such as New Zealand’s electricity transition to replace fossil fuels with electricity. The lower battery costs needed for electric vehicles (EVs) to compete head on and displace internal combustion engine (ICE)  cars are some years away. These forces only start to have a significant impact on global carbon intensity in the 2030s. Our forecasts fall well short of the 2°C target, as does the IEA’s base case scenario.

Yet we can’t just wait for new technology to come to the rescue. There are encouraging signs that society sees the need to deal with a deteriorating environment. Three areas of focus came out in discussion during Wood Mackenzie’s London Energy Forum - unrelated, different in scope and scale, each pointing the way forward.

First, clean air in cities.  China has shown how to clean up a local environment quickly. The government reacted to poor air quality in Beijing and other major cities by closing older coal power plants and forcing energy intensive industry and the residential sector to shift away from coal. The country’s return on investment will include a substantial future health care dividend.

European cities are introducing restrictions on diesel cars to improve air quality. London’s 2017 “toxicity charge” is a precursor of an Ultra-Low Emission Zone in 2019, and aligns with UK net-zero policy changes that affect transport planning, to be extended across much of the city by 2020. Paris wants to ban diesel cars from the city centre by 2025 and ICE vehicles by 2030. Barcelona, Madrid, Hamburg and Stuttgart are hatching similar plans.

 

College Promise In California: Community-Wide Efforts To Support Student Success

Second, desulphurisation of global shipping. High sulphur fuel oil (HSFO) meets around 3.5 million barrels per day (b/d) of the total marine market of 5 million b/d. A maximum of 3.5% sulphur content is allowed currently. The International Maritime Organisation (IMO) implements a 0.5% limit on all shipping in 2020, dramatically reducing the release of sulphur oxides into the atmosphere.

Some ships will switch to very low sulphur fuel oil, of which only around 1.4 million b/d will be available in 2020. Others will have to choose between investing in scrubbers or buying premium-priced low sulphur marine gas oil.

Longer-term, lower carbon-intensity gas is a winner as liquefied natural gas becomes fuel of choice for many newbuilds. Marine LNG demand climbs from near zero to 50 million tonnes per annum (tpa) by 2040 on our forecasts, behind only China, India and Japan as a demand centre. LNG will displace over 1 million b/d of oil demand in shipping by 2040.

Third, Europe’s radical decarbonisation plans. Already in the vanguard of emissions reductions policy, the European Commission is proposing to reduce carbon emissions for new cars and vans by 30% by 2030 versus 2020. The targets come with incentives for car manufacturers linked to the uptake of EVs.

The 2050 roadmap, presently at the concept stage, envisages a far more demanding regime, with EU electricity plans for 2050 implying a much larger power system. The mooted 80% reduction in emissions compared with 1990 will embrace all sectors. Power and transport are already moving in this direction, but the legacy fuel mix in many other sectors will be disrupted, too.

Near zero-energy buildings and homes might be possible with energy efficiency improvements, renewables and heat pumps. Electrification, recycling and bioenergy could reduce fossil fuel use in energy intensive sectors like steel and aluminium, and Europe’s oil majors going electric illustrates how incumbents are adapting. Some sectors will cite the risk decarbonisation poses to Europe’s global competitiveness. If change is to come, industry will need to build new partnerships with society to meet these targets.

The 2050 roadmap signals the ambition and will be game changing for Europe if it is adopted. It would provide a template for a global roll out that would go a long way toward meeting UN’s concerns.

 

Related News

View more

Energize America: Invest in a smarter electricity infrastructure

Smart Grid Modernization unites distributed energy resources, energy storage, EV charging, advanced metering, and bidirectional power flows to upgrade transmission and distribution infrastructure for reliability, resilience, cybersecurity, and affordable, clean power.

 

Key Points

Upgrading grid hardware and software to integrate DERs, storage, and EVs for a reliable and affordable power system.

✅ Enables DER, storage, and EV integration with bidirectional flows

✅ Improves reliability, resilience, and grid cybersecurity

✅ Requires early investment in sensors, inverters, and analytics

 

Much has been written, predicted, and debated in recent years about the future of the electricity system. The discussion isn’t simply about fossil fuels versus renewables, as often dominates mainstream energy discourse. Rather, the discussion is focused on something much larger and more fundamental: the very design of how and where electricity should be generated, delivered, and consumed.

Central to this discussion are arguments in support of, or in opposition to, the traditional model versus that of the decentralized or “emerging” model. But this is a false choice. The only choice that needs making is how to best transition to a smarter grid, and do so in a reliable and affordable manner that reflects grid modernization affordability concerns for utilities today. And the most effective and immediate means to accomplish that is to encourage and facilitate early investment in grid-related infrastructure and technology.

The traditional, or centralized, model has evolved since the days of Thomas Edison, but the basic structure is relatively unchanged: generate electrons at a central power plant, transmit them over a unidirectional system of high-voltage transmission lines, and deliver them to consumers through local distribution networks. The decentralized, or emerging, model envisions a system that moves away from the central power station as the primary provider of electricity to a system in which distributed energy resources, energy storage, electric vehicles, peer-to-peer transactions, connected appliances and devices, and sophisticated energy usage, pricing, and load management software play a more prominent role.

Whether it’s a fully decentralized and distributed power system, or the more likely centralized-decentralized hybrid, it is apparent that the way in which electricity is produced, delivered, and consumed will differ from today’s traditional model. And yet, in many ways, the fundamental design and engineering that makes up today’s electric grid will serve as the foundation for achieving a more distributed future. Indeed, as the transition to a smarter grid ramps up, the grid’s basic structure will remain the underlying commonality, allowing the grid to serve as a facilitator to integrate emerging technologies, including EV charging stations, rooftop solar, demand-side management software, and other distributed energy resources, while maximizing their potential benefits and informing discussions about California’s grid reliability under ambitious transition goals.

A loose analogy here is the internet. In its infancy, the internet was used primarily for sending and receiving email, doing homework, and looking up directions. At the time, it was never fully understood that the internet would create a range of services and products that would impact nearly every aspect of everyday life from online shopping, booking travel, and watching television to enabling the sharing economy and the emerging “Internet of Things.”

Uber, Netflix, Amazon, and Nest would not be possible without the internet. But the rapid evolution of the internet did not occur without significant investment in internet-related infrastructure. From dial-up to broadband to Wi-Fi, companies have invested billions of dollars to update and upgrade the system, allowing the internet to maximize its offerings and give way to technological breakthroughs, innovative businesses, and ways to share and communicate like never before.  

The electric grid is similar; it is both the backbone and the facilitator upon which the future of electricity can be built. If the vision for a smarter grid is to deploy advanced energy technologies, create new business models, and transform the way electricity is produced, distributed, and consumed, then updating and modernizing existing infrastructure and building out new intelligent infrastructure need to be top priorities. But this requires money. To be sure, increased investment in grid-related infrastructure is the key component to transitioning to a smarter grid; a grid capable of supporting and integrating advanced energy technologies within a more digital grid architecture that will result in a cleaner, more modern and efficient, and reliable and secure electricity system.

The inherent challenges of deploying new technologies and resources — reliability, bidirectional flow, intermittency, visibility, and communication, to name a few, as well as emerging climate resilience concerns shaping planning today, are not insurmountable and demonstrate exactly why federal and state authorities and electricity sector stakeholders should be planning for and making appropriate investment decisions now. My organization, Alliance for Innovation and Infrastructure, will release a report Wednesday addressing these challenges facing our infrastructure, and the opportunities a distributed smart grid would provide. From upgrading traditional wires and poles and integrating smart power inverters and real-time sensors to deploying advanced communications platforms and energy analytics software, there are numerous technologies currently available and capable of being deployed that warrant investment consideration.

Making these and similar investments will help to identify and resolve reliability issues earlier, and address vulnerabilities identified in the latest power grid report card findings, which in turn will create a stronger, more flexible grid that can then support additional emerging technologies, resulting in a system better able to address integration challenges. Doing so will ease the electricity evolution in the long-term and best realize the full reliability, economic, and environmental benefits that a smarter grid can offer.  

 

Related News

View more

California Blackouts reveal lapses in power supply

California Electricity Reliability covers grid resilience amid heat waves, rolling blackouts, renewable energy integration, resource adequacy, battery storage, natural gas peakers, ISO oversight, and peak demand management to keep homes, businesses, and industry powered.

 

Key Points

Dependable California power delivery despite heat waves, peak demand, and challenges integrating renewables into grid.

✅ Rolling blackouts revealed gaps in resource adequacy.

✅ Early evening solar drop requires fast ramping and storage.

✅ Agencies pledge planning reforms and flexible backup supply.

 

One hallmark of an advanced society is a reliable supply of electrical energy for residential, commercial and industrial consumers. Uncertainty that California electricity will be there when we need it it undermines social cohesion and economic progress, as demonstrated by the travails of poor nations with erratic energy supplies.

California got a small dose of that syndrome in mid-August when a record heat wave struck the state and utilities were ordered to impose rolling blackouts to protect the grid from melting down under heavy air conditioning demands.

Gov. Gavin Newsom quickly demanded that the three overseers of electrical service to most of the state - the Public Utilities Commission, the Energy Commission and the California Independent Service Operator – explain what went wrong.

"These blackouts, which occurred without prior warning or enough time for preparation, are unacceptable and unbefitting of the nation's largest and most innovative state," Newsom wrote. "This cannot stand. California residents and businesses deserve better from their government."

Initially, there was some fingerpointing among the three entities. The blackouts had been ordered by the California Independent System Operator, which manages the grid and its president, Steve Berberich, said he had warned the Public Utilities Commission about the potential supply shortfall facing the state.

"We have indicated in filing after filing after filing that the resource adequacy program was broken and needed to be fixed," he said. "The situation we are in could have been avoided."

However, as political heat increased, the three agencies hung together and produced a joint report that admitted to lapses of supply planning and grid management and promised steps to avoid a repeat next summer.

"The existing resource planning processes are not designed to fully address an extreme heat storm like the one experienced in mid August," their report said. "In transitioning to a reliable, clean and affordable resource mix, resource planning targets have not kept pace to lead to sufficient resources that can be relied upon to meet demand in the early evening hours. This makes balancing demand and supply more challenging."

Although California's grid had experienced greater heat-related demands in previous years, most notably 2006, managers then could draw standby power from natural gas-fired plants and import juice from other Western states when necessary.

Since then, the state has shut down a number of gas-fired plants and become more reliant on renewable but less reliable sources such as windmills and solar panels.

August's air conditioning demand peaked just as output from solar panels was declining with the setting of the sun and grid managers couldn't tap enough electrons from other sources to close the gap.

While the shift to renewables didn't, unto itself, cause the blackouts, they proved the need for a bigger cushion of backup generation or power storage in batteries or some other technology. The Public Utilities Commission, as Beberich suggested, has been somewhat lax in ordering development of backup supply.

In the aftermath of the blackouts, the state Water Resources Control Board, no doubt with direction from Newsom's office, postponed planned shutdowns of more coastal plants, which would have reduced supply flexibility even more.

Shifting to 100% renewable electricity, the state's eventual goal, while maintaining reliability will not get any easier. The state's last nuclear plant, Diablo Canyon, is ticketed for closure and demand will increase as California eliminates gasoline- and diesel-powered vehicles in favor of "zero emission vehicles" as part of its climate policies push and phases out natural gas in homes and businesses.

Politicians such as Newsom and legislators in last week's blackout hearing may endorse a carbon-free future in theory, but they know that they'll pay the price as electricity prices climb if nothing happens when Californians flip the switch.

 

Related News

View more

Jolting the brain's circuits with electricity is moving from radical to almost mainstream therapy

Brain Stimulation is transforming neuromodulation, from TMS and DBS to closed loop devices, targeting neural circuits for addiction, depression, Parkinsons, epilepsy, and chronic pain, powered by advanced imaging, AI analytics, and the NIH BRAIN Initiative.

 

Key Points

Brain stimulation uses pulses to modulate neural circuits, easing symptoms in depression, Parkinsons, and epilepsy.

✅ Noninvasive TMS and invasive DBS modulate specific brain circuits

✅ Closed loop systems adapt stimulation via real time biomarker detection

✅ Emerging uses: addiction, depression, Parkinsons, epilepsy, chronic pain

 

In June 2015, biology professor Colleen Hanlon went to a conference on drug dependence. As she met other researchers and wandered around a glitzy Phoenix resort’s conference rooms to learn about the latest work on therapies for drug and alcohol use disorders, she realized that out of the 730 posters, there were only two on brain stimulation as a potential treatment for addiction — both from her own lab at Wake Forest School of Medicine.

Just four years later, she would lead 76 researchers on four continents in writing a consensus article about brain stimulation as an innovative tool for addiction. And in 2020, the Food and Drug Administration approved a transcranial magnetic stimulation device to help patients quit smoking, a milestone for substance use disorders.

Brain stimulation is booming. Hanlon can attend entire conferences devoted to the study of what electrical currents do—including how targeted stimulation can improve short-term memory in older adults—to the intricate networks of highways and backroads that make up the brain’s circuitry. This expanding field of research is slowly revealing truths of the brain: how it works, how it malfunctions, and how electrical impulses, precisely targeted and controlled, might be used to treat psychiatric and neurological disorders.

In the last half-dozen years, researchers have launched investigations into how different forms of neuromodulation affect addiction, depression, loss-of-control eating, tremor, chronic pain, obsessive compulsive disorder, Parkinson’s disease, epilepsy, and more. Early studies have shown subtle electrical jolts to certain brain regions could disrupt circuit abnormalities — the miscommunications — that are thought to underlie many brain diseases, and help ease symptoms that persist despite conventional treatments.

The National Institute of Health’s massive BRAIN Initiative put circuits front and center, distributing $2.4 billion to researchers since 2013 to devise and use new tools to observe interactions between brain cells and circuits. That, in turn, has kindled interest from the private sector. Among the advances that have enhanced our understanding of how distant parts of the brain talk with one another are new imaging technology and the use of machine learning, much as utilities use AI to adapt to shifting electricity demand, to interpret complex brain signals and analyze what happens when circuits go haywire.

Still, the field is in its infancy, and even therapies that have been approved for use in patients with, for example, Parkinson’s disease or epilepsy, help only a minority of patients, and in a world where electricity drives pandemic readiness expectations can outpace evidence. “If it was the Bible, it would be the first chapter of Genesis,” said Michael Okun, executive director of the Norman Fixel Institute for Neurological Diseases at University of Florida Health.

As brain stimulation evolves, researchers face daunting hurdles, and not just scientific ones. How will brain stimulation become accessible to all the patients who need it, given how expensive and invasive some treatments are? Proving to the FDA that brain stimulation works, and does so safely, is complicated and expensive. Even with a swell of scientific momentum and an influx of funding, the agency has so far cleared brain stimulation for only a handful of limited conditions. Persuading insurers to cover the treatments is another challenge altogether. And outside the lab, researchers are debating nascent issues, such as the ethics of mind control, the privacy of a person’s brain data—concerns that echo efforts to develop algorithms to prevent blackouts during rising ransomware threats—and how to best involve patients in the study of the human brain’s far-flung regions.

Neurologist Martha Morrell is optimistic about the future of brain stimulation. She remembers the shocked reactions of her colleagues in 2004 when she left full-time teaching at Stanford (she still has a faculty appointment as a clinical professor of neurology) to direct clinical trials at NeuroPace, then a young company making neurostimulator systems to potentially treat epilepsy patients.

Related: Once a last resort, this pain therapy is getting a new life amid the opioid crisis
“When I started working on this, everybody thought I was insane,” said Morrell. Nearly 20 years in, she sees a parallel between the story of jolting the brain’s circuitry and that of early implantable cardiac devices, such as pacemakers and defibrillators, which initially “were used as a last option, where all other medications have failed.” Now, “the field of cardiology is very comfortable incorporating electrical therapy, device therapy, into routine care. And I think that’s really where we’re going with neurology as well.”


Reaching a ‘slope of enlightenment’
Parkinson’s is, in some ways, an elder in the world of modern brain stimulation, and it shows the potential as well as the limitations of the technology. Surgeons have been implanting electrodes deep in the brains of Parkinson’s patients since the late 1990s, and in people with more advanced disease since the early 2000s.

In that time, it’s gone through the “hype cycle,” said Okun, the national medical adviser to the Parkinson’s Foundation since 2006. Feverish excitement and overinflated expectations have given way to reality, bringing scientists to a “slope of enlightenment,” he said. They have found deep brain stimulation to be very helpful for some patients with Parkinson’s, rendering them almost symptom-free by calming the shaking and tremors that medications couldn’t. But it doesn’t stop the progression of the disease, or resolve some of the problems patients with advanced Parkinson’s have walking, talking, and thinking.

In 2015, the same year Hanlon found only her lab’s research on brain stimulation at the addiction conference, Kevin O’Neill watched one finger on his left hand start doing something “funky.” One finger twitched, then two, then his left arm started tingling and a feeling appeared in his right leg, like it was about to shake but wouldn’t — a tremor.

“I was assuming it was anxiety,” O’Neill, 62, told STAT. He had struggled with anxiety before, and he had endured a stressful year: a separation, selling his home, starting a new job at a law firm in California’s Bay Area. But a year after his symptoms first began, O’Neill was diagnosed with Parkinson’s.

In the broader energy context, California has increasingly turned to battery storage to stabilize its strained grid.

Related: Psychiatric shock therapy, long controversial, may face fresh restrictions
Doctors prescribed him pills that promote the release of dopamine, to offset the death of brain cells that produce this messenger molecule in circuits that control movement. But he took them infrequently because he worried about insomnia as a side effect. Walking became difficult — “I had to kind of think my left leg into moving” — and the labor lawyer found it hard to give presentations and travel to clients’ offices.

A former actor with an outgoing personality, he developed social anxiety and didn’t tell his bosses about his diagnosis for three years, and wouldn’t have, if not for two workdays in summer 2018 when his tremors were severe and obvious.

O’Neill’s tremors are all but gone since he began deep brain stimulation last May, though his left arm shakes when he feels tense.

It was during that period that he learned about deep brain stimulation, at a support group for Parkinson’s patients. “I thought, ‘I will never let anybody fuss with my brain. I’m not going to be a candidate for that,’” he recalled. “It felt like mad scientist science fiction. Like, are you kidding me?”

But over time, the idea became less radical, as O’Neill spoke to DBS patients and doctors and did his own research, and as his symptoms worsened. He decided to go for it. Last May, doctors at the University of California, San Francisco surgically placed three metal leads into his brain, connected by thin cords to two implants in his chest, just near the clavicles. A month later, he went into the lab and researchers turned the device on.

“That was a revelation that day,” he said. “You immediately — literally, immediately — feel the efficacy of these things. … You go from fully symptomatic to non-symptomatic in seconds.”

When his nephew pulled up to the curb to pick him up, O’Neill started dancing, and his nephew teared up. The following day, O’Neill couldn’t wait to get out of bed and go out, even if it was just to pick up his car from the repair shop.

In the year since, O’Neill’s walking has gone from “awkward and painful” to much improved, and his tremors are all but gone. When he is extra frazzled, like while renovating and moving into his new house overlooking the hills of Marin County, he feels tense and his left arm shakes and he worries the DBS is “failing,” but generally he returns to a comfortable, tremor-free baseline.

O’Neill worried about the effects of DBS wearing off but, for now, he can think “in terms of decades, instead of years or months,” he recalled his neurologist telling him. “The fact that I can put away that worry was the big thing.”

He’s just one patient, though. The brain has regions that are mostly uniform across all people. The functions of those regions also tend to be the same. But researchers suspect that how brain regions interact with one another — who mingles with whom, and what conversation they have — and how those mixes and matches cause complex diseases varies from person to person. So brain stimulation looks different for each patient.

Related: New study revives a Mozart sonata as a potential epilepsy therapy
Each case of Parkinson’s manifests slightly differently, and that’s a bit of knowledge that applies to many other diseases, said Okun, who organized the nine-year-old Deep Brain Stimulation Think Tank, where leading researchers convene, review papers, and publish reports on the field’s progress each year.

“I think we’re all collectively coming to the realization that these diseases are not one-size-fits-all,” he said. “We have to really begin to rethink the entire infrastructure, the schema, the framework we start with.”

Brain stimulation is also used frequently to treat people with common forms of epilepsy, and has reduced the number of seizures or improved other symptoms in many patients. Researchers have also been able to collect high-quality data about what happens in the brain during a seizure — including identifying differences between epilepsy types. Still, only about 15% of patients are symptom-free after treatment, according to Robert Gross, a neurosurgery professor at Emory University in Atlanta.

“And that’s a critical difference for people with epilepsy. Because people who are symptom-free can drive,” which means they can get to a job in a place like Georgia, where there is little public transit, he said. So taking neuromodulation “from good to great,” is imperative, Gross said.


Renaissance for an ancient idea
Recent advances are bringing about what Gross sees as “almost a renaissance period” for brain stimulation, though the ideas that undergird the technology are millenia old. Neuromodulation goes back to at least ancient Egypt and Greece, when electrical shocks from a ray, called the “torpedo fish,” were recommended as a treatment for headache and gout. Over centuries, the fish zaps led to doctors burning holes into the brains of patients. Those “lesions” worked, somehow, but nobody could explain why they alleviated some patients’ symptoms, Okun said.

Perhaps the clearest predecessor to today’s technology is electroconvulsive therapy (ECT), which in a rudimentary and dangerous way began being used on patients with depression roughly 100 years ago, said Nolan Williams, director of the Brain Stimulation Lab at Stanford University.

Related: A new index measures the extent and depth of addiction stigma
More modern forms of brain stimulation came about in the United States in the mid-20th century. A common, noninvasive approach is transcranial magnetic stimulation, which involves placing an electromagnetic coil on the scalp to transmit a current into the outermost layer of the brain. Vagus nerve stimulation (VNS), used to treat epilepsy, zaps a nerve that contributes to some seizures.

The most invasive option, deep brain stimulation, involves implanting in the skull a device attached to electrodes embedded in deep brain regions, such as the amygdala, that can’t be reached with other stimulation devices. In 1997, the FDA gave its first green light to deep brain stimulation as a treatment for tremor, and then for Parkinson’s in 2002 and the movement disorder dystonia in 2003.

Even as these treatments were cleared for patients, though, what was happening in the brain remained elusive. But advanced imaging tools now let researchers peer into the brain and map out networks — a recent breakthrough that researchers say has propelled the field of brain stimulation forward as much as increased funding has, paralleling broader efforts to digitize analog electrical systems across industry. Imaging of both human brains and animal models has helped researchers identify the neuroanatomy of diseases, target brain regions with more specificity, and watch what was happening after electrical stimulation.

Another key step has been the shift from open-loop stimulation — a constant stream of electricity — to closed-loop stimulation that delivers targeted, brief jolts in response to a symptom trigger. To make use of the futuristic technology, labs need people to develop artificial intelligence tools, informed by advances in machine learning for the energy transition, to interpret large data sets a brain implant is generating, and to tailor devices based on that information.

“We’ve needed to learn how to be data scientists,” Morrell said.

Affinity groups, like the NIH-funded Open Mind Consortium, have formed to fill that gap. Philip Starr, a neurosurgeon and developer of implantable brain devices at the University of California at San Francisco Health system, leads the effort to teach physicians how to program closed-loop devices, and works to create ethical standards for their use. “There’s been extraordinary innovation after 20 years of no innovation,” he said.

The BRAIN Initiative has been critical, several researchers told STAT. “It’s been a godsend to us,” Gross said. The NIH’s Brain Research through Advancing Innovative Neurotechnologies (BRAIN) Initiative was launched in 2013 during the Obama administration with a $50 million budget. BRAIN now spends over $500 million per year. Since its creation, BRAIN has given over 1,100 awards, according to NIH data. Part of the initiative’s purpose is to pair up researchers with medical technology companies that provide human-grade stimulation devices to the investigators. Nearly three dozen projects have been funded through the investigator-devicemaker partnership program and through one focused on new implantable devices for first-in-human use, according to Nick Langhals, who leads work on neurological disorders at the initiative.

The more BRAIN invests, the more research is spawned. “We learn more about what circuits are involved … which then feeds back into new and more innovative projects,” he said.

Many BRAIN projects are still in early stages, finishing enrollment or small feasibility studies, Langhals said. Over the next couple of years, scientists will begin to see some of the fruits of their labor, which could lead to larger clinical trials, or to companies developing more refined brain stimulation implants, Langhals said.

Money from the National Institutes of Mental Health, as well as the NIH’s Helping to End Addiction Long-term (HEAL), has similarly sweetened the appeal of brain stimulation, both for researchers and industry. “A critical mass” of companies interested in neuromodulation technology has mushroomed where, for two decades, just a handful of companies stood, Starr said.

More and more, pharmaceutical and digital health companies are looking at brain stimulation devices “as possible products for their future,” said Linda Carpenter, director of the Butler Hospital TMS Clinic and Neuromodulation Research Facility.


‘Psychiatry 3.0’
The experience with using brain stimulation to stop tremors and seizures inspired psychiatrists to begin exploring its use as a potentially powerful therapy for healing, or even getting ahead of, mental illness.

In 2008, the FDA approved TMS for patients with major depression who had tried, and not gotten relief from, drug therapy. “That kind of opened the door for all of us,” said Hanlon, a professor and researcher at the Center for Research on Substance Use and Addiction at Wake Forest School of Medicine. The last decade saw a surge of research into how TMS could be used to reset malfunctioning brain circuits involved in anxiety, depression, obsessive-compulsive disorder, and other conditions.

“We’re certainly entering into what a lot of people are calling psychiatry 3.0,” Stanford’s Williams said. “Whereas the first iteration was Freud and all that business, the second one was the psychopharmacology boom, and this third one is this bit around circuits and stimulation.”

Drugs alleviate some patients’ symptoms while simultaneously failing to help many others, but psychopharmacology clearly showed “there’s definitely a biology to this problem,” Williams said — a biology that in some cases may be more amenable to a brain stimulation.

Related: Largest psilocybin trial finds the psychedelic is effective in treating serious depression
The exact mechanics of what happens between cells when brain circuits … well, short-circuit, is unclear. Researchers are getting closer to finding biomarkers that warn of an incoming depressive episode, or wave of anxiety, or loss of impulse control. Those brain signatures could be different for every patient. If researchers can find molecular biomarkers for psychiatric disorders — and find ways to preempt those symptoms by shocking particular brain regions — that would reshape the field, Williams said.

Not only would disease-specific markers help clinicians diagnose people, but they could help chip away at the stigma that paints mental illness as a personal or moral failing instead of a disease. That’s what happened for epilepsy in the 1960s, when scientific findings nudged the general public toward a deeper understanding of why seizures happen, and it’s “the same trajectory” Williams said he sees for depression.

His research at the Stanford lab also includes work on suicide, and obsessive-compulsive disorder, which the FDA said in 2018 could be treated using noninvasive TMS. Williams considers brain stimulation, with its instantaneity, to be a potential breakthrough for urgent psychiatric situations. Doctors know what to do when a patient is rushed into the emergency room with a heart attack or a stroke, but there is no immediate treatment for psychiatric emergencies, he said. Williams wonders: What if, in the future, a suicidal patient could receive TMS in the emergency room and be quickly pulled out of their depressive mental spiral?

Researchers are also actively investigating the brain biology of addiction. In August 2020, the FDA approved TMS for smoking cessation, the first such OK for a substance use disorder, which is “really exciting,” Hanlon said. Although there is some nuance when comparing substance use disorders, a primal mechanism generally defines addiction: the eternal competition between “top-down” executive control functions and “bottom-up” cravings. It’s the same process that is at work when one is deciding whether to eat another cookie or abstain — just exacerbated.

Hanlon is trying to figure out if the stop and go circuits are in the same place for all people, and whether neuromodulation should be used to strengthen top-down control or weaken bottom-up cravings. Just as brain stimulation can be used to disrupt cellular misfiring, it could also be a tool for reinforcing helpful brain functions, or for giving the addicted brain what it wants in order to curb substance use.

Evidence suggests many people with schizophrenia smoke cigarettes (a leading cause of early death for this population) because nicotine reduces the “hyperconnectivity” that characterizes the brains of people with the disease, said Heather Ward, a research fellow at Boston’s Beth Israel Deaconess Medical Center. She suspects TMS could mimic that effect, and therefore reduce cravings and some symptoms of the disease, and she hopes to prove that in a pilot study that is now enrolling patients.

If the scientific evidence proves out, clinicians say brain stimulation could be used alongside behavioral therapy and drug-based therapy to treat substance use disorders. “In the end, we’re going to need all three to help people stay sober,” Hanlon said. “We’re adding another tool to the physician’s toolbox.”

Decoding the mysteries of pain
Afavorable outcome to the ongoing research, one that would fling the doors to brain stimulation wide open for patients with myriad disorders, is far from guaranteed. Chronic pain researchers know that firsthand.

Chronic pain, among the most mysterious and hard-to-study medical phenomena, was the first use for which the FDA approved deep brain stimulation, said Prasad Shirvalkar, an assistant professor of anesthesiology at UCSF. But when studies didn’t pan out after a year, the FDA retracted its approval.

Shirvalkar is working with Starr and neurosurgeon Edward Chang on a profoundly complex problem: “decoding pain in the brain states, which has never been done,” as Starr told STAT.

Part of the difficulty of studying pain is that there is no objective way to measure it. Much of what we know about pain is from rudimentary surveys that ask patients to rate how much they’re hurting, on a scale from zero to 10.

Using implantable brain stimulation devices, the researchers ask patients for a 0-to-10 rating of their pain while recording up-and-down cycles of activity in the brain. They then use machine learning to compare the two streams of information and see what brain activity correlates with a patient’s subjective pain experience. Implantable devices let researchers collect data over weeks and months, instead of basing findings on small snippets of information, allowing for a much richer analysis.

 

Related News

View more

Sign Up for Electricity Forum’s Newsletter

Stay informed with our FREE Newsletter — get the latest news, breakthrough technologies, and expert insights, delivered straight to your inbox.

Electricity Today T&D Magazine Subscribe for FREE

Stay informed with the latest T&D policies and technologies.
  • Timely insights from industry experts
  • Practical solutions T&D engineers
  • Free access to every issue

Live Online & In-person Group Training

Advantages To Instructor-Led Training – Instructor-Led Course, Customized Training, Multiple Locations, Economical, CEU Credits, Course Discounts.

Request For Quotation

Whether you would prefer Live Online or In-Person instruction, our electrical training courses can be tailored to meet your company's specific requirements and delivered to your employees in one location or at various locations.