Energy Saving Tower Type Oil Pump Driven by Permanent Magnetic Linear Motor

By PR Newswire


CSA Z463 Electrical Maintenance

Our customized live online or in‑person group training can be delivered to your staff at your location.

  • Live Online
  • 6 hours Instructor-led
  • Group Training Available
Regular Price:
$249
Coupon Price:
$199
Reserve Your Seat Today
Harbin Electric, Inc., a U.S. company, with operations based in Harbin, China, announced that the Company has successfully developed a high efficiency, energy saving "Tower Type Oil Pump" driven by permanent magnetic linear motors for the oil and gas industry.

The new generation of Tower Type Oil Pump is the world's first vertical oil pump that adopts a large thrust cylindrical synchronous linear motor, producing measurable energy savings. In addition to the benefits of energy efficiency, key features of the intelligent design include automatic and reversing speeds, as well as significantly reduced operational maintenance costs. This specially designed pump contains unique motor technologies which allow the pump to consume less energy during all phases of operation.

In particular, the patented pumping system intelligently gauges the tension needed to mechanically lift liquid out of the wellhead. By adjusting the power needs of the pump on a real time basis, it can provide more than 20-30% of energy, when compared to traditional oil pumps.

Harbin Electric's linear motor oil pump is certified by Heilongjiang Science and Technology Bureau ("HSTB"), a branch of The Ministry of Science and Technology of P.R. China, and China National Petroleum Corporation Daqing Petroleum. A prototype of the Tower Type Oil Pump has been running at the Second Extraction Factory of the Daqing Oilfield for one and a half years. The prototype unit has met the functional design requirements on all technology aspects.

Daqing Petroleum is the biggest petroleum corporation in China, producing approximately 50 million tons of oil annually. The testing model of the Tower Type Oil Pump was originally customized for Daqing Petroleum.

With the testing of the prototype complete, Daqing Petroleum has placed a purchase order for 13 units to be delivered by the end of 2007. These units are expected to sell at an average price of $52,000.

Tianfu Yang, Chairman and Chief Executive Officer of Harbin Electric stated, "We are pleased to introduce to the oil and gas industry this state of the art vertical style oil pump driven by efficient linear motors. This product development is an important milestone for our company as it further positions Harbin Electric as a leader in industrial motor technology. Since we began development of the Tower Type Oil Pump three years ago, our Research & Development team has been actively involved in an effort to deliver a unique solution which has culminated in several new patents for our business and has developed another revenue channel for our company."

Mr. Yang concluded, "In the newly issued '11th Five-Year Plan for National Economic and Social Development Program of China' (2006-2010), the Chinese government explicitly focuses on the need to improve the efficiency of the country's resource use and mandates a reduction in energy consumption per unit of GDP by 20%.

This policy established by the Central Government of China is encouraging for our business as our newly developed oil pump can provide energy savings of over 25% when compared to traditional oil pumps. We believe this product will add to our growth in the years ahead and supports our corporate goal to introduce intelligent, efficient solutions to the global industrial motor marketplace."

Related News

Data Center Boom Poses a Power Challenge for U.S. Utilities

U.S. Data Center Power Demand is straining electric utilities and grid reliability as AI, cloud computing, and streaming surge, driving transmission and generation upgrades, demand response, and renewable energy sourcing amid rising electricity costs.

 

Key Points

The rising electricity load from U.S. data centers, affecting utilities, grid capacity, and energy prices.

✅ AI, cloud, and streaming spur hyperscale compute loads

✅ Grid upgrades: transmission, generation, and substations

✅ Demand response, efficiency, and renewables mitigate strain

 

U.S. electric utilities are facing a significant new challenge as the explosive growth of data centers puts unprecedented strain on power grids across the nation. According to a new report from Reuters, data centers' power demands are expected to increase dramatically over the next few years, raising concerns about grid reliability and potential increases in electricity costs for businesses and consumers.


What's Driving the Data Center Surge?

The explosion in data centers is being fueled by several factors, with grid edge trends offering early context for these shifts:

  • Cloud Computing: The rise of cloud computing services, where businesses and individuals store and process data on remote servers, significantly increases demand for data centers.
  • Artificial Intelligence (AI): Data-hungry AI applications and machine learning algorithms are driving a massive need for computing power, accelerating the growth of data centers.
  • Streaming and Video Content: The growth of streaming platforms and high-definition video content requires vast amounts of data storage and processing, further boosting demand for data centers.


Challenges for Utilities

Data centers are notorious energy hogs. Their need for a constant, reliable supply of electricity places  heavy demand on the grid, making integrating AI data centers a complex planning challenge, often in regions where power infrastructure wasn't designed for such large loads. Utilities must invest significantly in transmission and generation capacity upgrades to meet the demand while ensuring grid stability.

Some experts warn that the growth of data centers could lead to brownouts or outages, as a U.S. blackout study underscores ongoing risks, especially during peak demand periods in areas where the grid is already strained. Increased electricity demand could also lead to price hikes, with utilities potentially passing the additional costs onto consumers and businesses.


Sustainable Solutions Needed

Utility companies, governments, and the data center industry are scrambling to find sustainable solutions, including using AI to manage demand initiatives across utilities, to mitigate these challenges:

  • Energy Efficiency: Data center operators are investing in new cooling and energy management solutions to improve energy efficiency. Some are even exploring renewable energy sources like onsite solar and wind power.
  • Strategic Placement: Authorities are encouraging the development of data centers in areas with abundant renewable energy and access to existing grid infrastructure. This minimizes the need for expensive new transmission lines.
  • Demand Flexibility: Utility companies are experimenting with programs as part of a move toward a digital grid architecture to incentivize data centers to reduce their power consumption during peak demand periods, which could help mitigate power strain.


The Future of the Grid

The rapid growth of data centers exemplifies the significant challenges facing the aging U.S. electrical grid, with a recent grid report card highlighting dangerous vulnerabilities. It highlights the need for a modernized power infrastructure, capable of accommodating increasing demand spurred by new technologies while addressing climate change impacts that threaten reliability and affordability.  The question for utilities, as well as data center operators, is how to balance the increasing need for computing power with the imperative of a sustainable and reliable energy future.

 

Related News

View more

Jolting the brain's circuits with electricity is moving from radical to almost mainstream therapy

Brain Stimulation is transforming neuromodulation, from TMS and DBS to closed loop devices, targeting neural circuits for addiction, depression, Parkinsons, epilepsy, and chronic pain, powered by advanced imaging, AI analytics, and the NIH BRAIN Initiative.

 

Key Points

Brain stimulation uses pulses to modulate neural circuits, easing symptoms in depression, Parkinsons, and epilepsy.

✅ Noninvasive TMS and invasive DBS modulate specific brain circuits

✅ Closed loop systems adapt stimulation via real time biomarker detection

✅ Emerging uses: addiction, depression, Parkinsons, epilepsy, chronic pain

 

In June 2015, biology professor Colleen Hanlon went to a conference on drug dependence. As she met other researchers and wandered around a glitzy Phoenix resort’s conference rooms to learn about the latest work on therapies for drug and alcohol use disorders, she realized that out of the 730 posters, there were only two on brain stimulation as a potential treatment for addiction — both from her own lab at Wake Forest School of Medicine.

Just four years later, she would lead 76 researchers on four continents in writing a consensus article about brain stimulation as an innovative tool for addiction. And in 2020, the Food and Drug Administration approved a transcranial magnetic stimulation device to help patients quit smoking, a milestone for substance use disorders.

Brain stimulation is booming. Hanlon can attend entire conferences devoted to the study of what electrical currents do—including how targeted stimulation can improve short-term memory in older adults—to the intricate networks of highways and backroads that make up the brain’s circuitry. This expanding field of research is slowly revealing truths of the brain: how it works, how it malfunctions, and how electrical impulses, precisely targeted and controlled, might be used to treat psychiatric and neurological disorders.

In the last half-dozen years, researchers have launched investigations into how different forms of neuromodulation affect addiction, depression, loss-of-control eating, tremor, chronic pain, obsessive compulsive disorder, Parkinson’s disease, epilepsy, and more. Early studies have shown subtle electrical jolts to certain brain regions could disrupt circuit abnormalities — the miscommunications — that are thought to underlie many brain diseases, and help ease symptoms that persist despite conventional treatments.

The National Institute of Health’s massive BRAIN Initiative put circuits front and center, distributing $2.4 billion to researchers since 2013 to devise and use new tools to observe interactions between brain cells and circuits. That, in turn, has kindled interest from the private sector. Among the advances that have enhanced our understanding of how distant parts of the brain talk with one another are new imaging technology and the use of machine learning, much as utilities use AI to adapt to shifting electricity demand, to interpret complex brain signals and analyze what happens when circuits go haywire.

Still, the field is in its infancy, and even therapies that have been approved for use in patients with, for example, Parkinson’s disease or epilepsy, help only a minority of patients, and in a world where electricity drives pandemic readiness expectations can outpace evidence. “If it was the Bible, it would be the first chapter of Genesis,” said Michael Okun, executive director of the Norman Fixel Institute for Neurological Diseases at University of Florida Health.

As brain stimulation evolves, researchers face daunting hurdles, and not just scientific ones. How will brain stimulation become accessible to all the patients who need it, given how expensive and invasive some treatments are? Proving to the FDA that brain stimulation works, and does so safely, is complicated and expensive. Even with a swell of scientific momentum and an influx of funding, the agency has so far cleared brain stimulation for only a handful of limited conditions. Persuading insurers to cover the treatments is another challenge altogether. And outside the lab, researchers are debating nascent issues, such as the ethics of mind control, the privacy of a person’s brain data—concerns that echo efforts to develop algorithms to prevent blackouts during rising ransomware threats—and how to best involve patients in the study of the human brain’s far-flung regions.

Neurologist Martha Morrell is optimistic about the future of brain stimulation. She remembers the shocked reactions of her colleagues in 2004 when she left full-time teaching at Stanford (she still has a faculty appointment as a clinical professor of neurology) to direct clinical trials at NeuroPace, then a young company making neurostimulator systems to potentially treat epilepsy patients.

Related: Once a last resort, this pain therapy is getting a new life amid the opioid crisis
“When I started working on this, everybody thought I was insane,” said Morrell. Nearly 20 years in, she sees a parallel between the story of jolting the brain’s circuitry and that of early implantable cardiac devices, such as pacemakers and defibrillators, which initially “were used as a last option, where all other medications have failed.” Now, “the field of cardiology is very comfortable incorporating electrical therapy, device therapy, into routine care. And I think that’s really where we’re going with neurology as well.”


Reaching a ‘slope of enlightenment’
Parkinson’s is, in some ways, an elder in the world of modern brain stimulation, and it shows the potential as well as the limitations of the technology. Surgeons have been implanting electrodes deep in the brains of Parkinson’s patients since the late 1990s, and in people with more advanced disease since the early 2000s.

In that time, it’s gone through the “hype cycle,” said Okun, the national medical adviser to the Parkinson’s Foundation since 2006. Feverish excitement and overinflated expectations have given way to reality, bringing scientists to a “slope of enlightenment,” he said. They have found deep brain stimulation to be very helpful for some patients with Parkinson’s, rendering them almost symptom-free by calming the shaking and tremors that medications couldn’t. But it doesn’t stop the progression of the disease, or resolve some of the problems patients with advanced Parkinson’s have walking, talking, and thinking.

In 2015, the same year Hanlon found only her lab’s research on brain stimulation at the addiction conference, Kevin O’Neill watched one finger on his left hand start doing something “funky.” One finger twitched, then two, then his left arm started tingling and a feeling appeared in his right leg, like it was about to shake but wouldn’t — a tremor.

“I was assuming it was anxiety,” O’Neill, 62, told STAT. He had struggled with anxiety before, and he had endured a stressful year: a separation, selling his home, starting a new job at a law firm in California’s Bay Area. But a year after his symptoms first began, O’Neill was diagnosed with Parkinson’s.

In the broader energy context, California has increasingly turned to battery storage to stabilize its strained grid.

Related: Psychiatric shock therapy, long controversial, may face fresh restrictions
Doctors prescribed him pills that promote the release of dopamine, to offset the death of brain cells that produce this messenger molecule in circuits that control movement. But he took them infrequently because he worried about insomnia as a side effect. Walking became difficult — “I had to kind of think my left leg into moving” — and the labor lawyer found it hard to give presentations and travel to clients’ offices.

A former actor with an outgoing personality, he developed social anxiety and didn’t tell his bosses about his diagnosis for three years, and wouldn’t have, if not for two workdays in summer 2018 when his tremors were severe and obvious.

O’Neill’s tremors are all but gone since he began deep brain stimulation last May, though his left arm shakes when he feels tense.

It was during that period that he learned about deep brain stimulation, at a support group for Parkinson’s patients. “I thought, ‘I will never let anybody fuss with my brain. I’m not going to be a candidate for that,’” he recalled. “It felt like mad scientist science fiction. Like, are you kidding me?”

But over time, the idea became less radical, as O’Neill spoke to DBS patients and doctors and did his own research, and as his symptoms worsened. He decided to go for it. Last May, doctors at the University of California, San Francisco surgically placed three metal leads into his brain, connected by thin cords to two implants in his chest, just near the clavicles. A month later, he went into the lab and researchers turned the device on.

“That was a revelation that day,” he said. “You immediately — literally, immediately — feel the efficacy of these things. … You go from fully symptomatic to non-symptomatic in seconds.”

When his nephew pulled up to the curb to pick him up, O’Neill started dancing, and his nephew teared up. The following day, O’Neill couldn’t wait to get out of bed and go out, even if it was just to pick up his car from the repair shop.

In the year since, O’Neill’s walking has gone from “awkward and painful” to much improved, and his tremors are all but gone. When he is extra frazzled, like while renovating and moving into his new house overlooking the hills of Marin County, he feels tense and his left arm shakes and he worries the DBS is “failing,” but generally he returns to a comfortable, tremor-free baseline.

O’Neill worried about the effects of DBS wearing off but, for now, he can think “in terms of decades, instead of years or months,” he recalled his neurologist telling him. “The fact that I can put away that worry was the big thing.”

He’s just one patient, though. The brain has regions that are mostly uniform across all people. The functions of those regions also tend to be the same. But researchers suspect that how brain regions interact with one another — who mingles with whom, and what conversation they have — and how those mixes and matches cause complex diseases varies from person to person. So brain stimulation looks different for each patient.

Related: New study revives a Mozart sonata as a potential epilepsy therapy
Each case of Parkinson’s manifests slightly differently, and that’s a bit of knowledge that applies to many other diseases, said Okun, who organized the nine-year-old Deep Brain Stimulation Think Tank, where leading researchers convene, review papers, and publish reports on the field’s progress each year.

“I think we’re all collectively coming to the realization that these diseases are not one-size-fits-all,” he said. “We have to really begin to rethink the entire infrastructure, the schema, the framework we start with.”

Brain stimulation is also used frequently to treat people with common forms of epilepsy, and has reduced the number of seizures or improved other symptoms in many patients. Researchers have also been able to collect high-quality data about what happens in the brain during a seizure — including identifying differences between epilepsy types. Still, only about 15% of patients are symptom-free after treatment, according to Robert Gross, a neurosurgery professor at Emory University in Atlanta.

“And that’s a critical difference for people with epilepsy. Because people who are symptom-free can drive,” which means they can get to a job in a place like Georgia, where there is little public transit, he said. So taking neuromodulation “from good to great,” is imperative, Gross said.


Renaissance for an ancient idea
Recent advances are bringing about what Gross sees as “almost a renaissance period” for brain stimulation, though the ideas that undergird the technology are millenia old. Neuromodulation goes back to at least ancient Egypt and Greece, when electrical shocks from a ray, called the “torpedo fish,” were recommended as a treatment for headache and gout. Over centuries, the fish zaps led to doctors burning holes into the brains of patients. Those “lesions” worked, somehow, but nobody could explain why they alleviated some patients’ symptoms, Okun said.

Perhaps the clearest predecessor to today’s technology is electroconvulsive therapy (ECT), which in a rudimentary and dangerous way began being used on patients with depression roughly 100 years ago, said Nolan Williams, director of the Brain Stimulation Lab at Stanford University.

Related: A new index measures the extent and depth of addiction stigma
More modern forms of brain stimulation came about in the United States in the mid-20th century. A common, noninvasive approach is transcranial magnetic stimulation, which involves placing an electromagnetic coil on the scalp to transmit a current into the outermost layer of the brain. Vagus nerve stimulation (VNS), used to treat epilepsy, zaps a nerve that contributes to some seizures.

The most invasive option, deep brain stimulation, involves implanting in the skull a device attached to electrodes embedded in deep brain regions, such as the amygdala, that can’t be reached with other stimulation devices. In 1997, the FDA gave its first green light to deep brain stimulation as a treatment for tremor, and then for Parkinson’s in 2002 and the movement disorder dystonia in 2003.

Even as these treatments were cleared for patients, though, what was happening in the brain remained elusive. But advanced imaging tools now let researchers peer into the brain and map out networks — a recent breakthrough that researchers say has propelled the field of brain stimulation forward as much as increased funding has, paralleling broader efforts to digitize analog electrical systems across industry. Imaging of both human brains and animal models has helped researchers identify the neuroanatomy of diseases, target brain regions with more specificity, and watch what was happening after electrical stimulation.

Another key step has been the shift from open-loop stimulation — a constant stream of electricity — to closed-loop stimulation that delivers targeted, brief jolts in response to a symptom trigger. To make use of the futuristic technology, labs need people to develop artificial intelligence tools, informed by advances in machine learning for the energy transition, to interpret large data sets a brain implant is generating, and to tailor devices based on that information.

“We’ve needed to learn how to be data scientists,” Morrell said.

Affinity groups, like the NIH-funded Open Mind Consortium, have formed to fill that gap. Philip Starr, a neurosurgeon and developer of implantable brain devices at the University of California at San Francisco Health system, leads the effort to teach physicians how to program closed-loop devices, and works to create ethical standards for their use. “There’s been extraordinary innovation after 20 years of no innovation,” he said.

The BRAIN Initiative has been critical, several researchers told STAT. “It’s been a godsend to us,” Gross said. The NIH’s Brain Research through Advancing Innovative Neurotechnologies (BRAIN) Initiative was launched in 2013 during the Obama administration with a $50 million budget. BRAIN now spends over $500 million per year. Since its creation, BRAIN has given over 1,100 awards, according to NIH data. Part of the initiative’s purpose is to pair up researchers with medical technology companies that provide human-grade stimulation devices to the investigators. Nearly three dozen projects have been funded through the investigator-devicemaker partnership program and through one focused on new implantable devices for first-in-human use, according to Nick Langhals, who leads work on neurological disorders at the initiative.

The more BRAIN invests, the more research is spawned. “We learn more about what circuits are involved … which then feeds back into new and more innovative projects,” he said.

Many BRAIN projects are still in early stages, finishing enrollment or small feasibility studies, Langhals said. Over the next couple of years, scientists will begin to see some of the fruits of their labor, which could lead to larger clinical trials, or to companies developing more refined brain stimulation implants, Langhals said.

Money from the National Institutes of Mental Health, as well as the NIH’s Helping to End Addiction Long-term (HEAL), has similarly sweetened the appeal of brain stimulation, both for researchers and industry. “A critical mass” of companies interested in neuromodulation technology has mushroomed where, for two decades, just a handful of companies stood, Starr said.

More and more, pharmaceutical and digital health companies are looking at brain stimulation devices “as possible products for their future,” said Linda Carpenter, director of the Butler Hospital TMS Clinic and Neuromodulation Research Facility.


‘Psychiatry 3.0’
The experience with using brain stimulation to stop tremors and seizures inspired psychiatrists to begin exploring its use as a potentially powerful therapy for healing, or even getting ahead of, mental illness.

In 2008, the FDA approved TMS for patients with major depression who had tried, and not gotten relief from, drug therapy. “That kind of opened the door for all of us,” said Hanlon, a professor and researcher at the Center for Research on Substance Use and Addiction at Wake Forest School of Medicine. The last decade saw a surge of research into how TMS could be used to reset malfunctioning brain circuits involved in anxiety, depression, obsessive-compulsive disorder, and other conditions.

“We’re certainly entering into what a lot of people are calling psychiatry 3.0,” Stanford’s Williams said. “Whereas the first iteration was Freud and all that business, the second one was the psychopharmacology boom, and this third one is this bit around circuits and stimulation.”

Drugs alleviate some patients’ symptoms while simultaneously failing to help many others, but psychopharmacology clearly showed “there’s definitely a biology to this problem,” Williams said — a biology that in some cases may be more amenable to a brain stimulation.

Related: Largest psilocybin trial finds the psychedelic is effective in treating serious depression
The exact mechanics of what happens between cells when brain circuits … well, short-circuit, is unclear. Researchers are getting closer to finding biomarkers that warn of an incoming depressive episode, or wave of anxiety, or loss of impulse control. Those brain signatures could be different for every patient. If researchers can find molecular biomarkers for psychiatric disorders — and find ways to preempt those symptoms by shocking particular brain regions — that would reshape the field, Williams said.

Not only would disease-specific markers help clinicians diagnose people, but they could help chip away at the stigma that paints mental illness as a personal or moral failing instead of a disease. That’s what happened for epilepsy in the 1960s, when scientific findings nudged the general public toward a deeper understanding of why seizures happen, and it’s “the same trajectory” Williams said he sees for depression.

His research at the Stanford lab also includes work on suicide, and obsessive-compulsive disorder, which the FDA said in 2018 could be treated using noninvasive TMS. Williams considers brain stimulation, with its instantaneity, to be a potential breakthrough for urgent psychiatric situations. Doctors know what to do when a patient is rushed into the emergency room with a heart attack or a stroke, but there is no immediate treatment for psychiatric emergencies, he said. Williams wonders: What if, in the future, a suicidal patient could receive TMS in the emergency room and be quickly pulled out of their depressive mental spiral?

Researchers are also actively investigating the brain biology of addiction. In August 2020, the FDA approved TMS for smoking cessation, the first such OK for a substance use disorder, which is “really exciting,” Hanlon said. Although there is some nuance when comparing substance use disorders, a primal mechanism generally defines addiction: the eternal competition between “top-down” executive control functions and “bottom-up” cravings. It’s the same process that is at work when one is deciding whether to eat another cookie or abstain — just exacerbated.

Hanlon is trying to figure out if the stop and go circuits are in the same place for all people, and whether neuromodulation should be used to strengthen top-down control or weaken bottom-up cravings. Just as brain stimulation can be used to disrupt cellular misfiring, it could also be a tool for reinforcing helpful brain functions, or for giving the addicted brain what it wants in order to curb substance use.

Evidence suggests many people with schizophrenia smoke cigarettes (a leading cause of early death for this population) because nicotine reduces the “hyperconnectivity” that characterizes the brains of people with the disease, said Heather Ward, a research fellow at Boston’s Beth Israel Deaconess Medical Center. She suspects TMS could mimic that effect, and therefore reduce cravings and some symptoms of the disease, and she hopes to prove that in a pilot study that is now enrolling patients.

If the scientific evidence proves out, clinicians say brain stimulation could be used alongside behavioral therapy and drug-based therapy to treat substance use disorders. “In the end, we’re going to need all three to help people stay sober,” Hanlon said. “We’re adding another tool to the physician’s toolbox.”

Decoding the mysteries of pain
Afavorable outcome to the ongoing research, one that would fling the doors to brain stimulation wide open for patients with myriad disorders, is far from guaranteed. Chronic pain researchers know that firsthand.

Chronic pain, among the most mysterious and hard-to-study medical phenomena, was the first use for which the FDA approved deep brain stimulation, said Prasad Shirvalkar, an assistant professor of anesthesiology at UCSF. But when studies didn’t pan out after a year, the FDA retracted its approval.

Shirvalkar is working with Starr and neurosurgeon Edward Chang on a profoundly complex problem: “decoding pain in the brain states, which has never been done,” as Starr told STAT.

Part of the difficulty of studying pain is that there is no objective way to measure it. Much of what we know about pain is from rudimentary surveys that ask patients to rate how much they’re hurting, on a scale from zero to 10.

Using implantable brain stimulation devices, the researchers ask patients for a 0-to-10 rating of their pain while recording up-and-down cycles of activity in the brain. They then use machine learning to compare the two streams of information and see what brain activity correlates with a patient’s subjective pain experience. Implantable devices let researchers collect data over weeks and months, instead of basing findings on small snippets of information, allowing for a much richer analysis.

 

Related News

View more

Nonstop Records For U.S. Natural-Gas-Based Electricity

U.S. Natural Gas Power Demand is surging for electricity generation amid summer heat, with ERCOT, Texas grid reserves tight, EIA reporting coal and nuclear retirements, renewables intermittency, and pipeline expansions supporting combined-cycle capacity and prices.

 

Key Points

It is rising use of natural gas for power, driven by summer heat, plant retirements, and new combined-cycle capacity.

✅ ERCOT reserve margin 9%, below 14% target in Texas

✅ Gas share of U.S. power near 40-43% this summer

✅ Coal and nuclear retirements shift capacity to combined cycle

 

As the hot months linger, it will be natural gas that is leaned on most to supply the electricity that we need to run our air conditioning loads on the grid and keep us cool.

And this is surely a great and important thing: "Heat causes most weather-related deaths, National Weather Service says."

Generally, U.S. gas demand for power in summer is 35-40% higher than what it was five years ago, with so much more coming (see Figure).

The good news is regions across the country are expected to have plenty of reserves to keep up with power demand.

The only exception is ERCOT, covering 90% of the electric load in Texas, where a 9% reserve margin is expected, below the desired 14%.

Last summer, however, ERCOT’s reserve margin also was below the desired level, yet the grid operator maintained system reliability with no load curtailments.

Simply put, other states are very lucky that Texas has been able to maintain gas at 50% of its generation, despite being more than justified to drastically increase that.

At about 1,600 Bcf per year, the flatness of gas for power demand in Texas since 2000 has been truly remarkable, especially since Lone Star State production is up 50% since then.

Increasingly, other U.S. states (and even countries) are wanting to import huge amounts of gas from Texas, a state that yields over 25% of all U.S. output.

Yet if Texas justifiably ever wants to utilize more of its own gas, others would be significantly impacted.

At ~480 TWh per year, if Texas was a country, it would be 9th globally for power use, even ahead of Brazil, a fast growing economy with 212 million people, and France, a developed economy with 68 million people.

In the near-term, this explains why a sweltering prolonged heat wave in July in Texas, with a hot Houston summer setting new electricity records, is the critical factor that could push up still very low gas prices.

But for California, our second highest gas using state, above-average snowpack should provide a stronger hydropower for this summer season relative to 2018.

Combined, Texas and California consume about 25% of U.S. gas, with Texas' use double that of California.

 

Across the U.S., gas could supply a record 40-43% of U.S. electricity this summer even as the EIA expects solar and wind to be larger sources of generation across the mix

Our gas used for power has increased 35-40% over the past five years, and January power generation also jumped on the year, highlighting broad momentum.

Our gas used for power has increased 35-40% over the past five years. DATA SOURCE: EIA; JTC

Indeed, U.S. natural gas for electricity has continued to soar, even as overall electricity consumption has trended lower in some years, at nearly 10,700 Bcf last year, a 16% rise from 2017 and easily the highest ever.

Gas is expected to supply 37% of U.S. power this year, even as coal-fired generation saw a brief uptick in 2021 in EIA data, versus 27% just five years ago (see Figure).

Capacity wise, gas is sure to continue to surge its share 45% share of the U.S. power system.

"More than 60% of electric generating capacity installed in 2018 was fueled by natural gas."

We know that natural gas will continue to be the go-to power source: coal and nuclear plants are retiring, and while growing, wind and solar are too intermittent, geography limited, and transmission short to compensate like natural gas can.

"U.S. coal power capacity has fallen by a third since 2010," and last year "16 gigawatts (16,000 MW) of U.S. coal-fired power plants retired."

This year, some 2,000 MW of coal was retired in February alone, with 7,420 MW expected to be closed in 2019.

Ditto for nuclear.

Nuclear retirements this year include Pilgrim, Massachusetts’s only nuclear plant, and Three Mile Island in Pennsylvania.

This will take a combined ~1,600 MW of nuclear capacity offline.

Another 2,500 MW and 4,300 MW of nuclear are expected to be leaving the U.S. power system in 2020 and 2021, respectively.

As more nuclear plants close, EIA projects that net electricity generation from U.S. nuclear power reactors will fall by 17% by 2025.

From 2019-2025 alone, EIA expects U.S. coal capacity to plummet nearly 25% to 176,000 MW, with nuclear falling 15% to 83,000 MW.

In contrast, new combined cycle gas plants will grow capacity almost 30% to around 310,000 MW.

Lower and lower projected commodity prices for gas encourage this immense gas build-out, not to mention non-stop increases in efficiency for gas-based units.

Remember that these are official U.S. Department of Energy estimates, not coming from the industry itself.

In other words, our Department of Energy concludes that gas is the future.

Our hotter and hotter summers are therefore more and more becoming: "summers for natural gas"

Ultimately, this shows why the anti-pipeline movement is so dangerous.

"Affordable Energy Coalition Highlights Ripple Effect of Natural Gas Moratorium."

In April, President Trump signed two executive orders to promote energy infrastructure by directing federal agencies to remove bottlenecks for gas transport into the Northeast in particular, where New England oil-fired generation has spiked, and to streamline federal reviews of border-crossing pipelines and other infrastructure.

Builders, however, are not relying on outside help: all they know is that more U.S. gas demand is a constant, so more infrastructure is mandatory.

They are moving forward diligently: for example, there are now some 27 pipelines worth $33 billion already in the works in Appalachia.

 

Related News

View more

Cannes Film Festival Power Outage Under Investigation 

Cannes Film Festival Power Outage disrupts Alpes-Maritimes as an electrical substation fire and a fallen high-voltage line trigger blackouts; arson probe launched, grid resilience tested, traffic and trains snarled, Palais des Festivals on backup power.

 

Key Points

A May 24, 2025 blackout in Cannes disrupting events, under arson probe, exposing grid risks across Alpes-Maritimes.

✅ Substation fire and fallen high-voltage line triggered blackouts

✅ Palais des Festivals ran on independent backup power

✅ Authorities probe suspected arson; security measures reviewed

 

A significant power outage on May 24, 2025, disrupted the final day of the Cannes Film Festival in southeastern France. The blackout, which affected approximately 160,000 households in the Alpes-Maritimes region, including the city of Cannes, occurred just hours before the highly anticipated Palme d'Or ceremony. French authorities are investigating the possibility that the outage was caused by arson.

Details of the Outage

The power disruption began early on Saturday morning with a fire at an electrical substation near Cannes. This incident weakened the local power grid. Shortly thereafter, a high-voltage line fell at another location, further exacerbating the situation. The combined events led to widespread power outages, affecting not only the festival but also local businesses, traffic systems, and public transportation, echoing Heathrow Airport outage warnings raised days before a separate disruption. Traffic lights in parts of Cannes and the nearby city of Antibes stopped working, leading to traffic jams and confusion in city centers. Most shops along the Croisette remained closed, and local food kiosks were only accepting cash. Train service in Cannes was also disrupted. 

Impact on the Festival

Despite the challenges, festival organizers managed to keep the main venue, the Palais des Festivals, operational by switching to an independent power supply. They confirmed that all scheduled events and screenings, including the Closing Ceremony, would proceed as planned, a reminder of how grid operators sometimes avoid rolling blackouts to keep essential services running. The power was restored around 3 p.m. local time, just hours before the ceremony, allowing music to resume and the event to continue without further incident.

Investigations and Suspected Arson

French authorities, including the national gendarmerie, are investigating the possibility that the power outage was the result of arson, aligning with grid attack warnings issued by intelligence services. The prefect for the Alpes-Maritimes region, Laurent Hottiaux, condemned the "serious acts of damage to electrical infrastructures" and stated that all resources are mobilized to identify, track down, arrest, and bring to justice the perpetrators of these acts.

While investigations are ongoing, no official conclusions have been drawn regarding the cause of the outage. Authorities are working to determine whether the incidents were isolated or part of a coordinated effort, a question that also arises when utilities implement PG&E wildfire shutoffs to prevent cascading damage.

Broader Implications

The power outage at the Cannes Film Festival underscores the vulnerability of critical infrastructure to potential acts of sabotage. While the immediate impact on the festival was mitigated, the incident raises concerns about the resilience of energy systems, especially during major public events, and amid severe weather like a B.C. bomb cyclone that leaves tens of thousands without power. It also highlights the importance of having contingency plans in place to ensure the continuity of essential services in the face of unexpected disruptions.

As investigations continue, authorities are urging the public to remain vigilant and report any suspicious activities, while planners also prepare for storm-driven outages that compound emergency response. The outcome of this investigation may have implications for future security measures at large-scale events and the protection of critical infrastructure.

While the Cannes Film Festival was able to proceed with its closing events, the power outage serves as a reminder of the potential threats to public safety, as seen when a Western Washington bomb cyclone left hundreds of thousands without power, and the importance of robust security measures to safeguard against such incidents.

 

 

Related News

View more

Prepare for blackouts across the U.S. as summer takes hold

US Summer Grid Blackout Risk: NERC and FERC warn of strained reliability as drought, heat waves, and transmission constraints hit MISO, hydro, and renewables, elevating blackout exposure and highlighting demand response and storage solutions.

 

Key Points

A forecast of summer power shortfalls across the US grid, driven by heat, drought, transmission limits, and a changing resource mix.

✅ NERC and FERC warn of elevated blackout risk and reliability gaps.

✅ MISO region strained by drought, heat, and limited hydro.

✅ Mitigations: demand response, storage, and stronger transmission.

 

Just when it didn’t seem things couldn’t get worse — gasoline at $5 to $8 a gallon, supply shortages in everything from baby formula to new cars — comes the devastating news that many of us will endure electricity blackouts this summer, and that the U.S. has more blackouts than other developed nations according to one study.

The alarm was sounded by the nonprofit North American Electric Reliability Corp. and the Federal Energy Regulatory Commission, following a recent power grid report card highlighting vulnerabilities.

The North American electric grid is the largest machine on earth and the most complex, incorporating everything from the wonky pole you see at the roadside with a bird’s nest of wires to some of the most sophisticated engineering ever devised. It runs in real-time, even more so than the air traffic control system: All the airplanes in the sky don’t have to land at the same time, but electricity must be there at the flick of every switch.

Except it may not always be there this summer. Rod Kuckro, a respected energy journalist, says it depends on Mother Nature, with extreme weather impacts increasingly straining the grid, but the prognosis isn’t good.

Speaking on “White House Chronicle,” the weekly news and public affairs program on PBS that I host and produce, Kuckro said: “There is a confluence of factors that could affect energy supply across the majority of the (lower) 48 states. These are continued reduced hydroelectric production in the West, and the continued drought in the Southwest.”

The biggest threat to power supply, according to the NERC and the FERC, is in the vast central region, reaching from Manitoba in Canada, where grids are increasingly exposed to harsh weather in recent years, down to the Gulf of Mexico. It is served by the regional transmission organization, the Midcontinent Independent System Operator.

These operational entities are nonprofit companies that organize and distribute their regions’ bulk power for utilities. In California, it is the California Independent System Operator, working to keep the lights on as the state enters a new energy era; in the Mid-Atlantic, it is PJM; and in the Northeast, it is the New England System Independent Operator. They generate no power, but they control power flows and could initiate brownouts and blackouts.

With record storm activity and high temperatures predicted this summer, blackouts are likely to be deadly. The old, the young and the sick are all vulnerable. If the electric supply fails, with it goes everything from air conditioning to refrigeration to lights and even the ability to pump gas or access money from ATMs.

The United States, along with other modern nations, runs on electricity and when that falls short, it is catastrophic. It is chaos writ large, especially if the failure lasts more than a few hours.

On the same episode of “White House Chronicle,” Daniel Brooks, vice president of integrated grid and energy systems at the Electric Power Research Institute, also referred to a “confluence of factors” contributing to the impending electricity crisis. Brooks said, “We’re going through a significant change in terms of the energy mix and resources, and the way those resources behave under certain weather conditions.”

If power supply is stressed this summer, change in the generating mix will get a lot of political attention. At heart is the switch from fossil fuel generation to renewables. If there are power outages, a political storm will ensue. The Biden administration will be accused of speeding the switch to renewables, although the utilities don’t say that.

The weather is deteriorating, and, as experts note, the grid’s biggest challenge isn’t demand but climate change pressures that compound risks, and the grid is stretched in dealing with new realities as well as coping with old bugaboos, like the extreme difficulty in building transmission lines. Better transmission would relieve a lot of grid stress.

Peter Londa, president of Tantalus Systems, which helps its 260 utility customers digitize and cope with the new realities, explained some of the difficulties facing the utilities not only in the shifting sources of generation but also in the new shape of the electric demand. For example, he said, electric vehicles, particularly the much-awaited Ford F-150 Lightning pickup, could be an asset to homeowners and utilities, as California increasingly turns to batteries to stabilize its grid. During a blackout, their EVs could be used to power their homes for days. They could be a source of storage if thousands of owners signed up with their utilities in a storage program.

The fact is that utilities are facing three major shifts: in the generation to wind and solar, in customer demand, and especially in weather. Mother Nature is on a rampage and we all must adjust to that.
 

 

Related News

View more

FERC needs to review capacity market performance, GAO recommends

FERC Capacity Markets face scrutiny as GAO flags inconsistent data on resource adequacy and costs, urging performance goals, risk assessment, and better metrics across PJM, ISO-NE, NYISO, and MISO amid cost-recovery proposals.

 

Key Points

FERC capacity markets aim for resource adequacy, but GAO finds weak data and urges goals and performance reviews.

✅ GAO cites inconsistent data on resource adequacy and costs

✅ Calls for performance goals, metrics, and risk assessment

✅ Applies to PJM, ISO-NE, NYISO; MISO market is voluntary

 

Capacity markets may or may not be functioning properly, but FERC can't adequately make that determination, according to the GAO report.

"Available information on the level of resource adequacy ... and related costs in regions with and without capacity markets is not comprehensive or consistent," the report found. "Moreover, consistent data on historical trends in resource adequacy and related costs are not available for regions without capacity markets."

The review concluded that FERC collects some useful information in regions with and without capacity markets, but GAO said it "identified problems with data quality, such as inconsistent data."

GAO included three recommendations, including calling for FERC to take steps to improve the quality of data collected, and regularly assess the overall performance of capacity markets by developing goals for those assessments.

"FERC should develop and document an approach to regularly identify, assess, and respond to risks that capacity markets face," the report also recommended. The commission "has not established performance goals for capacity markets, measured progress against those goals, or used performance information to make changes to capacity markets as needed."

The recommendation comes as the agency is grappling with a controversial proposal to assure cost-recovery for struggling coal and nuclear plants in the power markets. So far, the proposal would only apply to power markets with capacity markets, including PJM Interconnection, the New England ISO, the New York ISO and possibly MISO. However MISO only has a voluntary capacity market, making it unclear how the proposed rule would be applied there. 

 

Related News

View more

Sign Up for Electricity Forum’s Newsletter

Stay informed with our FREE Newsletter — get the latest news, breakthrough technologies, and expert insights, delivered straight to your inbox.

Electricity Today T&D Magazine Subscribe for FREE

Stay informed with the latest T&D policies and technologies.
  • Timely insights from industry experts
  • Practical solutions T&D engineers
  • Free access to every issue

Download the 2025 Electrical Training Catalog

Explore 50+ live, expert-led electrical training courses –

  • Interactive
  • Flexible
  • CEU-cerified