Nuclear planning to the year 1,002,008

By Reuters


CSA Z462 Arc Flash Training - Electrical Safety Essentials

Our customized live online or in‑person group training can be delivered to your staff at your location.

  • Live Online
  • 6 hours Instructor-led
  • Group Training Available
Regular Price:
$249
Coupon Price:
$199
Reserve Your Seat Today
Will this barren mountain rising up to 4,950 feet from the Mojave desert look roughly the same in the year 1,002,008? ThatÂ’s a million years into the future.

The question may sound bizarre but its answer is key to the future of a decades-old, controversial project to store America’s nuclear waste in the belly of Yucca Mountain, on the edge of a nuclear test site and 95 miles from Las Vegas. The narrow road from there winds through a desolate landscape of sparse vegetation — creosote scrub, cactus and gnarled Joshua trees.

“This is probably the world’s most intensely studied mountain,” says Michael Voegele, one of the senior engineers on the project, standing beside the “Yucca Mucker”, a 720-ton cylinder-shaped machine that has drilled a five-mile tunnel into the mountain. “And yet, there will be even more study.”

Indeed. In September, the U.S. Environmental Protection Agency (EPA) revised its original safety standards for what would be the worldÂ’s first deep underground nuclear mausoleum. Those standards were meant to protect the health of people living near Yucca Mountain for 10,000 years from the time the mountain is filled with 70,000 tons of radioactive nuclear waste.

Ten thousand years is roughly twice mankind’s recorded history. But a court in Washington ruled in 2004 that protection should reach farther into the future. The new standards “will protect public health and the environment for 1 million years,” according to the EPA. “The Yucca Mountain facility will open only if it meets EPA’s standards….”

The standards specify that for the first 10,000 years, future residents should not be exposed to more than 15 millirem of radioactivity per year. From year 10,001 to one million, the dose limit is now set at 100 millirem a year.

To put those limits into context: Princeton University estimates that the average American is exposed to 350 millirems per year, from sources that range from X-rays to food. Bananas, for example. (They contain potassium and a radioactive potassium isotope. Eating one or two a day adds up to the radioactivity of a chest x-ray a year).

So is a U.S. government agency engaging in scientific fantasy that sets impossible hurdles to building up nuclear power?

“Our fundamental problem is our strict adherence to this number which is given to us by the EPA,” Allison Macfarlane, one of America’s leading experts on the Yucca Mountain project told a panel on nuclear waste in Washington a few days after the U.S. election. (America’s energy mix and the country’s dependence on foreign oil were major campaign topics.)

“This… number created these huge machinations of making incredibly complex computer models, simulations of what will happen at Yucca Mountain over time. And you know what? Those models are meaningless. We’ve set up this process where we want to say a million years from now we know that Yucca Mountain won’t give anyone a dose of more than 100 millirems a year. And we can’t know that. So we need to rethink that whole process of how we re-evaluate that site.”

Like many other experts, Macfarlane does not consider Yucca Mountain an ideal site for a nuclear cemetery. It is in a seismically active zone, complete with extinct volcanoes. Critics say an earthquake could damage the canisters in which nuclear waste will be kept and release highly toxic radioactive emissions.

Up on the mountain, that prospect is not rated probable. Says Voegele, pointing to large boulders that look as if they are balancing on the ridge: “There’s been no quake strong enough in the past 500,000 years to topple them over. Difficult to see how a quake could shake the mountain.”

At the dawn of the nuclear age, scientists discussed a range of options for the storage of the nuclear waste that began piling up from the military — much of the U.S. naval fleet is powered by nuclear reactors — and civilian power plants. They included burying the material in the ocean floor, placing it in polar ice sheets, and even blasting it into space.

No country has completely solved the problem but there is consensus that “deep geological disposal” is a better option than the present system of storing the waste in above-the-ground containers. In the U.S., radioactive waste is kept at 121 sites in 39 states, all awaiting eventual storage inside the mountain here.

Whether that will ever happen is not clear. Apart from technical considerations, Yucca Mountain faces fierce political opposition, not least from president-elect Barack Obama who has described the project as a multi-billion-dollar mistake and said no U.S. state should be “unfairly burdened with waste from other states.”

That came during the election campaign in a letter to a newspaper in Nevada, a fiercely contested state whose people are almost uniformly opposed to Yucca Mountain.

ObamaÂ’s encouragement of an attitude also known as Nimbyism (from Not in My Backyard) helped him beat his pro-Yucca, pro-nuclear energy Republican rival John McCain.

But the project, based on legislation dating back to 1982, canÂ’t be stopped by presidential fiat. The U.S. Department of Energy submitted an application to the Nuclear Regulatory Commission in September to license Yucca Mountain. That process is expected to last three to four years and includes passing judgment on the one-million-year safety standard.

If all goes well, the facility will open in 2020 at the earliest, more than 20 years behind schedule — a blink of an eye on the geological time scale.

Related News

3-layer non-medical masks now recommended by Canada's top public health doctor

Canada Three-Layer Mask Recommendation advises non-medical masks with a polypropylene filter layer and tightly woven cotton, aligned with WHO guidance, to curb COVID-19 aerosols indoors through better fit, coverage, and public health compliance.

 

Key Points

PHAC advises three-layer non-medical masks with a polypropylene filter to improve indoor COVID-19 protection.

✅ Two fabric layers plus a non-woven polypropylene filter

✅ Ensure snug fit: cover nose, mouth, chin without gaps

✅ Aligns with WHO guidance for aerosols and droplets

 

The Public Health Agency of Canada is now recommending Canadians choose three-layer non-medical masks with a filter layer to prevent the spread of COVID-19, even as an IEA report projects higher electricity needs for net-zero, as they prepare to spend more time indoors over the winter.

Chief Public Health Officer Dr. Theresa Tam made the recommendation during her bi-weekly pandemic briefing in Ottawa Tuesday, as officials also track electricity grid security amid critical infrastructure concerns.

"To improve the level of protection that can be provided by non-medical masks or face coverings, we are recommending that you consider a three-layer nonmedical mask," she said.

 

Trust MedProtect For All Your Mask Protection

www.medprotect.ca/collections/protective-masks

According to recently updated guidelines, two layers of the mask should be made of a tightly woven fabric, such as cotton or linen, and the middle layer should be a filter-type fabric, such as non-woven polypropylene fabric, as Canada explores post-COVID manufacturing capacity for PPE.

"We're not necessarily saying just throw out everything that you have," Tam told reporters, suggesting adding a filter can help with protection.

The Public Health website now includes instructions for making three-layer masks, while national goals like Canada's 2050 net-zero target continue to shape recovery efforts.

The World Health Organization has recommended three layers for non-medical masks since June, and experts note that cleaning up Canada's electricity is critical to broader climate resilience. When pressed about the sudden change for Canada, Tam said the research has evolved.

"This is an additional recommendation just to add another layer of protection. The science of masks has really accelerated during this particular pandemic. So we're just learning again as we go," she said.

"I do think that because it's winter, because we're all going inside, we're learning more about droplets and aerosols, and how indoor comfort systems from heating to air conditioning costs can influence behaviors."

She also urged Canadians to wear well-fitted masks that cover the nose, mouth and chin without gaping, as the federal government advances emissions and EV sales regulations alongside public health guidance.

Trust MedProtect For All Your Mask Protection

www.medprotect.ca/collections/protective-masks

 

 

Related News

View more

Rio Tinto Completes Largest Off-Grid Solar Plant in Canada's Northwest Territories

Rio Tinto Off-Grid Solar Power Plant showcases renewable energy at the Diavik Diamond Mine in Canada's Northwest Territories, cutting diesel use, lowering carbon emissions, and boosting remote mining resilience with advanced photovoltaic technology.

 

Key Points

A remote solar PV plant at Diavik mine supplying clean power while cutting diesel use, carbon emissions, and costs.

✅ Largest off-grid solar in Northwest Territories

✅ Replaces diesel generators during peak solar hours

✅ Enhances sustainability and lowers operating costs

 

In a significant step towards sustainable mining practices, Rio Tinto has completed the largest off-grid solar power plant in Canada’s Northwest Territories. This groundbreaking achievement not only highlights the company's commitment to renewable energy, as Canada nears 5 GW of solar capacity nationwide, but also sets a new standard for the mining industry in remote and off-grid locations.

Located in the remote Diavik Diamond Mine, approximately 220 kilometers south of the Arctic Circle, Rio Tinto's off-grid solar power plant represents a technological feat in harnessing renewable energy in challenging environments. The plant is designed to reduce reliance on diesel fuel, traditionally used to power the mine's operations, and mitigate carbon emissions associated with mining activities.

The decision to build the solar power plant aligns with Rio Tinto's broader sustainability goals and commitment to reducing its environmental footprint. By integrating renewable energy sources like solar power, a strategy that renewable developers say leads to better, more resilient projects, the company aims to enhance energy efficiency, lower operational costs, and contribute to global efforts to combat climate change.

The Diavik Diamond Mine, jointly owned by Rio Tinto and Dominion Diamond Mines, operates in a remote region where access to traditional energy infrastructure is limited, and where, despite lagging solar demand in Canada, off-grid solutions are increasingly vital for reliability. Historically, diesel generators have been the primary source of power for the mine's operations, posing logistical challenges and environmental impacts due to fuel transportation and combustion.

Rio Tinto's investment in the off-grid solar power plant addresses these challenges by leveraging abundant sunlight in the Northwest Territories to generate clean electricity directly at the mine site. The solar array, equipped with advanced photovoltaic technology, which mirrors deployments such as Arvato's first solar plant in other sectors, is capable of producing a significant portion of the mine's electricity needs during peak solar hours, reducing reliance on diesel generators and lowering overall carbon emissions.

Moreover, the completion of the largest off-grid solar power plant in Canada's Northwest Territories underscores the feasibility and scalability of renewable energy solutions, from rooftop arrays like Edmonton's largest rooftop solar to off-grid systems in remote and resource-intensive industries like mining. The success of this project serves as a model for other mining companies seeking to enhance sustainability practices and operational resilience in challenging geographical locations.

Beyond environmental benefits, Rio Tinto's initiative is expected to have positive economic and social impacts on the local community. By reducing diesel consumption, the company mitigates air pollution and noise levels associated with mining operations, improving environmental quality and contributing to the well-being of nearby residents and wildlife.

Looking ahead, Rio Tinto's investment in renewable energy at the Diavik Diamond Mine sets a precedent for responsible resource development and sustainable mining practices in Canada, where solar growth in Alberta is accelerating, and globally. As the mining industry continues to evolve, integrating renewable energy solutions like off-grid solar power plants will play a crucial role in achieving long-term environmental sustainability and operational efficiency.

In conclusion, Rio Tinto's completion of the largest off-grid solar power plant in Canada's Northwest Territories marks a significant milestone in the mining industry's transition towards renewable energy. By harnessing solar power to reduce reliance on diesel generators, the company not only improves operational efficiency and environmental stewardship but also adds to momentum from corporate power purchase agreements like RBC's Alberta solar deal, setting a positive example for sustainable development in remote regions. As global demand for responsible mining practices grows, initiatives like Rio Tinto's off-grid solar project demonstrate the potential of renewable energy to drive positive change in resource-intensive industries.

 

Related News

View more

Was there another reason for electricity shutdowns in California?

PG&E Wind Shutdown and Renewable Reliability examines PSPS strategy, wildfire risk, transmission line exposure, wind turbine cut-out speeds, grid stability, and California's energy mix amid historic high-wind events and supply constraints across service areas.

 

Key Points

An overview of PG&E's PSPS decisions, wildfire mitigation, and how wind cut-out limits influence grid reliability.

✅ Wind turbines reach cut-out near 55 mph, reducing generation.

✅ PSPS mitigates ignition from damaged transmission infrastructure.

✅ Baseload diversity improves resilience during high-wind events.

 

According to the official, widely reported story, Pacific Gas & Electric (PG&E) initiated power shutoffs across substantial portions of its electric transmission system in northern California as a precautionary measure.

Citing high wind speeds they described as “historic,” the utility claims that if it didn’t turn off the grid, wind-caused damage to its infrastructure could start more wildfires.

Perhaps that’s true. Perhaps. This tale presumes that the folks who designed and maintain PG&E’s transmission system are unaware of or ignored the need to design it to withstand severe weather events, and that the Federal Energy Regulatory Commission (FERC) and North American Electric Reliability Corp. (NERC) allowed the utility to do so.

Ignorance and incompetence happens, to be sure, but there’s much about this story that doesn’t smell right—and it’s disappointing that most journalists and elected officials are apparently accepting it without question.

Take, for example, this statement from a Fox News story about the Kincade Fires: “A PG&E meteorologist said it’s ‘likely that many trees will fall, branches will break,’ which could damage utility infrastructure and start a fire.”

Did you ever notice how utilities cut wide swaths of trees away when transmission lines pass through forests? There’s a reason for that: When trees fall and branches break, the grid can still function, and even as the electric rhythms of New York City shifted during COVID-19, operators planned for variability.

So, if badly designed and poorly maintained infrastructure isn’t the reason PG&E cut power to millions of Californians, what might have prompted them to do so? Could it be that PG&E’s heavy reliance on renewable energy means they don’t have the power to send when a “historic” weather event occurs, especially as policymakers weigh the postponed closure of three power plants elsewhere in California?

 

Wind Speed Limits

The two most popular forms of renewable energy come with operating limitations, which is why some energy leaders urge us to keep electricity options open when planning the grid. With solar power, the constraint is obvious: the availability of sunlight. One doesn’t generate solar power at night and energy generation drops off with increasing degrees of cloud cover during the day.

The main operating constraint of wind power is, of course, wind speed, and even in markets undergoing 'transformative change' in wind generation, operators adhere to these technical limits. At the low end of the scale, you need about a 6 or 7 miles-per-hour wind to get a turbine moving. This is called the “cut-in speed.” To generate maximum power, about a 30 mph wind is typically required. But, if the wind speed is too high, the wind turbine will shut down. This is called the “cut-out speed,” and it’s about 55 miles per hour for most modern wind turbines.

It may seem odd that wind turbines have a cut-out speed, but there’s a very good reason for it. Each wind turbine rotor is connected to an electric generator housed in the turbine nacelle. The connection is made through a gearbox that is sized to turn the generator at the precise speed required to produce 60 Hertz AC power.

The blades of the wind turbine are airfoils, just like the wings of an airplane. Adjusting the pitch (angle) of the blades allows the rotor to maintain constant speed, which, in turn, allows the generator to maintain the constant speed it needs to safely deliver power to the grid. However, there’s a limit to blade pitch adjustment. When the wind is blowing so hard that pitch adjustment is no longer possible, the turbine shuts down. That’s the cut-out speed.

Now consider how California’s power generation profile has changed. According to Energy Information Administration data, the state generated 74.3 percent of its electricity from traditional sources—fossil fuels and nuclear, amid debates over whether to classify nuclear as renewable—in 2001. Hydroelectric, geothermal, and biomass-generated power accounted for most of the remaining 25.7 percent, with wind and solar providing only 1.98 percent of the total.

By 2018, the state’s renewable portfolio had jumped to 43.8 percent of total generation, with clean power increasing and wind and solar now accounting for 17.9 percent of total generation. That’s a lot of power to depend on from inherently unreliable sources. Thus, it wouldn’t be at all surprising to learn that PG&E didn’t stop delivering power out of fear of starting fires, but because it knew it wouldn’t have power to deliver once high winds shut down all those wind turbines

 

Related News

View more

Russia suspected as hackers breach systems at power plants across US

US Power Grid Cyberattacks target utilities and nuclear plants, probing SCADA, ICS, and business networks at sites like Wolf Creek; suspected Russian actors, malware, and spear-phishing trigger DHS and FBI alerts on critical infrastructure resilience.

 

Key Points

Intrusions on energy networks probing ICS and SCADA, seeking persistence and elevating risks to critical infrastructure.

✅ Wolf Creek nuclear plant targeted; no operational systems breached

✅ Attackers leveraged stolen credentials, malware, and spear-phishing

✅ DHS and FBI issued alerts; utilities enhance cyber resilience

 

Hackers working for a foreign government recently breached at least a dozen US power plants, including the Wolf Creek nuclear facility in Kansas, according to current and former US officials, sparking concerns the attackers were searching for vulnerabilities in the electrical grid.

The rivals could be positioning themselves to eventually disrupt the nation’s power supply, warned the officials, who noted that a general alert, prompting a renewed focus on protecting the U.S. power grid, was distributed to utilities a week ago. Adding to those concerns, hackers recently infiltrated an unidentified company that makes control systems for equipment used in the power industry, an attack that officials believe may be related.

The chief suspect is Russia, according to three people familiar with the continuing effort to eject the hackers from the computer networks. One of those networks belongs to an ageing nuclear generating facility known as Wolf Creek -- owned by Westar Energy Inc, Great Plains Energy Inc, and Kansas Electric Power Cooperative Inc -- on a lake shore near Burlington, Kansas.

The possibility of a Russia connection is particularly worrying, former and current official s say, because Russian hackers have previously taken down parts of the electrical grid in Ukraine and appear to be testing increasingly advanced tools, including cyber weapons to disrupt power grids, to disrupt power supplies.

The hacks come as international tensions have flared over US intelligence agencies’ conclusion that Russia tried to influence the 2016 presidential election, and amid U.S. government condemnation of Russian power-grid hacking in recent advisories. The US, which has several continuing investigations into Russia’s activities, is known to possess digital weapons capable of disrupting the electricity grids of rival nations.

“We don’t pay attention to such anonymous fakes,” Kremlin spokesman Dmitry Peskov said, in response to a request to comment on alleged Russian involvement.

It was unclear whether President Donald Trump was planning to address the cyber attacks at his meeting on Friday with Russian President Vladimir Putin. In an earlier speech in Warsaw, Trump called out Russia’s “destabilising activities” and urged the country to join “the community of responsible nations.”

The Department of Homeland Security and Federal Bureau of Investigation said they are aware of a potential intrusion in the energy sector. The alert issued to utilities cited activities by hackers since May.

“There is no indication of a threat to public safety, as any potential impact appears to be limited to administrative and business networks,” the government agencies said in a joint statement.

The Department of Energy also said the impact appears limited to administrative and business networks and said it was working with utilities and grid operators to enhance security and resilience.

“Regardless of whether malicious actors attempt to exploit business networks or operational systems, we take any reports of malicious cyber activity potentially targeting our nation’s energy infrastructure seriously and respond accordingly,” the department said in an emailed statement.

Representatives of the National Security Council, the Director of National Intelligence and the Nuclear Regulatory Commission declined to comment. While Bloomberg News was waiting for responses from the government, the New York Times reported that hacks were targeting nuclear power stations.

The North American Electric Reliability Corp, a nonprofit that works to ensure the reliability of the continent’s power system, said it was aware of the incident and was exchanging information with the industry through a secure portal.

“At this time, there has been no bulk power system impact in North America,” the corporation said in an emailed statement.

In addition, the operational controls at Wolf Creek were not pierced, according to government officials, even as attackers accessed utility control rooms elsewhere in the U.S., according to separate reports. “There was absolutely no operational impact to Wolf Creek,” Jenny Hageman, a spokeswoman for the nuclear plant, said in a statement to Bloomberg News.

“The reason that is true is because the operational computer systems are completely separate from the corporate network.”

Determining who is behind an attack can be tricky. Government officials look at the sophistication of the tools, among other key markers, when gauging whether a foreign government is sponsoring cyber activities.

Several private security firms, including Symantec researchers, are studying data on the attacks, but none has linked the work to a particular hacking team or country.

“We don’t tie this to any known group at this point,” said Sean McBride, a lead analyst for FireEye Inc, a global cyber security firm. “It’s not to say it’s not related, but we don’t have the evidence at this point.”

US intelligence officials have long been concerned about the security of the country’s electrical grid. The recent attack, striking almost simultaneously at multiple locations, is testing the government’s ability to coordinate an effective response among several private utilities, state and local officials, and industry regulators.

Specialised teams from Homeland Security and the FBI have been scrambled to help extricate the hackers from the power stations, in some cases without informing local and state officials. Meanwhile, the US National Security Agency is working to confirm the identity of the hackers, who are said to be using computer servers in Germany, Italy, Malaysia and Turkey to cover their tracks.

Many of the power plants are conventional, but the targeting of a nuclear facility adds to the pressure. While the core of a nuclear generator is heavily protected, a sudden shutdown of the turbine can trigger safety systems. These safety devices are designed to disperse excess heat while the nuclear reaction is halted, but the safety systems themselves may be vulnerable to attack.

Homeland Security and the FBI sent out a general warning about the cyber attack to utilities and related parties on June 28, though it contained few details or the number of plants affected. The government said it was most concerned about the “persistence” of the attacks on choke points of the US power supply. That language suggests hackers are trying to establish backdoors on the plants’ systems for later use, according to a former senior DHS official who asked not to be identified.

Those backdoors can be used to insert software specifically designed to penetrate a facility’s operational controls and disrupt critical systems, according to Galina Antova, co-founder of Claroty, a New York firm that specialises in securing industrial control systems.

“We’re moving to a point where a major attack like this is very, very possible,” Antova said. “Once you’re into the control systems -- and you can get into the control systems by hacking into the plant’s regular computer network -- then the basic security mechanisms you’d expect are simply not there.”

The situation is a little different at nuclear facilities. Backup power supplies and other safeguards at nuclear sites are meant to ensure that “you can’t really cause a nuclear plant to melt down just by taking out the secondary systems that are connected to the grid,” Edwin Lyman, a nuclear expert with the Union of Concerned Scientists, said in a phone interview.

The operating systems at nuclear plants also tend to be legacy controls built decades ago and don’t have digital control systems that can be exploited by hackers. Wolf Creek, for example, began operations in 1985. “They’re relatively impervious to that kind of attack,” Lyman said.

The alert sent out last week inadvertently identified Wolf Creek as one of the victims of the attack. An analysis of one of the tools used by the hackers had the stolen credentials of a plant employee, a senior engineer. A US official acknowledged the error was not caught until after the alert was distributed.

According to a security researcher who has seen the report, the malware that activated the engineer’s username and password was designed to be used once the hackers were already inside the plant’s computer systems.

The tool tries to connect to non-public computers, and may have been intended to identify systems related to Wolf Creek’s generation plant, a part of the facility typically more modern than the nuclear reactor control room, according to a security expert who asked to note be identified because the alert is not public.

Even if there is no indication that the hackers gained access to those control systems, the design of the malware suggests they may have at least been looking for ways to do so, the expert said.

Stan Luke, the mayor of Burlington, the largest community near Wolf Creek, which is surrounded by corn fields and cattle pastures, said he learned about a cyber threat at the plant only recently, and then only through golfing buddies.

With a population of just 2,700, Burlington boasts a community pool with three water slides and a high school football stadium that would be the envy of any junior college. Luke said those amenities lead back to the tax dollars poured into the community by Wolf Creek, Coffey County’s largest employer with some 1,000 workers, 600 of whom live in the county.

E&E News first reported on digital attacks targeting US nuclear plants, adding it was code-named Nuclear 17. A senior US official told Bloomberg that there was a bigger breach of conventional plants, which could affect multiple regions.

Industry experts and US officials say the attack is being taken seriously, in part because of recent events in Ukraine. Antova said that the Ukrainian power grid has been disrupted at least twice, first in 2015, and then in a more automated attack last year, suggesting the hackers are testing methods.

Scott Aaronson, executive director for security and business continuity at the Edison Electric Institute, an industry trade group, said utilities, grid operators and federal officials were already dissecting the attack on Ukraine’s electric sector to apply lessons in North America before the US government issued the latest warning to “energy and critical manufacturing sectors”. The current threat is unrelated to recently publicised ransomware incidents or the CrashOverride malware, Mr Aaronson said in an emailed statement.

Neither attack in Ukraine caused long-term damage. But with each escalation, the hackers may be gauging the world’s willingness to push back.

“If you think about a typical war, some of the acts that have been taken against critical infrastructure in Ukraine and even in the US, those would be considered crossing red lines,” Antova said.

 

Related News

View more

Europe’s Big Oil Companies Are Turning Electric

European Oil Majors Energy Transition highlights BP, Shell, and Total rapidly scaling renewables, wind and solar assets, hydrogen, electricity, and EV charging while cutting upstream capex, aligning with net-zero goals and utility-style energy services.

 

Key Points

It is the shift by BP, Shell, Total and peers toward renewables, electricity, hydrogen, and EV charging to meet net-zero goals.

✅ Offshore wind, solar, and hydrogen projects scale across Europe

✅ Capex shifts, fossil output declines, net-zero targets by 2050

✅ EV charging, utilities, and power trading become core services

 

Under pressure from governments and investors, including rising investor pressure at utilities that reverberates across the sector, industry leaders like BP and Shell are accelerating their production of cleaner energy.

This may turn out to be the year that oil giants, especially in Europe, started looking more like electric companies.

Late last month, Royal Dutch Shell won a deal to build a vast wind farm off the coast of the Netherlands. Earlier in the year, France’s Total, which owns a battery maker, agreed to make several large investments in solar power in Spain and a wind farm off Scotland. Total also bought an electric and natural gas utility in Spain and is joining Shell and BP in expanding its electric vehicle charging business.

At the same time, the companies are ditching plans to drill more wells as they chop back capital budgets. Shell recently said it would delay new fields in the Gulf of Mexico and in the North Sea, while BP has promised not to hunt for oil in any new countries.

Prodded by governments and investors to address climate change concerns about their products, Europe’s oil companies are accelerating their production of cleaner energy — usually electricity, sometimes hydrogen — and promoting natural gas, which they argue can be a cleaner transition fuel from coal and oil to renewables, as carbon emissions drop in power generation.

For some executives, the sudden plunge in demand for oil caused by the pandemic — and the accompanying collapse in earnings — is another warning that unless they change the composition of their businesses, they risk being dinosaurs headed for extinction.

This evolving vision is more striking because it is shared by many longtime veterans of the oil business.

“During the last six years, we had extreme volatility in the oil commodities,” said Claudio Descalzi, 65, the chief executive of Eni, who has been with that Italian company for nearly 40 years. He said he wanted to build a business increasingly based on green energy rather than oil.

“We want to stay away from the volatility and the uncertainty,” he added.

Bernard Looney, a 29-year BP veteran who became chief executive in February, recently told journalists, “What the world wants from energy is changing, and so we need to change, quite frankly, what we offer the world.”

The bet is that electricity will be the prime means of delivering cleaner energy in the future and, therefore, will grow rapidly as clean-energy investment incentives scale globally.

American giants like Exxon Mobil and Chevron have been slower than their European counterparts to commit to climate-related goals that are as far reaching, analysts say, partly because they face less government and investor pressure (although Wall Street investors are increasingly vocal of late).

“We are seeing a much bigger differentiation in corporate strategy” separating American and European oil companies “than at any point in my career,” said Jason Gammel, a veteran oil analyst at Jefferies, an investment bank.

Companies like Shell and BP are trying to position themselves for an era when they will rely much less on extracting natural resources from the earth than on providing energy as a service tailored to the needs of customers — more akin to electric utilities than to oil drillers.

They hope to take advantage of the thousands of engineers on their payrolls to manage the construction of new types of energy plants; their vast networks of retail stations to provide services like charging electric vehicles; and their trading desks, which typically buy and hedge a wide variety of energy futures, to arrange low-carbon energy supplies for cities or large companies.

All of Europe’s large oil companies have now set targets to reduce the carbon emissions that contribute to climate change. Most have set a ”net zero” ambition by 2050, a goal also embraced by governments like the European Union and Britain.

The companies plan to get there by selling more and more renewable energy and by investing in carbon-free electricity across their portfolios, and, in some cases, by offsetting emissions with so-called nature-based solutions like planting forests to soak up carbon.

Electricity is the key to most of these strategies. Hydrogen, a clean-burning gas that can store energy and generate electric power for vehicles, also plays an increasingly large role.

The coming changes are clearest at BP. Mr. Looney said this month that he planned to increase investment in low-emission businesses like renewable energy by tenfold in the next decade to $5 billion a year, while cutting back oil and gas production by 40 percent. By 2030, BP aims to generate renewable electricity comparable to a few dozen large offshore wind farms.

Mr. Looney, though, has said oil and gas production need to be retained to generate cash to finance the company’s future.

Environmentalists and analysts described Mr. Looney’s statement that BP’s oil and gas production would decline in the future as a breakthrough that would put pressure on other companies to follow.

BP’s move “clearly differentiates them from peers,” said Andrew Grant, an analyst at Carbon Tracker, a London nonprofit. He noted that most other oil companies had so far been unwilling to confront “the prospect of producing less fossil fuels.”

While there is skepticism in both the environmental and the investment communities about whether century-old companies like BP and Shell can learn new tricks, they do bring scale and know-how to the task.

“To make a switch from a global economy that depends on fossil fuels for 80 percent of its energy to something else is a very, very big job,” said Daniel Yergin, the energy historian who has a forthcoming book, “The New Map,” on the global energy transition now occurring in energy. But he noted, “These companies are really good at big, complex engineering management that will be required for a transition of that scale.”

Financial analysts say the dreadnoughts are already changing course.

“They are doing it because management believes it is the right thing to do and also because shareholders are severely pressuring them,” said Michele Della Vigna, head of natural resources research at Goldman Sachs.

Already, he said, investments by the large oil companies in low-carbon energy have risen to as much as 15 percent of capital spending, on average, for 2020 and 2021 and around 50 percent if natural gas is included.

Oswald Clint, an analyst at Bernstein, forecast that the large oil companies would expand their renewable-energy businesses like wind, solar and hydrogen by around 25 percent or more each year over the next decade.

Shares in oil companies, once stock market stalwarts, have been marked down by investors in part because of the risk that climate change concerns will erode demand for their products. European electric companies are perceived as having done more than the oil industry to embrace the new energy era.

“It is very tricky for an investor to have confidence that they can pull this off,” Mr. Clint said, referring to the oil industry’s aspirations to change.

But, he said, he expects funds to flow back into oil stocks as the new businesses gather momentum.

At times, supplying electricity has been less profitable than drilling for oil and gas. Executives, though, figure that wind farms and solar parks are likely to produce more predictable revenue, partly because customers want to buy products labeled green.

Mr. Descalzi of Eni said converted refineries in Venice and Sicily that the company uses to make lower-carbon fuel from plant matter have produced better financial results in this difficult year than its traditional businesses.

Oil companies insist that they must continue with some oil and gas investments, not least because those earnings can finance future energy sources. “Not to make any mistake,” Patrick Pouyanné, chief executive of Total, said to analysts recently: Low-cost oil projects will be a part of the future.

During the pandemic, BP, Total and Shell have all scrutinized their portfolios, partly to determine if climate change pressures and lingering effects from the pandemic mean that petroleum reserves on their books — developed for perhaps billions of dollars, when oil was at the center of their business — might never be produced or earn less than previously expected. These exercises have led to tens of billions of dollars of write-offs for the second quarter, and there are likely to be more as companies recalibrate their plans.

“We haven’t seen the last of these,” said Luke Parker, vice president for corporate analysis at Wood Mackenzie, a market research firm. “There will be more to come as the realities of the energy transition bite.”

 

Related News

View more

Study: US Power Grid Has More Blackouts Than ENTIRE Developed World

US Power Grid Blackouts highlight aging infrastructure, rising outages, and declining reliability per DOE and NERC data, with weather-driven failures, cyberattack risk, and underinvestment stressing utilities, transmission lines, and modernization efforts.

 

Key Points

US power grid blackouts are outages caused by aging grid assets, severe weather, and cyber threats reducing reliability.

✅ DOE and NERC data show rising outage frequency and duration.

✅ Weather now drives 68-73% of major failures since 2008.

✅ Modernization, hardening, and cybersecurity investments are critical.

 

The United States power grid has more blackouts than any other country in the developed world, according to new data and U.S. blackout warnings that spotlight the country’s aging and unreliable electric system.

The data by the Department of Energy (DOE) and the North American Electric Reliability Corporation (NERC) shows that Americans face more power grid failures lasting at least an hour than residents of other developed nations.

And it’s getting worse.

Going back three decades, the US grid loses power 285 percent more often than it did in 1984, when record keeping began, International Business Times reported. The power outages cost businesses in the United States as much as $150 billion per year, according to the Department of Energy.

Customers in Japan lose power for an average of 4 minutes per year, as compared to customers in the US upper Midwest (92 minutes) and upper Northwest (214), University of Minnesota Professor Massoud Amin told the Times. Amin is director of the Technological Leadership Institute at the school.

#google#

The grid is becoming less dependable each year, he said.

“Each one of these blackouts costs tens of hundreds of millions, up to billions, of dollars in economic losses per event,” Amin said. “… We used to have two to five major weather events per year [that knocked out power], from the ‘50s to the ‘80s. Between 2008 and 2012, major outages caused by weather, reflecting extreme weather trends, increased to 70 to 130 outages per year. Weather used to account for about 17 to 21 percent of all root causes. Now, in the last five years, it’s accounting for 68 to 73 percent of all major outages.”

As previously reported by Off The Grid News, the power grid received a “D+” grade on its power grid report card from the American Society of Civil Engineers (ASCE) in 2013. The power grid grade card rating means the energy infrastructure is in “poor to fair condition and mostly below standard, with many elements approaching the end of their service life.” It further means a “large portion of the system exhibits significant deterioration” with a “strong risk of failure.”

“America relies on an aging electrical grid and pipeline distribution systems, some of which originated in the 1880s,” the 2013 ASCE report read. “Investment in power transmission has increased since 2005, but ongoing permitting issues, weather events, and limited maintenance have contributed to an increasing number of failures and power interruptions.”

As The Times noted, the US power grid as it exists today was built shortly after World War II, with the design dating back to Thomas Edison. While Edison was a genius, he and his contemporaries could not have envisioned all the strains the modern world would place upon the grid and the multitude of tech gadgets many Americans treat as an extension of their body. While the drain on the grid has advanced substantially, the infrastructure itself has not.

There are approximately 5 million miles of electrical transmission lines throughout the United States, and thousands of power generating plants dot the landscape. The electrical grid is managed by a group of 3,300 different utilities and serve about 150 million customers, The Times said. The entire power grid system is currently valued at $876 billion.

Many believe the grid is vulnerable to an attack on substations and other threats.

Former Department of Homeland Security Secretary Janet Napolitano once said that a power grid cyber attack is a matter of “when” not “if,” as Russians hacked utilities incidents have shown.

 

Related News

View more

Sign Up for Electricity Forum’s Newsletter

Stay informed with our FREE Newsletter — get the latest news, breakthrough technologies, and expert insights, delivered straight to your inbox.

Electricity Today T&D Magazine Subscribe for FREE

Stay informed with the latest T&D policies and technologies.
  • Timely insights from industry experts
  • Practical solutions T&D engineers
  • Free access to every issue

Download the 2025 Electrical Training Catalog

Explore 50+ live, expert-led electrical training courses –

  • Interactive
  • Flexible
  • CEU-cerified