Alberta government tells utilities to prove fair costs

By Alberta Energy


Protective Relay Training - Basic

Our customized live online or in‑person group training can be delivered to your staff at your location.

  • Live Online
  • 12 hours Instructor-led
  • Group Training Available
Regular Price:
$699
Coupon Price:
$599
Reserve Your Seat Today
Changes by Alberta's provincial government will protect consumers by bringing greater transparency to electricity costs. Under new regulations, the onus is now on electricity transmission companies to prove the cost of transmission lines is reasonable. Under the old regulations, it was up to consumer groups to challenge the cost - a cost that is ultimately passed on to Albertans.

“We have taken action to ensure that Albertans aren’t on the hook for unjustified costs associated with building transmission lines,” said Alberta's Energy Minister Ken Hughes. “Transmission companies now must defend every cent they charge consumers. This brings more transparency and delivers on promises made following the retail market review.”

As part of its broader powers to scrutinize the cost of new transmission projects, the Alberta Utilities Commission will be able to assess information earlier in the process, so it can ensure that all costs are necessary.

In January 2013, government announced changes to transmission cost oversight and struck an MLA Implementation Team to protect Alberta consumers from volatile electricity costs.

An estimated $14 billion in new transmission infrastructures will be built over the next decade to meet the growing electricity demands Albertans and industry within the province.

Related News

Carbon capture: How can we remove CO2 from the atmosphere?

CO2 Removal Technologies address climate change via negative emissions, including carbon capture, reforestation, soil carbon, biochar, BECCS, DAC, and mineralization, helping meet Paris Agreement targets while managing costs, land use, and infrastructure demands.

 

Key Points

Methods to extract or sequester atmospheric CO2, combining natural and engineered approaches to limit warming.

✅ Includes reforestation, soil carbon, biochar, BECCS, DAC, mineralization

✅ Balances climate goals with costs, land, energy, and infrastructure

✅ Key to Paris Agreement targets under 1.5-2.0 °C warming

 

The world is, on average, 1.1 degrees Celsius warmer today than it was in 1850. If this trend continues, our planet will be 2 – 3 degrees hotter by the end of this century, according to the Intergovernmental Panel on Climate Change (IPCC).

The main reason for this temperature rise is higher levels of atmospheric carbon dioxide, which cause the atmosphere to trap heat radiating from the Earth into space. Since 1850, the proportion of CO2 in the air has increased, with record greenhouse gas concentrations documented, from 0.029% to 0.041% (288 ppm to 414 ppm).

This is directly related to the burning of coal, oil and gas, which were created from forests, plankton and plants over millions of years. Back then, they stored CO2 and kept it out of the atmosphere, but as fossil fuels are burned, that CO2 is released. Other contributing factors include industrialized agriculture and slash-and-burn land clearing techniques, and emissions from SF6 in electrical equipment are also concerning today.

Over the past 50 years, more than 1200 billion tons of CO2 have been emitted into the planet's atmosphere — 36.6 billion tons in 2018 alone, though global emissions flatlined in 2019 before rising again. As a result, the global average temperature has risen by 0.8 degrees in just half a century.


Atmospheric CO2 should remain at a minimum
In 2015, the world came together to sign the Paris Climate Agreement which set the goal of limiting global temperature rise to well below 2 degrees — 1.5 degrees, if possible.

The agreement limits the amount of CO2 that can be released into the atmosphere, providing a benchmark for the global energy transition now underway. According to the IPCC, if a maximum of around 300 billion tons were emitted, there would be a 50% chance of limiting global temperature rise to 1.5 degrees. If CO2 emissions remain the same, however, the CO2 'budget' would be used up in just seven years.

According to the IPCC's report on the 1.5 degree target, negative emissions are also necessary to achieve the climate targets.


Using reforestation to remove CO2
One planned measure to stop too much CO2 from being released into the atmosphere is reforestation. According to studies, 3.6 billion tons of CO2 — around 10% of current CO2 emissions — could be saved every year during the growth phase. However, a study by researchers at the Swiss Federal Institute of Technology, ETH Zurich, stresses that achieving this would require the use of land areas equivalent in size to the entire US.

Young trees at a reforestation project in Africa (picture-alliance/OKAPIA KG, Germany)
Reforestation has potential to tackle the climate crisis by capturing CO2. But it would require a large amount of space


More humus in the soil
Humus in the soil stores a lot of carbon. But this is being released through the industrialization of agriculture. The amount of humus in the soil can be increased by using catch crops and plants with deep roots as well as by working harvest remnants back into the ground and avoiding deep plowing. According to a study by the German Institute for International and Security Affairs (SWP) on using targeted CO2 extraction as a part of EU climate policy, between two and five billion tons of CO2 could be saved with a global build-up of humus reserves.


Biochar shows promise
Some scientists see biochar as a promising technology for keeping CO2 out of the atmosphere. Biochar is created when organic material is heated and pressurized in a zero or very low-oxygen environment. In powdered form, the biochar is then spread on arable land where it acts as a fertilizer. This also increases the amount of carbon content in the soil. According to the same study from the SWP, global application of this technology could save between 0.5 and two billion tons of CO2 every year.


Storing CO2 in the ground
Storing CO2 deep in the Earth is already well-known and practiced on Norway's oil fields, for example. However, the process is still controversial, as storing CO2 underground can lead to earthquakes and leakage in the long-term. A different method is currently being practiced in Iceland, in which CO2 is sequestered into porous basalt rock to be mineralized into stone. Both methods still require more research, however, with new DOE funding supporting carbon capture, utilization, and storage.

Capturing CO2 to be held underground is done by using chemical processes which effectively extract the gas from the ambient air, and some researchers are exploring CO2-to-electricity concepts for utilization. This method is known as direct air capture (DAC) and is already practiced in other parts of Europe.  As there is no limit to the amount of CO2 that can be captured, it is considered to have great potential. However, the main disadvantage is the cost — currently around €550 ($650) per ton. Some scientists believe that mass production of DAC systems could bring prices down to €50 per ton by 2050. It is already considered a key technology for future climate protection.

The inside of a carbon capture facility in the Netherlands (RWE AG)
Carbon capture facilities are still very expensive and take up a huge amount of space

Another way of extracting CO2 from the air is via biomass. Plants grow and are burned in a power plant to produce electricity. CO2 is then extracted from the exhaust gas of the power plant and stored deep in the Earth, with new U.S. power plant rules poised to test such carbon capture approaches.

The big problem with this technology, known as bio-energy carbon capture and storage (BECCS) is the huge amount of space required. According to Felix Creutzig from the Mercator Institute on Global Commons and Climate Change (MCC) in Berlin, it will therefore only play "a minor role" in CO2 removal technologies.


CO2 bound by rock minerals
In this process, carbonate and silicate rocks are mined, ground and scattered on agricultural land or on the surface water of the ocean, where they collect CO2 over a period of years. According to researchers, by the middle of this century it would be possible to capture two to four billion tons of CO2 every year using this technique. The main challenges are primarily the quantities of stone required, and building the necessary infrastructure. Concrete plans have not yet been researched.


Not an option: Fertilizing the sea with iron
The idea is use iron to fertilize the ocean, thereby increasing its nuturient content, which would allow plankton to grow stronger and capture more CO2. However, both the process and possible side effects are very controversial. "This is rarely treated as a serious option in research," concludes SWP study authors Oliver Geden and Felix Schenuit.

 

Related News

View more

How to Get Solar Power on a Rainy Day? Beam It From Space

Space solar power promises wireless energy from orbital solar satellites via microwave or laser power beaming, using photovoltaics and rectennas. NRL and AFRL advances hint at 24-7 renewable power delivery to Earth and airborne drones.

 

Key Points

Space solar power beams orbital solar energy to Earth via microwaves or lasers, enabling continuous wireless electricity.

✅ Harvests sunlight in orbit and transmits via microwaves or lasers

✅ Provides 24-7 renewable power, independent of weather or night

✅ Enables wireless power for remote sites, grids, and drones

 

Earlier this year, a small group of spectators gathered in David Taylor Model Basin, the Navy’s cavernous indoor wave pool in Maryland, to watch something they couldn’t see. At each end of the facility there was a 13-foot pole with a small cube perched on top. A powerful infrared laser beam shot out of one of the cubes, striking an array of photovoltaic cells inside the opposite cube. To the naked eye, however, it looked like a whole lot of nothing. The only evidence that anything was happening came from a small coffee maker nearby, which was churning out “laser lattes” using only the power generated by the system as ambitions for cheap abundant electricity gain momentum worldwide.

The laser setup managed to transmit 400 watts of power—enough for several small household appliances—through hundreds of meters of air without moving any mass. The Naval Research Lab, which ran the project, hopes to use the system to send power to drones during flight. But NRL electronics engineer Paul Jaffe has his sights set on an even more ambitious problem: beaming solar power to Earth from space. For decades the idea had been reserved for The Future, but a series of technological breakthroughs and a massive new government research program suggest that faraway day may have finally arrived as interest in space-based solar broadens across industry and government.

Since the idea for space solar power first cropped up in Isaac Asimov’s science fiction in the early 1940s, scientists and engineers have floated dozens of proposals to bring the concept to life, including inflatable solar arrays and robotic self-assembly. But the basic idea is always the same: A giant satellite in orbit harvests energy from the sun and converts it to microwaves or lasers for transmission to Earth, where it is converted into electricity. The sun never sets in space, so a space solar power system could supply renewable power to anywhere on the planet, day or night, as recent tests show we can generate electricity from the night sky as well, rain or shine.

Like fusion energy, space-based solar power seemed doomed to become a technology that was always 30 years away. Technical problems kept cropping up, cost estimates remained stratospheric, and as solar cells became cheaper and more efficient, and storage improved with cheap batteries, the case for space-based solar seemed to be shrinking.

That didn’t stop government research agencies from trying. In 1975, after partnering with the Department of Energy on a series of space solar power feasibility studies, NASA beamed 30 kilowatts of power over a mile using a giant microwave dish. Beamed energy is a crucial aspect of space solar power, but this test remains the most powerful demonstration of the technology to date. “The fact that it’s been almost 45 years since NASA’s demonstration, and it remains the high-water mark, speaks for itself,” Jaffe says. “Space solar wasn’t a national imperative, and so a lot of this technology didn’t meaningfully progress.”

John Mankins, a former physicist at NASA and director of Solar Space Technologies, witnessed how government bureaucracy killed space solar power development firsthand. In the late 1990s, Mankins authored a report for NASA that concluded it was again time to take space solar power seriously and led a project to do design studies on a satellite system. Despite some promising results, the agency ended up abandoning it.

In 2005, Mankins left NASA to work as a consultant, but he couldn’t shake the idea of space solar power. He did some modest space solar power experiments himself and even got a grant from NASA’s Innovative Advanced Concepts program in 2011. The result was SPS-ALPHA, which Mankins called “the first practical solar power satellite.” The idea, says Mankins, was “to build a large solar-powered satellite out of thousands of small pieces.” His modular design brought the cost of hardware down significantly, at least in principle.

Jaffe, who was just starting to work on hardware for space solar power at the Naval Research Lab, got excited about Mankins’ concept. At the time he was developing a “sandwich module” consisting of a small solar panel on one side and a microwave transmitter on the other. His electronic sandwich demonstrated all the elements of an actual space solar power system and, perhaps most important, it was modular. It could work beautifully with something like Mankins' concept, he figured. All they were missing was the financial support to bring the idea from the laboratory into space.

Jaffe invited Mankins to join a small team of researchers entering a Defense Department competition, in which they were planning to pitch a space solar power concept based on SPS-ALPHA. In 2016, the team presented the idea to top Defense officials and ended up winning four out of the seven award categories. Both Jaffe and Mankins described it as a crucial moment for reviving the US government’s interest in space solar power.

They might be right. In October, the Air Force Research Lab announced a $100 million program to develop hardware for a solar power satellite. It’s an important first step toward the first demonstration of space solar power in orbit, and Mankins says it could help solve what he sees as space solar power’s biggest problem: public perception. The technology has always seemed like a pie-in-the-sky idea, and the cost of setting up a solar array on Earth is plummeting, as proposals like a tenfold U.S. solar expansion signal rapid growth; but space solar power has unique benefits, chief among them the availability of solar energy around the clock regardless of the weather or time of day.

It can also provide renewable energy to remote locations, such as forward operating bases for the military, which has deployed its first floating solar array to bolster resilience. And at a time when wildfires have forced the utility PG&E to kill power for thousands of California residents on multiple occasions, having a way to provide renewable energy through the clouds and smoke doesn’t seem like such a bad idea. (Ironically enough, PG&E entered a first-of-its-kind agreement to buy space solar power from a company called Solaren back in 2009; the system was supposed to start operating in 2016 but never came to fruition.)

“If space solar power does work, it is hard to overstate what the geopolitical implications would be,” Jaffe says. “With GPS, we sort of take it for granted that no matter where we are on this planet, we can get precise navigation information. If the same thing could be done for energy, especially as peer-to-peer energy sharing matures, it would be revolutionary.”

Indeed, there seems to be an emerging race to become the first to harness this technology. Earlier this year China announced its intention to become the first country to build a solar power station in space, and for more than a decade Japan has considered the development of a space solar power station to be a national priority. Now that the US military has joined in with a $100 million hardware development program, it may only be a matter of time before there’s a solar farm in the solar system.

 

Related News

View more

PG&E says power lines may have started 2 California fires

PG&E Wildfire Blackouts highlight California power shutoffs as high winds and suspected transmission line faults trigger evacuations, CPUC investigations, and grid safety reviews, with utilities weighing risk, compliance, and resilience during Santa Ana conditions.

 

Key Points

PG&E Wildfire Blackouts are outages during wind-driven fire threats linked to power lines, spurring CPUC investigations.

✅ Wind and line faults suspected amid Lafayette evacuations

✅ CPUC to probe shutoffs, notifications, and compliance

✅ Utilities plan more outages as Santa Ana winds return

 

Pacific Gas & Electric Co. power lines may have started two wildfires over the weekend in the San Francisco Bay Area, the utility said Monday, even though widespread blackouts were in place to prevent downed lines from starting fires during dangerously windy weather.

The fires described in PG&E reports to state regulators match blazes that destroyed a tennis club and forced evacuations in Lafayette, about 20 miles (32 kilometres) east of San Francisco.

The fires began in a section of town where PG&E had opted to keep the lights on. The sites were not designated as a high fire risk, the company said.

Powerful winds were driving multiple fires across California and forcing power shut-offs intended to prevent blazes, even as electricity prices are soaring across the state as well.

More than 900,000 power customers -- an estimated 2.5 million people -- were in the dark at the height of the latest planned blackout, nearly all of them in PG&E's territory in Northern and central California. By Monday evening a little less than half of those had their service back. But some 1.5 million people in 29 counties will be hit with more shut-offs starting Tuesday because another round of strong winds is expected, a reminder of grid stress during heat waves that test capacity, the utility said.

Southern California Edison had cut off power to 25,000 customers and warned that it was considering disconnecting about 350,000 more as power supply lapses and Santa Ana winds return midweek.

PG&E is under severe financial pressure after its equipment was blamed for a series of destructive wildfires and its 2018 Camp Fire guilty plea compounded liabilities during the past three years. Its stock dropped 24% Monday to close at $3.80 and was down more than 50% since Thursday.

The company reported last week that a transmission tower may have caused a Sonoma County fire that has forced 156,000 people to evacuate.

PG&E told the California Public Utilities Commission that a worker responded to a fire in Lafayette late Sunday afternoon and was told firefighters believed contact between a power line and a communication line may have caused it.

A worker went to another fire about an hour later and saw a fallen pole and transformer. Contra Costa Fire Department personnel on site told the worker they were looking at the transformer as a potential ignition source, a company official wrote.

Separately, the company told regulators that it had failed to notify 23,000 customers, including 500 with medical conditions, before shutting off their power earlier this month during windy weather.

Before a planned blackout, power companies are required to notify customers and take extra care to get in touch with those with medical problems who may not be able to handle extended periods without air conditioning or may need power to run medical devices.

PG&E said some customers had no contact information on file. Others were incorrectly thought to be getting electricity.

After that outage, workers discovered 43 cases of wind-related damage to power lines, transformers and other equipment.

Jennifer Robison, a PG&E spokeswoman, said the company is working with independent living centres to determine how best to serve people with disabilities.

The company faced a growing backlash from regulators and lawmakers, and a judge's order on wildfire risk spending added pressure as well.

U.S. Rep. Josh Harder, a Democrat from Modesto, said he plans to introduce legislation that would raise PG&E's taxes if it pays bonuses to executives while engaging in blackouts.

The Public Utilities Commission plans to open a formal investigation into the blackouts and the broader climate policy debate surrounding reliability within the next month, allowing regulators to gather evidence and question utility officials. If rules are found to be broken, they can impose fines up to $100,000 per violation per day, said Terrie Prosper, a spokeswoman for the commission.

The commission said Monday it also plans to review the rules governing blackouts, will look to prevent utilities from charging customers when the power is off and will convene experts to find grid improvements that might lessen blackouts during next year's fire season, as debates over rate stability in 2025 continue across PG&E's service area.

The state can't continue experiencing such widespread blackouts, "nor should Californians be subject to the poor execution that PG&E in particular has exhibited," Marybel Batjer, president of the California Public Utilities Commission, said in a statement.

 

Related News

View more

Tracking Progress on 100% Clean Energy Targets

100% Clean Energy Targets drive renewable electricity, decarbonization, and cost savings through state policies, CCAs, RECs, and mandates, with timelines and interim goals that boost jobs, resilience, and public health across cities, counties, and utilities.

 

Key Points

Policies for cities and states to reach 100% clean power by set dates, using mandates, RECs, and interim goals.

✅ Define eligible clean vs renewable resources

✅ Mandate vs goal framework with enforcement

✅ Timelines with interim targets and escape clauses

 

“An enormous amount of authority still rests with the states for determining your energy future. So we can build these policies that will become a postcard from the future for the rest of the country,” said David Hochschild, chair of the California Energy Commission, speaking last week at a UCLA summit on state and local progress toward 100 percent clean energy.

According to a new report from the UCLA Luskin Center for Innovation, 13 states, districts and territories, as well as more than 200 cities and counties, with standout clean energy purchases by Southeast cities helping drive momentum, have committed to a 100 percent clean electricity target — and dozens of cities have already hit it.

This means that one of every three Americans, or roughly 111 million U.S. residents representing 34 percent of the population, live in a community that has committed to or has already achieved 100 percent clean electricity, including communities like Frisco, Colorado that have set ambitious targets.

“We’re going to look back on this moment as the moment when local action and state commitments began to push the entire nation toward this goal,” said J.R. DeShazo, director of the UCLA Luskin Center for Innovation.

Not all 100 percent targets are alike, however. The report notes that these targets vary based on 1) what resources are eligible, 2) how binding the 100 percent target is, and 3) how and when the target will be achieved.

These distinctions will carry a lot of weight as the policy discussion shifts from setting goals to actually meeting targets. They also have implications for communities in terms of health benefits, cost savings and employment opportunities.

 

100% targets come in different forms

One key attribute is whether a target is based on "renewable" or "clean" energy resources. Some 100 percent targets, like Hawaii’s and Rhode Island’s 2030 plan, are focused exclusively on renewable energy, or sources that cannot be depleted, such as wind, solar and geothermal. But most jurisdictions use the broader term “clean energy,” which can also include resources like large hydroelectric generation and nuclear power.

States also vary in their treatment of renewable energy certificates, used to track and assign ownership to renewable energy generation and use. Unbundled RECs allow for the environmental attributes of the renewable energy resource to be purchased separately from the physical electricity delivery.

The binding nature of these targets is also noteworthy. Seven states, as well as Puerto Rico and the District of Columbia, have passed 100 percent clean energy transition laws. Of the jurisdictions that have passed 100 percent legislation, all but one specifies that the target is a “mandate,” according to the report. Nevada is the only state to call the target a “goal.”

Governors in four other states have signed executive orders with 100 percent clean energy goals.

Target timelines also vary. Washington, D.C. has set the most ambitious target date, with a mandate to achieve 100 percent renewable electricity by 2032. Other states and cities have set deadline years between 2040 and 2050. All "100 percent" state laws, and some city and county policies, also include interim targets to keep clean energy deployment on track.

In addition, some locations have included some form of escape clause. For instance, Salt Lake City, which last month passed a resolution establishing a goal of powering the county with 100 percent clean electricity by 2030, included “exit strategies” in its policy in order to encourage stakeholder buy-in, said Mayor Jackie Biskupski, speaking last week at the UCLA summit.

“We don’t think they’ll get used, but they’re there,” she said.

Other locales, meanwhile, have decided to go well beyond 100 percent clean electricity. The State of California and 44 cities have set even more challenging targets to also transition their entire transportation, heating and cooling sectors to 100 percent clean energy sources, and proposals like requiring solar panels on new buildings underscore how policy can accelerate progress across sectors.

Businesses are simultaneously electing to adopt more clean and renewable energy. Six utilities across the United States have set their own 100 percent clean or carbon-free electricity targets. UCLA researchers did not include populations served by these utilities in their analysis of locations with state and city 100 percent clean commitments.

 

“We cannot wait”

All state and local policies that require a certain share of electricity to come from renewable energy resources have contributed to more efficient project development and financing mechanisms, which have supported continued technology cost declines and contributed to a near doubling of renewable energy generation since 2008.

Many communities are switching to clean energy in order to save money, now that the cost calculation is increasingly in favor of renewables over fossil fuels, as more jurisdictions get on the road to 100% renewables worldwide. Additional benefits include local job creation, cleaner air and electricity system resilience due to greater reliance on local energy resources.

Another major motivator is climate change. The electricity sector is responsible for 28 percent of U.S. greenhouse gas emissions, second only to transportation. Decarbonizing the grid also helps to clean up the transportation sector as more vehicles move to electricity as their fuel source.

“The now-constant threat of wildfires, droughts, severe storms and habitat loss driven by climate change signals a crisis we can no longer ignore,” said Carla Peterman, senior vice president of regulatory affairs at investor-owned utility Southern California Edison. “We cannot wait and we should not wait when there are viable solutions to pursue now.”

Prior to joining SCE on October 1, Peterman served as a member of the California Public Utilities Commission, which implements and administers renewable portfolio standard (RPS) compliance rules for California’s retail sellers of electricity. California’s target requires 60 percent of the state’s electricity to come from renewable energy resources by 2030, and all the state's electricity to come from carbon-free resources by 2045.  

 

How CCAs are driving renewable energy deployment

One way California communities are working to meet the state’s ambitious targets is through community-choice aggregation, especially after California's near-100% renewable milestone underscored what's possible, via which cities and counties can take control of their energy procurement decisions to suit their preferences. Investor-owned utilities no longer purchase energy for these jurisdictions, but they continue to operate the transmission and distribution grid for all electricity users.                           

A second paper released by the Luskin Center for Innovation in recent days examines how community-choice aggregators are affecting levels of renewable energy deployment in California and contributing to the state’s 100 percent target.

The paper finds that 19 CCAs have launched in California since 2010, growing to include more than 160 towns, cities and counties. Of those communities, 64 have a 100 percent renewable or clean energy policy as their default energy program.

Because of these policies, the UCLA paper finds that “CCAs have had both direct and indirect effects that have led to increases in the clean energy sold in excess of the state’s RPS.”

From 2011 to 2018, CCAs directly procured 24 terawatt-hours of RPS-eligible electricity, 11 TWh of which have been voluntary or in excess of RPS compliance, according to the paper.

The formation of CCAs has also had an indirect effect on investor-owned utilities. As customers have left investor-owned utilities to join CCAs, the utilities have been left holding contracts for more renewable energy than they need to comply with California’s clean energy targets, amid rising solar and wind curtailments that complicate procurement decisions. UCLA researchers estimate that this indirect effect of CCA formation has left IOUs holding 13 terawatt-hours in excess of RPS requirements.

The paper concludes that CCAs have helped to accelerate California’s ability to meet state renewable energy targets over the past decade. However, the future contributions of CCAs to the RPS are more uncertain as communities make new power-purchasing decisions and utilities seek to reduce their excess renewable energy contracts.

“CCAs offer a way for communities to put their desire for clean energy into action. They're growing fast in California, one of only eight states where this kind of mechanism is allowed," said UCLA's Kelly Trumbull, an author of the report. "State and federal policies could be reformed to better enable communities to meet local demand for renewable energy.”

 

Related News

View more

Big prizes awarded to European electricity prediction specialists

Electricity Grid Flow Prediction leverages big data, machine learning, and weather analytics to forecast power flows across smart grids, enhancing reliability, reducing blackouts and curtailment, and optimizing renewable integration under EU Horizon 2020 innovation.

 

Key Points

Short-term forecasting of power flows using big data, weather inputs, and machine learning to stabilize smart grids.

✅ Uses big data, weather, and ML for 6-hour forecasts

✅ Improves reliability, cuts blackouts and energy waste

✅ Supports smart grids, renewables, and grid balancing

 

Three European prediction specialists have won prizes worth €2 million for developing the most accurate predictions of electricity flow through a grid

The three winners of the Big Data Technologies Horizon Prize received their awards at a ceremony on 12th November in Austria.

The first prize of €1.2 million went to Professor José Vilar from Spain, while Belgians Sofie Verrewaere and Yann-Aël Le Borgne came in joint second place and won €400,000 each.

The challenge was open to individuals groups and organisations from countries taking part in the EU’s research and innovation programme, Horizon 2020.

Carlos Moedas, Commissioner for Research, Science and Innovation, said: “Energy is one of the crucial sectors that are being transformed by the digital grid worldwide.

“This Prize is a good example of how we support a positive transformation through the EU’s research and innovation programme, Horizon 2020.

“For the future, we have designed our next programme, Horizon Europe, to put even more emphasis on the merger of the physical and digital worlds across sectors such as energy, transport and health.”

The challenge for the applicants was to create AI-driven software that could predict the likely flow of electricity through a grid taking into account a number of factors including the weather and the generation source (i.e. wind turbines, solar cells, etc).

Using a large quantity of data from electricity grids, EU smart meters, combined with additional data such as weather conditions, applicants had to develop software that could predict the flow of energy through the grid over a six-hour period.

Commissioner for Digital Economy and Society Mariya Gabriel said: “The wide range of possible applications of these winning submissions could bring tangible benefits to all European citizens, including efforts to tackle climate change with machine learning across sectors.”

The decision to focus on energy grids for this particular prize was driven by a clear market need, including expanding HVDC technology capabilities.

Today’s energy is produced at millions of interconnected and dispersed unpredictable sites such as wind turbines, solar cells, etc., so it is harder to ensure that electricity supply matches the demand at all times.

This complexity means that huge amounts of data are produced at the energy generation sites, in the grid and at the place where the energy is consumed.

Being able to make accurate, short-term predictions about power grid traffic is therefore vital to reduce the risks of blackouts or, by enabling utilities to use AI for energy savings, limit waste of energy.

Reliable predictions can also be used in fields such as biology and healthcare. The predictions can help to diagnose and cure diseases as well as to allocate resources where they are most needed.

Ultimately, the winning ideas are set to be picked up by the energy sector in the hopes of creating smarter electricity infrastructure, more economic and more reliable power grids.

 

Related News

View more

Was there another reason for electricity shutdowns in California?

PG&E Wind Shutdown and Renewable Reliability examines PSPS strategy, wildfire risk, transmission line exposure, wind turbine cut-out speeds, grid stability, and California's energy mix amid historic high-wind events and supply constraints across service areas.

 

Key Points

An overview of PG&E's PSPS decisions, wildfire mitigation, and how wind cut-out limits influence grid reliability.

✅ Wind turbines reach cut-out near 55 mph, reducing generation.

✅ PSPS mitigates ignition from damaged transmission infrastructure.

✅ Baseload diversity improves resilience during high-wind events.

 

According to the official, widely reported story, Pacific Gas & Electric (PG&E) initiated power shutoffs across substantial portions of its electric transmission system in northern California as a precautionary measure.

Citing high wind speeds they described as “historic,” the utility claims that if it didn’t turn off the grid, wind-caused damage to its infrastructure could start more wildfires.

Perhaps that’s true. Perhaps. This tale presumes that the folks who designed and maintain PG&E’s transmission system are unaware of or ignored the need to design it to withstand severe weather events, and that the Federal Energy Regulatory Commission (FERC) and North American Electric Reliability Corp. (NERC) allowed the utility to do so.

Ignorance and incompetence happens, to be sure, but there’s much about this story that doesn’t smell right—and it’s disappointing that most journalists and elected officials are apparently accepting it without question.

Take, for example, this statement from a Fox News story about the Kincade Fires: “A PG&E meteorologist said it’s ‘likely that many trees will fall, branches will break,’ which could damage utility infrastructure and start a fire.”

Did you ever notice how utilities cut wide swaths of trees away when transmission lines pass through forests? There’s a reason for that: When trees fall and branches break, the grid can still function, and even as the electric rhythms of New York City shifted during COVID-19, operators planned for variability.

So, if badly designed and poorly maintained infrastructure isn’t the reason PG&E cut power to millions of Californians, what might have prompted them to do so? Could it be that PG&E’s heavy reliance on renewable energy means they don’t have the power to send when a “historic” weather event occurs, especially as policymakers weigh the postponed closure of three power plants elsewhere in California?

 

Wind Speed Limits

The two most popular forms of renewable energy come with operating limitations, which is why some energy leaders urge us to keep electricity options open when planning the grid. With solar power, the constraint is obvious: the availability of sunlight. One doesn’t generate solar power at night and energy generation drops off with increasing degrees of cloud cover during the day.

The main operating constraint of wind power is, of course, wind speed, and even in markets undergoing 'transformative change' in wind generation, operators adhere to these technical limits. At the low end of the scale, you need about a 6 or 7 miles-per-hour wind to get a turbine moving. This is called the “cut-in speed.” To generate maximum power, about a 30 mph wind is typically required. But, if the wind speed is too high, the wind turbine will shut down. This is called the “cut-out speed,” and it’s about 55 miles per hour for most modern wind turbines.

It may seem odd that wind turbines have a cut-out speed, but there’s a very good reason for it. Each wind turbine rotor is connected to an electric generator housed in the turbine nacelle. The connection is made through a gearbox that is sized to turn the generator at the precise speed required to produce 60 Hertz AC power.

The blades of the wind turbine are airfoils, just like the wings of an airplane. Adjusting the pitch (angle) of the blades allows the rotor to maintain constant speed, which, in turn, allows the generator to maintain the constant speed it needs to safely deliver power to the grid. However, there’s a limit to blade pitch adjustment. When the wind is blowing so hard that pitch adjustment is no longer possible, the turbine shuts down. That’s the cut-out speed.

Now consider how California’s power generation profile has changed. According to Energy Information Administration data, the state generated 74.3 percent of its electricity from traditional sources—fossil fuels and nuclear, amid debates over whether to classify nuclear as renewable—in 2001. Hydroelectric, geothermal, and biomass-generated power accounted for most of the remaining 25.7 percent, with wind and solar providing only 1.98 percent of the total.

By 2018, the state’s renewable portfolio had jumped to 43.8 percent of total generation, with clean power increasing and wind and solar now accounting for 17.9 percent of total generation. That’s a lot of power to depend on from inherently unreliable sources. Thus, it wouldn’t be at all surprising to learn that PG&E didn’t stop delivering power out of fear of starting fires, but because it knew it wouldn’t have power to deliver once high winds shut down all those wind turbines

 

Related News

View more

Sign Up for Electricity Forum’s Newsletter

Stay informed with our FREE Newsletter — get the latest news, breakthrough technologies, and expert insights, delivered straight to your inbox.

Electricity Today T&D Magazine Subscribe for FREE

Stay informed with the latest T&D policies and technologies.
  • Timely insights from industry experts
  • Practical solutions T&D engineers
  • Free access to every issue

Download the 2025 Electrical Training Catalog

Explore 50+ live, expert-led electrical training courses –

  • Interactive
  • Flexible
  • CEU-cerified