Hadron Collider struggles with bad connections

By New York Times


Arc Flash Training CSA Z462 - Electrical Safety Essentials

Our customized live online or in‑person group training can be delivered to your staff at your location.

  • Live Online
  • 6 hours Instructor-led
  • Group Training Available
Regular Price:
$249
Coupon Price:
$199
Reserve Your Seat Today
The biggest, most expensive physics machine in the world is riddled with thousands of bad electrical connections.

Many of the magnets meant to whiz high-energy subatomic particles around a 17-mile underground racetrack have mysteriously lost their ability to operate at high energies.

Some physicists are deserting the European project, at least temporarily, to work at a smaller, rival machine across the ocean.

After 15 years and $9 billion, and a showy “switch-on” ceremony last September, the Large Hadron Collider, the giant particle accelerator outside Geneva, has to yet collide any particles at all.

But soon?

Scientists and engineers at the European Center for Nuclear Research, or CERN, are to announce how and when their machine will start running this winter.

That will be a Champagne moment. But scientists say it could be years, if ever, before the collider runs at full strength, stretching out the time it should take to achieve the colliderÂ’s main goals, like producing a particle known as the Higgs boson thought to be responsible for imbuing other elementary particles with mass, or identifying the dark matter that astronomers say makes up 25 percent of the cosmos.

The energy shortfall could also limit the colliderÂ’s ability to test more exotic ideas, like the existence of extra dimensions beyond the three of space and one of time that characterize life.

“The fact is, it’s likely to take a while to get the results we really want,” said Lisa Randall, a Harvard physicist who is an architect of the extra-dimension theory.

The collider was built to accelerate protons to energies of seven trillion electron volts and smash them together in search of particles and forces that reigned earlier than the first trillionth of a second of time, but the machine could run as low as four trillion electron volts for its first year. Upgrades would come a year or two later.

Physicists on both sides of the Atlantic say they are confident that the European machine will produce groundbreaking science — eventually — and quickly catch up to an American rival, even at the lower energy. All big accelerators have gone through painful beginnings.

“These are baby problems,” said Peter Limon, a physicist at the Fermi National Accelerator Laboratory in Batavia, Ill., who helped build the collider.

But some physicists admit to being impatient. “I’ve waited 15 years,” said Nima Arkani-Hamed, a leading particle theorist at the Institute for Advanced Study in Princeton. “I want it to get up running. We can’t tolerate another disaster. It has to run smoothly from now.”

The delays are hardest on younger scientists, who may need data to complete a thesis or work toward tenure. Slowing a recent physics brain drain from the United States to Europe, some have gone to work at Fermilab, where the rival Tevatron accelerator has been smashing together protons and antiprotons for the last decade.

Colliders get their oomph from EinsteinÂ’s equivalence of mass and energy, both expressed in the currency of electron volts. The CERN collider was designed to investigate what happens at energies and distances where the reigning theory, known as the Standard Model, breaks down and gives nonsense answers.

The colliderÂ’s own prodigious energies are in some way its worst enemy. At full strength, the energy stored in its superconducting magnets would equal that of an Airbus A380 flying at 450 miles an hour, and the proton beam itself could pierce 100 feet of solid copper.

In order to carry enough current, the colliderÂ’s magnets are cooled by liquid helium to a temperature of 1.9 degrees above absolute zero, at which point the niobium-titanium cables in them lose all electrical resistance and become superconducting.

Any perturbation, however, such as a bad soldering job on a splice, can cause resistance and heat the cable and cause it to lose its superconductivity in what physicists call a “quench.” Which is what happened on September 19, when the junction between two magnets vaporized in a shower of sparks, soot and liberated helium.

Technicians have spent most of the last year cleaning up and inspecting thousands of splices in the collider. About 5,000 will have to be redone, Steve Myers, head of CERNÂ’s accelerator division, said in an interview.

The exploding splices have diverted engineers’ attention from the mystery of the underperforming magnets. Before the superconducting magnets are installed, engineers “train” each one by ramping up its electrical current until the magnet fails, or “quenches.” Thus the magnet gradually grows comfortable with higher and higher current.

All of the magnets for the collider were trained to an energy above seven trillion electron volts before being installed, Dr. Myers said, but when engineers tried to take one of the ringsÂ’ eight sectors to a higher energy last year, some magnets unexpectedly failed.

In an e-mail exchange, Lucio Rossi, head of magnets for CERN, said that 49 magnets had lost their training in the sectors tested and that it was impossible to estimate how many in the entire collider had gone bad. He said the magnets in question had all met specifications and that the problem might stem from having sat outside for a year before they could be installed.

Retraining magnets is costly and time consuming, experts say, and it might not be worth the wait to get all the way to the original target energy. “It looks like we can get to 6.5 relatively easily,” Dr. Myers said, but seven trillion electron volts would require “a lot of training.”

Many physicists say they would be perfectly happy if the collider never got above five trillion electron volts. If that were the case, said Joe Lykken, a Fermilab theorist who is on one of the CERN collider teams, “It’s not the end of the world. I am not pessimistic at all.”

For the immediate future, however, physicists are not even going to get that. Dr. Myers said he thought the splices as they are could handle 4 trillion electron volts.

“We could be doing physics at the end of November,” he said in July, before new vacuum leaks pushed the schedule back a few additional weeks.

“It’s not the design energy of the machine, but it’s 4 times higher than the Tevatron,” he said.

Pauline Gagnon, an Indiana University physicist who works at CERN, said she would happily take that energy level. “The public pays for this,” she said in an e-mail message, “and we need to start delivering.”

Related News

Why the promise of nuclear fusion is no longer a pipe dream

ITER Nuclear Fusion advances tokamak magnetic confinement, heating deuterium-tritium plasma with superconducting magnets, targeting net energy gain, tritium breeding, and steam-turbine power, while complementing laser inertial confinement milestones for grid-scale electricity and 2025 startup goals.

 

Key Points

ITER Nuclear Fusion is a tokamak project confining D-T plasma with magnets to achieve net energy gain and clean power.

✅ Tokamak magnetic confinement with high-temp superconducting coils

✅ Deuterium-tritium fuel cycle with on-site tritium breeding

✅ Targets net energy gain and grid-scale, low-carbon electricity

 

It sounds like the stuff of dreams: a virtually limitless source of energy that doesn’t produce greenhouse gases or radioactive waste. That’s the promise of nuclear fusion, often described as the holy grail of clean energy by proponents, which for decades has been nothing more than a fantasy due to insurmountable technical challenges. But things are heating up in what has turned into a race to create what amounts to an artificial sun here on Earth, one that can provide power for our kettles, cars and light bulbs.

Today’s nuclear power plants create electricity through nuclear fission, in which atoms are split, with next-gen nuclear power exploring smaller, cheaper, safer designs that remain distinct from fusion. Nuclear fusion however, involves combining atomic nuclei to release energy. It’s the same reaction that’s taking place at the Sun’s core. But overcoming the natural repulsion between atomic nuclei and maintaining the right conditions for fusion to occur isn’t straightforward. And doing so in a way that produces more energy than the reaction consumes has been beyond the grasp of the finest minds in physics for decades.

But perhaps not for much longer. Some major technical challenges have been overcome in the past few years and governments around the world have been pouring money into fusion power research as part of a broader green industrial revolution under way in several regions. There are also over 20 private ventures in the UK, US, Europe, China and Australia vying to be the first to make fusion energy production a reality.

“People are saying, ‘If it really is the ultimate solution, let’s find out whether it works or not,’” says Dr Tim Luce, head of science and operation at the International Thermonuclear Experimental Reactor (ITER), being built in southeast France. ITER is the biggest throw of the fusion dice yet.

Its $22bn (£15.9bn) build cost is being met by the governments of two-thirds of the world’s population, including the EU, the US, China and Russia, at a time when Europe is losing nuclear power and needs energy, and when it’s fired up in 2025 it’ll be the world’s largest fusion reactor. If it works, ITER will transform fusion power from being the stuff of dreams into a viable energy source.


Constructing a nuclear fusion reactor
ITER will be a tokamak reactor – thought to be the best hope for fusion power. Inside a tokamak, a gas, often a hydrogen isotope called deuterium, is subjected to intense heat and pressure, forcing electrons out of the atoms. This creates a plasma – a superheated, ionised gas – that has to be contained by intense magnetic fields.

The containment is vital, as no material on Earth could withstand the intense heat (100,000,000°C and above) that the plasma has to reach so that fusion can begin. It’s close to 10 times the heat at the Sun’s core, and temperatures like that are needed in a tokamak because the gravitational pressure within the Sun can’t be recreated.

When atomic nuclei do start to fuse, vast amounts of energy are released. While the experimental reactors currently in operation release that energy as heat, in a fusion reactor power plant, the heat would be used to produce steam that would drive turbines to generate electricity, even as some envision nuclear beyond electricity for industrial heat and fuels.

Tokamaks aren’t the only fusion reactors being tried. Another type of reactor uses lasers to heat and compress a hydrogen fuel to initiate fusion. In August 2021, one such device at the National Ignition Facility, at the Lawrence Livermore National Laboratory in California, generated 1.35 megajoules of energy. This record-breaking figure brings fusion power a step closer to net energy gain, but most hopes are still pinned on tokamak reactors rather than lasers.

In June 2021, China’s Experimental Advanced Superconducting Tokamak (EAST) reactor maintained a plasma for 101 seconds at 120,000,000°C. Before that, the record was 20 seconds. Ultimately, a fusion reactor would need to sustain the plasma indefinitely – or at least for eight-hour ‘pulses’ during periods of peak electricity demand.

A real game-changer for tokamaks has been the magnets used to produce the magnetic field. “We know how to make magnets that generate a very high magnetic field from copper or other kinds of metal, but you would pay a fortune for the electricity. It wouldn’t be a net energy gain from the plant,” says Luce.


One route for nuclear fusion is to use atoms of deuterium and tritium, both isotopes of hydrogen. They fuse under incredible heat and pressure, and the resulting products release energy as heat


The solution is to use high-temperature, superconducting magnets made from superconducting wire, or ‘tape’, that has no electrical resistance. These magnets can create intense magnetic fields and don’t lose energy as heat.

“High temperature superconductivity has been known about for 35 years. But the manufacturing capability to make tape in the lengths that would be required to make a reasonable fusion coil has just recently been developed,” says Luce. One of ITER’s magnets, the central solenoid, will produce a field of 13 tesla – 280,000 times Earth’s magnetic field.

The inner walls of ITER’s vacuum vessel, where the fusion will occur, will be lined with beryllium, a metal that won’t contaminate the plasma much if they touch. At the bottom is the divertor that will keep the temperature inside the reactor under control.

“The heat load on the divertor can be as large as in a rocket nozzle,” says Luce. “Rocket nozzles work because you can get into orbit within minutes and in space it’s really cold.” In a fusion reactor, a divertor would need to withstand this heat indefinitely and at ITER they’ll be testing one made out of tungsten.

Meanwhile, in the US, the National Spherical Torus Experiment – Upgrade (NSTX-U) fusion reactor will be fired up in the autumn of 2022, while efforts in advanced fission such as a mini-reactor design are also progressing. One of its priorities will be to see whether lining the reactor with lithium helps to keep the plasma stable.


Choosing a fuel
Instead of just using deuterium as the fusion fuel, ITER will use deuterium mixed with tritium, another hydrogen isotope. The deuterium-tritium blend offers the best chance of getting significantly more power out than is put in. Proponents of fusion power say one reason the technology is safe is that the fuel needs to be constantly fed into the reactor to keep fusion happening, making a runaway reaction impossible.

Deuterium can be extracted from seawater, so there’s a virtually limitless supply of it. But only 20kg of tritium are thought to exist worldwide, so fusion power plants will have to produce it (ITER will develop technology to ‘breed’ tritium). While some radioactive waste will be produced in a fusion plant, it’ll have a lifetime of around 100 years, rather than the thousands of years from fission.

At the time of writing in September, researchers at the Joint European Torus (JET) fusion reactor in Oxfordshire were due to start their deuterium-tritium fusion reactions. “JET will help ITER prepare a choice of machine parameters to optimise the fusion power,” says Dr Joelle Mailloux, one of the scientific programme leaders at JET. These parameters will include finding the best combination of deuterium and tritium, and establishing how the current is increased in the magnets before fusion starts.

The groundwork laid down at JET should accelerate ITER’s efforts to accomplish net energy gain. ITER will produce ‘first plasma’ in December 2025 and be cranked up to full power over the following decade. Its plasma temperature will reach 150,000,000°C and its target is to produce 500 megawatts of fusion power for every 50 megawatts of input heating power.

“If ITER is successful, it’ll eliminate most, if not all, doubts about the science and liberate money for technology development,” says Luce. That technology development will be demonstration fusion power plants that actually produce electricity, where advanced reactors can build on decades of expertise. “ITER is opening the door and saying, yeah, this works – the science is there.”

 

Related News

View more

Cost, safety drive line-burying decisions at Tucson Electric Power

TEP Undergrounding Policy prioritizes selective underground power lines to manage wildfire risk, engineering costs, and ratepayer impacts, balancing transmission and distribution reliability with right-of-way, safety, and vegetation management per Arizona regulators.

 

Key Points

A selective TEP approach to bury lines where safety, engineering, and cost justify undergrounding.

✅ Selective undergrounding for feeders near substations

✅ Balances wildfire mitigation, reliability, and ratepayer costs

✅ Follows ACC rules, BLM and USFS vegetation management

 

Though wildfires in California caused by power lines have prompted calls for more underground lines, Tucson Electric Power Co. plans to keep to its policy of burying lines selectively for safety.

Like many other utilities, TEP typically doesn’t install its long-range, high-voltage transmission lines, such as the TransWest Express project, and distribution equipment underground because of higher costs that would be passed on to ratepayers, TEP spokesman Joe Barrios said.

But the company will sometimes bury lower-voltage lines and equipment where it is cost-effective or needed for safety as utilities adapt to climate change across North America, or if customers or developers are willing to pay the higher installation costs

Underground installations generally include additional engineering expenses, right-of-way acquisition for projects like the New England Clean Power Link in other regions, and added labor and materials, Barrios said.

“This practice avoids passing along unnecessary costs to customers through their rates, so that all customers are not asked to subsidize a discretionary expenditure that primarily benefits residents or property owners in one small area of our service territory,” he said, adding that the Arizona Corporation Commission has supported the company’s policy.

Even so, TEP will place equipment underground in some circumstances if engineering or safety concerns, including electrical safety tips that utilities promote during storm season, justify the additional cost of underground installation, Barrios said.

In fact, lower-voltage “feeder” lines emerging from distribution substations are typically installed underground until the lines reach a point where they can be safely brought above ground, he added.

While in California PG&E has shut off power during windy weather to avoid wildfires in forested areas traversed by its power lines after events like the Drum Fire last June, TEP doesn’t face the same kind of wildfire risk, Barrios said.

Most of TEP’s 5,000 miles of transmission and distribution lines aren’t located in heavily forested areas that would raise fire concerns, though large urban systems have seen outages after station fires in Los Angeles, he said.

However, TEP has an active program of monitoring transmission lines and trimming vegetation to maintain a fire-safety buffer zone and address risks from vandalism such as copper theft where applicable, in compliance with federal regulations and in cooperation with the U.S. Bureau of Land Management and the U.S. Forest Service.

 

Related News

View more

As New Zealand gets serious about climate change, can electricity replace fossil fuels in time?

New Zealand Energy Transition will electrify transport and industry with renewables, grid-scale solar, wind farms, geothermal, batteries, demand response, pumped hydro, and transmission upgrades to manage dry-year risk and winter peak loads.

 

Key Points

A shift to renewables and smart demand to decarbonise transport and industry while ensuring reliable, affordable power.

✅ Electrifies transport and industrial heat with renewables

✅ Uses demand response, batteries, and pumped hydro for resilience

✅ Targets 99%+ renewable supply, managing dry-year and peak loads

 

As fossil fuels are phased out over the coming decades, the Climate Change Commission (CCC) suggests electricity will take up much of the slack, aligning with the vision of a sustainable electric planet powering our vehicle fleet and replacing coal and gas in industrial processes.

But can the electricity system really provide for this increased load where and when it is needed? The answer is “yes”, with some caveats.

Our research examines climate change impacts on the New Zealand energy system. It shows we’ll need to pay close attention to demand as well as supply. And we’ll have to factor in the impacts of climate change when we plan for growth in the energy sector.

 

Demand for electricity to grow
While electricity use has not increased in NZ in the past decade, many agencies project steeply rising demand in coming years. This is partly due to both increasing population and gross domestic product, but mostly due to the anticipated electrification of transport and industry, which could result in a doubling of demand by mid-century.

It’s hard to get a sense of the scale of the new generation required, but if wind was the sole technology employed to meet demand by 2050, between 10 and 60 new wind farms would be needed nationwide.

Of course, we won’t only build wind farms, as renewables are coming on strong and grid-scale solar, rooftop solar, new geothermal, some new small hydro plant and possibly tidal and wave power will all have a part to play.

 

Managing the demand
As well as providing more electricity supply, demand management and batteries will also be important. Our modelling shows peak demand (which usually occurs when everyone turns on their heaters and ovens at 6pm in winter) could be up to 40% higher by 2050 than it is now.

But meeting this daily period of high demand could see expensive plant sitting idle for much of the time (with the last 25% of generation capacity only used about 10% of the time).

This is particularly a problem in a renewable electricity system when the hydro lakes are dry, as hydro is one of the few renewable electricity sources that can be stored during the day (as water behind the dam) and used over the evening peak (by generating with that stored water).

Demand response will therefore be needed. For example, this might involve an industrial plant turning off when there is too much load on the electricity grid.

 

But by 2050, a significant number of households will also need smart appliances and meters that automatically use cheaper electricity at non-peak times. For example, washing machines and electric car chargers could run automatically at 2am, rather than 6pm when demand is high.

Our modelling shows a well set up demand response system could mitigate dry-year risk (when hydro lakes are low on water) in coming decades, where currently gas and coal generation is often used.

Instead of (or as well as) having demand response and battery systems to combat dry-year risk, a pumped storage system could be built. This is where water is pumped uphill when hydro lake inflows are plentiful, and used to generate electricity during dry periods.

The NZ Battery project is currently considering the potential for this in New Zealand, and debates such as whether we would use Site C's electricity offer relevant lessons.

 

Almost (but not quite) 100% renewable
Dry-year risk would be greatly reduced and there would be “greater greenhouse gas emissions savings” if the Interim Climate Change Committee’s (ICCC) 2019 recommendation to aim for 99% renewable electricity was adopted, rather than aiming for 100%.

A small amount of gas-peaking plant would therefore be retained. The ICCC said going from 99% to 100% renewable electricity by overbuilding would only avoid a very small amount of carbon emissions, at a very high cost.

Our modelling supports this view. The CCC’s draft advice on the issue also makes the point that, although 100% renewable electricity is the “desired end point”, timing is important to enable a smooth transition.

Despite these views, Energy Minister Megan Woods has said the government will be keeping the target of a 100% renewable electricity sector by 2030.

 

Impacts of climate change
In future, the electricity system will have to respond to changing climate patterns as well, becoming resilient to climate risks over time.

The National Institute of Water and Atmospheric Research predicts winds will increase in the South Island and decrease in the far north in coming decades.

Inflows to the biggest hydro lakes will get wetter (more rain in their headwaters), and their seasonality will change due to changes in the amount of snow in these catchments.

Our modelling shows the electricity system can adapt to those changing conditions. One good news story (unless you’re a skier) is that warmer temperatures will mean less snow storage at lower elevations, and therefore higher lake inflows in the big hydro catchments in winter, leading to a better match between times of high electricity demand and higher inflows.

 

The price is right
The modelling also shows the cost of generating electricity is not likely to increase, because the price of building new sources of renewable energy continues to fall globally.

Because the cost of building new renewables is now cheaper than non-renewables (such as coal-fired plants), investing in carbon-free electricity is increasingly compelling, and renewables are more likely to be built to meet new demand in the near term.

While New Zealand’s electricity system can enable the rapid decarbonisation of (at least) our transport and industrial heat sectors, international efforts like cleaning up Canada's electricity underline the need for certainty so the electricity industry can start building to meet demand everywhere.

Bipartisan cooperation at government level will be important to encourage significant investment in generation and transmission projects with long lead times and life expectancies, as analyses of climate policy and grid implications underscore in comparable markets.

Infrastructure and markets are needed to support demand response uptake, as well as certainty around the Tiwai exit in 2024 and whether pumped storage is likely to be built.

Our electricity system can support the rapid decarbonisation needed if New Zealand is to do its fair share globally to tackle climate change.

But sound planning, firm decisions and a supportive and relatively stable regulatory framework are all required before shovels can hit the ground.

 

Related News

View more

Why Canada should invest in "macrogrids" for greener, more reliable electricity

Canadian electricity transmission enables grid resilience, long-distance power trade, and decarbonization by integrating renewables, hydroelectric storage, and HVDC links, providing backup during extreme weather and lowering costs to reach net-zero, clean energy targets.

 

Key Points

An interprovincial high-voltage grid that shares clean power to deliver reliable, low-cost decarbonization.

✅ Enables resilience by sharing power across weather zones

✅ Integrates renewables with hydro storage via HVDC links

✅ Lowers decarbonization costs through interprovincial trade

 

As the recent disaster in Texas showed, climate change requires electricity utilities to prepare for extreme events. This “global weirding” is leaving Canadian electricity grids increasingly exposed to harsh weather that leads to more intense storms, higher wind speeds, heatwaves and droughts that can threaten the performance of electricity systems.

The electricity sector must adapt to this changing climate while also playing a central role in mitigating climate change. Greenhouse gas emissions can be reduced a number of ways, but the electricity sector is expected to play a central role in decarbonization, including powering a net-zero grid by 2050 across Canada. Zero-emissions electricity can be used to electrify transportation, heating and industry and help achieve emissions reduction in these sectors.

Enhancing long-distance transmission is viewed as a cost-effective way to enable a clean and reliable power grid, and to lower the cost of meeting our climate targets. Now is the time to strengthen transmission links in Canada, with concepts like a western Canadian electricity grid gaining traction.


Insurance for climate extremes
An early lesson from the Texas power outages is that extreme conditions can lead to failures across all forms of power supply. The state lost the capacity to generate electricity from natural gas, coal, nuclear and wind simultaneously. But it also lacked cross-border transmission to other electricity systems that could have bolstered supply.

Join thousands of Canadians who subscribe to free evidence-based news.
Long-distance transmission offers the opportunity to escape the correlative clutch of extreme weather, by accessing energy and spare capacity in areas not beset by the same weather patterns. For example, while Texas was in its deep freeze, relatively balmy conditions in California meant there was a surplus of electricity generation capability in that region — but no means to get it to Texas. Building new transmission lines and connections across broader regions, including projects like a hydropower line to New York that expand access, can act as an insurance policy, providing a back-up for regions hit by the crippling effects of climate change.

A transmission tower crumpled under the weight of ice.
The 1998 Quebec ice storm left 3.5 million Quebecers and a million Ontarians, as well as thousands in in New Brunswick, without power. CP Photo/Robert Galbraith
Transmission is also vulnerable to climate disruptions, such as crippling ice storms that leave wires temporarily inoperable. This may mean using stronger poles when building transmission, or burying major high-voltage transmission links, or deploying superconducting cables to reduce losses.

In any event, more transmission links between regions can improve resilience by co-ordinating supply across larger regions. Well-connected grids that are larger than the areas disrupted by weather systems can be more resilient to climate extremes.


Lowering the cost of clean power
Adding more transmission can also play a role in mitigating climate change. Numerous studies have found that building a larger transmission grid allows for greater shares of renewables onto the grid, ultimately lowering the overall cost of electricity.

In a recent study, two of us looked at the role transmission could play in lowering greenhouse gas emissions in Canada’s electricity sector. We found the cost of reducing greenhouse gas emissions is lower when new or enhanced transmission links can be built between provinces.

Average cost increase to electricity in Canada at different levels of decarbonization, with new transmission (black) and without new transmission (red). New transmission lowers the cost of reducing greenhouse gas emissions. (Authors), Author provided
Much of the value of transmission in these scenarios comes from linking high-quality wind and solar resources with flexible zero-emission generation that can produce electricity on demand. In Canada, our system is dominated by hydroelectricity, but most of this hydro capacity is located in five provinces: British Columbia, Manitoba, Ontario, Québec and Newfoundland and Labrador.

In the west, Alberta and Saskatchewan are great locations for building low-cost wind and solar farms. Enhanced interprovincial transmission would allow Alberta and Saskatchewan to build more variable wind and solar, with the assurance that they could receive backup power from B.C. and Manitoba when the wind isn’t blowing and the sun isn’t shining.

When wind and solar are plentiful, the flow of low cost energy can reverse to allow B.C. and Manitoba the opportunity to better manage their hydro reservoir levels. Provinces can only benefit from trading with each other if we have the infrastructure to make that trade possible.

A recent working paper examined the role that new transmission links could play in decarbonizing the B.C. and Alberta electricity systems. We again found that enabling greater electricity trade between B.C. and Alberta can reduce the cost of deep cuts to greenhouse gas emissions by billions of dollars a year. Although we focused on the value of the Site C project, in the context of B.C.'s clean energy shift, the analysis showed that new transmission would offer benefits of much greater value than a single hydroelectric project.

The value of enabling new transmission links between Alberta and B.C. as greenhouse gas emissions reductions are pursued. (Authors), Author provided
Getting transmission built
With the benefits that enhanced electricity transmission links can provide, one might think new projects would be a slam dunk. But there are barriers to getting projects built.

First, electricity grids in Canada are managed at the provincial level, most often by Crown corporations. Decisions by the Crowns are influenced not simply by economics, but also by political considerations. If a transmission project enables greater imports of electricity to Saskatchewan from Manitoba, it raises a flag about lost economic development opportunity within Saskatchewan. Successful transmission agreements need to ensure a two-way flow of benefits.

Second, transmission can be expensive. On this front, the Canadian government could open up the purse strings to fund new transmission links between provinces. It has already shown a willingness to do so.

Lastly, transmission lines are long linear projects, not unlike pipelines. Siting transmission lines can be contentious, even when they are delivering zero-emissions electricity. Using infrastructure corridors, such as existing railway right of ways or the proposed Canadian Northern Corridor, could help better facilitate co-operation between regions and reduce the risks of siting transmission lines.

If Canada can address these barriers to transmission, we should find ourselves in an advantageous position, where we are more resilient to climate extremes and have achieved a lower-cost, zero-emissions electricity grid.

 

Related News

View more

Alberta Introduces New Electricity Rules

Alberta Rate of Last Resort streamlines electricity regulations to stabilize the default rate, curb price volatility, and protect rural communities, low-income households, and seniors while preserving competition in the province's energy market.

 

Key Points

Alberta's Rate of Last Resort sets biennial default electricity prices, curbing volatility and protecting customers.

✅ Biennial default rate to limit price spikes

✅ Focus on rural, senior, and low-income customers

✅ Encourages competitive contracts and market stability

 

The Alberta government is overhauling its electricity regulations as part of a market overhaul aimed at reducing spikes in electricity prices for consumers and businesses. The new rules, set to be introduced this spring, are intended to stabilize the default electricity rate paid by many Albertans.


Background on the Rate of Last Resort

Albertans currently have the option to sign up for competitive contracts with electricity providers. These contracts can sometimes offer lower rates than the default electricity rate, officially known as the Regulated Rate Option (RRO). However, these competitive rates can fluctuate significantly. Currently, those unable to secure these contracts or those who are on the default rate are experiencing rising electricity prices and high levels of price volatility.

To address this, the Alberta government is renaming the default rate as the Rate of Last Resort designation (RoLR) under the new framework. This aims to reduce the sense of security that some consumers might associate with the current name, which the government feels is misleading.


Key Changes Under New Regulations

The new regulations, which include proposed market changes that affect pricing, focus on:

  • Price Stabilization: Default electricity rates will be set every two years for each utility provider, providing greater predictability by enabling a consumer price cap and reducing the potential for extreme price swings.
  • Rural and Underserved Communities: The changes are intended to particularly benefit rural Albertans and those on the default rate, including low-income individuals and seniors. These groups often lack access to the competitive rates offered by some providers and have been disproportionately affected by recent price increases.
  • Promoting Economic Stability: The goal is to lower the cost of utilities for all Albertans, leading to overall lower costs of living and doing business. The government anticipates these changes will create a more attractive environment for investment and job creation.


Opposition Views

Critics argue that limiting the flexibility of prices for the default electricity rate could interfere with market dynamics and stifle market competition among providers. Some worry it could ultimately lead to higher prices in the long term. Others advocate directly subsidizing low-income households rather than introducing broad price controls.


Balancing Affordability and the Market

The Alberta government maintains that the proposed changes will strike a balance between ensuring affordable electricity for vulnerable Albertans and preserving a competitive energy market. Provincial officials emphasize that the new regulations should not deter consumers from seeking out competitive rates if they choose to.


The Path Ahead

The new electricity regulations are part of the Alberta government's broader Affordable Utilities Program, alongside electricity policy changes across the province. The legislation is expected to be introduced and debated in the provincial legislature this spring with the potential of coming into effect later in the year. Experts expect these changes will significantly impact the Alberta electricity market and ignite further discussion about how best to manage rising utility costs for consumers and businesses.

 

Related News

View more

Renewables surpass coal in US energy generation for first time in 130 years

Renewables Overtake Coal in the US, as solar, wind, and hydro expand grid share; EIA data show an energy transition accelerated by COVID-19, slashing emissions, displacing fossil fuels, and reshaping electricity generation and climate policy.

 

Key Points

It refers to the milestone where US renewable energy generation surpassed coal, marking a pivotal energy transition.

✅ EIA data show renewables topped coal consumption in 2019.

✅ Solar, wind, and hydro displaced aging, costly coal plants.

✅ COVID-19 demand drop accelerated the energy transition.

 

Solar, wind and other renewable sources have toppled coal in energy generation in the United States for the first time in over 130 years, with the coronavirus pandemic accelerating a decline in coal that has profound implications for the climate crisis.

Not since wood was the main source of American energy in the 19th century has a renewable resource been used more heavily than coal, but 2019 saw a historic reversal, building on wind and solar reaching 10% of U.S. generation in 2018, according to US government figures.

Coal consumption fell by 15%, down for the sixth year in a row, while renewables edged up by 1%, even as U.S. electricity use trended lower. This meant renewables surpassed coal for the first time since at least 1885, a year when Mark Twain published The Adventures of Huckleberry Finn and America’s first skyscraper was erected in Chicago.

Electricity generation from coal fell to its lowest level in 42 years in 2019, with the US Energy Information Administration (EIA) forecasting that renewables will eclipse coal as an electricity source this year, while a global eclipse by 2025 is also projected. On 21 May, the year hit its 100th day in which renewables have been used more heavily than coal.

“Coal is on the way out, we are seeing the end of coal,” said Dennis Wamsted, analyst at the Institute for Energy Economics and Financial Analysis. “We aren’t going to see a big resurgence in coal generation, the trend is pretty clear.”

The ongoing collapse of coal would have been nearly unthinkable a decade ago, when the fuel source accounted for nearly half of America’s generated electricity, even as a brief uptick in 2021 was anticipated. That proportion may fall to under 20% this year, with analysts predicting a further halving within the coming decade.

A rapid slump since then has not been reversed despite the efforts of the Trump administration, which has dismantled a key Barack Obama-era climate rule to reduce emissions from coal plants and eased requirements that prevent coal operations discharging mercury into the atmosphere and waste into streams.

Coal releases more planet-warming carbon dioxide than any other energy source, with scientists warning its use must be rapidly phased out to achieve net-zero emissions globally by 2050 and avoid the worst ravages of the climate crisis.

Countries including the UK and Germany are in the process of winding down their coal sectors, and in Europe renewables are increasingly crowding out gas as well, although in the US the industry still enjoys strong political support from Trump.

“It’s a big moment for the market to see renewables overtake coal,” said Ben Nelson, lead coal analyst at Moody’s. “The magnitude of intervention to aid coal has not been sufficient to fundamentally change its trajectory, which is sharply downwards.”

Nelson said he expects coal production to plummet by a quarter this year but stressed that declaring the demise of the industry is “a very tough statement to make” due to ongoing exports of coal and its use in steel-making. There are also rural communities with power purchase agreements with coal plants, meaning these contracts would have to end before coal use was halted.

The coal sector has been beset by a barrage of problems, predominantly from cheap, abundant gas that has displaced it as a go-to energy source. The Covid-19 outbreak has exacerbated this trend, even as global power demand has surged above pre-pandemic levels. With plunging electricity demand following the shutting of factories, offices and retailers, utilities have plenty of spare energy to choose from and coal is routinely the last to be picked because it is more expensive to run than gas, solar, wind or nuclear.

Many US coal plants are ageing and costly to operate, forcing hundreds of closures over the past decade. Just this year, power companies have announced plans to shutter 13 coal plants, including the large Edgewater facility outside Sheboygan, Wisconsin, the Coal Creek Station plant in North Dakota and the Four Corners generating station in New Mexico – one of America’s largest emitters of carbon dioxide.

The last coal facility left in New York state closed earlier this year.

The additional pressure of the pandemic “will likely shutter the US coal industry for good”, said Yuan-Sheng Yu, senior analyst at Lux Research. “It is becoming clear that Covid-19 will lead to a shake-up of the energy landscape and catalyze the energy transition, with investors eyeing new energy sector plays as we emerge from the pandemic.”

Climate campaigners have cheered the decline of coal but in the US the fuel is largely being replaced by gas, which burns more cleanly than coal but still emits a sizable amount of carbon dioxide and methane, a powerful greenhouse gas, in its production, whereas in the EU wind and solar overtook gas last year.

Renewables accounted for 11% of total US energy consumption last year – a share that will have to radically expand if dangerous climate change is to be avoided. Petroleum made up 37% of the total, followed by gas at 32%. Renewables marginally edged out coal, while nuclear stood at 8%.

“Getting past coal is a big first hurdle but the next round will be the gas industry,” said Wamsted. “There are emissions from gas plants and they are significant. It’s certainly not over.”
 

 

Related News

View more

Sign Up for Electricity Forum’s Newsletter

Stay informed with our FREE Newsletter — get the latest news, breakthrough technologies, and expert insights, delivered straight to your inbox.

Electricity Today T&D Magazine Subscribe for FREE

Stay informed with the latest T&D policies and technologies.
  • Timely insights from industry experts
  • Practical solutions T&D engineers
  • Free access to every issue

Download the 2025 Electrical Training Catalog

Explore 50+ live, expert-led electrical training courses –

  • Interactive
  • Flexible
  • CEU-cerified