Court urged to side with power plants against fish

By Associated Press


Protective Relay Training - Basic

Our customized live online or in‑person group training can be delivered to your staff at your location.

  • Live Online
  • 12 hours Instructor-led
  • Group Training Available
Regular Price:
$699
Coupon Price:
$599
Reserve Your Seat Today
The Bush administration asked the Supreme Court to let New York's Indian Point and other older power plants draw in billions of gallons of water for cooling without installing technology that would best protect fish and aquatic organisms.

Lawyers for the government and electricity producers urged the justices to overturn a lower court ruling that says the Clean Water Act does not let the government pit the cost of upgrading an estimated 554 power plants against the benefits of protecting fish and aquatic organisms when limiting water use.

They argued that for the last 30 years the Environmental Protection Agency has weighed the costs of controlling power plant withdrawals from rivers, streams and other waterways against the benefits of saving more aquatic wildlife in setting technology requirements.

The law already allows such cost-benefit analyses to be performed when facilities discharge pollutants into waterways that could affect human health, the attorneys said.

"There is no reason Congress would want greater protection for fish from intake structures than for people through the discharge of pollutants," said Deputy Solicitor General Daryl Joseffer.

Environmentalists want the decision upheld, an outcome that could prompt the EPA, under a new administration, to require existing power plants to install more costly and protective technology. All new power plants must use closed-cycle cooling, which recycles water using less from waterways to cool machinery.

Richard Lazarus, an attorney representing environmental groups, said that by comparing costs to benefits the EPA has underregulated water intake from power plants.

"EPA has no authority in any circumstance to decide that fish aren't worth a certain amount of cost," he said. Lazarus, however, said that other parts of the law let the agency evaluate the burden on industry.

Maureen Mahoney, who presented the case for power producers including Entergy Corp. and PSEG Fossil LLC, said that if the EPA is not allowed to consider costs it would lead to "irrational results" such as 200-foot cooling towers in historic towns.

Some justices expressed skepticism about whether it was possible to weigh the cost of buying and installing new cooling water intake structures against the value of fish and other aquatic organisms.

"The difficulty that I have is if you are going to apply... a cost benefit analysis, I'm not sure how it would work," said Justice David Souter. "Are a thousand plankton worth a million dollars? I don't know."

The nation's power plants use billions of gallons of water from rivers and other waterways each year to cool their facilities. But the flow of water can smash fish against grills and screens and smaller aquatic organisms can get sucked into the system itself.

In July 2004, the EPA allowed the industry to forgo the most expensive solution, installing closed-cycle cooling systems that would cost billions of dollars and prompt the shutdown of some power plants.

An EPA analysis cited by Justice Stephen Breyer found that requiring all facilities to use the technology would require building 20 more 400-megawatt power plants to replace the electricity used to operate it. The same analysis also said electricity costs could rise by 2.4 percent to 5.3 percent.

Breyer said the EPA should be able to take into account the environmental costs and benefits, as long as it doesn't overreach.

"Why not let sleeping dogs lie? Let the agency take it into account the way it's done it to prevent absurd results, but not try to do it so that it's so refined you can't even take account of what a fish is worth unless they happen to be one of the 1.2 percent that goes to market," he said.

The EPA rule allowed companies operating older plants to decide how to comply with the Clean Water Act by selecting among more economical options.

The 2nd U.S. Circuit Court of Appeals in January 2007 ruled against the companies and government, saying the Clean Water Act does not allow cost to be used when deciding what technology would best minimize environmental impacts.

Related News

Flowing with current, Frisco, Colorado wants 100% clean electricity

Frisco 100% Renewable Electricity Goal outlines decarbonization via Xcel Energy, wind, solar, and battery storage, enabling beneficial electrification and a smarter grid for 100% municipal power by 2025 and community-wide clean electricity by 2035.

 

Key Points

Frisco targets 100% renewable electricity: municipal by 2025, community by 2035, via Xcel decarbonization.

✅ Municipal operations to reach 100% renewable electricity by 2025

✅ Community-wide electricity to be 100% carbon-free by 2035

✅ Partnerships: Xcel Energy, wind, solar, storage, grid markets

 

Frisco has now set a goal of 100-per-cent renewable energy, joining communities on the road to 100% renewables across the country. But unlike some other resolutions adopted in the last decade, this one isn't purely aspirational. It's swimming with a strong current.

With the resolution adopted last week by the town council, Frisco joins 10 other Colorado towns and cities, plus Pueblo and Summit counties, a trend reflected in tracking progress on clean energy targets reports nationwide, in adopting 100-per-cent goals.

The goal is to get the municipality's electricity to 100-per-cent by 2025 and the community altogether by 2035, a timeline aligned with scenarios showing zero-emissions electricity by 2035 is possible in North America.

Decarbonizing electricity will be far easier than transportation, and transportation far easier than buildings. Many see carbon-free electricity as being crucial to both, a concept called "beneficial electrification," and point to ways to meet decarbonization goals that leverage electrified end uses.

Electricity for Frisco comes from Xcel Energy, an investor-owned utility that is making giant steps toward decarbonizing its power supply.

Xcel first announced plans to close its work-horse power plants early to take advantage of now-cheap wind and solar resources plus what will be the largest battery storage project east of the Rocky Mountains. All this will be accomplished by 2026 and will put Xcel at 55 per cent renewable generation in Colorado.

In December, a week after Frisco launched the process that produced the resolution, Xcel announced further steps, an 80 percent reduction in carbon dioxide emissions by 2030 as compared to 2050 levels. By 2050, the company vows to be 100 per cent "carbon-free" energy by 2050.

Frisco's non-binding goals were triggered by Fran Long, who is retired and living in Frisco. For eight years, though, he worked for Xcel in helping shape its response to the declining prices of renewables. In his retirement, he has also helped put together the aspirational goal adopted by Breckenridge for 100-per-cent renewables.

A task force that Long led identified a three-pronged approach. First, the city government must lead by example. The resolution calls for the town to spend $25,000 to $50,000 annually during the next several years to improve energy efficiency in its municipal facilities. Then, through an Xcel program called Renewable Connect, it can pay an added cost to allow it to say it uses 100-per-cent electricity from renewable sources.

Beyond that, Frisco wants to work with high-end businesses to encourage buying output from solar gardens or other devices that will allow them to proclaim 100-per-cent renewable energy. The task force also recommends a marketing program directed to homes and smaller businesses.

Goals of 100-per-cent renewable electricity are problematic, given why the grid isn't 100% renewable today for technical and economic reasons. Aspen Electric, which provides electricity for about two-thirds of the town, by 2015 had secured enough wind and hydro, mostly from distant locations, to allow it to proclaim 100 per cent renewables.

In fact, some of those electrons in Aspen almost certainly originate in coal or gas plants. That doesn't make Aspen's claim wrong. But the fact remains that nobody has figured out how, at least at affordable cost, to deliver 100-per-cent clean energy on a broad basis.

Xcel Energy, which supplies more than 60 per cent of electricity in Colorado, one of six states in which it operates, has a taller challenge. But it is a very different utility than it was in 2004, when it spent heavily in advertising to oppose a mandate that it would have to achieve 10 per cent of its electricity from renewable sources by 2020.

Once it lost the election, though, Xcel set out to comply. Integrating renewables proved far more easily than was feared. It has more than doubled the original mandate for 2020. Wind delivers 82 per cent of that generation, with another 18 per cent coming from community, rooftop, and utility-scale solar.

The company has become steadily more proficient at juggling different intermittent power supplies while ensuring lights and computers remain on. This is partly the result of practice but also of relatively minor technological wrinkles, such as improved weather forecasting, according to an Energy News Network story published in March.

For example, a Boulder company, Global Weather corporation, projects wind—and hence electrical production—from turbines for 10 days ahead. It updates its forecasts every 15 minutes.

Forecasts have become so good, said John T. Welch, director of power operations for Xcel in Colorado, that the utility uses 95 per cent to 98 per cent of the electricity generated by turbines. This has allowed the company to use its coal and natural gas plants less.M

Moreover, prices of wind and then solar declined slowly at first and then dramatically.

Xcel is now comfortable that existing technology will allow it to push from 55 per cent renewables in 2026 to an 80 per cent carbon reduction goal by 2030.

But when announcing their goal of emissions-free energy by mid-century in December, the company's Minneapolis-based chief executive, Ben Fowke, and Alice Jackson, the chief executive of the company's Colorado subsidiary, freely admitted they had no idea how they will achieve it. "I have a lot of confidence they will be developed," Fowke said of new technologies.

Everything is on the table, they said, including nuclear. But also including fossil fuels, if the carbon dioxide can be sequestered. So far, such technology has proven prohibitively expensive despite billions of dollars in federal support for research and deployment. They suggested it might involve new technology.

Xcel's Welch told Energy News Network that he believes solar must play a larger role, and he believes solar forecasting must improve.

Storage technology must also improve as batteries are transforming solar economics across markets. Batteries, such as produced by Tesla at its Gigafactory near Reno, can store electricity for hours, maybe even a few days. But batteries that can store large amounts of electricity for months will be needed in Colorado. Wind is plentiful in spring but not so much in summer, when air conditioners crank up.

Increased sharing of cheap renewable generation among utilities will also allow deeper penetration of carbon-free energy, a dynamic consistent with studies finding wind and solar could meet 80% of demand with improved transmission. Western US states and Canadian provinces are all on one grid, but the different parts are Balkanized. In other words, California is largely its own energy balancing authority, ensuring electricity supplies match electricity demands. Ditto for Colorado. The Pacific Northwest has its own balancing authority.

If they were all orchestrated as one in an expanded energy market across the West, however, electricity supplies and demands could more easily be matched. California's surplus of solar on summer afternoons, for example, might be moved to Colorado.

Colorado legislators in early May adopted a bill that requires the state's Public Utilities Commission to begin study by late this year of an energy imbalance market or regional transmission organization.

 

Related News

View more

Europe Stores Electricity in Natural Gas Pipes

Power-to-gas converts surplus renewable electricity into green hydrogen or synthetic methane via electrolysis and methanation, enabling seasonal energy storage, grid balancing, hydrogen injection into gas pipelines, and decarbonization of heat, transport, and industry.

 

Key Points

Power-to-gas turns excess renewable power into hydrogen or methane for storage, grid support, and clean fuel.

✅ Enables hydrogen injection into existing natural gas networks

✅ Balances grids and provides seasonal energy storage capacity

✅ Supplies low-carbon fuels for industry, heat, and heavy transport

 

Last month Denmark’s biggest energy firm, Ørsted, said wind farms it is proposing for the North Sea will convert some of their excess power into gas. Electricity flowing in from offshore will feed on-shore electrolysis plants that split water to produce clean-burning hydrogen, with oxygen as a by-product. That would supply a new set of customers who need energy, but not as electricity. And it would take some strain off of Europe’s power grid as it grapples with an ever-increasing share of hard-to-handle EU wind and solar output on the grid.

Turning clean electricity into energetic gases such as hydrogen or methane is an old idea that is making a comeback as renewable power generation surges and crowds out gas in Europe. That is because gases can be stockpiled within the natural gas distribution system to cover times of weak winds and sunlight. They can also provide concentrated energy to replace fossil fuels for vehicles and industries. Although many U.S. energy experts argue that this “power-to-gas” vision may be prohibitively expensive, some of Europe’s biggest industrial firms are buying in to the idea.

European power equipment manufacturers, anticipating a wave of renewable hydrogen projects such as Ørsted’s, vowed in January that, as countries push for hydrogen-ready power plants across Europe, all of their gas-fired turbines will be certified by next year to run on up to 20 percent hydrogen, which burns faster than methane-rich natural gas. The natural gas distributors, meanwhile, have said they will use hydrogen to help them fully de-carbonize Europe’s gas supplies by 2050.

Converting power to gas is picking up steam in Europe because the region has more consistent and aggressive climate policies and evolving electricity pricing frameworks that support integration. Most U.S. states have goals to clean up some fraction of their electricity supply; coal- and gas-fired plants contribute a little more than a quarter of U.S. greenhouse gas emissions. In contrast, European countries are counting on carbon reductions of 80 percent or more by midcentury—reductions that will require an economywide switch to low-carbon energy.

Cleaning up energy by stripping the carbon out of fossil fuels is costly. So is building massive new grid infrastructure, including transmission lines and huge batteries, amid persistent grid expansion woes in parts of Europe. Power-to-gas may be the cheapest way forward, complementing Germany’s net-zero roadmap to cut electricity costs by a third. “In order to reach the targets for climate protection, we need even more renewable energy. Green hydrogen is perceived as one of the most promising ways to make the energy transition happen,” says Armin Schnettler, head of energy and electronics research at Munich-based electric equipment giant Siemens.

Europe already has more than 45 demonstration projects to improve power-to-gas technologies and their integration with power grids and gas networks. The principal focus has been to make the electrolyzers that convert electricity to hydrogen more efficient, longer-lasting and cheaper to produce.

The projects are also scaling up the various technologies. Early installations converted a few hundred kilowatts of electricity, but manufacturers such as Siemens are now building equipment that can convert 10 megawatts, which would yield enough hydrogen each year to heat around 3,000 homes or fuel 100 buses, according to financial consultancy Ernst & Young.

The improvements have been most dramatic for proton-exchange membrane electrolyzers, which are akin to the fuel cells used in hydrogen vehicles (but optimized to produce hydrogen rather than consume it). The price of proton-exchange electrolyzers has dropped by roughly 40 percent during the past decade, according to a study published in February in Nature Energy. They are also five times more compact than older alkaline electrolysis plants, enabling onsite hydrogen production near gas consumers, and they can vary their power consumption within seconds to operate on fluctuating wind and solar generation.

Many European pilot projects are demonstrating “methanation” equipment that converts hydrogen to methane, too, which can be used as a drop-in replacement for natural gas. Europe’s electrolyzer plants, however, are showing that methanation is not as critical to the power-to-gas vision as advocates long believed. Many electrolyzers are injecting their hydrogen directly into natural gas pipelines—something that U.S. gas firms forbid—and they are doing so without impacting either the gas infrastructure or natural gas consumers.

Europe’s first large-scale hydrogen injection began in eastern Germany in 2013 at a two-megawatt electrolyzer installed by Essen-based power firm E.ON. Germany has since ratcheted up the amount of hydrogen it allows in natural gas lines from an initial 2 percent by volume to 10 percent, in a market where renewables now outpace coal and nuclear in Germany, and other European states have followed suit with their own hydrogen allowances. Christopher Hebling, head of hydrogen technologies at the Freiburg-based Fraunhofer Institute for Solar Energy Systems, predicts that such limits will rise to the 20-percent level anticipated by Europe’s turbine manufacturers.

Moving renewable hydrogen and methane via natural gas pipelines promises to cut the cost of switching to renewable energy. For example, gas networks have storage caverns whose reserves could be tapped to run gas-fired electric generation power plants during periods of low wind and solar output. Hebling notes that Germany’s gas network can store 240 terawatt-hours of energy—roughly 25 times more energy than global power grids can presently store by pumping water uphill to refill hydropower reservoirs. Repurposing gas infrastructure to help the power system could save European consumers 138 billion euros ($156 billion) by 2050, according to Dutch energy consultancy Navigant (formerly Ecofys).

For all the pilot plants and promise, renewable hydrogen presently supplies a tiny fraction of Europe’s gas. And, globally, around 4 percent of hydrogen is supplied via electrolysis, with the bulk refined from fossil fuels, according to the International Renewable Energy Agency.

Power-to-gas is catching up, however. According to the February Nature Energy study, renewable hydrogen already pays for itself in some niche applications, and further electrolyzer improvements will progressively extend its market. “If costs continue to decline as they have done in recent years, power-to-gas will become competitive at large scale within the next decade,” says study co-author Gunther Glenk, an economist at the Technical University of Munich.

Glenk says power-to-gas could scale up faster if governments guaranteed premium prices for renewable hydrogen and methane, as they did to mainstream solar and wind power.

Tim Calver, an energy storage researcher turned consultant and Ernst & Young’s executive director in London, agrees that European governments need to step up their support for power-to-gas projects and markets. Calver calls the scale of funding to date, “not proportionate to the challenge that we face on long-term decarbonization and the potential role of hydrogen.”

 

Related News

View more

Why the promise of nuclear fusion is no longer a pipe dream

ITER Nuclear Fusion advances tokamak magnetic confinement, heating deuterium-tritium plasma with superconducting magnets, targeting net energy gain, tritium breeding, and steam-turbine power, while complementing laser inertial confinement milestones for grid-scale electricity and 2025 startup goals.

 

Key Points

ITER Nuclear Fusion is a tokamak project confining D-T plasma with magnets to achieve net energy gain and clean power.

✅ Tokamak magnetic confinement with high-temp superconducting coils

✅ Deuterium-tritium fuel cycle with on-site tritium breeding

✅ Targets net energy gain and grid-scale, low-carbon electricity

 

It sounds like the stuff of dreams: a virtually limitless source of energy that doesn’t produce greenhouse gases or radioactive waste. That’s the promise of nuclear fusion, often described as the holy grail of clean energy by proponents, which for decades has been nothing more than a fantasy due to insurmountable technical challenges. But things are heating up in what has turned into a race to create what amounts to an artificial sun here on Earth, one that can provide power for our kettles, cars and light bulbs.

Today’s nuclear power plants create electricity through nuclear fission, in which atoms are split, with next-gen nuclear power exploring smaller, cheaper, safer designs that remain distinct from fusion. Nuclear fusion however, involves combining atomic nuclei to release energy. It’s the same reaction that’s taking place at the Sun’s core. But overcoming the natural repulsion between atomic nuclei and maintaining the right conditions for fusion to occur isn’t straightforward. And doing so in a way that produces more energy than the reaction consumes has been beyond the grasp of the finest minds in physics for decades.

But perhaps not for much longer. Some major technical challenges have been overcome in the past few years and governments around the world have been pouring money into fusion power research as part of a broader green industrial revolution under way in several regions. There are also over 20 private ventures in the UK, US, Europe, China and Australia vying to be the first to make fusion energy production a reality.

“People are saying, ‘If it really is the ultimate solution, let’s find out whether it works or not,’” says Dr Tim Luce, head of science and operation at the International Thermonuclear Experimental Reactor (ITER), being built in southeast France. ITER is the biggest throw of the fusion dice yet.

Its $22bn (£15.9bn) build cost is being met by the governments of two-thirds of the world’s population, including the EU, the US, China and Russia, at a time when Europe is losing nuclear power and needs energy, and when it’s fired up in 2025 it’ll be the world’s largest fusion reactor. If it works, ITER will transform fusion power from being the stuff of dreams into a viable energy source.


Constructing a nuclear fusion reactor
ITER will be a tokamak reactor – thought to be the best hope for fusion power. Inside a tokamak, a gas, often a hydrogen isotope called deuterium, is subjected to intense heat and pressure, forcing electrons out of the atoms. This creates a plasma – a superheated, ionised gas – that has to be contained by intense magnetic fields.

The containment is vital, as no material on Earth could withstand the intense heat (100,000,000°C and above) that the plasma has to reach so that fusion can begin. It’s close to 10 times the heat at the Sun’s core, and temperatures like that are needed in a tokamak because the gravitational pressure within the Sun can’t be recreated.

When atomic nuclei do start to fuse, vast amounts of energy are released. While the experimental reactors currently in operation release that energy as heat, in a fusion reactor power plant, the heat would be used to produce steam that would drive turbines to generate electricity, even as some envision nuclear beyond electricity for industrial heat and fuels.

Tokamaks aren’t the only fusion reactors being tried. Another type of reactor uses lasers to heat and compress a hydrogen fuel to initiate fusion. In August 2021, one such device at the National Ignition Facility, at the Lawrence Livermore National Laboratory in California, generated 1.35 megajoules of energy. This record-breaking figure brings fusion power a step closer to net energy gain, but most hopes are still pinned on tokamak reactors rather than lasers.

In June 2021, China’s Experimental Advanced Superconducting Tokamak (EAST) reactor maintained a plasma for 101 seconds at 120,000,000°C. Before that, the record was 20 seconds. Ultimately, a fusion reactor would need to sustain the plasma indefinitely – or at least for eight-hour ‘pulses’ during periods of peak electricity demand.

A real game-changer for tokamaks has been the magnets used to produce the magnetic field. “We know how to make magnets that generate a very high magnetic field from copper or other kinds of metal, but you would pay a fortune for the electricity. It wouldn’t be a net energy gain from the plant,” says Luce.


One route for nuclear fusion is to use atoms of deuterium and tritium, both isotopes of hydrogen. They fuse under incredible heat and pressure, and the resulting products release energy as heat


The solution is to use high-temperature, superconducting magnets made from superconducting wire, or ‘tape’, that has no electrical resistance. These magnets can create intense magnetic fields and don’t lose energy as heat.

“High temperature superconductivity has been known about for 35 years. But the manufacturing capability to make tape in the lengths that would be required to make a reasonable fusion coil has just recently been developed,” says Luce. One of ITER’s magnets, the central solenoid, will produce a field of 13 tesla – 280,000 times Earth’s magnetic field.

The inner walls of ITER’s vacuum vessel, where the fusion will occur, will be lined with beryllium, a metal that won’t contaminate the plasma much if they touch. At the bottom is the divertor that will keep the temperature inside the reactor under control.

“The heat load on the divertor can be as large as in a rocket nozzle,” says Luce. “Rocket nozzles work because you can get into orbit within minutes and in space it’s really cold.” In a fusion reactor, a divertor would need to withstand this heat indefinitely and at ITER they’ll be testing one made out of tungsten.

Meanwhile, in the US, the National Spherical Torus Experiment – Upgrade (NSTX-U) fusion reactor will be fired up in the autumn of 2022, while efforts in advanced fission such as a mini-reactor design are also progressing. One of its priorities will be to see whether lining the reactor with lithium helps to keep the plasma stable.


Choosing a fuel
Instead of just using deuterium as the fusion fuel, ITER will use deuterium mixed with tritium, another hydrogen isotope. The deuterium-tritium blend offers the best chance of getting significantly more power out than is put in. Proponents of fusion power say one reason the technology is safe is that the fuel needs to be constantly fed into the reactor to keep fusion happening, making a runaway reaction impossible.

Deuterium can be extracted from seawater, so there’s a virtually limitless supply of it. But only 20kg of tritium are thought to exist worldwide, so fusion power plants will have to produce it (ITER will develop technology to ‘breed’ tritium). While some radioactive waste will be produced in a fusion plant, it’ll have a lifetime of around 100 years, rather than the thousands of years from fission.

At the time of writing in September, researchers at the Joint European Torus (JET) fusion reactor in Oxfordshire were due to start their deuterium-tritium fusion reactions. “JET will help ITER prepare a choice of machine parameters to optimise the fusion power,” says Dr Joelle Mailloux, one of the scientific programme leaders at JET. These parameters will include finding the best combination of deuterium and tritium, and establishing how the current is increased in the magnets before fusion starts.

The groundwork laid down at JET should accelerate ITER’s efforts to accomplish net energy gain. ITER will produce ‘first plasma’ in December 2025 and be cranked up to full power over the following decade. Its plasma temperature will reach 150,000,000°C and its target is to produce 500 megawatts of fusion power for every 50 megawatts of input heating power.

“If ITER is successful, it’ll eliminate most, if not all, doubts about the science and liberate money for technology development,” says Luce. That technology development will be demonstration fusion power plants that actually produce electricity, where advanced reactors can build on decades of expertise. “ITER is opening the door and saying, yeah, this works – the science is there.”

 

Related News

View more

Vancouver's Reversal on Gas Appliances

Vancouver Natural Gas Ban Reversal spotlights energy policy, electrification tradeoffs, heat pumps, emissions, grid reliability, and affordability, reshaping building codes and decarbonization pathways while inviting stakeholders to weigh practical constraints and climate goals.

 

Key Points

Vancouver ending its ban on natural gas in new homes to balance climate goals with reliability, costs, and technology.

✅ Balances emissions goals with reliability and affordability

✅ Impacts builders, homeowners, and energy infrastructure

✅ Spurs debate on electrification, heat pumps, and grid capacity

 

In a significant policy shift, Vancouver has decided to lift its ban on natural gas appliances in new homes, a move that marks a pivotal moment in the city's energy policy and environmental strategy. This decision, announced recently and following the city's Clean Energy Champion recognition for Bloedel upgrades, has sparked a broader conversation about the future of energy systems and the balance between environmental goals and practical energy needs. Stewart Muir, CEO of Resource Works, argues that this reversal should catalyze a necessary dialogue on energy choices, highlighting both the benefits and challenges of such a policy change.

Vancouver's original ban on natural gas appliances was part of a broader initiative aimed at reducing greenhouse gas emissions and promoting sustainability, including progress toward phasing out fossil fuels where feasible over time. The city had adopted stringent regulations to encourage the use of electric heat pumps and other low-carbon technologies in new residential buildings. This move was aligned with Vancouver’s ambitious climate goals, which include achieving carbon neutrality by 2050 and significantly cutting down on fossil fuel use.

However, the recent decision to reverse the ban reflects a growing recognition of the complexities involved in transitioning to entirely new energy systems. The city's administration acknowledged that while electric alternatives offer environmental benefits, they also come with challenges that can affect homeowners, builders, and the broader energy infrastructure, including options for bridging the electricity gap with Alberta to enhance regional reliability.

Stewart Muir argues that Vancouver’s policy shift is not just about natural gas appliances but represents a larger conversation about energy system choices and their implications. He suggests that the reversal of the ban provides an opportunity to address key issues related to energy reliability, affordability, and the practicalities of integrating new technologies, including electrified LNG options for industry within the province into existing systems.

One of the primary reasons behind the reversal is the recognition of the practical limitations and costs associated with transitioning to electric-only systems. For many homeowners and builders, natural gas appliances have long been a reliable and cost-effective option. The initial ban on these appliances led to concerns about increased construction costs and potential disruptions for homeowners who were accustomed to natural gas heating and cooking.

In addition to cost considerations, there are concerns about the reliability and efficiency of electric alternatives. Natural gas has been praised for its stable energy supply and efficient performance, especially in colder climates where electric heating systems might struggle to maintain consistent temperatures or fully utilize Site C's electricity under peak demand. By reversing the ban, Vancouver acknowledges that a one-size-fits-all approach may not be suitable for every situation, particularly when considering diverse housing needs and energy demands.

Muir emphasizes that the reversal of the ban should prompt a broader discussion about how to balance environmental goals with practical energy needs. He argues that rather than enforcing a blanket ban on specific technologies, it is crucial to explore a range of solutions that can effectively address climate objectives while accommodating the diverse requirements of different communities and households.

The debate also touches on the role of technological innovation in achieving sustainability goals. As energy technologies continue to evolve, renewable electricity is coming on strong and new solutions and advancements could potentially offer more efficient and environmentally friendly alternatives. The conversation should include exploring these innovations and considering how they can be integrated into existing energy systems to support long-term sustainability.

Moreover, Muir advocates for a more inclusive approach to energy policy that involves engaging various stakeholders, including residents, businesses, and energy experts. A collaborative approach can help identify practical solutions that address both environmental concerns and the realities of everyday energy use.

In the broader context, Vancouver’s decision reflects a growing trend in cities and regions grappling with energy transitions. Many urban centers are evaluating their energy policies and considering adjustments based on new information and emerging technologies. The key is to find a balance that supports climate goals such as 2050 greenhouse gas targets while ensuring that energy systems remain reliable, affordable, and adaptable to changing needs.

As Vancouver moves forward with its revised policy, it will be important to monitor the outcomes and assess the impacts on both the environment and the community. The reversal of the natural gas ban could serve as a case study for other cities facing similar challenges and could provide valuable insights into how to navigate the complexities of energy transitions.

In conclusion, Vancouver’s decision to reverse its ban on natural gas appliances in new homes is a significant development that opens the door for a critical dialogue about energy system choices. Stewart Muir’s call for a broader conversation emphasizes the need to balance environmental ambitions with practical considerations, such as cost, reliability, and technological advancements. As cities continue to navigate their energy futures, finding a pragmatic and inclusive approach will be essential in achieving both sustainability and functionality in energy systems.

 

Related News

View more

Electricity Regulation With Equity & Justice For All

Energy equity in utility regulation prioritizes fair rates, clean energy access, and DERs, addressing fixed charges and energy burdens on low-income households through stakeholder engagement and public utility commission reforms.

 

Key Points

Fairly allocates clean energy benefits and rate burdens, ensuring access and protections for low-income households.

✅ Reduces fixed charges that burden low-income households

✅ Funds community participation in utility proceedings

✅ Prioritizes DERs, energy efficiency, and solar in impacted areas

 

By Kiran Julin

Pouring over the line items on your monthly electricity bill may not sound like an enticing way to spend an afternoon, but the way electricity bills are structured has a significant impact on equitable energy access and distribution. For example, fixed fees can have a disproportionate impact on low-income households. And combined with other factors, low-income households and households of color are far more likely to report losing home heating service, with evidence from pandemic power shut-offs highlighting these disparities, according to recent federal data.

Advancing Equity in Utility Regulation, a new report published by the U.S. Department of Energy’s (DOE’s) Lawrence Berkeley National Laboratory (Berkeley Lab), makes a unifying case that utilities, regulators, and stakeholders need to prioritize energy equity in the deployment of clean energy technologies and resources, aligning with a people-and-planet electricity future envisioned by advocacy groups. Equity in this context is the fair distribution of the benefits and burdens of energy production and consumption. The report outlines systemic changes needed to advance equity in electric utility regulation by providing perspectives from four organizations — Portland General Electric, a utility company; the National Consumer Law Center, a consumer advocacy organization; and the Partnership for Southern Equity and the Center for Biological Diversity, social justice and environmental organizations.
 
“While government and ratepayer-funded energy efficiency programs have made strides towards equity by enabling low-income households to access energy-efficiency measures, that has not yet extended in a major way to other clean-energy technologies,” said Lisa Schwartz, a manager and strategic advisor at Berkeley Lab and technical editor of the report. “States and utilities can take the lead to make sure the clean-energy transition does not leave behind low-income households and communities of color. Decarbonization and energy equity goals are not mutually exclusive, and in fact, they need to go hand-in-hand.”

Energy bills and electricity rates are governed by state laws and utility regulators, whose mission is to ensure that utility services are reliable, safe, and fairly priced. Public utility commissions also are increasingly recognizing equity as an important goal, tool, and metric, and some customers face major changes to electric bills as reforms advance. While states can use existing authorities to advance equity in their decision-making, several, including Illinois, Maine, Oregon, and Washington, have enacted legislation over the last couple of years to more explicitly require utility regulators to consider equity.

“The infrastructure investments that utility companies make today, and regulator decisions about what goes into electricity bills, including new rate design steps that shape customer costs, will have significant impacts for decades to come,” Schwartz said.

Solutions recommended in the report include considering energy justice goals when determining the “public interest” in regulatory decisions, allocating funding for energy justice organizations to participate in utility proceedings, supporting utility programs that increase deployment of energy efficiency and solar for low-income households, and accounting for energy inequities and access in designing electricity rates, while examining future utility revenue models as technologies evolve.

The report is part of the Future of Electric Utility Regulation series that started in 2015, led by Berkeley Lab and funded by DOE, to encourage informed discussion and debate on utility trends and tackling the toughest issues related to state electric utility regulation. An advisory group of utilities, public utility commissioners, consumer advocates, environmental and social justice organizations, and other experts provides guidance.

 

Taking stock of past and current energy inequities

One focus of the report is electricity bills. In addition to charges based on usage, electricity bills usually also have a fixed basic customer charge, which is the minimum amount a household has to pay every month to access electricity. The fixed charge varies widely, from $5 to more than $20. In recent years, utility companies have sought sizable increases in this charge to cover more costs, amid rising electricity prices in some markets.

This fixed charge means that no matter what a household does to use energy more efficiently or to conserve energy, there is always a minimum cost. Moreover, low-income households often live in older, poorly insulated housing. Current levels of public and utility funding for energy-efficiency programs fall far short of the need. The combined result is that the energy burden – or percent of income needed to keep the lights on and their homes at a healthy temperature – is far greater for lower-income households.

“While all households require basic lighting, heating, cooling, and refrigeration, low-income households must devote a greater proportion of income to maintain basic service,” explained John Howat and Jenifer Bosco from the National Consumer Law Center and co-authors of Berkeley Lab’s report. Their analysis of data from the most recent U.S. Energy Information Administration’s Residential Energy Consumption Survey shows households with income less than $20,000 reported losing home heating service at a pace more than five times higher than households with income over $80,000. Households of color were far more likely than those with a white householder to report loss of heating service. In addition, low-income households and households of color are more likely to have to choose between paying their energy bill or paying for other necessities, such as healthcare or food.

Based on the most recent data (2015) from the U.S. Energy Information Administration (EIA), households with income less than $20,000 reported losing home heating service at a rate more than five times higher than households with income over $80,000. Households of color were far more likely than those with a white householder to report loss of heating service. Click on chart for larger view. (Credit: John Howat/National Consumer Law Center, using EIA data)

Moreover, while many of the infrastructure investment decisions that utilities make, such as whether and where to build a new power plant, often have long-term environmental and health consequences, impacted communities often are not at the table. “Despite bearing an inequitable proportion of the negative impacts of environmental injustices related to fossil fuel-based energy production and climate change, marginalized communities remain virtually unrepresented in the energy planning and decision-making processes that drive energy production, distribution, and regulation,” wrote Chandra Farley, CEO of ReSolve and a co-author of the report.


Engaging impacted communities
Each of the perspectives in the report identify a need for meaningful engagement of underrepresented and disadvantaged communities in energy planning and utility decision-making. “Connecting the dots between energy, racial injustice, economic disinvestment, health disparities, and other associated equity challenges becomes a clarion call for communities that are being completely left out of the clean energy economy,” wrote Farley, who previously served as the Just Energy Director at Partnership for Southern Equity. “We must prioritize the voices and lived experiences of residents if we are to have more equity in utility regulation and equitably transform the energy sector.”

In another essay in the report, Nidhi Thaker and Jake Wise from Portland General Electric identify the importance of collaborating directly with the communities they serve. In 2021, the Oregon Legislature passed Oregon HB 2475, which allows the Oregon Public Utility Commission to allocate ratepayer funding for organizations representing people most affected by a high energy burden, enabling them to participate in utility regulatory processes.

The report explains why energy equity requires correcting inequities resulting from past and present failures as well as rethinking how we achieve future energy and decarbonization goals. “Equity in energy requires adopting an expansive definition of the ‘public interest’ that encompasses energy, climate, and environmental justice. Energy equity also means prioritizing the deployment of distributed energy resources and clean energy technologies in areas that have been hit first and worst by the existing fossil fuel economy,” wrote Jean Su, energy justice director and senior attorney at the Center for Biological Diversity.

This report was supported by DOE’s Grid Modernization Laboratory Consortium, with funding from the Office of Energy Efficiency and Renewable Energy and the Office of Electricity.

 

Related News

View more

Customers on the hook for $5.5 billion in deferred BC Hydro operating costs: report

BC Hydro Deferred Regulatory Assets detail $5.5 billion in costs under rate-regulated accounting, to be recovered from ratepayers, highlighting B.C. Utilities Commission oversight, audit scrutiny, financial reporting impacts, and public utility governance.

 

Key Points

BC Hydro defers costs as regulatory assets to recover from ratepayers, influencing rates and financial reporting.

✅ $5.5B in deferred costs recorded as net regulatory assets

✅ Rate impacts tied to B.C. Utilities Commission oversight

✅ Auditor General to assess accounting and governance

 

Auditor General Carol Bellringer says BC Hydro has deferred $5.5 billion in expenses that it plans to recover from ratepayers in the future, as rates to rise by 3.75% over two years.

Bellringer focuses on the deferred expenses in a report on the public utility's use of rate-regulated accounting to control electricity rates for customers.

"As of March 31, 2018, BC Hydro reported a total net regulatory asset of $5.455 billion, which is what ratepayers owe," says the report. "BC Hydro expects to recover this from ratepayers in the future. For BC Hydro, this is an asset. For ratepayers, this is a debt."

She says rate-regulated accounting is used widely across North America, but cautions that Hydro has largely overridden the role of the independent B.C. Utilities Commission to regulate rates.

"We think it's important for the people of B.C. and our members of the legislative assembly to better understand rate-regulated accounting in order to appreciate the impact it has on the bottom line for BC Hydro, for government as a whole, for ratepayers and for taxpayers, especially following a three per cent rate increase in April 2018," Bellringer said in a conference call with reporters.

Last June, the B.C. government launched a two-phase review of BC Hydro to find cost savings and look at the direction of the Crown utility, amid calls for change from advocates.

The review came shortly after a planned government rate freeze was overturned by the utilities commission, which resulted in a three per cent rate increase in April 2018.

A statement by BC Hydro and the government says a key objective of the review due this month is to enhance the regulatory oversight of the commission.

Bellringer's office will become BC Hydro's auditor next year — and will be assessing the impact of regulation on the utility's financial reporting.

"It is a complex area and confidence in the regulatory system is critical to protect the public interest," wrote Bellringer.

 

Related News

View more

Sign Up for Electricity Forum’s Newsletter

Stay informed with our FREE Newsletter — get the latest news, breakthrough technologies, and expert insights, delivered straight to your inbox.

Electricity Today T&D Magazine Subscribe for FREE

Stay informed with the latest T&D policies and technologies.
  • Timely insights from industry experts
  • Practical solutions T&D engineers
  • Free access to every issue

Download the 2025 Electrical Training Catalog

Explore 50+ live, expert-led electrical training courses –

  • Interactive
  • Flexible
  • CEU-cerified