President Barack Obama's goal of having 1 million plug-in vehicles on U.S. roads by 2015 is on its way to being met, a Department of Energy official said.
"It's looking good," said Assistant Energy Secretary David Sandalow when asked by reporters on the chances of meeting the goal set by Obama.
"If you look at the plans of the major automotive manufacturers, there's a clear pathway to a million vehicles," Sandalow said.
Sandalow spoke to reporters after his keynote address to the Society of Automotive Engineers in Detroit.
Obama's goal of 1 million plug-in electric and hybrid cars by 2015 was a campaign pledge that he has restated since becoming president in January 2009. The goal was widely seen as well beyond optimistic forecasts for expansion of the alternative vehicles at the time, and there is skepticism that it can be met.
"The pace of innovation in this industry is extraordinary, and the interest around the world is extraordinary," he said. "But, partly it depends on American entrepreneurship and innovation. That's always served us very well in the past and I think it will do so in the future."
The DOE will soon announce how it will handle cuts to its energy efficiency and renewable energy program. He said he was not ready to say how much of those cuts will come in the automotive sector.
The cuts will not slow the Obama administration's effort on energy efficiency, he said.
"We will march forward aggressively to promote clean energy and we've got the budget to do it," Sandalow said.
California Biofuels vs EV Subsidies examines tradeoffs in decarbonization, greenhouse gas reductions, clean energy deployment, charging infrastructure, energy security, lifecycle emissions, and transportation sector policy to meet climate goals and accelerate sustainable mobility.
Key Points
Policy tradeoffs weighing biofuels and EVs to cut GHGs, boost energy security, and advance clean transportation.
✅ Near-term blending cuts emissions from existing fleets
✅ EVs scale with a cleaner grid and charging buildout
✅ Lifecycle impacts and costs guide optimal subsidy mix
California is at the forefront of the transition to a greener economy, driven by its ambitious goals to reduce greenhouse gas emissions and combat climate change. As part of its strategy, the state is grappling with the question of whether it should subsidize out-of-state biofuels or in-state electric vehicles (EVs) to meet these goals. Both options come with their own sets of benefits and challenges, and the decision carries significant implications for the state’s environmental, economic, and energy landscapes.
The Case for Biofuels
Biofuels have long been promoted as a cleaner alternative to traditional fossil fuels like gasoline and diesel. They are made from organic materials such as agricultural crops, algae, and waste, which means they can potentially reduce carbon emissions in comparison to petroleum-based fuels. In the context of California, biofuels—particularly ethanol and biodiesel—are viewed as a way to decarbonize the transportation sector, which is one of the state’s largest sources of greenhouse gas emissions.
Subsidizing out-of-state biofuels can help California reduce its reliance on imported oil while promoting the development of biofuel industries in other states. This approach may have immediate benefits, as biofuels are widely available and can be blended with conventional fuels to lower carbon emissions right away. It also allows the state to diversify its energy sources, improving energy security by reducing dependency on oil imports.
Moreover, biofuels can be produced in many regions across the United States, including rural areas. By subsidizing out-of-state biofuels, California could foster economic development in these regions, creating jobs and stimulating agricultural innovation. This approach could also support farmers who grow the feedstock for biofuel production, boosting the agricultural economy in the U.S.
However, there are drawbacks. The environmental benefits of biofuels are often debated. Critics argue that the production of biofuels—particularly those made from food crops like corn—can contribute to deforestation, water pollution, and increased food prices. Additionally, biofuels are not a silver bullet in the fight against climate change, as their production and combustion still release greenhouse gases. When considering whether to subsidize biofuels, California must also account for the full lifecycle emissions associated with their production and use.
The Case for Electric Vehicles
In contrast to biofuels, electric vehicles (EVs) offer a more direct pathway to reducing emissions from transportation. EVs are powered by electricity, and when coupled with renewable energy sources like solar or wind power, they can provide a nearly zero-emission solution for personal and commercial transportation. California has already invested heavily in EV infrastructure, including expanding its network of charging stations and exploring how EVs can support grid stability through vehicle-to-grid approaches, and offering incentives for consumers to purchase EVs.
Subsidizing in-state EVs could stimulate job creation and innovation within California's thriving clean-tech industry, with other states such as New Mexico projecting substantial economic gains from transportation electrification, and the state has already become a hub for electric vehicle manufacturers, including Tesla, Rivian, and several battery manufacturers. Supporting the EV industry could further strengthen California’s position as a global leader in green technology, attracting investment and fostering growth in related sectors such as battery manufacturing, renewable energy, and smart grid technology.
Additionally, the environmental benefits of EVs are substantial. As the electric grid becomes cleaner with an increasing share of renewable energy, EVs will become even greener, with lower lifecycle emissions than biofuels. By prioritizing EVs, California could further reduce its carbon footprint while also achieving its long-term climate goals, including reaching carbon neutrality by 2045.
However, there are challenges. EV adoption in California remains a significant undertaking, requiring major investments in infrastructure as they challenge state power grids in the near term, technology, and consumer incentives. The cost of EVs, although decreasing, still remains a barrier for many consumers. Additionally, there are concerns about the environmental impact of lithium mining, which is essential for EV batteries. While renewable energy is expanding, California’s grid is still reliant on fossil fuels to some degree, and in other jurisdictions such as Canada's 2019 electricity mix fossil generation remains significant, meaning that the full emissions benefit of EVs is not realized until the grid is entirely powered by clean energy.
A Balancing Act
The debate between subsidizing out-of-state biofuels and in-state electric vehicles is ultimately a question of how best to allocate California’s resources to meet its climate and economic goals. Biofuels may offer a quicker fix for reducing emissions from existing vehicles, but their long-term benefits are more limited compared to the transformative potential of electric vehicles, even as some analysts warn of policy pitfalls that could complicate the transition.
However, biofuels still have a role to play in decarbonizing hard-to-abate sectors like aviation and heavy-duty transportation, where electrification may not be as feasible in the near future. Thus, a mixed strategy that includes both subsidies for EVs and biofuels may be the most effective approach.
Ultimately, California’s decision will likely depend on a combination of factors, including technological advancements, 2021 electricity lessons, and the pace of renewable energy deployment, and the state’s ability to balance short-term needs with long-term environmental goals. The road ahead is not easy, but California's leadership in clean energy will be crucial in shaping the nation’s response to climate change.
U.S. Data Center Power Demand is straining electric utilities and grid reliability as AI, cloud computing, and streaming surge, driving transmission and generation upgrades, demand response, and renewable energy sourcing amid rising electricity costs.
Key Points
The rising electricity load from U.S. data centers, affecting utilities, grid capacity, and energy prices.
✅ AI, cloud, and streaming spur hyperscale compute loads
✅ Grid upgrades: transmission, generation, and substations
✅ Demand response, efficiency, and renewables mitigate strain
U.S. electric utilities are facing a significant new challenge as the explosive growth of data centers puts unprecedented strain on power grids across the nation. According to a new report from Reuters, data centers' power demands are expected to increase dramatically over the next few years, raising concerns about grid reliability and potential increases in electricity costs for businesses and consumers.
What's Driving the Data Center Surge?
The explosion in data centers is being fueled by several factors, with grid edge trends offering early context for these shifts:
Cloud Computing: The rise of cloud computing services, where businesses and individuals store and process data on remote servers, significantly increases demand for data centers.
Artificial Intelligence (AI): Data-hungry AI applications and machine learning algorithms are driving a massive need for computing power, accelerating the growth of data centers.
Streaming and Video Content: The growth of streaming platforms and high-definition video content requires vast amounts of data storage and processing, further boosting demand for data centers.
Challenges for Utilities
Data centers are notorious energy hogs. Their need for a constant, reliable supply of electricity places heavy demand on the grid, making integrating AI data centers a complex planning challenge, often in regions where power infrastructure wasn't designed for such large loads. Utilities must invest significantly in transmission and generation capacity upgrades to meet the demand while ensuring grid stability.
Some experts warn that the growth of data centers could lead to brownouts or outages, as a U.S. blackout study underscores ongoing risks, especially during peak demand periods in areas where the grid is already strained. Increased electricity demand could also lead to price hikes, with utilities potentially passing the additional costs onto consumers and businesses.
Sustainable Solutions Needed
Utility companies, governments, and the data center industry are scrambling to find sustainable solutions, including using AI to manage demand initiatives across utilities, to mitigate these challenges:
Energy Efficiency: Data center operators are investing in new cooling and energy management solutions to improve energy efficiency. Some are even exploring renewable energy sources like onsite solar and wind power.
Strategic Placement: Authorities are encouraging the development of data centers in areas with abundant renewable energy and access to existing grid infrastructure. This minimizes the need for expensive new transmission lines.
Demand Flexibility: Utility companies are experimenting with programs as part of a move toward a digital grid architecture to incentivize data centers to reduce their power consumption during peak demand periods, which could help mitigate power strain.
The Future of the Grid
The rapid growth of data centers exemplifies the significant challenges facing the aging U.S. electrical grid, with a recent grid report card highlighting dangerous vulnerabilities. It highlights the need for a modernized power infrastructure, capable of accommodating increasing demand spurred by new technologies while addressing climate change impacts that threaten reliability and affordability. The question for utilities, as well as data center operators, is how to balance the increasing need for computing power with the imperative of a sustainable and reliable energy future.
ITER Nuclear Fusion advances tokamak magnetic confinement, heating deuterium-tritium plasma with superconducting magnets, targeting net energy gain, tritium breeding, and steam-turbine power, while complementing laser inertial confinement milestones for grid-scale electricity and 2025 startup goals.
Key Points
ITER Nuclear Fusion is a tokamak project confining D-T plasma with magnets to achieve net energy gain and clean power.
✅ Tokamak magnetic confinement with high-temp superconducting coils
✅ Deuterium-tritium fuel cycle with on-site tritium breeding
✅ Targets net energy gain and grid-scale, low-carbon electricity
It sounds like the stuff of dreams: a virtually limitless source of energy that doesn’t produce greenhouse gases or radioactive waste. That’s the promise of nuclear fusion, often described as the holy grail of clean energy by proponents, which for decades has been nothing more than a fantasy due to insurmountable technical challenges. But things are heating up in what has turned into a race to create what amounts to an artificial sun here on Earth, one that can provide power for our kettles, cars and light bulbs.
Today’s nuclear power plants create electricity through nuclear fission, in which atoms are split, with next-gen nuclear power exploring smaller, cheaper, safer designs that remain distinct from fusion. Nuclear fusion however, involves combining atomic nuclei to release energy. It’s the same reaction that’s taking place at the Sun’s core. But overcoming the natural repulsion between atomic nuclei and maintaining the right conditions for fusion to occur isn’t straightforward. And doing so in a way that produces more energy than the reaction consumes has been beyond the grasp of the finest minds in physics for decades.
But perhaps not for much longer. Some major technical challenges have been overcome in the past few years and governments around the world have been pouring money into fusion power research as part of a broader green industrial revolution under way in several regions. There are also over 20 private ventures in the UK, US, Europe, China and Australia vying to be the first to make fusion energy production a reality.
“People are saying, ‘If it really is the ultimate solution, let’s find out whether it works or not,’” says Dr Tim Luce, head of science and operation at the International Thermonuclear Experimental Reactor (ITER), being built in southeast France. ITER is the biggest throw of the fusion dice yet.
Its $22bn (£15.9bn) build cost is being met by the governments of two-thirds of the world’s population, including the EU, the US, China and Russia, at a time when Europe is losing nuclear power and needs energy, and when it’s fired up in 2025 it’ll be the world’s largest fusion reactor. If it works, ITER will transform fusion power from being the stuff of dreams into a viable energy source.
Constructing a nuclear fusion reactor ITER will be a tokamak reactor – thought to be the best hope for fusion power. Inside a tokamak, a gas, often a hydrogen isotope called deuterium, is subjected to intense heat and pressure, forcing electrons out of the atoms. This creates a plasma – a superheated, ionised gas – that has to be contained by intense magnetic fields.
The containment is vital, as no material on Earth could withstand the intense heat (100,000,000°C and above) that the plasma has to reach so that fusion can begin. It’s close to 10 times the heat at the Sun’s core, and temperatures like that are needed in a tokamak because the gravitational pressure within the Sun can’t be recreated.
When atomic nuclei do start to fuse, vast amounts of energy are released. While the experimental reactors currently in operation release that energy as heat, in a fusion reactor power plant, the heat would be used to produce steam that would drive turbines to generate electricity, even as some envision nuclear beyond electricity for industrial heat and fuels.
Tokamaks aren’t the only fusion reactors being tried. Another type of reactor uses lasers to heat and compress a hydrogen fuel to initiate fusion. In August 2021, one such device at the National Ignition Facility, at the Lawrence Livermore National Laboratory in California, generated 1.35 megajoules of energy. This record-breaking figure brings fusion power a step closer to net energy gain, but most hopes are still pinned on tokamak reactors rather than lasers.
In June 2021, China’s Experimental Advanced Superconducting Tokamak (EAST) reactor maintained a plasma for 101 seconds at 120,000,000°C. Before that, the record was 20 seconds. Ultimately, a fusion reactor would need to sustain the plasma indefinitely – or at least for eight-hour ‘pulses’ during periods of peak electricity demand.
A real game-changer for tokamaks has been the magnets used to produce the magnetic field. “We know how to make magnets that generate a very high magnetic field from copper or other kinds of metal, but you would pay a fortune for the electricity. It wouldn’t be a net energy gain from the plant,” says Luce.
One route for nuclear fusion is to use atoms of deuterium and tritium, both isotopes of hydrogen. They fuse under incredible heat and pressure, and the resulting products release energy as heat
The solution is to use high-temperature, superconducting magnets made from superconducting wire, or ‘tape’, that has no electrical resistance. These magnets can create intense magnetic fields and don’t lose energy as heat.
“High temperature superconductivity has been known about for 35 years. But the manufacturing capability to make tape in the lengths that would be required to make a reasonable fusion coil has just recently been developed,” says Luce. One of ITER’s magnets, the central solenoid, will produce a field of 13 tesla – 280,000 times Earth’s magnetic field.
The inner walls of ITER’s vacuum vessel, where the fusion will occur, will be lined with beryllium, a metal that won’t contaminate the plasma much if they touch. At the bottom is the divertor that will keep the temperature inside the reactor under control.
“The heat load on the divertor can be as large as in a rocket nozzle,” says Luce. “Rocket nozzles work because you can get into orbit within minutes and in space it’s really cold.” In a fusion reactor, a divertor would need to withstand this heat indefinitely and at ITER they’ll be testing one made out of tungsten.
Meanwhile, in the US, the National Spherical Torus Experiment – Upgrade (NSTX-U) fusion reactor will be fired up in the autumn of 2022, while efforts in advanced fission such as a mini-reactor design are also progressing. One of its priorities will be to see whether lining the reactor with lithium helps to keep the plasma stable.
Choosing a fuel Instead of just using deuterium as the fusion fuel, ITER will use deuterium mixed with tritium, another hydrogen isotope. The deuterium-tritium blend offers the best chance of getting significantly more power out than is put in. Proponents of fusion power say one reason the technology is safe is that the fuel needs to be constantly fed into the reactor to keep fusion happening, making a runaway reaction impossible.
Deuterium can be extracted from seawater, so there’s a virtually limitless supply of it. But only 20kg of tritium are thought to exist worldwide, so fusion power plants will have to produce it (ITER will develop technology to ‘breed’ tritium). While some radioactive waste will be produced in a fusion plant, it’ll have a lifetime of around 100 years, rather than the thousands of years from fission.
At the time of writing in September, researchers at the Joint European Torus (JET) fusion reactor in Oxfordshire were due to start their deuterium-tritium fusion reactions. “JET will help ITER prepare a choice of machine parameters to optimise the fusion power,” says Dr Joelle Mailloux, one of the scientific programme leaders at JET. These parameters will include finding the best combination of deuterium and tritium, and establishing how the current is increased in the magnets before fusion starts.
The groundwork laid down at JET should accelerate ITER’s efforts to accomplish net energy gain. ITER will produce ‘first plasma’ in December 2025 and be cranked up to full power over the following decade. Its plasma temperature will reach 150,000,000°C and its target is to produce 500 megawatts of fusion power for every 50 megawatts of input heating power.
“If ITER is successful, it’ll eliminate most, if not all, doubts about the science and liberate money for technology development,” says Luce. That technology development will be demonstration fusion power plants that actually produce electricity, where advanced reactors can build on decades of expertise. “ITER is opening the door and saying, yeah, this works – the science is there.”
Western Denmark Negative Electricity Prices stem from wind energy oversupply, grid congestion, and limited interconnector capacity via Nord Pool and TenneT, underscoring electrification needs, renewable integration, special regulation, and system flexibility.
Key Points
They are sub-zero power prices from wind oversupply, weak interconnectors, low demand, and balancing needs.
✅ Caused by high wind output, low demand, and export bottlenecks
✅ Limited Nord Pool interconnector capacity depresses prices
✅ Special regulation and district heating absorb excess power
A downturn in the cable connection to Norway and Sweden, together with low electricity consumption and high electricity production, has pushed down European electricity prices to a negative level in Western Denmark.
A sign that the electrification of society is urgently needed, says Soren Klinge, head of electricity market at Wind Denmark today.
The heavy winds during the first weekend of July, unlike periods when cheap wind power wanes in the UK, have not only had consequences for the Danes who had otherwise been looking forward to spending their first days at home in the garden or at the beach. It has also pushed down prices in the electricity market to a negative level, which especially the West Danish wind turbine owners have had to notice.
'The electricity market is currently affected by an unfortunate coincidence of various factors that have a negative impact on the electricity price: a reduced export capacity to the other Nordic countries, a low electricity consumption and a high electricity generation, reflecting broader concerns over dispatchable power shortages in Europe today. Unfortunately, the coincidence of these three factors means that the price base falls completely out of the market. This is another sign that the electrification of society is urgently needed, 'explains Soren Klinge, electricity market manager at Wind Denmark.
According to the European power exchange Nord Pool Spot, where UK peak power prices are also tracked, the cable connection to Sweden is expected to return to full capacity from 19 July. The connection between Jutland and Norway is only expected to return to full capacity in early September.
2000 MWh / hour in special regulation
During the windy weather on Monday morning, July 6, up to 2000 MWh / hour was activated at national level in the form of so-called special regulation. Special regulation is the designation that the German system operator TenneT switches off Danish electricity generation at cogeneration plants and wind turbines in order to help with the balancing of the German power system during such events. In addition, electric boilers at the cogeneration plants also contribute by using the electricity from the electricity grid and converting it to district heating for the benefit of Danish homes and businesses.
'The Danish wind turbines are probably the source of most of the special regulation, because there are very few cogeneration units to down-regulate electricity generation. Of course, it is positive to see that we have a high degree of flexibility in the wind-based power system at home. That being said, Denmark does not really get ahead with the green transition, even as its largest energy company plans to stop using coal by 2023, until we are able to raise electricity consumption based on renewable energy.
U.S. Power Grid D+ Rating underscores aging infrastructure, rising outages, cyber threats, EMP and solar flare risks, strained transmission lines, vulnerable transformers, and slow permitting, amplifying reliability concerns and resilience needs across national energy systems.
Key Points
ASCE's D+ grade flags aging infrastructure, rising outages, and cyber, EMP, and weather risks needing investment.
✅ Major outages rising; weather remains top disruption driver.
✅ Cybersecurity gaps via smart grid, EV charging, SCADA.
The U.S. power grid just received its “grade card” from the American Society of Civil Engineers (ASCE) and it barely passed.
The overall rating of our antiquated electrical system was a D+. Major power outages in the United States, including widespread blackouts, have grown from 76 in 2007 to 307 in 2011, according to the latest available statistics. The major outage figures do not take into account all of the smaller outages which routinely occur due to seasonal storms.
The American Society of Civil Engineers power grid grade card rating means the energy infrastructure is in “poor to fair condition and mostly below standard, with many elements approaching the end of their service life.” It further means a “large portion of the system exhibits significant deterioration” with a “strong risk of failure.”
Such a designation is not reassuring and validates those who purchased solar generators over the past several years.
#google#
The vulnerable state of the power grid gets very little play by mainstream media outlets. Concerns about a solar flare or an electromagnetic pulse (EMP) attack instantly sending us back to an 1800s existence are legitimate, but it may not take such an extreme act to render the power grid a useless tangle of wires. The majority of the United States’ infrastructure and public systems evaluated by the ASCE earned a “D” rating. A “C” ranking (public parks, rail and bridges) was the highest grade earned. It would take a total of $3.6 trillion in investments by 2020 to fix everything, the report card stated. To put that number in perspective, the federal government’s budget for all of 2012 was slightly more, $3.7 trillion.
“America relies on an aging electrical grid and pipeline distribution systems, some of which originated in the 1880s,” the report read. “Investment in power transmission has increased since 2005, but ongoing permitting issues, weather events, including summer blackouts that strain local systems, and limited maintenance have contributed to an increasing number of failures and power interruptions. While demand for electricity has remained level, the availability of energy in the form of electricity, natural gas, and oil will become a greater challenge after 2020 as the population increases. Although about 17,000 miles of additional high-voltage transmission lines and significant oil and gas pipelines are planned over the next five years, permitting and siting issues threaten their completion. The electric grid in the United States consists of a system of interconnected power generation, transmission facilities, and distribution facilities.”
Harness the power of the sun when the power goes out…
There are approximately 400,000 miles of electrical transmission lines throughout the United States, and thousands of power generating plants dot the landscape. The ASCE report card also stated that new gas-fired and renewable generation issues increase the need to add new transmission lines. Antiquated power grid equipment has reportedly prompted even more “intermittent” power outages in recent years.
The American Society of Civil Engineers accurately notes that the power grid is more vulnerable to cyber attacks than ever before, including Russian intrusions documented in recent years, and it cites the aging electrical system as the primary culprit. Although the decades-old transformers and other equipment necessary to keep power flowing around America are a major factor in the enhanced vulnerability of the power grid, moving towards a “smart grid” system is not the answer. As previously reported by Off The Grid News, smart grid systems and even electric car charging stations make the power grid more accessible to cyber hackers. During the Hack in the Box Conference in Amsterdam, HP ArcSight Product Manager Ofer Sheaf stated that electric car charging stations are in essence a computer on the street. The roadway fueling stations are linked to the power grid electrical system. If cyber hackers garner access to the power grid via the charging stations, they could stop the flow of power to a specific area or alter energy distribution levels and overload the system.
While a relatively small number of electric car charging stations exist in America now, that soon will change. Ongoing efforts by both federal and state governments to reduce our reliance on fossil fuels have resulted in grants and privately funded vehicle charging station projects. New York Governor Andrew Cuomo in April announced plans to build 360 such electrical stations in his state. A total of 3,000 car charging stations are in the works statewide and are slated for completion over the next five years.
SHIELD ActWeather-related events were the primary cause of power outages from 2007 to 2012, according to the infrastructure report card. Power grid reliability issues are emerging as the greatest threat to the electrical system, with rising attacks on substations compounding the risks. The ASCE grade card also notes that retiring and rotating in “new energy sources” is a “complex” process. Like most items we routinely purchase in our daily lives, many of the components needed to make the power grid functional are not manufactured in the United States.
The SHIELD Act is the first real piece of federal legislation in years drafted to address power grid vulnerabilities. While the single bill will not fix all of the electrical system issues, it is a big step in the right direction – if it ever makes it out of committee. Replacing aging transformers, encasing them in a high-tech version of a Faraday cage, and stockpiling extra units so instant repairs are possible would help preserve one of the nation’s most critical and life-saving pieces of infrastructure after a weather-related incident or man-made disaster.
“Geomagnetic storm environments can develop instantaneously over large geographic footprints,” solar geomagnetic researcher John Kappenman said about the fragile state of the power grid. He was quoted in an Oak Ridge National Laboratory report. “They have the ability to essentially blanket the continent with an intense threat environment and … produce significant collateral damage to critical infrastructures. In contrast to well-conceived design standards that have been successfully applied for more conventional threats, no comprehensive design criteria have ever been considered to check the impact of the geomagnetic storm environments. The design actions that have occurred over many decades have greatly escalated the dangers posed by these storm threats for this critical infrastructure.”
The power grid has morphed in size tenfold during the past 50 years. While solar flares, cyber attacks, and an EMP are perhaps the most extensive and frightening threats to the electrical system, the infrastructure could just as easily fail in large portions due to weather-related events exacerbated by climate change across regions. The power grid is basically a ticking time bomb which will spawn civil unrest, lack of food, clean water, and a multitude of fires if it does go down.
BC Hydro LNG Load Forecast signals rising electricity demand from LNG Canada, Woodfibre, and Tilbury, aligning Site C dam capacity with BCUC review, hydroelectric supply, and a potential fourth project in feasibility study British Columbia.
Key Points
BC Hydro's projection of LNG-driven power demand, guiding Site C capacity, BCUC review, and grid planning.
✅ Includes LNG Canada, Woodfibre, and Tilbury load requests
✅ Aligns Site C hydroelectric output with industrial electrification
✅ Notes feasibility study for a fourth LNG project
Despite recent project cancellations, such as the Siwash Creek independent power project now in limbo, BC Hydro still expects three LNG projects — and possibly a fourth, which is undergoing a feasibility study — will need power from its controversial and expensive Site C hydroelectric dam.
In a letter sent to the British Columbia Utilities Commission (BCUC) on Oct. 3, BC Hydro’s chief regulatory officer Fred James said the provincially owned utility’s load forecast includes power demand for three proposed liquefied natural gas projects because they continue to ask the company for power.
The letter and attached report provide some detail on which of the LNG projects proposed in B.C. are more likely to be built, given recent project cancellations.
The documents are also an attempt to explain why BC Hydro continues to forecast a surge in electricity demand in the province, as seen in its first call for power in 15 years driven by electrification, even though massive LNG projects proposed by Malaysia’s state owned oil company Petronas and China’s CNOOC Nexen have been cancelled.
An explanation is needed because B.C.’s new NDP government had promised the BCUC would review the need for the $9-billion Site C dam, which was commissioned to provide power for the province’s nascent LNG industry, amid debates over alternatives like going nuclear among residents. The commission had specifically asked for an explanation of BC Hydro’s electric load forecast as it relates to LNG projects by Wednesday.
The three projects that continue to ask BC Hydro for electricity are Shell Canada Ltd.’s LNG Canada project, the Woodfibre LNG project and a future expansion of FortisBC’s Tilbury LNG storage facility.
None of those projects have officially been sanctioned but “service requests from industrial sector customers, including LNG, are generally included in our industrial load forecast,” the report noted, even as Manitoba Hydro warned about energy-intensive customers in a separate notice.
In a redacted section of the report, BC Hydro also raises the possibility of a fourth LNG project, which is exploring the need for power in B.C.
“BC Hydro is currently undertaking feasibility studies for another large LNG project, which is not currently included in its Current Load Forecast,” one section of the report notes, though the remainder of the section is redacted.
The Site C dam, which has become a source of controversy in B.C. and was an important election issue, is currently under construction and, following two new generating stations recently commissioned, is expected to be in service by 2024, a timeline which had been considered to provide LNG projects with power by the time they are operational.
BC Hydro’s letter to the BCUC refers to media and financial industry reports that indicate global LNG markets will require more supply by 2023.
“While there remains significant uncertainty, global LNG demand will continue to grow and there is opportunity for B.C. LNG,” the report notes.