Internet 'brownouts' feared

By Toronto Star


NFPA 70e Training - Arc Flash

Our customized live online or in‑person group training can be delivered to your staff at your location.

  • Live Online
  • 6 hours Instructor-led
  • Group Training Available
Regular Price:
$199
Coupon Price:
$149
Reserve Your Seat Today
Rising demand for bandwidth-hogging Internet activities such as swapping music files and watching YouTube videos threatens to outstrip the Web's infrastructure within three years, creating the spectre of service "brownouts" and potentially thwarting the development of the next Google-sized application, an industry-funded study warns.

Despite all the talk about the Internet's infinite possibilities, a study by U.S. firm Nemertes Research found that projected increases in Internet traffic are poised to eclipse the capacity of the Web's broadband access infrastructure – essentially the points where users "plug in" to the Internet via cable, phone or fibre optic lines – as early as 2010.

The study predicts that, in the absence of billions worth of new investments, Internet users will begin to notice a marked degradation in their online experiences within three to five years as bandwidth-heavy applications such as VoIP (Voice-over Internet Protocol), video-on-demand and various file-sharing schemes become more popular among individuals and corporations alike.

That could take the form of Web pages that take longer to load, interruptions in videos that are downloaded or streamed and stalled e-commerce transactions, said Mike Jude, a senior analyst with Nemertes. "If we don't do anything, we're going to start looking like the dying days of dial-up access," he said.

"The Internet won't fail, but it will degrade. Our expectation is that around 2010 or 2011, that degradation will become apparent to most users."

Nemertes said it received funding for the study from its client base of vendors, service providers, and not-for-profit organizations in the industry.

That includes the Internet Innovation Alliance, which some have criticized as being biased in favour of the U.S. telecommunications industry.

The study warns that the North American economy may suffer if Internet entrepreneurs are prevented from developing the next so-called "killer application" such as Google's market-dominating search engine, Facebook's rapidly growing social networking tool or eBay's popular auction site.

The concern is not that the Internet's core optical backbone, or trunk lines, can't handle the extra traffic, but that there are limitations at the "edge" of the Web where service providers, including telephone and cable operators, deploy switching equipment and networks of broadband "pipes" that connect to homes and businesses.

In order to keep things flowing smoothly, the study forecasts that service providers in the United States alone will need to spend as much as $55 billion (US) over and above the $72 billion (US) they were already planning to invest in broadband access.

Globally, the required investment by telecommunications and cable firms is estimated to be $137 billion (US), according to the study.

There is much discussion in the North American industry about what to do about the infamous "last mile," or the part of the network that extends from service providers' switching equipment to tens of thousands of homes within a given service area.

Since few homes are connected directly to high-bandwidth fibre optic lines, but have aging copper telephone wires or cable lines, there's a limit to how much Internet capacity can be delivered to customers alongside other services, creating an Internet bottleneck.

In the U.S., Verizon Communications Inc. has decided to spend millions bringing fibre optic cables to the home – a painstaking process that will require rewiring of entire neighbourhoods, house by house.

In Canada, meanwhile, there's speculation that Bell Canada Inc. and Telus Corp. may opt for a similar strategy in key markets in order to better compete with cable operators, whose lines have greater bandwidth capabilities.

Jude said researchers arrived at their conclusions by comparing published industry figures pertaining to planned infrastructure investments and comparing them to traffic growth projections culled from historical data and interviews with businesses about their plans for the Web.

One thing the study does not take into account, however, is the emergence of new technologies that improves the efficiency of existing infrastructure or new methods of broadband access such as wireless.

Critics seized upon the omission.

"I think it's a little premature to suggest that at 2010 we're going to hit a crossover point and all of sudden things are going to slow down," said Carmi Levy, the senior vice-president of strategic consulting for AR Communications Inc.

"I think that significantly underestimates the capability of the market to introduce products and services to meet this growing demand."

Levy noted that the industry has so far easily kept pace with demand.

Related News

Opinion: Germany's drive for renewable energy is a cautionary tale

Germany Energiewende Lessons highlight climate policy tradeoffs, as renewables, wind and solar face grid constraints, coal phase-out delays, rising electricity prices, and public opposition, informing Canada on diversification, hydro, oil and gas, and balanced transition.

 

Key Points

Insights from Germany's renewable shift on costs, grid limits, and emissions to guide Canada's balanced energy policy.

✅ Evidence: high power prices, delayed coal exit, limited grid buildout

✅ Land, materials, and wildlife impacts challenge wind and solar scale-up

✅ Diversification: hydro, nuclear, gas, and storage balance reliability

 

News that Greta Thunberg is visiting Alberta should be welcomed by all Canadians.

The teenaged Swedish environmentalist has focused global attention on the climate change debate like never before. So as she tours our province, where selling renewable energy could be Alberta's next big thing, what better time for a reality check than to look at a country that is furthest ahead in already adapting steps that Greta is advocating.

That country is Germany. And it’s not a pretty sight.

Germany embraced the shift toward renewable energy before anyone else, and did so with gusto. The result?

Germany’s largest newsmagazine Der Spiegel published an article on May 3 of this year entitled “A Botched Job in Germany.” The cover showed broken wind turbines and half-finished transition towers against a dark silhouette of Berlin.

Germany’s renewable energy transition, Energiewende, is a bust. After spending and committing a total of US$580 billion to it from 2000 to 2025.

Why is that? Because it’s been a nightmare of foolish dreams based on hope rather than fact, resulting in stalled projects and dreadfully poor returns.

Last year Germany admitted it had to delay its phase-out of coal and would not meet its 2020 greenhouse gas emissions reduction commitment. Only eight per cent of the transmission lines needed to support this new approach to powering Germany have been built.

Opposition to renewables is growing due to electricity prices rising to the point they are now among the highest in the world. Wind energy projects in Germany are now facing the same opposition that pipelines are here in Canada. 

Opposition to renewables in Germany, reports Forbes, is coming from people who live in rural or suburban areas, in opposition to the “urbane, cosmopolitan elites who fetishize their solar roofs and Teslas as a sign of virtue.” Sound familiar?

So, if renewables cannot successfully power Germany, one of the richest and most technologically advanced countries in the world, who can do it better?

The biggest problem with using wind and solar power on a large scale is that the physics just don’t work. They need too much land and equipment to produce sufficient amounts of electricity.

Solar farms take 450 times more land than nuclear power plants to produce the same amount of electricity. Wind farms take 700 times more land than natural gas wells.

The amount of metal required to build these sites is enormous, requiring new mines. Wind farms are killing hundreds of endangered birds.

No amount of marketing or spin can change the poor physics of resource-intensive and land-intensive renewables.

But, wait. Isn’t Norway, Greta’s neighbour, dumping its energy investments and moving into alternative energy like wind farms in a big way?

No, not really. Fact is only 0.8 per cent of Norway’s power comes from wind turbines. The country is blessed with a lot of hydroelectric power, but that’s a historical strength owing to the country’s geography, nothing new.

And yet we’re being told the US$1-trillion Oslo-based Government Pension Fund Global is moving out of the energy sector to instead invest in wind, solar and other alternative energy technologies. According to 350.org activist Nicolo Wojewoda this is “yet another nail in the coffin of the coal, oil, and gas industry.”

Well, no.

Norway’s pension fund is indeed investing in new energy forms, but not while pulling out of traditional investments in oil and gas. Rather, as any prudent fund manager will, they are diversifying by making modest investments in emerging industries such as Alberta's renewable energy surge that will likely pay off down the road while maintaining existing investments, spreading their investments around to reduce risk. Unfortunately for climate alarmists, the reality is far more nuanced and not nearly as explosive as they’d like us to think.

Yet, that’s enough for them to spin this tale to argue Canada should exit oil and gas investment and put all of our money into wind and solar, even as Canada remains a solar power laggard according to experts.

That is not to say renewable energy projects like wind and solar don’t have a place. They do, and we must continue to innovate and research lower-polluting ways to power our societies on the path to zero-emissions electricity by 2035 in Canada.

But like it actually is in Norway, investment in renewables should supplement — not replace — fossil fuel energy systems if we aim for zero-emission electricity in Canada by 2035 without undermining reliability. We need both.

And that’s the message that Greta should hear when she arrives in Canada.

Rick Peterson is the Edmonton-based founder and Beth Bailey is a Calgary-based supporter of Suits and Boots, a national not-for-profit group of investment industry professionals that supports resource sector workers and their families.

 

Related News

View more

Opinion: The awesome, revolutionary electric-car revolution that doesn't actually exist

Ecofiscal Commission EV Policy Shift examines carbon pricing limits, endorsing signal boosters like subsidies, EV incentives, and coal bans, amid advisory changes and public pushback, to accelerate emissions cuts beyond market-based taxes and regulations.

 

Key Points

An updated stance recognizing carbon pricing limits and backing EV incentives, subsidies, and rules to reduce emissions.

✅ Carbon pricing plus subsidies, EV incentives

✅ Advisory shift; Jack Mintz departs

✅ Focus on emissions cuts, coal power bans

 

Something strange happened at the Ecofiscal Commission recently. Earlier this month, the carbon-tax advocacy group featured on its website as one of its advisers the renowned Canadian economist (and FP Comment columnist) Jack M. Mintz. The other day, suddenly and without fanfare, Mintz was gone from the website, and the commission’s advisory board.

Advisers come and advisers go, of course, but it turns out there was an impetus for Mintz’s departure. The Ecofiscal Commission in its latest report, dropped just before Canada Day, seemingly shifted from its position that carbon prices were so excellent at mimicking market forces that the tax could repeal and replace virtually the entire vast expensive gallimaufry of subsidies, caps, rules and regulations that are costing Canada a fortune in business and bureaucrats. As some Ecofiscal commissioners wrote just a few months ago, policies that “dictate specific technologies or methods for reducing emissions constrain private choice and increase costs” and were a bad idea.

But, in this latest report, the commission is now musing about the benefits of carbon-tax “signal boosters”: that is, EV subsidies and rules to, for instance, get people to start buying electric vehicles (EVs), as well as bans on coal-fired power. “Even well designed carbon pricing can have limitations,” rationalized the commission. Mintz said he had “misgivings” about the change of tack. He decided it best if he focus his advisory energies elsewhere.

It’s hard to blame the commission for falling like everyone else for the electric-car mania that’s sweeping the nation and the world. Electric cars offer a sexiness that dreary old carbon taxes can never hope to match — especially in light of a new Angus Reid poll last week that showed the majority of Canadians now want governments to shelve any plans for carbon taxes.

So far, because nobody’s really driving these miracle machines, said mania has been limited to breathless news reports about how the electric-vehicle revolution is about to rock our world. EVs comprise just two-tenths of a per cent of all passenger vehicles in North America, despite the media’s endless hype and efforts of green-obsessed governments to cover much of the price tag, like Ontario’s $14,000 rebate for Tesla buyers. In Europe, where virtue-signalling urban environmentalism is the coolest, they’re not feeling the vehicular electricity much more: EVs account for barely one per cent of personal vehicles in France, the U.K. and Germany. When Hong Kong cancelled Tesla rebates in April, sales fell to zero.

Going by the ballyhoo, you’d think EVs were at an inflection point and an unstoppable juggernaut. But it’s one that has yet to even get started. In his 2011 State of the Union address, then president Barack Obama predicted one million electric cars on the road by 2015. Four years later, there wasn’t even a third that many. California offered so many different subsidies for electric vehicles that low-income families could get rebates of up to US$13,500, but it still isn’t even close to reaching its target of having zero-emission vehicles make up 15 per cent of California auto sales by 2025, being stuck at three per cent since 2014. Ontario’s Liberal government last year announced to much laughter its plan to ensure that every family would have at least one zero-emission vehicle (ZEV) by 2024, and Quebec made a plan to make ZEVs worth 15.5 per cent of sales by 2020, while Ottawa’s 2035 EV mandate attracts criticism too. Let’s see how that’s going: Currently, ZEVs make up 0.16 per cent of new vehicle sales in Ontario and 0.38 per cent in Quebec.

The latest sensational but bogus EV news out last week was France’s government announcing the “end of the sale of gasoline and diesel cars by 2040,” and Volvo apparently announcing that as of 2019, all its models would be “electric.” Both announcements made international headlines. Both are baloney. France provided no actual details about this plan (will it literally become a crime to sell a gasoline car? Will hybrids, run partly on gasoline, be allowed?), but more importantly, as automotive writer Ed Wiseman pointed out in The Guardian, a lot will happen in technology and automotive use over the next 23 years that France has no way to predict, with changes in self-driving cars, public car-sharing and fuel technologies. Imagine making rules for today’s internet back in 1994.

Volvo, meanwhile, looked to be recycling and repackaging years-old news to seize on today’s infatuation with electric vehicles to burnish its now Chinese-owned brand. Since 2010, Volvo’s plan has been to focus on engines that were partly electric, with electric turbochargers, but still based on gasoline. Volvo doesn’t actually have an all-electric model, but the gasoline-swigging engine of its popular XC90 SUV is, partly, electrical. When Volvo said all its models would in two years be “electric,” it meant this kind of engine, not that it was phasing out the internal-combustion gasoline engine. But that is what it wanted reporters to think, and judging by all the massive and inaccurate coverage, it worked.

The real story being missed is just how pathetic things look right now for electric cars. Gasoline prices in the U.S. turned historically cheap in 2015 and stayed cheap, icing demand for gasless cars. Tesla, whose founder’s self-promotion had made the niche carmaker magically more valuable than powerhouses like Ford and GM, haemorrhaged US$12 billion in market value last week after tepid sales figures brought some investors back to Earth, even as the company’s new Model 3 began rolling off the line.

Not helping is that environmental claims about environmental cars are falling apart. In June, Tesla was rocked by a controversial Swedish study that found that making one of its car batteries released as much CO2 as eight years of gasoline-powered driving. And Bloomberg reported last week on a study by Chinese engineers that found that electric vehicles, because of battery manufacturing and charging by fossil-fuel-powered electricity sources, emit 50-per-cent more carbon than do internal-combustion engines. Still, the electric-vehicle hype not only continues unabated, it gets bigger and louder every day. If some car company figures out how to harness it, we’d finally have a real automotive revolution on our hands.

Kevin Libin, Financial Post

 

Related News

View more

Almost 500-mile-long lightning bolt crossed three US states

Longest Lightning Flash Record confirmed by WMO: a 477.2-mile megaflash spanning Mississippi, Louisiana, and Texas, detected by satellite sensors, highlighting Great Plains supercell storms, lightning safety, and extreme weather monitoring advancements.

 

Key Points

It is the WMO-verified 477.2-mile megaflash across MS, LA, and TX, detected via satellites.

✅ Spanned 477.2 miles across Mississippi, Louisiana, and Texas

✅ Verified by WMO using space-based lightning detection

✅ Occurs in megaflash-prone regions like the U.S. Great Plains

 

An almost 500-mile long bolt of lightning that lit up the sky across three US states has set a new world record for longest flash, scientists have confirmed.

The lightning bolt, extended a total of 477.2 miles (768 km) and spread across Mississippi, Louisiana, and Texas.

The previous record was 440.6 miles (709 km) and recorded in Brazil in 2018.

Lightning rarely extends over 10 miles and usually lasts under a second, yet utilities plan for severe weather when building long-distance lines such as the TransWest Express transmission project to enhance reliability.

Another lightning flash recorded in 2020 - in Uruguay and Argentina - has also set a new record for duration at 17.1 seconds. The previous record was 16.7 seconds.

"These are extraordinary records from lightning flash events," Professor Randall Cerveny, the WMO's rapporteur of weather and climate extremes, said.

According to the WMO, both records took place in areas prone to intense storms that produce 'megaflashes', namely the Great Plains region of the United States and the La Plata basin of South America's southern cone, where utilities adapting to climate change is an increasing priority.

Professor Cerveny added that greater extremes are likely to exist and are likely to be recorded in the future thanks to advances in space-based lightning detection technology.

The WMO warned that lightning was a hazard and urged people in both regions and around the world to take caution during storms, which can lead to extensive disruptions like the Tennessee power outages reported after severe weather.

"These extremely large and long-duration lightning events were not isolated but happened during active thunderstorms," lightning specialist Ron Holle said in a WMO statement.

"Any time there is thunder heard, it is time to reach a lightning-safe place".

Previously accepted WMO 'lightning extremes' include a 1975 incident in which 21 people were killed by a single flash of a lightning as they huddled inside a tent in Zimbabwe, and modern events show how dangerous weather can also cut electricity for days, as with the Hong Kong typhoon outages that affected families.

In another incident, 469 people were killed when lightning struck the Egyptian town of Dronka in 1994, causing burning oil to flood the town, and major incidents can also disrupt infrastructure, as seen during the LA power outage following a substation fire.

The WMO notes that the only lightning-safe locations are "substantial" buildings with wiring and plumbing, and dedicated lightning protection training helps reinforce these guidelines, rather than structures such as bus stops or those found at beaches.

Fully enclosed metal-topped vehicles are also considered reliably safe, and regional storm safety tips offer additional guidance.

 

Related News

View more

Current Model For Storing Nuclear Waste Is Incomplete

Nuclear Waste Corrosion accelerates as stainless steel, glass, and ceramics interact in aqueous conditions, driving localized corrosion in repositories like Yucca Mountain, according to Nature Materials research on high-level radioactive waste storage.

 

Key Points

Degradation of waste forms and canisters from water-driven chemistry, causing accelerated, localized corrosion in storage.

✅ Stainless steel-glass contact triggers severe localized attack

✅ Ceramics and steel co-corrosion observed under aqueous conditions

✅ Yucca Mountain-like chemistry accelerates waste form degradation

 

The materials the United States and other countries plan to use to store high-level nuclear waste, even as utilities expand carbon-free electricity portfolios, will likely degrade faster than anyone previously knew because of the way those materials interact, new research shows.

The findings, published today in the journal Nature Materials (https://www.nature.com/articles/s41563-019-0579-x), show that corrosion of nuclear waste storage materials accelerates because of changes in the chemistry of the nuclear waste solution, and because of the way the materials interact with one another.

"This indicates that the current models may not be sufficient to keep this waste safely stored," said Xiaolei Guo, lead author of the study and deputy director of Ohio State's Center for Performance and Design of Nuclear Waste Forms and Containers, part of the university's College of Engineering. "And it shows that we need to develop a new model for storing nuclear waste."

Beyond waste storage, options like carbon capture technologies are being explored to reduce atmospheric CO2 alongside nuclear energy.

The team's research focused on storage materials for high-level nuclear waste -- primarily defense waste, the legacy of past nuclear arms production. The waste is highly radioactive. While some types of the waste have half-lives of about 30 years, others -- for example, plutonium -- have a half-life that can be tens of thousands of years. The half-life of a radioactive element is the time needed for half of the material to decay.

The United States currently has no disposal site for that waste; according to the U.S. General Accountability Office, it is typically stored near the nuclear power plants where it is produced. A permanent site has been proposed for Yucca Mountain in Nevada, though plans have stalled. Countries around the world have debated the best way to deal with nuclear waste; only one, Finland, has started construction on a long-term repository for high-level nuclear waste.

But the long-term plan for high-level defense waste disposal and storage around the globe is largely the same, even as the U.S. works to sustain nuclear power for decarbonization efforts. It involves mixing the nuclear waste with other materials to form glass or ceramics, and then encasing those pieces of glass or ceramics -- now radioactive -- inside metallic canisters. The canisters then would be buried deep underground in a repository to isolate it.

At the generation level, regulators are advancing EPA power plant rules on carbon capture to curb emissions while nuclear waste strategies evolve.

In this study, the researchers found that when exposed to an aqueous environment, glass and ceramics interact with stainless steel to accelerate corrosion, especially of the glass and ceramic materials holding nuclear waste.

In parallel, the electrical grid's reliance on SF6 insulating gas has raised warming concerns across Europe.

The study qualitatively measured the difference between accelerated corrosion and natural corrosion of the storage materials. Guo called it "severe."

"In the real-life scenario, the glass or ceramic waste forms would be in close contact with stainless steel canisters. Under specific conditions, the corrosion of stainless steel will go crazy," he said. "It creates a super-aggressive environment that can corrode surrounding materials."

To analyze corrosion, the research team pressed glass or ceramic "waste forms" -- the shapes into which nuclear waste is encapsulated -- against stainless steel and immersed them in solutions for up to 30 days, under conditions that simulate those under Yucca Mountain, the proposed nuclear waste repository.

Those experiments showed that when glass and stainless steel were pressed against one another, stainless steel corrosion was "severe" and "localized," according to the study. The researchers also noted cracks and enhanced corrosion on the parts of the glass that had been in contact with stainless steel.

Part of the problem lies in the Periodic Table. Stainless steel is made primarily of iron mixed with other elements, including nickel and chromium. Iron has a chemical affinity for silicon, which is a key element of glass.

The experiments also showed that when ceramics -- another potential holder for nuclear waste -- were pressed against stainless steel under conditions that mimicked those beneath Yucca Mountain, both the ceramics and stainless steel corroded in a "severe localized" way.

Other Ohio State researchers involved in this study include Gopal Viswanathan, Tianshu Li and Gerald Frankel.

This work was funded in part by the U.S. Department of Energy Office of Science.

Meanwhile, U.S. monitoring shows potent greenhouse gas declines confirming the impact of control efforts across the energy sector.

 

Related News

View more

Extensive Disaster Planning at Electric & Gas Utilities Means Lights Will Stay On

Utility Pandemic Preparedness strengthens grid resilience through continuity planning, critical infrastructure protection, DOE-DHS coordination, onsite sequestration, skeleton crews, and deferred maintenance to ensure reliable electric and gas service for commercial and industrial customers.

 

Key Points

Plans that sustain grid operations during outbreaks using staffing limits, access controls, and deferred maintenance.

✅ Deferred maintenance and restricted site access

✅ Onsite sequestering and skeleton crew operations

✅ DOE-DHS coordination and control center staffing

 

Commercial and industrial businesses can rest assured that the current pandemic poses no real threat to our utilities, with the U.S. grid remaining reliable for now, as disaster planning has been key to electric and gas utilities in recent years, writes Forbes. Beginning a decade ago, the utility and energy industries evolved detailed pandemic plans, outlining what to know about the U.S. grid during outbreaks, which include putting off maintenance and routine activities until the worst of the pandemic has passed, restricting site access to essential personnel, and being able to run on a skeleton crew as more and more people become ill, a capability underscored by FPL's massive Irma response when crews faced prolonged outages.

One possible outcome of the current situation is that the US electric industry may require essential staff to live onsite at power plants and control centers, similar to Ontario work-site lockdown plans under consideration, if the outbreak worsens; bedding, food and other supplies are being stockpiled, reflecting local response preparations many utilities practice, Reuters reported. The Great River Energy cooperative, for example, has had a plan to sequester essential staff in place since the H1N1 bird flu crisis in 2009. The cooperative, which runs 10 power plants in Minnesota, says its disaster planning ensured it has enough cots, blankets and other necessities on site to keep staff healthy.

Electricity providers are now taking part in twice-weekly phone calls with officials at the DOE, the Department of Homeland Security, and other agencies, as Ontario demand shifts are monitored, according to the Los Angeles Times. By planning for a variety of worst case scenarios, including weeks-long restorations after major storms, “I have confidence that the sector will be prepared to respond no matter how this evolves,” says Scott Aaronson, VP of security and preparedness for the Edison Electric Institute.

 

Related News

View more

Why the promise of nuclear fusion is no longer a pipe dream

ITER Nuclear Fusion advances tokamak magnetic confinement, heating deuterium-tritium plasma with superconducting magnets, targeting net energy gain, tritium breeding, and steam-turbine power, while complementing laser inertial confinement milestones for grid-scale electricity and 2025 startup goals.

 

Key Points

ITER Nuclear Fusion is a tokamak project confining D-T plasma with magnets to achieve net energy gain and clean power.

✅ Tokamak magnetic confinement with high-temp superconducting coils

✅ Deuterium-tritium fuel cycle with on-site tritium breeding

✅ Targets net energy gain and grid-scale, low-carbon electricity

 

It sounds like the stuff of dreams: a virtually limitless source of energy that doesn’t produce greenhouse gases or radioactive waste. That’s the promise of nuclear fusion, often described as the holy grail of clean energy by proponents, which for decades has been nothing more than a fantasy due to insurmountable technical challenges. But things are heating up in what has turned into a race to create what amounts to an artificial sun here on Earth, one that can provide power for our kettles, cars and light bulbs.

Today’s nuclear power plants create electricity through nuclear fission, in which atoms are split, with next-gen nuclear power exploring smaller, cheaper, safer designs that remain distinct from fusion. Nuclear fusion however, involves combining atomic nuclei to release energy. It’s the same reaction that’s taking place at the Sun’s core. But overcoming the natural repulsion between atomic nuclei and maintaining the right conditions for fusion to occur isn’t straightforward. And doing so in a way that produces more energy than the reaction consumes has been beyond the grasp of the finest minds in physics for decades.

But perhaps not for much longer. Some major technical challenges have been overcome in the past few years and governments around the world have been pouring money into fusion power research as part of a broader green industrial revolution under way in several regions. There are also over 20 private ventures in the UK, US, Europe, China and Australia vying to be the first to make fusion energy production a reality.

“People are saying, ‘If it really is the ultimate solution, let’s find out whether it works or not,’” says Dr Tim Luce, head of science and operation at the International Thermonuclear Experimental Reactor (ITER), being built in southeast France. ITER is the biggest throw of the fusion dice yet.

Its $22bn (£15.9bn) build cost is being met by the governments of two-thirds of the world’s population, including the EU, the US, China and Russia, at a time when Europe is losing nuclear power and needs energy, and when it’s fired up in 2025 it’ll be the world’s largest fusion reactor. If it works, ITER will transform fusion power from being the stuff of dreams into a viable energy source.


Constructing a nuclear fusion reactor
ITER will be a tokamak reactor – thought to be the best hope for fusion power. Inside a tokamak, a gas, often a hydrogen isotope called deuterium, is subjected to intense heat and pressure, forcing electrons out of the atoms. This creates a plasma – a superheated, ionised gas – that has to be contained by intense magnetic fields.

The containment is vital, as no material on Earth could withstand the intense heat (100,000,000°C and above) that the plasma has to reach so that fusion can begin. It’s close to 10 times the heat at the Sun’s core, and temperatures like that are needed in a tokamak because the gravitational pressure within the Sun can’t be recreated.

When atomic nuclei do start to fuse, vast amounts of energy are released. While the experimental reactors currently in operation release that energy as heat, in a fusion reactor power plant, the heat would be used to produce steam that would drive turbines to generate electricity, even as some envision nuclear beyond electricity for industrial heat and fuels.

Tokamaks aren’t the only fusion reactors being tried. Another type of reactor uses lasers to heat and compress a hydrogen fuel to initiate fusion. In August 2021, one such device at the National Ignition Facility, at the Lawrence Livermore National Laboratory in California, generated 1.35 megajoules of energy. This record-breaking figure brings fusion power a step closer to net energy gain, but most hopes are still pinned on tokamak reactors rather than lasers.

In June 2021, China’s Experimental Advanced Superconducting Tokamak (EAST) reactor maintained a plasma for 101 seconds at 120,000,000°C. Before that, the record was 20 seconds. Ultimately, a fusion reactor would need to sustain the plasma indefinitely – or at least for eight-hour ‘pulses’ during periods of peak electricity demand.

A real game-changer for tokamaks has been the magnets used to produce the magnetic field. “We know how to make magnets that generate a very high magnetic field from copper or other kinds of metal, but you would pay a fortune for the electricity. It wouldn’t be a net energy gain from the plant,” says Luce.


One route for nuclear fusion is to use atoms of deuterium and tritium, both isotopes of hydrogen. They fuse under incredible heat and pressure, and the resulting products release energy as heat


The solution is to use high-temperature, superconducting magnets made from superconducting wire, or ‘tape’, that has no electrical resistance. These magnets can create intense magnetic fields and don’t lose energy as heat.

“High temperature superconductivity has been known about for 35 years. But the manufacturing capability to make tape in the lengths that would be required to make a reasonable fusion coil has just recently been developed,” says Luce. One of ITER’s magnets, the central solenoid, will produce a field of 13 tesla – 280,000 times Earth’s magnetic field.

The inner walls of ITER’s vacuum vessel, where the fusion will occur, will be lined with beryllium, a metal that won’t contaminate the plasma much if they touch. At the bottom is the divertor that will keep the temperature inside the reactor under control.

“The heat load on the divertor can be as large as in a rocket nozzle,” says Luce. “Rocket nozzles work because you can get into orbit within minutes and in space it’s really cold.” In a fusion reactor, a divertor would need to withstand this heat indefinitely and at ITER they’ll be testing one made out of tungsten.

Meanwhile, in the US, the National Spherical Torus Experiment – Upgrade (NSTX-U) fusion reactor will be fired up in the autumn of 2022, while efforts in advanced fission such as a mini-reactor design are also progressing. One of its priorities will be to see whether lining the reactor with lithium helps to keep the plasma stable.


Choosing a fuel
Instead of just using deuterium as the fusion fuel, ITER will use deuterium mixed with tritium, another hydrogen isotope. The deuterium-tritium blend offers the best chance of getting significantly more power out than is put in. Proponents of fusion power say one reason the technology is safe is that the fuel needs to be constantly fed into the reactor to keep fusion happening, making a runaway reaction impossible.

Deuterium can be extracted from seawater, so there’s a virtually limitless supply of it. But only 20kg of tritium are thought to exist worldwide, so fusion power plants will have to produce it (ITER will develop technology to ‘breed’ tritium). While some radioactive waste will be produced in a fusion plant, it’ll have a lifetime of around 100 years, rather than the thousands of years from fission.

At the time of writing in September, researchers at the Joint European Torus (JET) fusion reactor in Oxfordshire were due to start their deuterium-tritium fusion reactions. “JET will help ITER prepare a choice of machine parameters to optimise the fusion power,” says Dr Joelle Mailloux, one of the scientific programme leaders at JET. These parameters will include finding the best combination of deuterium and tritium, and establishing how the current is increased in the magnets before fusion starts.

The groundwork laid down at JET should accelerate ITER’s efforts to accomplish net energy gain. ITER will produce ‘first plasma’ in December 2025 and be cranked up to full power over the following decade. Its plasma temperature will reach 150,000,000°C and its target is to produce 500 megawatts of fusion power for every 50 megawatts of input heating power.

“If ITER is successful, it’ll eliminate most, if not all, doubts about the science and liberate money for technology development,” says Luce. That technology development will be demonstration fusion power plants that actually produce electricity, where advanced reactors can build on decades of expertise. “ITER is opening the door and saying, yeah, this works – the science is there.”

 

Related News

View more

Sign Up for Electricity Forum’s Newsletter

Stay informed with our FREE Newsletter — get the latest news, breakthrough technologies, and expert insights, delivered straight to your inbox.

Electricity Today T&D Magazine Subscribe for FREE

Stay informed with the latest T&D policies and technologies.
  • Timely insights from industry experts
  • Practical solutions T&D engineers
  • Free access to every issue

Download the 2025 Electrical Training Catalog

Explore 50+ live, expert-led electrical training courses –

  • Interactive
  • Flexible
  • CEU-cerified