When Douglas Worts learned that the City of Toronto was going to fix the pavement on his street, he knew what he had to do: he called his councillor to get it stopped.
Worts has nothing against good roads. But he looks at his street – Laurier Ave. in the Parliament-Wellesley area – as more than a roadway.
He thinks it has the potential to heat and cool his house and others, by providing the footings for a geothermal heating system.
Now the city is interested in the idea, and has given $25,000 to Worts and his neighbours, through the Don Vale Cabbagetown Residents Association, to carry out a feasibility study.
Worts had never thought much about geothermal heating and cooling until he happened to hear that it was being considered for the University of Ontario Institute of Technology in Oshawa.
He talked up the idea at the Laurier street party in 2007, and some neighbours expressed interest.
He explained that down past the frost line, the Earth keeps a temperature that's warmer than winter air and cooler than summer air.
Geothermal systems take advantage of that by pumping fluid through underground pipes to carry the seasonal warmth or coolness to the surface.
Worts is keen on geothermal because the Laurier Ave. homes, built in 1888, are not energy-efficient by today's standards.
Worts thought tapping a green energy source like geothermal made a lot of sense.
One obstacle to geothermal at Laurier Ave. is geographic: There just isn't much surface area along the narrow street, where houses lack front yards or driveways.
Worts figured the roadway itself would be far more accessible for drilling rigs.
And the project would set an example of how geothermal could also have application in dense urban neighbourhoods.
"This is a perfect size street to be doing this kind of experiment," says Worts.
Staff at the energy efficiency office at city hall have been helpful, Worts said, and are willing to give residents a permit to drill on the street.
The holes will have to be very deep – about 175 metres or 575 feet, Worts says – because there's no room to run buried pipe sideways.
Each home will need its own system, because setting up a single system with common ownership proved legally complex, and not everyone on the street wants to convert to geothermal.
Worts says 16 of the 22 residents have shown serious interest.
Their councillor, Pam McConnell, supports the project.
"I think it's fabulous," she said in an interview. "It's a small street, but it could have major implications in quite a large circumference around Cabbagetown.
McConnell strongly approves of using the city street for the drill holes, because the project is in keeping with city policy on curbing carbon emissions.
"If we need to give up a little space in our right of way, that's fine with me," she said.
"I don't think it impacts the use of the street or the sidewalk. It doesn't impact the public realm, and has very important public benefits."
But money remains an obstacle – even doing a detailed feasibility study is expensive, and the Laurier Ave. residents were hobbled by lacking a formal organization.
A solution to that problem appeared one day when Sameer Dhargalkar, a Laurier resident and co-backer of the geothermal project with Worts, was walking his dog.
In Wellesley Park, he struck up a conversation with another dog owner, Lee Garrison, who heads the Don Vale Cabbagetown Residents Association.
"We just started talking out of the blue," Garrison recalls.
When the geothermal project came up, "I said: 'Let's talk some more, because I'm head of the residents' association and we've been wanting for a while to find some flagship projects to kick-start a green initiative in Cabbagetown.'"
The residents' association is now a partner in the project and provides the funding link with the city.
However, money is still an issue.
A consultant has estimated the cost of a geothermal unit at $27,000 per household.
Worts figures that with grant incentives, and with the savings from drilling many holes at once, the cost would fall to $17,000 or less.
Worts hopes the city or some other sponsor can be persuaded to loan this upfront money to owners.
He says a house spending $2,000 a year on heating and cooling might slice that to $800 with geothermal.
Alberta Last-Resort Power Rate Reform outlines consumer protection against market volatility, price spikes, and wholesale rate swings, promoting fixed-rate plans, price caps, transparency, and stable pricing mechanisms within Alberta's deregulated power market.
Key Points
Alberta Last-Resort Power Rate Reform seeks stable, transparent pricing and stronger consumer protections.
✅ Caps or hedges shield bills from wholesale price spikes
✅ Expand fixed-rate options and enrollment nudges
✅ Publish clear, real-time pricing and market risk alerts
Alberta’s electricity market is facing growing instability, with rising prices leaving many consumers struggling. The province's rate of last resort, a government-set price for people who haven’t chosen a fixed electricity plan, has become a significant concern. Due to volatile market conditions, this rate has surged, causing financial strain for households. Experts, like energy policy analyst Blake Shaffer, argue that the current market structure needs reform. They suggest creating more stability in pricing, ensuring better protection for consumers against unexpected price spikes, and addressing the flaws that lead to market volatility.
As electricity prices climb, many consumers are feeling the pressure. In Alberta, where energy deregulation is the norm in the electricity market, people without fixed-rate plans are automatically switched to the last-resort rate when their contracts expire. This price is based on fluctuating wholesale market rates, which can spike unexpectedly, leaving consumers vulnerable to sharp price increases. For those on tight budgets, such volatility makes it difficult to predict costs, leading to higher financial stress.
Blake Shaffer, a prominent energy policy expert, has been vocal about the need to address these issues. He has highlighted that while some consumers benefit from fixed-rate plans, with experts urging Albertans to lock in rates when possible, those who cannot afford them or who are unaware of their options often find themselves stuck with the unpredictable last-resort rate. This rate can be substantially higher than what a fixed-plan customer would pay, often due to rapid shifts in energy demand and supply imbalances.
Shaffer suggests that the province’s electricity market needs a restructuring to make it more consumer-friendly and less vulnerable to extreme price hikes. He argues that introducing more transparency in pricing and offering more stable options for consumers through new electricity rules could help. In addition, there could be better incentives for consumers to stay informed about their electricity plans, which would help reduce the number of people unintentionally placed on the last-resort rate.
One potential solution proposed by Shaffer and others is the creation of a more predictable and stable pricing mechanism, though a Calgary electricity retailer has urged the government to scrap an overhaul, where consumers could have access to reasonable rates that aren’t so closely tied to the volatility of the wholesale market. This could involve capping prices or offering government-backed insurance against large price fluctuations, making electricity more affordable for those who are most at risk.
The increasing reliance on market-driven prices has also raised concerns about Alberta’s energy policy changes and overall direction. As a province with a large reliance on oil and gas, Alberta’s energy sector is tightly connected to global energy trends. While this has its benefits, it also means that Alberta’s electricity prices are heavily influenced by factors outside the control of local consumers, such as geopolitical issues or extreme weather events. This makes it hard for residents to predict and plan their energy usage and costs.
For many Albertans, the current state of the electricity market feels precarious. As more people face unexpected price hikes, calls for a market overhaul continue to grow louder across Alberta. Shaffer and others believe that a new framework is necessary—one that balances the interests of consumers, the government, and energy companies, while ensuring that basic energy needs are met without overwhelming households with excessive costs.
In conclusion, Alberta’s last-resort electricity rate system is an increasing burden for many. While some may benefit from fixed-rate plans, others are left exposed to market volatility. Blake Shaffer advocates for reform to create a more stable, transparent, and affordable electricity market, one that could better protect consumers from the high risks associated with deregulated pricing. Addressing these challenges will be crucial in ensuring that energy remains accessible and affordable for all Alberta residents.
Bright Feeds Solar Upgrade integrates a 300-kW DC PV system and 625 solar panels at the Berlin, CT plant, supplying one-third of power, cutting carbon emissions, and advancing clean, renewable energy in agriculture.
Key Points
An initiative powering Bright Feeds' Berlin plant with a 300-kW DC PV array, reducing costs and carbon emissions.
✅ 300-kW DC PV with 625 panels by Solect Energy
✅ Supplies ~33% of facility power; lowers operating costs
Bright Feeds, a New England-based startup, has successfully transitioned its Berlin, Connecticut, animal feed production facility to solar energy. The company installed a 300-kilowatt direct current (DC) solar photovoltaic (PV) system at its 25,000-square-foot plant, mirroring progress seen at projects like the Arvato solar plant in advancing onsite generation. This move aligns with Bright Feeds' commitment to sustainability and reducing its carbon footprint.
Solar Installation Details
The solar system comprises 625 solar panels and was developed and installed by Solect Energy, a Massachusetts-based company, reflecting momentum as projects like Building Energy's launch come online nationwide. Over its lifetime, the system is projected to offset more than 2,100 tons of carbon emissions, contributing significantly to the company's environmental goals. This initiative not only reduces energy expenses but also supports Bright Feeds' mission to promote clean energy solutions in the agricultural sector.
Bright Feeds' Sustainable Operations
At its Berlin facility, Bright Feeds employs advanced artificial intelligence and drying technology to transform surplus food into an all-natural, nutrient-rich alternative to soy and corn in animal feed, complementing emerging agrivoltaics approaches that pair energy with agriculture. The company supplies its innovative feed product to a broad range of customers across the Northeast, including animal feed distributors and dairy farms. By processing food that would otherwise go to waste, the facility diverts tens of thousands of tons of food from the regional waste stream each year. When operating at full capacity, the environmental benefit of the plant’s process is comparable to taking more than 33,000 cars off the road annually.
Industry Impact
Bright Feeds' adoption of solar energy sets a precedent for sustainability in the agricultural sector. The integration of renewable energy sources into production processes not only reduces operational costs but also demonstrates a commitment to environmental stewardship, amid rising European demand for U.S. solar equipment that underscores market momentum. As the demand for sustainable practices grows, and as rural clean energy delivers measurable benefits, other companies in the industry may look to Bright Feeds as a model for integrating clean energy solutions into their operations.
Bright Feeds' initiative to power its Berlin facility with solar energy underscores the company's dedication to sustainability and innovation. By harnessing the power of the sun, Bright Feeds is not only reducing its carbon footprint but also contributing to a cleaner, more sustainable future for the agricultural industry, and when paired with solar batteries can further enhance resilience. This move serves as an example for other companies seeking to align their operations with environmental responsibility and renewable energy adoption, as new milestones like a U.S. clean energy factory signal expanding capacity across the sector.
ITER Nuclear Fusion advances tokamak magnetic confinement, heating deuterium-tritium plasma with superconducting magnets, targeting net energy gain, tritium breeding, and steam-turbine power, while complementing laser inertial confinement milestones for grid-scale electricity and 2025 startup goals.
Key Points
ITER Nuclear Fusion is a tokamak project confining D-T plasma with magnets to achieve net energy gain and clean power.
✅ Tokamak magnetic confinement with high-temp superconducting coils
✅ Deuterium-tritium fuel cycle with on-site tritium breeding
✅ Targets net energy gain and grid-scale, low-carbon electricity
It sounds like the stuff of dreams: a virtually limitless source of energy that doesn’t produce greenhouse gases or radioactive waste. That’s the promise of nuclear fusion, often described as the holy grail of clean energy by proponents, which for decades has been nothing more than a fantasy due to insurmountable technical challenges. But things are heating up in what has turned into a race to create what amounts to an artificial sun here on Earth, one that can provide power for our kettles, cars and light bulbs.
Today’s nuclear power plants create electricity through nuclear fission, in which atoms are split, with next-gen nuclear power exploring smaller, cheaper, safer designs that remain distinct from fusion. Nuclear fusion however, involves combining atomic nuclei to release energy. It’s the same reaction that’s taking place at the Sun’s core. But overcoming the natural repulsion between atomic nuclei and maintaining the right conditions for fusion to occur isn’t straightforward. And doing so in a way that produces more energy than the reaction consumes has been beyond the grasp of the finest minds in physics for decades.
But perhaps not for much longer. Some major technical challenges have been overcome in the past few years and governments around the world have been pouring money into fusion power research as part of a broader green industrial revolution under way in several regions. There are also over 20 private ventures in the UK, US, Europe, China and Australia vying to be the first to make fusion energy production a reality.
“People are saying, ‘If it really is the ultimate solution, let’s find out whether it works or not,’” says Dr Tim Luce, head of science and operation at the International Thermonuclear Experimental Reactor (ITER), being built in southeast France. ITER is the biggest throw of the fusion dice yet.
Its $22bn (£15.9bn) build cost is being met by the governments of two-thirds of the world’s population, including the EU, the US, China and Russia, at a time when Europe is losing nuclear power and needs energy, and when it’s fired up in 2025 it’ll be the world’s largest fusion reactor. If it works, ITER will transform fusion power from being the stuff of dreams into a viable energy source.
Constructing a nuclear fusion reactor ITER will be a tokamak reactor – thought to be the best hope for fusion power. Inside a tokamak, a gas, often a hydrogen isotope called deuterium, is subjected to intense heat and pressure, forcing electrons out of the atoms. This creates a plasma – a superheated, ionised gas – that has to be contained by intense magnetic fields.
The containment is vital, as no material on Earth could withstand the intense heat (100,000,000°C and above) that the plasma has to reach so that fusion can begin. It’s close to 10 times the heat at the Sun’s core, and temperatures like that are needed in a tokamak because the gravitational pressure within the Sun can’t be recreated.
When atomic nuclei do start to fuse, vast amounts of energy are released. While the experimental reactors currently in operation release that energy as heat, in a fusion reactor power plant, the heat would be used to produce steam that would drive turbines to generate electricity, even as some envision nuclear beyond electricity for industrial heat and fuels.
Tokamaks aren’t the only fusion reactors being tried. Another type of reactor uses lasers to heat and compress a hydrogen fuel to initiate fusion. In August 2021, one such device at the National Ignition Facility, at the Lawrence Livermore National Laboratory in California, generated 1.35 megajoules of energy. This record-breaking figure brings fusion power a step closer to net energy gain, but most hopes are still pinned on tokamak reactors rather than lasers.
In June 2021, China’s Experimental Advanced Superconducting Tokamak (EAST) reactor maintained a plasma for 101 seconds at 120,000,000°C. Before that, the record was 20 seconds. Ultimately, a fusion reactor would need to sustain the plasma indefinitely – or at least for eight-hour ‘pulses’ during periods of peak electricity demand.
A real game-changer for tokamaks has been the magnets used to produce the magnetic field. “We know how to make magnets that generate a very high magnetic field from copper or other kinds of metal, but you would pay a fortune for the electricity. It wouldn’t be a net energy gain from the plant,” says Luce.
One route for nuclear fusion is to use atoms of deuterium and tritium, both isotopes of hydrogen. They fuse under incredible heat and pressure, and the resulting products release energy as heat
The solution is to use high-temperature, superconducting magnets made from superconducting wire, or ‘tape’, that has no electrical resistance. These magnets can create intense magnetic fields and don’t lose energy as heat.
“High temperature superconductivity has been known about for 35 years. But the manufacturing capability to make tape in the lengths that would be required to make a reasonable fusion coil has just recently been developed,” says Luce. One of ITER’s magnets, the central solenoid, will produce a field of 13 tesla – 280,000 times Earth’s magnetic field.
The inner walls of ITER’s vacuum vessel, where the fusion will occur, will be lined with beryllium, a metal that won’t contaminate the plasma much if they touch. At the bottom is the divertor that will keep the temperature inside the reactor under control.
“The heat load on the divertor can be as large as in a rocket nozzle,” says Luce. “Rocket nozzles work because you can get into orbit within minutes and in space it’s really cold.” In a fusion reactor, a divertor would need to withstand this heat indefinitely and at ITER they’ll be testing one made out of tungsten.
Meanwhile, in the US, the National Spherical Torus Experiment – Upgrade (NSTX-U) fusion reactor will be fired up in the autumn of 2022, while efforts in advanced fission such as a mini-reactor design are also progressing. One of its priorities will be to see whether lining the reactor with lithium helps to keep the plasma stable.
Choosing a fuel Instead of just using deuterium as the fusion fuel, ITER will use deuterium mixed with tritium, another hydrogen isotope. The deuterium-tritium blend offers the best chance of getting significantly more power out than is put in. Proponents of fusion power say one reason the technology is safe is that the fuel needs to be constantly fed into the reactor to keep fusion happening, making a runaway reaction impossible.
Deuterium can be extracted from seawater, so there’s a virtually limitless supply of it. But only 20kg of tritium are thought to exist worldwide, so fusion power plants will have to produce it (ITER will develop technology to ‘breed’ tritium). While some radioactive waste will be produced in a fusion plant, it’ll have a lifetime of around 100 years, rather than the thousands of years from fission.
At the time of writing in September, researchers at the Joint European Torus (JET) fusion reactor in Oxfordshire were due to start their deuterium-tritium fusion reactions. “JET will help ITER prepare a choice of machine parameters to optimise the fusion power,” says Dr Joelle Mailloux, one of the scientific programme leaders at JET. These parameters will include finding the best combination of deuterium and tritium, and establishing how the current is increased in the magnets before fusion starts.
The groundwork laid down at JET should accelerate ITER’s efforts to accomplish net energy gain. ITER will produce ‘first plasma’ in December 2025 and be cranked up to full power over the following decade. Its plasma temperature will reach 150,000,000°C and its target is to produce 500 megawatts of fusion power for every 50 megawatts of input heating power.
“If ITER is successful, it’ll eliminate most, if not all, doubts about the science and liberate money for technology development,” says Luce. That technology development will be demonstration fusion power plants that actually produce electricity, where advanced reactors can build on decades of expertise. “ITER is opening the door and saying, yeah, this works – the science is there.”
US power grid modernization addresses aging infrastructure, climate resilience, extreme weather, EV demand, and clean energy integration, using AI, transmission upgrades, and resilient substations to improve reliability, reduce outages, and enable rapid recovery.
Key Points
US power grid modernization strengthens infrastructure for resilience, reliability, and clean energy under rising demand.
✅ Hardening substations, lines, and transformers against extreme weather
✅ Integrating EV load, DERs, and renewables into transmission and distribution
✅ Using AI, sensors, and automation to cut outages and speed restoration
The power grid in the U.S. is aging and already struggling to meet current demand, with dangerous vulnerabilities documented across the system today. It faces a future with more people — people who drive more electric cars and heat homes with more electric furnaces.
Alice Hill says that's not even the biggest problem the country's electricity infrastructure faces.
"Everything that we've built, including the electric grid, assumed a stable climate," she says. "It looked to the extremes of the past — how high the seas got, how high the winds got, the heat."
Hill is an energy and environment expert at the Council on Foreign Relations. She served on the National Security Council staff during the Obama administration, where she led the effort to develop climate resilience. She says past weather extremes can no longer safely guide future electricity planning.
"It's a little like we're building the plane as we're flying because the climate is changing right now, and it's picking up speed as it changes," Hill says.
The newly passed infrastructure package dedicates billions of dollars to updating the energy grid with smarter electricity infrastructure programs that aim to modernize operations. Hill says utility companies and public planners around the country are already having to adapt. She points to the storm surge of Hurricane Sandy in 2012.
Article continues after sponsor message
"They thought the maximum would be 12 feet," she says. "That storm surge came in close to 14 feet. It overcame the barriers at the tip of Manhattan, and then the electric grid — a substation blew out. The city that never sleeps [was] plunged into darkness."
Hill noted that Con Edison, the utility company providing New York City with energy, responded with upgrades to its grid: It buried power lines, introduced artificial intelligence, upgraded software to detect failures. But upgrading the way humans assess risk, she says, is harder.
"What happens is that some people tend to think, well, that last storm that we just had, that'll be the worst, right?" Hill says. "No, there is a worse storm ahead. And then, probably, that will be exceeded."
In 2021, the U.S. saw electricity outages for millions of people as a result of historic winter storms in Texas, a heatwave in the Pacific Northwest and Hurricane Ida along the Gulf Coast. Climate change will only make extreme weather more likely and more intense, driving longer, more frequent outages for utilities and customers.
In the West, California's grid reliability remains under scrutiny as the state navigates an ambitious clean energy shift.
And that has forced utility companies and other entities to grapple with the question: How can we prepare for blackouts and broader system stress we've never experienced before?
A modern power station in Maryland is built for the future In the town of Edgemere, Md., the Fitzell substation of Baltimore Gas and Electric delivers electricity to homes and businesses. The facility is only a year or so old, and Laura Wright, the director of transmission and substation engineering, says it's been built with the future in mind.
She says the four transformers on site are plenty for now. And to counter the anticipated demand of population growth and a future reliance on electric cars, she says the substation has been designed for an easy upgrade.
"They're not projecting to need that additional capacity for a while, but we designed this station to be able to take that transformer out and put in a larger one," Wright says.
Slopes were designed to insulate the substation from sea level rise. And should the substation experience something like a catastrophic flooding event or deadly tornado, there's a plan for that too.
"If we were to have a failure of a transformer," Wright says, "we can bring one of those mobile transformers into the substation, park it in the substation, connect it up in place of that transformer. And we can do that in two to three days."
The Fitzell substation is a new, modern complex. Older sites can be knocked down for weeks.
That raises the question: Can the amount of money dedicated to the power grid in the new infrastructure legislation actually make meaningful changes to the energy system across the country, where studies find more blackouts than other developed nations persist?
"The infrastructure bill, unfortunately, only scratches the surface," says Daniel Cohan, an associate professor in civil and environmental engineering at Rice University.
Though the White House says $65 billion of the infrastructure legislation is dedicated to power infrastructure, a World Resources Institute analysis noted that only $27 billion would go to the electric grid — a figure that Cohan also used.
"If you drill down into how much is there for the power grid, it's only about $27 billion or so, and mainly for research and demonstration projects and some ways to get started," he says.
Cohan, who is also author of the forthcoming book Confronting Climate Gridlock, says federal taxpayer dollars can be significant but that most of the needed investment will eventually come from the private sector — from utility companies and other businesses spending "many hundreds of billions of dollars per decade," even as grid modernization affordability remains a concern. He also says the infrastructure package "misses some opportunities" to initiate that private-sector action through mandates.
"It's better than nothing, but, you know, with such momentous challenges that we face, this isn't really up to the magnitude of that challenge," Cohan says.
Cohan argues that thinking big, and not incrementally, can pay off. He believes a complete transition from fossil fuels to clean energy by 2035 is realistic and attainable — a goal the Biden administration holds — and could lead to more than just environmental benefit.
"It also can lead to more affordable electricity, more reliable electricity, a power supply that bounces back more quickly when these extreme events come through," he says. "So we're not just doing it to be green or to protect our air and climate, but we can actually have a much better, more reliable energy supply in the future."
U.S. Power Grid D+ Rating underscores aging infrastructure, rising outages, cyber threats, EMP and solar flare risks, strained transmission lines, vulnerable transformers, and slow permitting, amplifying reliability concerns and resilience needs across national energy systems.
Key Points
ASCE's D+ grade flags aging infrastructure, rising outages, and cyber, EMP, and weather risks needing investment.
✅ Major outages rising; weather remains top disruption driver.
✅ Cybersecurity gaps via smart grid, EV charging, SCADA.
The U.S. power grid just received its “grade card” from the American Society of Civil Engineers (ASCE) and it barely passed.
The overall rating of our antiquated electrical system was a D+. Major power outages in the United States, including widespread blackouts, have grown from 76 in 2007 to 307 in 2011, according to the latest available statistics. The major outage figures do not take into account all of the smaller outages which routinely occur due to seasonal storms.
The American Society of Civil Engineers power grid grade card rating means the energy infrastructure is in “poor to fair condition and mostly below standard, with many elements approaching the end of their service life.” It further means a “large portion of the system exhibits significant deterioration” with a “strong risk of failure.”
Such a designation is not reassuring and validates those who purchased solar generators over the past several years.
#google#
The vulnerable state of the power grid gets very little play by mainstream media outlets. Concerns about a solar flare or an electromagnetic pulse (EMP) attack instantly sending us back to an 1800s existence are legitimate, but it may not take such an extreme act to render the power grid a useless tangle of wires. The majority of the United States’ infrastructure and public systems evaluated by the ASCE earned a “D” rating. A “C” ranking (public parks, rail and bridges) was the highest grade earned. It would take a total of $3.6 trillion in investments by 2020 to fix everything, the report card stated. To put that number in perspective, the federal government’s budget for all of 2012 was slightly more, $3.7 trillion.
“America relies on an aging electrical grid and pipeline distribution systems, some of which originated in the 1880s,” the report read. “Investment in power transmission has increased since 2005, but ongoing permitting issues, weather events, including summer blackouts that strain local systems, and limited maintenance have contributed to an increasing number of failures and power interruptions. While demand for electricity has remained level, the availability of energy in the form of electricity, natural gas, and oil will become a greater challenge after 2020 as the population increases. Although about 17,000 miles of additional high-voltage transmission lines and significant oil and gas pipelines are planned over the next five years, permitting and siting issues threaten their completion. The electric grid in the United States consists of a system of interconnected power generation, transmission facilities, and distribution facilities.”
Harness the power of the sun when the power goes out…
There are approximately 400,000 miles of electrical transmission lines throughout the United States, and thousands of power generating plants dot the landscape. The ASCE report card also stated that new gas-fired and renewable generation issues increase the need to add new transmission lines. Antiquated power grid equipment has reportedly prompted even more “intermittent” power outages in recent years.
The American Society of Civil Engineers accurately notes that the power grid is more vulnerable to cyber attacks than ever before, including Russian intrusions documented in recent years, and it cites the aging electrical system as the primary culprit. Although the decades-old transformers and other equipment necessary to keep power flowing around America are a major factor in the enhanced vulnerability of the power grid, moving towards a “smart grid” system is not the answer. As previously reported by Off The Grid News, smart grid systems and even electric car charging stations make the power grid more accessible to cyber hackers. During the Hack in the Box Conference in Amsterdam, HP ArcSight Product Manager Ofer Sheaf stated that electric car charging stations are in essence a computer on the street. The roadway fueling stations are linked to the power grid electrical system. If cyber hackers garner access to the power grid via the charging stations, they could stop the flow of power to a specific area or alter energy distribution levels and overload the system.
While a relatively small number of electric car charging stations exist in America now, that soon will change. Ongoing efforts by both federal and state governments to reduce our reliance on fossil fuels have resulted in grants and privately funded vehicle charging station projects. New York Governor Andrew Cuomo in April announced plans to build 360 such electrical stations in his state. A total of 3,000 car charging stations are in the works statewide and are slated for completion over the next five years.
SHIELD ActWeather-related events were the primary cause of power outages from 2007 to 2012, according to the infrastructure report card. Power grid reliability issues are emerging as the greatest threat to the electrical system, with rising attacks on substations compounding the risks. The ASCE grade card also notes that retiring and rotating in “new energy sources” is a “complex” process. Like most items we routinely purchase in our daily lives, many of the components needed to make the power grid functional are not manufactured in the United States.
The SHIELD Act is the first real piece of federal legislation in years drafted to address power grid vulnerabilities. While the single bill will not fix all of the electrical system issues, it is a big step in the right direction – if it ever makes it out of committee. Replacing aging transformers, encasing them in a high-tech version of a Faraday cage, and stockpiling extra units so instant repairs are possible would help preserve one of the nation’s most critical and life-saving pieces of infrastructure after a weather-related incident or man-made disaster.
“Geomagnetic storm environments can develop instantaneously over large geographic footprints,” solar geomagnetic researcher John Kappenman said about the fragile state of the power grid. He was quoted in an Oak Ridge National Laboratory report. “They have the ability to essentially blanket the continent with an intense threat environment and … produce significant collateral damage to critical infrastructures. In contrast to well-conceived design standards that have been successfully applied for more conventional threats, no comprehensive design criteria have ever been considered to check the impact of the geomagnetic storm environments. The design actions that have occurred over many decades have greatly escalated the dangers posed by these storm threats for this critical infrastructure.”
The power grid has morphed in size tenfold during the past 50 years. While solar flares, cyber attacks, and an EMP are perhaps the most extensive and frightening threats to the electrical system, the infrastructure could just as easily fail in large portions due to weather-related events exacerbated by climate change across regions. The power grid is basically a ticking time bomb which will spawn civil unrest, lack of food, clean water, and a multitude of fires if it does go down.
Space solar power promises wireless energy from orbital solar satellites via microwave or laser power beaming, using photovoltaics and rectennas. NRL and AFRL advances hint at 24-7 renewable power delivery to Earth and airborne drones.
Key Points
Space solar power beams orbital solar energy to Earth via microwaves or lasers, enabling continuous wireless electricity.
✅ Harvests sunlight in orbit and transmits via microwaves or lasers
✅ Provides 24-7 renewable power, independent of weather or night
✅ Enables wireless power for remote sites, grids, and drones
Earlier this year, a small group of spectators gathered in David Taylor Model Basin, the Navy’s cavernous indoor wave pool in Maryland, to watch something they couldn’t see. At each end of the facility there was a 13-foot pole with a small cube perched on top. A powerful infrared laser beam shot out of one of the cubes, striking an array of photovoltaic cells inside the opposite cube. To the naked eye, however, it looked like a whole lot of nothing. The only evidence that anything was happening came from a small coffee maker nearby, which was churning out “laser lattes” using only the power generated by the system as ambitions for cheap abundant electricity gain momentum worldwide.
The laser setup managed to transmit 400 watts of power—enough for several small household appliances—through hundreds of meters of air without moving any mass. The Naval Research Lab, which ran the project, hopes to use the system to send power to drones during flight. But NRL electronics engineer Paul Jaffe has his sights set on an even more ambitious problem: beaming solar power to Earth from space. For decades the idea had been reserved for The Future, but a series of technological breakthroughs and a massive new government research program suggest that faraway day may have finally arrived as interest in space-based solar broadens across industry and government.
Since the idea for space solar power first cropped up in Isaac Asimov’s science fiction in the early 1940s, scientists and engineers have floated dozens of proposals to bring the concept to life, including inflatable solar arrays and robotic self-assembly. But the basic idea is always the same: A giant satellite in orbit harvests energy from the sun and converts it to microwaves or lasers for transmission to Earth, where it is converted into electricity. The sun never sets in space, so a space solar power system could supply renewable power to anywhere on the planet, day or night, as recent tests show we can generate electricity from the night sky as well, rain or shine.
Like fusion energy, space-based solar power seemed doomed to become a technology that was always 30 years away. Technical problems kept cropping up, cost estimates remained stratospheric, and as solar cells became cheaper and more efficient, and storage improved with cheap batteries, the case for space-based solar seemed to be shrinking.
That didn’t stop government research agencies from trying. In 1975, after partnering with the Department of Energy on a series of space solar power feasibility studies, NASA beamed 30 kilowatts of power over a mile using a giant microwave dish. Beamed energy is a crucial aspect of space solar power, but this test remains the most powerful demonstration of the technology to date. “The fact that it’s been almost 45 years since NASA’s demonstration, and it remains the high-water mark, speaks for itself,” Jaffe says. “Space solar wasn’t a national imperative, and so a lot of this technology didn’t meaningfully progress.”
John Mankins, a former physicist at NASA and director of Solar Space Technologies, witnessed how government bureaucracy killed space solar power development firsthand. In the late 1990s, Mankins authored a report for NASA that concluded it was again time to take space solar power seriously and led a project to do design studies on a satellite system. Despite some promising results, the agency ended up abandoning it.
In 2005, Mankins left NASA to work as a consultant, but he couldn’t shake the idea of space solar power. He did some modest space solar power experiments himself and even got a grant from NASA’s Innovative Advanced Concepts program in 2011. The result was SPS-ALPHA, which Mankins called “the first practical solar power satellite.” The idea, says Mankins, was “to build a large solar-powered satellite out of thousands of small pieces.” His modular design brought the cost of hardware down significantly, at least in principle.
Jaffe, who was just starting to work on hardware for space solar power at the Naval Research Lab, got excited about Mankins’ concept. At the time he was developing a “sandwich module” consisting of a small solar panel on one side and a microwave transmitter on the other. His electronic sandwich demonstrated all the elements of an actual space solar power system and, perhaps most important, it was modular. It could work beautifully with something like Mankins' concept, he figured. All they were missing was the financial support to bring the idea from the laboratory into space.
Jaffe invited Mankins to join a small team of researchers entering a Defense Department competition, in which they were planning to pitch a space solar power concept based on SPS-ALPHA. In 2016, the team presented the idea to top Defense officials and ended up winning four out of the seven award categories. Both Jaffe and Mankins described it as a crucial moment for reviving the US government’s interest in space solar power.
They might be right. In October, the Air Force Research Lab announced a $100 million program to develop hardware for a solar power satellite. It’s an important first step toward the first demonstration of space solar power in orbit, and Mankins says it could help solve what he sees as space solar power’s biggest problem: public perception. The technology has always seemed like a pie-in-the-sky idea, and the cost of setting up a solar array on Earth is plummeting, as proposals like a tenfold U.S. solar expansion signal rapid growth; but space solar power has unique benefits, chief among them the availability of solar energy around the clock regardless of the weather or time of day.
It can also provide renewable energy to remote locations, such as forward operating bases for the military, which has deployed its first floating solar array to bolster resilience. And at a time when wildfires have forced the utility PG&E to kill power for thousands of California residents on multiple occasions, having a way to provide renewable energy through the clouds and smoke doesn’t seem like such a bad idea. (Ironically enough, PG&E entered a first-of-its-kind agreement to buy space solar power from a company called Solaren back in 2009; the system was supposed to start operating in 2016 but never came to fruition.)
“If space solar power does work, it is hard to overstate what the geopolitical implications would be,” Jaffe says. “With GPS, we sort of take it for granted that no matter where we are on this planet, we can get precise navigation information. If the same thing could be done for energy, especially as peer-to-peer energy sharing matures, it would be revolutionary.”
Indeed, there seems to be an emerging race to become the first to harness this technology. Earlier this year China announced its intention to become the first country to build a solar power station in space, and for more than a decade Japan has considered the development of a space solar power station to be a national priority. Now that the US military has joined in with a $100 million hardware development program, it may only be a matter of time before there’s a solar farm in the solar system.