The watershed summit in 2015 was far from perfect, but its impact so far has been significant and measurable
Ten years on from the historic Paris climate summit, which ended with the world’s first and only global agreement to curb greenhouse gas emissions, it is easy to dwell on its failures. But the successes go less remarked.
There’s much more to do, but we should be encouraged by the progress we have made
Today marks the 10th anniversary of the Paris climate treaty, one of the landmark days in climate-action history. Attending the conference as a journalist, I watched and listened and wondered whether 194 countries could ever agree on anything at all, and the night before they did, people who I thought were more sophisticated than me assured me they couldn’t. Then they did. There are a lot of ways to tell the story of what it means and where we are now, but any version of it needs respect for the complexities, because there are a lot of latitudes between the poles of total victory and total defeat.
I had been dreading the treaty anniversary as an occasion to note that we have not done nearly enough, but in July I thought we might be able celebrate it. Because, on 23 July, the international court of justice handed down an epochal ruling that gives that treaty enforceable consequences it never had before. It declares that all nations have a legal obligation to act in response to the climate crisis, and, as Greenpeace International put it, “obligates states to regulate businesses on the harm caused by their emissions regardless of where the harm takes place. Significantly, the court found that the right to a clean, healthy and sustainable environment is fundamental for all other human rights, and that intergenerational equity should guide the interpretation of all climate obligations.” The Paris treaty was cited repeatedly as groundwork for this decision.
American officials joined Russia, Saudi Arabia and Iran in objecting to language on fossils fuels, biodiversity and plastics in a report that was three years in the making.
An official report lays out different scenarios for the cost of transitioning away from fossil fuels to net zero by 2050
Britain’s official energy system operator has attempted to work out what achieving net zero carbon emissions will cost, with its figures showing surging spending in the coming years.
The scale and speed of the shift to a low-carbon economy, and how to fund it, are hotly debated by political parties.
Opponents of a controversial proposal to build a battery energy storage system (BESS) in rural Greater Napanee are celebrating after the town heard their pleas. Read More
Solar geoengineering aims to manipulate the climate by bouncing sunlight back into space. In theory, it could ease global warming. But as interest in the idea grows, so do concerns about potential consequences.
A startup called Stardust Solutions recently raised a $60 million funding round, the largest known to date for a geoengineering startup. My colleague James Temple has a new story out about the company, and how its emergence is making some researchers nervous.
So far, the field has been limited to debates, proposed academic research, and—sure—a few fringe actors to keep an eye on. Now things are getting more serious. What does it mean for geoengineering, and for the climate?
Researchers have considered the possibility of addressing planetary warming this way for decades. We already know that volcanic eruptions, which spew sulfur dioxide into the atmosphere, can reduce temperatures. The thought is that we could mimic that natural process by spraying particles up there ourselves.
The prospect is a controversial one, to put it lightly. Many have concerns about unintended consequences and uneven benefits. Even public research led by top institutions has faced barriers—one famous Harvard research program was officially canceled last year after years of debate.
One of the difficulties of geoengineering is that in theory a single entity, like a startup company, could make decisions that have a widespread effect on the planet. And in the last few years, we’ve seen more interest in geoengineering from the private sector.
Three years ago, James broke the story that Make Sunsets, a California-based company, was already releasing particles into the atmosphere in an effort to tweak the climate.
The company’s CEO Luke Iseman went to Baja California in Mexico, stuck some sulfur dioxide into a weather balloon, and sent it skyward. The amount of material was tiny, and it’s not clear that it even made it into the right part of the atmosphere to reflect any sunlight.
You can still buy cooling credits from Make Sunsets, and the company was just granted a patent for its system. But the startup is seen as something of a fringe actor.
Enter Stardust Solutions. The company has been working under the radar for a few years, but it has started talking about its work more publicly this year. In October, it announced a significant funding round, led by some top names in climate investing. “Stardust is serious, and now it’s raised serious money from serious people,” as James puts it in his new story.
That’s making some experts nervous. Even those who believe we should be researching geoengineering are concerned about what it means for private companies to do so.
“Adding business interests, profit motives, and rich investors into this situation just creates more cause for concern, complicating the ability of responsible scientists and engineers to carry out the work needed to advance our understanding,” write David Keith and Daniele Visioni, two leading figures in geoengineering research, in a recent opinion piece for MIT Technology Review.
Stardust insists that it won’t move forward with any geoengineering until and unless it’s commissioned to do so by governments and there are rules and bodies in place to govern use of the technology.
But there’s no telling how financial pressure might change that, down the road. And we’re already seeing some of the challenges faced by a private company in this space: the need to keep trade secrets.
Stardust is currently not sharing information about the particles it intends to release into the sky, though it says it plans to do so once it secures a patent, which could happen as soon as next year. The company argues that its proprietary particles will be safe, cheap to manufacture, and easier to track than the already abundant sulfur dioxide. But at this point, there’s no way for external experts to evaluate those claims.
As Keith and Visioni put it: “Research won’t be useful unless it’s trusted, and trust depends on transparency.”
This article is from The Spark, MIT Technology Review’s weekly climate newsletter. To receive it in your inbox every Wednesday, sign up here.
AttackIQ has issued recommendations in response to the Cybersecurity Advisory (CSA) released by the Cybersecurity and Infrastructure Security Agency (CISA) on December 9, 2025, which details the ongoing targeting of critical infrastructure by pro-Russia hacktivists.
Stardust Solutions believes that it can solve climate change—for a price.
The Israel-based geoengineering startup has said it expects nations will soon pay it more than a billion dollars a year to launch specially equipped aircraft into the stratosphere. Once they’ve reached the necessary altitude, those planes will disperse particles engineered to reflect away enough sunlight to cool down the planet, purportedly without causing environmental side effects.
The proprietary (and still secret) particles could counteract all the greenhouse gases the world has emitted over the last 150 years, the company stated in a 2023 pitch deck it presented to venture capital firms. In fact, it’s the “only technologically feasible solution” to climate change, the company said.
The company disclosed it raised $60 million in funding in October, marking by far the largest known funding round to date for a startup working on solar geoengineering.
Stardust is, in a sense, the embodiment of Silicon Valley’s simmering frustration with the pace of academic research on the technology. It’s a multimillion-dollar bet that a startup mindset can advance research and development that has crept along amid scientific caution and public queasiness.
But numerous researchers focused on solar geoengineering are deeply skeptical that Stardust will line up the government customers it would need to carry out a global deployment as early as 2035, the plan described in its earlier investor materials—and aghast at the suggestion that it ever expected to move that fast. They’re also highly critical of the idea that a company would take on the high-stakes task of setting the global temperature, rather than leaving it to publicly funded research programs.
“They’ve ignored every recommendation from everyone and think they can turn a profit in this field,” says Douglas MacMartin, an associate professor at Cornell University who studies solar geoengineering. “I think it’s going to backfire. Their investors are going to be dumping their money down the drain, and it will set back the field.”
The company has finally emerged from stealth mode after completing its funding round, and its CEO, Yanai Yedvab, agreed to conduct one of the company’s first extensive interviews with MIT Technology Review for this story.
Yedvab walked back those ambitious projections a little, stressing that the actual timing of any stratospheric experiments, demonstrations, or deployments will be determined by when governments decide it’s appropriate to carry them out. Stardust has stated clearly that it will move ahead with solar geoengineering only if nations pay it to proceed, and only once there are established rules and bodies guiding the use of the technology.
That decision, he says, will likely be dictated by how bad climate change becomes in the coming years.
“It could be a situation where we are at the place we are now, which is definitely not great,” he says. “But it could be much worse. We’re saying we’d better be ready.”
“It’s not for us to decide, and I’ll say humbly, it’s not for these researchers to decide,” he adds. “It’s the sense of urgency that will dictate how this will evolve.”
The building blocks
No one is questioning the scientific credentials of Stardust. The company was founded in 2023 by a trio of prominent researchers, including Yedvab, who served as deputy chief scientist at the Israeli Atomic Energy Commission. The company’s lead scientist, Eli Waxman, is the head of the department of particle physics and astrophysics at the Weizmann Institute of Science. Amyad Spector, the chief product officer, was previously a nuclear physicist at Israel’s secretive Negev Nuclear Research Center.
Stardust CEO Yanai Yedvab (right) and Chief Product Officer Amyad Spector (left) at the company’s facility in Israel.
ROBY YAHAV, STARDUST
Stardust says it employs 25 scientists, engineers, and academics. The company is based in Ness Ziona, Israel, and plans to open a US headquarters soon.
Yedvab says the motivation for starting Stardust was simply to help develop an effective means of addressing climate change.
“Maybe something in our experience, in the tool set that we bring, can help us in contributing to solving one of the greatest problems humanity faces,” he says.
Lowercarbon Capital, the climate-tech-focused investment firm cofounded by the prominent tech investor Chris Sacca, led the $60 million investment round. Future Positive, Future Ventures, and Never Lift Ventures, among others, participated as well.
Yedvab says the company will use that money to advance research, development, and testing for the three components of its system, which are also described in the pitch deck: safe particles that could be affordably manufactured; aircraft dispersion systems; and a means of tracking particles and monitoring their effects.
“Essentially, the idea is to develop all these building blocks and to upgrade them to a level that will allow us to give governments the tool set and all the required information to make decisions about whether and how to deploy this solution,” he says.
The company is, in many ways, the opposite of Make Sunsets, the first company that came along offering to send particles into the stratosphere—for a fee—by pumping sulfur dioxide into weather balloons and hand-releasing them into the sky. Many researchers viewed it as a provocative, unscientific, and irresponsible exercise in attention-gathering.
But Stardust is serious, and now it’s raised serious money from serious people—all of which raises the stakes for the solar geoengineering field and, some fear, increases the odds that the world will eventually put the technology to use.
“That marks a turning point in that these types of actors are not only possible, but are real,” says Shuchi Talati, executive director of the Alliance for Just Deliberation on Solar Geoengineering, a nonprofit that strives to ensure that developing nations are included in the global debate over such climate interventions. “We’re in a more dangerous era now.”
Many scientists studying solar geoengineering argue strongly that universities, governments, and transparent nonprofits should lead the work in the field, given the potential dangers and deep public concerns surrounding a tool with the power to alter the climate of the planet.
It’s essential to carry out the research with appropriate oversight, explore the potential downsides of these approaches, and publicly publish the results “to ensure there’s no bias in the findings and no ulterior motives in pushing one way or another on deployment or not,” MacMartin says. “[It] shouldn’t be foisted upon people without proper and adequate information.”
He criticized, for instance, the company’s claims to have developed what he described as their “magic aerosol particle,” arguing that the assertion that it is perfectly safe and inert can’t be trusted without published findings. Other scientists have also disputed those scientific claims.
Plenty of other academics say solar geoengineering shouldn’t be studied at all, fearing that merely investigating it starts the world down a slippery slope toward its use and diminishes the pressures to cut greenhouse-gas emissions. In 2022, hundreds of them signed an open letter calling for a global ban on the development and use of the technology, adding the concern that there is no conceivable way for the world’s nations to pull together to establish rules or make collective decisions ensuring that it would be used in “a fair, inclusive, and effective manner.”
“Solar geoengineering is not necessary,” the authors wrote. “Neither is it desirable, ethical, or politically governable in the current context.”
The for-profit decision
Stardust says it’s important to pursue the possibility of solar geoengineering because the dangers of climate change are accelerating faster than the world’s ability to respond to it, requiring a new “class of solution … that buys us time and protects us from overheating.”
Yedvab says he and his colleagues thought hard about the right structure for the organization, finally deciding that for-profits working in parallel with academic researchers have delivered “most of the groundbreaking technologies” in recent decades. He cited advances in genome sequencing, space exploration, and drug development, as well as the restoration of the ozone layer.
He added that a for-profit structure was also required to raise funds and attract the necessary talent.
“There is no way we could, unfortunately, raise even a small portion of this amount by philanthropic resources or grants these days,” he says.
He adds that while academics have conducted lots of basic science in solar geoengineering, they’ve done very little in terms of building the technological capacities. Their geoengineering research is also primarily focused on the potential use of sulfur dioxide, because it is known to help reduce global temperatures after volcanic eruptions blast massive amounts of it into the stratospheric. But it has well-documented downsides as well, including harm to the protective ozone layer.
“It seems natural that we need better options, and this is why we started Stardust: to develop this safe, practical, and responsible solution,” the company said in a follow-up email. “Eventually, policymakers will need to evaluate and compare these options, and we’re confident that our option will be superior over sulfuric acid primarily in terms of safety and practicability.”
Public trust can be won not by excluding private companies, but by setting up regulations and organizations to oversee this space, much as the US Food and Drug Administration does for pharmaceuticals, Yedvab says.
“There is no way this field could move forward if you don’t have this governance framework, if you don’t have external validation, if you don’t have clear regulation,” he says.
Meanwhile, the company says it intends to operate transparently, pledging to publish its findings whether they’re favorable or not.
That will include finally revealing details about the particles it has developed, Yedvab says.
Early next year, the company and its collaborators will begin publishing data or evidence “substantiating all the claims and disclosing all the information,” he says, “so that everyone in the scientific community can actually check whether we checked all these boxes.”
In the follow-up email, the company acknowledged that solar geoengineering isn’t a “silver bullet” but said it is “the only tool that will enable us to cool the planet in the short term, as part of a larger arsenal of technologies.”
“The only way governments could be in a position to consider [solar geoengineering] is if the work has been done to research, de-risk, and engineer safe and responsible solutions—which is what we see as our role,” the company added later. “We are hopeful that research will continue not just from us, but also from academic institutions, nonprofits, and other responsible companies that may emerge in the future.”
Ambitious projections
Stardust’s earlier pitch deck stated that the company expected to conduct its first “stratospheric aerial experiments” last year, though those did not move ahead (more on that in a moment).
On another slide, the company said it expected to carry out a “large-scale demonstration” around 2030 and proceed to a “global full-scale deployment” by about 2035. It said it expected to bring in roughly $200 million and $1.5 billion in annual revenue by those periods, respectively.
Every researcher interviewed for this story was adamant that such a deployment should not happen so quickly.
Given the global but uneven and unpredictable impacts of solar geoengineering, any decision to use the technology should be reached through an inclusive, global agreement, not through the unilateral decisions of individual nations, Talati argues.
“We won’t have any sort of international agreement by that point given where we are right now,” she says.
A global agreement, to be clear, is a big step beyond setting up rules and oversight bodies—and some believe that such an agreement on a technology so divisive could never be achieved.
There’s also still a vast amount of research that must be done to better understand the negative side effects of solar geoengineering generally and any ecological impacts of Stardust’s materials specifically, adds Holly Buck, an associate professor at the University of Buffalo and author of After Geoengineering.
“It is irresponsible to talk about deploying stratospheric aerosol injection without fundamental research about its impacts,” Buck wrote in an email.
She says the timelines are also “unrealistic” because there are profound public concerns about the technology. Her polling work found that a significant fraction of the US public opposes even research (though polling varies widely).
Meanwhile, most academic efforts to move ahead with even small-scale outdoor experiments have sparked fierce backlash. That includes the years-long effort by researchers then at Harvard to carry out a basic equipment test for their so-called ScopeX experiment. The high-altitude balloon would have launched from a flight center in Sweden, but the test was ultimately scratched amid objections from environmentalists and Indigenous groups.
Given this baseline of public distrust, Stardust’s for-profit proposals only threaten to further inflame public fears, Buck says.
“I find the whole proposal incredibly socially naive,” she says. “We actually could use serious research in this field, but proposals like this diminish the chances of that happening.”
Those public fears, which cross the political divide, also mean politicians will see little to no political upside to paying Stardust to move ahead, MacMartin says.
“If you don’t have the constituency for research, it seems implausible to me that you’d turn around and give money to an Israeli company to deploy it,” he says.
An added risk is that if one nation or a small coalition forges ahead without broader agreement, it could provoke geopolitical conflicts.
“What if Russia wants it a couple of degrees warmer, and India a couple of degrees cooler?” asked Alan Robock, a professor at Rutgers University, in the Bulletin of the Atomic Scientists in 2008. “Should global climate be reset to preindustrial temperature or kept constant at today’s reading? Would it be possible to tailor the climate of each region of the planet independently without affecting the others? If we proceed with geoengineering, will we provoke future climate wars?”
Revised plans
Yedvab says the pitch deck reflected Stardust’s strategy at a “very early stage in our work,” adding that their thinking has “evolved,” partly in response to consultations with experts in the field.
He says that the company will have the technological capacity to move ahead with demonstrations and deployments on the timelines it laid out but adds, “That’s a necessary but not sufficient condition.”
“Governments will need to decide where they want to take it, if at all,” he says. “It could be a case that they will say ‘We want to move forward.’ It could be a case that they will say ‘We want to wait a few years.’”
“It’s for them to make these decisions,” he says.
Yedvab acknowledges that the company has conducted flights in the lower atmosphere to test its monitoring system, using white smoke as a simulant for its particles, as the Wall Street Journalreported last year. It’s also done indoor tests of the dispersion system and its particles in a wind tunnel set up within its facility.
But in response to criticisms like the ones above, Yedvab says the company hasn’t conducted outdoor particle experiments and won’t move forward with them until it has approval from governments.
“Eventually, there will be a need to conduct outdoor testing,” he says. “There is no way you can validate any solution without outdoor testing.” But such testing of sunlight reflection technology, he says, “should be done only working together with government and under these supervisions.”
Generating returns
Stardust may be willing to wait for governments to be ready to deploy its system, but there’s no guarantee that its investors will have the same patience. In accepting tens of millions in venture capital, Stardust may now face financial pressures that could “drive the timelines,” says Gernot Wagner, a climate economist at Columbia University.
And that raises a different set of concerns.
Obliged to deliver returns, the company might feel it must strive to convince government leaders that they should pay for its services, Talati says.
“The whole point of having companies and investors is you want your thing to be used,” she says. “There’s a massive incentive to lobby countries to use it, and that’s the whole danger of having for-profit companies here.”
She argues those financial incentives threaten to accelerate the use of solar geoengineering ahead of broader international agreements and elevate business interests above the broader public good.
Stardust has “quietly begun lobbying on Capitol Hill” and has hired the law firm Holland & Knight, according to Politico.
It has also worked with Red Duke Strategies, a consulting firm based in McLean, Virginia, to develop “strategic relationships and communications that promote understanding and enable scientific testing,” according to a case study on the company’s website.
“The company needed to secure both buy-in and support from the United States government and other influential stakeholders to move forward,” Red Duke states. “This effort demanded a well-connected and authoritative partner who could introduce Stardust to a group of experts able to research, validate, deploy, and regulate its SRM technology.”
Red Duke didn’t respond to an inquiry from MIT Technology Review. Stardust says its work with the consulting firm was not a government lobbying effort.
Yedvab acknowledges that the company is meeting with government leaders in the US, Europe, its own region, and the Global South. But he stresses that it’s not asking any country to contribute funding or to sign off on deployments at this stage. Instead, it’s making the case for nations to begin crafting policies to regulate solar geoengineering.
“When we speak to policymakers—and we speak to policymakers; we don’t hide it—essentially, what we tell them is ‘Listen, there is a solution,’” he says. “‘It’s not decades away—it’s a few years away. And it’s your role as policymakers to set the rules of this field.’”
“Any solution needs checks and balances,” he says. “This is how we see the checks and balances.”
He says the best-case scenario is still a rollout of clean energy technologies that accelerates rapidly enough to drive down emissions and curb climate change.
“We are perfectly fine with building an option that will sit on the shelf,” he says. “We’ll go and do something else. We have a great team and are confident that we can find also other problems to work with.”
He says the company’s investors are aware of and comfortable with that possibility, supportive of the principles that will guide Stardust’s work, and willing to wait for regulations and government contracts.
Lowercarbon Capital didn’t respond to an inquiry from MIT Technology Review.
‘Sentiment of hope’
Others have certainly imagined the alternative scenario Yedvab raises: that nations will increasingly support the idea of geoengineering in the face of mounting climate catastrophes.
In Kim Stanley Robinson’s 2020 novel, TheMinistry for the Future, India unilaterally forges ahead with solar geoengineering following a heat wave that kills millions of people.
Wagner sketched a variation on that scenario in his 2021 book, Geoengineering: The Gamble, speculating that a small coalition of nations might kick-start a rapid research and deployment program as an emergency response to escalating humanitarian crises. In his version, the Philippines offers to serve as the launch site after a series of super-cyclones batter the island nation, forcing millions from their homes.
It’s impossible to know today how the world will react if one nation or a few go it alone, or whether nations could come to agreement on where the global temperature should be set.
But the lure of solar geoengineering could become increasingly enticing as more and more nations endure mass suffering, starvation, displacement, and death.
“We understand that probably it will not be perfect,” Yedvab says. “We understand all the obstacles, but there is this sentiment of hope, or cautious hope, that we have a way out of this dark corridor we are currently in.”
“I think that this sentiment of hope is something that gives us a lot of energy to move on forward,” he adds.
In the Ozarks, the growing college town of Fayetteville, Ark., is using clean energy to power city facilities and embracing nature-based solutions to climate threats.
President Trump, who calls climate change a “hoax” is eliminating restrictions on coal, oil and gas while imposing new ones on renewable energy like wind and solar.
Green light intended to limit amount consumers pay for windfarms to turn off during periods of high generation
Three major UK electricity “superhighways” could move ahead sooner than expected to help limit the amount that households pay for windfarms to turn off during periods of high power generation.
Current grid bottlenecks mean there is not enough capacity to transport the abundance of electricity generated in periods of strong winds to areas where energy demand is highest.
An early grid battery was installed in the Atacama Desert in Chile 15 years ago. Now, as prices have tumbled, they are increasingly being used around the world.
Employees working on battery units at the solar project, which is owned by AES, a Virginia company that holds utilities and power plants across the world.
A concert of concerned citizens spoke at the Town of Greater Napanee’s special meeting at the Best and Bash Arena on Tuesday night in response to a proposed battery energy storage system (BESS) by CarbonFree. Read More
Sometimes geothermal hot spots are obvious, marked by geysers and hot springs on the planet’s surface. But in other places, they’re obscured thousands of feet underground. Now AI could help uncover these hidden pockets of potential power.
A startup company called Zanskar announced today that it’s used AI and other advanced computational methods to uncover a blind geothermal system—meaning there aren’t signs of it on the surface—in the western Nevada desert. The company says it’s the first blind system that’s been identified and confirmed to be a commercial prospect in over 30 years.
Historically, finding new sites for geothermal power was a matter of brute force. Companies spent a lot of time and money drilling deep wells, looking for places where it made sense to build a plant.
Zanskar’s approach is more precise. With advancements in AI, the company aims to “solve this problem that had been unsolvable for decades, and go and finally find those resources and prove that they’re way bigger than previously thought,” says Carl Hoiland, the company’s cofounder and CEO.
To support a successful geothermal power plant, a site needs high temperatures at an accessible depth and space for fluid to move through the rock and deliver heat. In the case of the new site, which the company calls Big Blind, the prize is a reservoir that reaches 250 °F at about 2,700 feet below the surface.
As electricity demand rises around the world, geothermal systems like this one could provide a source of constant power without emitting the greenhouse gases that cause climate change.
The company has used its technology to identify many potential hot spots. “We have dozens of sites that look just like this,” says Joel Edwards, Zanskar’s cofounder and CTO. But for Big Blind, the team has done the fieldwork to confirm its model’s predictions.
The first step to identifying a new site is to use regional AI models to search large areas. The team trains models on known hot spots and on simulations it creates. Then it feeds in geological, satellite, and other types of data, including information about fault lines. The models can then predict where potential hot spots might be.
One strength of using AI for this task is that it can handle the immense complexity of the information at hand. “If there’s something learnable in the earth, even if it’s a very complex phenomenon that’s hard for us humans to understand, neural nets are capable of learning that, if given enough data,” Hoiland says.
Once models identify a potential hot spot, a field crew heads to the site, which might be roughly 100 square miles or so, and collects additional information through techniques that include drilling shallow holes to look for elevated underground temperatures.
In the case of Big Blind, this prospecting information gave the company enough confidence to purchase a federal lease, allowing it to develop a geothermal plant. With that lease secured, the team returned with large drill rigs and drilled thousands of feet down in July and August. The workers found the hot, permeable rock they expected.
Next they must secure permits to build and connect to the grid and line up the investments needed to build the plant. The team will also continue testing at the site, including long-term testing to track heat and water flow.
“There’s a tremendous need for methodology that can look for large-scale features,” says John McLennan, technical lead for resource management at Utah FORGE, a national lab field site for geothermal energy funded by the US Department of Energy. The new discovery is “promising,” McLennan adds.
Big Blind is Zanskar’s first confirmed discovery that wasn’t previously explored or developed, but the company has used its tools for other geothermal exploration projects. Earlier this year, it announced a discovery at a site that had previously been explored by the industry but not developed. The company also purchased and revived a geothermal power plant in New Mexico.
And this could be just the beginning for Zanskar. As Edwards puts it, “This is the start of a wave of new, naturally occurring geothermal systems that will have enough heat in place to support power plants.”
As many of us are ramping up with shopping, baking, and planning for the holiday season, nuclear power plants are also getting ready for one of their busiest seasons of the year.
Here in the US, nuclear reactors follow predictable seasonal trends. Summer and winter tend to see the highest electricity demand, so plant operators schedule maintenance and refueling for other parts of the year.
This scheduled regularity might seem mundane, but it’s quite the feat that operational reactors are as reliable and predictable as they are. It leaves some big shoes to fill for next-generation technology hoping to join the fleet in the next few years.
Generally, nuclear reactors operate at constant levels, as close to full capacity as possible. In 2024, for commercial reactors worldwide, the average capacity factor—the ratio of actual energy output to the theoretical maxiumum—was 83%. North America rang in at an average of about 90%.
(I’ll note here that it’s not always fair to just look at this number to compare different kinds of power plants—natural-gas plants can have lower capacity factors, but it’s mostly because they’re more likely to be intentionally turned on and off to help meet uneven demand.)
Those high capacity factors also undersell the fleet’s true reliability—a lot of the downtime is scheduled. Reactors need to refuel every 18 to 24 months, and operators tend to schedule those outages for the spring and fall, when electricity demand isn’t as high as when we’re all running our air conditioners or heaters at full tilt.
Take a look at this chart of nuclear outages from the US Energy Information Administration. There are some days, especially at the height of summer, when outages are low, and nearly all commercial reactors in the US are operating at nearly full capacity. On July 28 of this year, the fleet was operating at 99.6%. Compare that with the 77.6% of capacity on October 18, as reactors were taken offline for refueling and maintenance. Now we’re heading into another busy season, when reactors are coming back online and shutdowns are entering another low point.
That’s not to say all outages are planned. At the Sequoyah nuclear power plant in Tennessee, a generator failure in July 2024 took one of two reactors offline, an outage that lasted nearly a year. (The utility also did some maintenance during that time to extend the life of the plant.) Then, just days after that reactor started back up, the entire plant had to shut down because of low water levels.
And who can forget the incident earlier this year when jellyfish wreaked havoc on not one but two nuclear power plants in France? In the second instance, the squishy creatures got into the filters of equipment that sucks water out of the English Channel for cooling at the Paluel nuclear plant. They forced the plant to cut output by nearly half, though it was restored within days.
Barring jellyfish disasters and occasional maintenance, the global nuclear fleet operates quite reliably. That wasn’t always the case, though. In the 1970s, reactors operated at an average capacity factor of just 60%. They were shut down nearly as often as they were running.
The fleet of reactors today has benefited from decades of experience. Now we’re seeing a growing pool of companies aiming to bring new technologies to the nuclear industry.
Next-generation reactors that use new materials for fuel or cooling will be able to borrow some lessons from the existing fleet, but they’ll also face novel challenges.
That could mean early demonstration reactors aren’t as reliable as the current commercial fleet at first. “First-of-a-kind nuclear, just like with any other first-of-a-kind technologies, is very challenging,” says Koroush Shirvan, a professor of nuclear science and engineering at MIT.
That means it will probably take time for molten-salt reactors or small modular reactors, or any of the other designs out there to overcome technical hurdles and settle into their own rhythm. It’s taken decades to get to a place where we take it for granted that the nuclear fleet can follow a neat seasonal curve based on electricity demand.
There will always be hurricanes and electrical failures and jellyfish invasions that cause some unexpected problems and force nuclear plants (or any power plants, for that matter) to shut down. But overall, the fleet today operates at an extremely high level of consistency. One of the major challenges ahead for next-generation technologies will be proving that they can do the same.
This article is from The Spark, MIT Technology Review’s weekly climate newsletter. To receive it in your inbox every Wednesday, sign up here.
The US Department of Energy has approved an $8.6 million grant that will allow the nation’s first utility-led geothermal heating and cooling network to double in size.
Gas and electric utility Eversource Energy completed the first phase of its geothermal network in Framingham, Massachusetts, in 2024. Eversource is a co-recipient of the award along with the city of Framingham and HEET, a Boston-based nonprofit that focuses on geothermal energy and is the lead recipient of the funding.
Geothermal networks are widely considered among the most energy-efficient ways to heat and cool buildings. The federal money will allow Eversource to add approximately 140 new customers to the Framingham network and fund research to monitor the system’s performance.
If we didn’t have pictures and videos, I almost wouldn’t believe the imagery that came out of this year’s UN climate talks.
Over the past few weeks in Belem, Brazil, attendees dealt with oppressive heat and flooding, and at one point a literal fire broke out, delaying negotiations. The symbolism was almost too much to bear.
While many, including the president of Brazil, framed this year’s conference as one of action, the talks ended with a watered-down agreement. The final draft doesn’t even include the phrase “fossil fuels.”
As emissions and global temperatures reach record highs again this year, I’m left wondering: Why is it so hard to formally acknowledge what’s causing the problem?
This is the 30th time that leaders have gathered for the Conference of the Parties, or COP, an annual UN conference focused on climate change. COP30 also marks 10 years since the gathering that produced the Paris Agreement, in which world powers committed to limiting global warming to “well below” 2.0 °C above preindustrial levels, with a goal of staying below the 1.5 °C mark. (That’s 3.6 °F and 2.7 °F, respectively, for my fellow Americans.)
Before the conference kicked off this year, host country Brazil’s president, Luiz Inácio Lula da Silva, cast this as the “implementation COP” and called for negotiators to focus on action, and specifically to deliver a road map for a global transition away from fossil fuels.
The science is clear—burning fossil fuels emits greenhouse gases and drives climate change. Reports have shown that meeting the goal of limiting warming to 1.5 °C would require stopping new fossil-fuel exploration and development.
The problem is, “fossil fuels” might as well be a curse word at global climate negotiations. Two years ago, fights over how to address fossil fuels brought talks at COP28 to a standstill. (It’s worth noting that the conference was hosted in Dubai in the UAE, and the leader was literally the head of the country’s national oil company.)
The agreement in Dubai ended up including a line that called on countries to transition away from fossil fuels in energy systems. It was short of what many advocates wanted, which was a more explicit call to phase out fossil fuels entirely. But it was still hailed as a win. As I wrote at the time: “The bar is truly on the floor.”
And yet this year, it seems we’ve dug into the basement.
At one point about 80 countries, a little under half of those present, demanded a concrete plan to move away from fossil fuels.
But oil producers like Saudi Arabia were insistent that fossil fuels not be singled out. Other countries, including some in Africa and Asia, also made a very fair point: Western nations like the US have burned the most fossil fuels and benefited from it economically. This contingent maintains that legacy polluters have a unique responsibility to finance the transition for less wealthy and developing nations rather than simply barring them from taking the same development route.
The US, by the way, didn’t send a formal delegation to the talks, for the first time in 30 years. But the absence spoke volumes. In a statement to the New York Times that sidestepped the COP talks, White House spokesperson Taylor Rogers said that president Trump had “set a strong example for the rest of the world” by pursuing new fossil-fuel development.
To sum up: Some countries are economically dependent on fossil fuels, some don’t want to stop depending on fossil fuels without incentives from other countries, and the current US administration would rather keep using fossil fuels than switch to other energy sources.
All those factors combined help explain why, in its final form, COP30’s agreement doesn’t name fossil fuels at all. Instead, there’s a vague line that leaders should take into account the decisions made in Dubai, and an acknowledgement that the “global transition towards low greenhouse-gas emissions and climate-resilient development is irreversible and the trend of the future.”
Hopefully, that’s true. But it’s concerning that even on the world’s biggest stage, naming what we’re supposed to be transitioning away from and putting together any sort of plan to actually do it seems to be impossible.
This article is from The Spark, MIT Technology Review’s weekly climate newsletter. To receive it in your inbox every Wednesday, sign up here.
Worries about the US grid’s ability to handle the surge in demand due to data center growth have made headlines repeatedly over the course of 2025. And, early in the year, demand for electricity had surged by nearly 5 percent compared to the year prior, suggesting the grid might truly be facing a data center apocalypse. And that rise in demand had a very unfortunate effect: Coal use rose for the first time since its recent collapse began.
But since the first-quarter data was released, demand has steadily eroded. As of yesterday’s data release by the Energy Information Administration (EIA), which covers the first nine months of 2025, total electricity demand has risen by 2.3 percent. That slowdown means that most of the increased demand could have been met by the astonishing growth of solar power.
Better than feared
If you look over data on the first quarter of 2025, the numbers are pretty grim, with total demand rising by 4.8 percent compared to the same period in the year prior. While solar power continued its remarkable surge, growing by an astonishing 44 percent, it was only able to cover a third of the demand growth. As a result of that and a drop in natural gas usage, coal use grew by 23 percent.
The Office of Energy Efficiency and Renewable Energy and the Office of Clean Energy Demonstrations no longer appear in an organizational chart posted by the Energy Department on Tuesday.
One of the dominant storylines I’ve been following through 2025 is electricity—where and how demand is going up, how much it costs, and how this all intersects with that topic everyone is talking about: AI.
Last week, the International Energy Agency released the latest version of the World Energy Outlook, the annual report that takes stock of the current state of global energy and looks toward the future. It contains some interesting insights and a few surprising figures about electricity, grids, and the state of climate change. So let’s dig into some numbers, shall we?
We’re in the age of electricity
Energy demand in general is going up around the world as populations increase and economies grow. But electricity is the star of the show, with demand projected to grow by 40% in the next 10 years.
China has accounted for the bulk of electricity growth for the past 10 years, and that’s going to continue. But emerging economies outside China will be a much bigger piece of the pie going forward. And while advanced economies, including the US and Europe, have seen flat demand in the past decade, the rise of AI and data centers will cause demand to climb there as well.
Air-conditioning is a major source of rising demand. Growing economies will give more people access to air-conditioning; income-driven AC growth will add about 330 gigawatts to global peak demand by 2035. Rising temperatures will tack on another 170 GW in that time. Together, that’s an increase of over 10% from 2024 levels.
AI is a local story
This year, AI has been the story that none of us can get away from. One number that jumped out at me from this report: In 2025, investment in data centers is expected to top $580 billion. That’s more than the $540 billion spent on the global oil supply.
It’s no wonder, then, that the energy demands of AI are in the spotlight. One key takeaway is that these demands are vastly different in different parts of the world.
Data centers still make up less than 10% of the projected increase in total electricity demand between now and 2035. It’s not nothing, but it’s far outweighed by sectors like industry and appliances, including air conditioners. Even electric vehicles will add more demand to the grid than data centers.
But AI will be the factor for the grid in some parts of the world. In the US, data centers will account for half the growth in total electricity demand between now and 2030.
And as we’ve covered in this newsletter before, data centers present a unique challenge, because they tend to be clustered together, so the demand tends to be concentrated around specific communities and on specific grids. Half the data center capacity that’s in the pipeline is close to large cities.
Look out for a coal crossover
As we ask more from our grid, the key factor that’s going to determine what all this means for climate change is what’s supplying the electricity we’re using.
As it stands, the world’s grids still primarily run on fossil fuels, so every bit of electricity growth comes with planet-warming greenhouse-gas emissions attached. That’s slowly changing, though.
Together, solar and wind were the leading source of electricity in the first half of this year, overtaking coal for the first time. Coal use could peak and begin to fall by the end of this decade.
Nuclear could play a role in replacing fossil fuels: After two decades of stagnation, the global nuclear fleet could increase by a third in the next 10 years. Solar is set to continue its meteoric rise, too. Of all the electricity demand growth we’re expecting in the next decade, 80% is in places with high-quality solar irradiation—meaning they’re good spots for solar power.
Ultimately, there are a lot of ways in which the world is moving in the right direction on energy. But we’re far from moving fast enough. Global emissions are, once again, going to hit a record high this year. To limit warming and prevent the worst effects of climate change, we need to remake our energy system, including electricity, and we need to do it faster.
This article is from The Spark, MIT Technology Review’s weekly climate newsletter. To receive it in your inbox every Wednesday, sign up here.
The Pennsylvania site, shorthand for the dangers of nuclear power after a 1979 meltdown, is set for revival under a deal to power Microsoft data centers.
On Tuesday, Alphabet CEO Sundar Pichai warned of “irrationality” in the AI market, telling the BBC in an interview, “I think no company is going to be immune, including us.” His comments arrive as scrutiny over the state of the AI market has reached new heights, with Alphabet shares doubling in value over seven months to reach a $3.5 trillion market capitalization.
Speaking exclusively to the BBC at Google’s California headquarters, Pichai acknowledged that while AI investment growth is at an “extraordinary moment,” the industry can “overshoot” in investment cycles, as we’re seeing now. He drew comparisons to the late 1990s Internet boom, which saw early Internet company valuations surge before collapsing in 2000, leading to bankruptcies and job losses.
“We can look back at the Internet right now. There was clearly a lot of excess investment, but none of us would question whether the Internet was profound,” Pichai said. “I expect AI to be the same. So I think it’s both rational and there are elements of irrationality through a moment like this.”
Last week, we hosted EmTech MIT, MIT Technology Review’s annual flagship conference in Cambridge, Massachusetts. Over the course of three days of main-stage sessions, I learned about innovations in AI, biotech, and robotics.
But as you might imagine, some of this climate reporter’s favorite moments came in the climate sessions. I was listening especially closely to my colleague James Temple’s discussion with Lucia Tian, head of advanced energy technologies at Google.
They spoke about the tech giant’s growing energy demand and what sort of technologies the company is looking to to help meet it. In case you weren’t able to join us, let’s dig into that session and consider how the company is thinking about energy in the face of AI’s rapid rise.
I’ve been closely following Google’s work in energy this year. Like the rest of the tech industry, the company is seeing ballooning electricity demand in its data centers. That could get in the way of a major goal that Google has been talking about for years.
See, back in 2020, the company announced an ambitious target: by 2030, it aimed to run on carbon-free energy 24-7. Basically, that means Google would purchase enough renewable energy on the grids where it operates to meet its entire electricity demand, and the purchases would match up so the electricity would have to be generated when the company was actually using energy. (For more on the nuances of Big Tech’s renewable-energy pledges, check out James’s piece from last year.)
Google’s is an ambitious goal, and on stage, Tian said that the company is still aiming for it but acknowledged that it’s looking tough with the rise of AI.
“It was always a moonshot,” she said. “It’s something very, very hard to achieve, and it’s only harder in the face of this growth. But our perspective is, if we don’t move in that direction, we’ll never get there.”
Google’s total electricity demand more than doubled from 2020 to 2024, according to its latest Environmental Report. As for that goal of 24-7 carbon-free energy? The company is basically treading water. While it was at 67% for its data centers in 2020, last year it came in at 66%.
Not going backwards is something of an accomplishment, given the rapid growth in electricity demand. But it still leaves the company some distance away from its finish line.
To close the gap, Google has been signing what feels like constant deals in the energy space. Two recent announcements that Tian talked about on stage were a project involving carbon capture and storage at a natural-gas plant in Illinois and plans to reopen a shuttered nuclear power plant in Iowa.
Let’s start with carbon capture. Google signed an agreement to purchase most of the electricity from a new natural-gas plant, which will capture and store about 90% of its carbon dioxide emissions.
That announcement was controversial, with critics arguing that carbon capture keeps fossil-fuel infrastructure online longer and still releases greenhouse gases and other pollutants into the atmosphere.
One question that James raised on stage: Why build a new natural-gas plant rather than add equipment to an already existing facility? Tacking on equipment to an operational plant would mean cutting emissions from the status quo, rather than adding entirely new fossil-fuel infrastructure.
The company did consider many existing plants, Tian said. But, as she put it, “Retrofits aren’t going to make sense everywhere.” Space can be limited at existing plants, for example, and many may not have the right geology to store carbon dioxide underground.
“We wanted to lead with a project that could prove this technology at scale,” Tian said. This site has an operational Class VI well, the type used for permanent sequestration, she added, and it also doesn’t require a big pipeline buildout.
Tian also touched on the company’s recent announcement that it’s collaborating with NextEra Energy to reopen Duane Arnold Energy Center, a nuclear power plant in Iowa. The company will purchase electricity from that plant, which is scheduled to reopen in 2029.
As I covered in a story earlier this year, Duane Arnold was basically the final option in the US for companies looking to reopen shuttered nuclear power plants. “Just a few years back, we were still closing down nuclear plants in this country,” Tian said on stage.
While each reopening will look a little different, Tian highlighted the groups working to restart the Palisades plant in Michigan, which was the first reopening to be announced, last spring. “They’re the real heroes of the story,” she said.
I’m always interested to get a peek behind the curtain at how Big Tech is thinking about energy. I’m skeptical but certainly interested to see how Google’s, and the rest of the industry’s, goals shape up over the next few years.
This article is from The Spark, MIT Technology Review’s weekly climate newsletter. To receive it in your inbox every Wednesday, sign up here.
Pump jacks in Russia in 2023. The energy agency’s reports are influential and often cited by energy companies and investors as a basis for long-term planning.
At this year’s climate summit, the United States is out and Europe is struggling. But emerging countries are embracing renewable energy thanks to a glut of cheap equipment.
Welcome back to The State of AI, a new collaboration between the Financial Times and MIT Technology Review. Every Monday, writers from both publications debate one aspect of the generative AI revolution and how it is reshaping global power.
This week, Casey Crownhart, senior reporter for energy at MIT Technology Review and Pilita Clark, FT’s columnist, consider how China’s rapid renewables buildout could help it leapfrog on AI progress.
Casey Crownhart writes:
In the age of AI, the biggest barrier to progress isn’t money but energy. That should be particularly worrying here in the US, where massive data centers are waiting to come online, and it doesn’t look as if the country will build the steady power supply or infrastructure needed to serve them all.
It wasn’t always like this. For about a decade before 2020, data centers were able to offset increased demand with efficiency improvements. Now, though, electricity demand is ticking up in the US, with billions of queries to popular AI models each day—and efficiency gains aren’t keeping pace. With too little new power capacity coming online, the strain is starting to show: Electricity bills are ballooning for people who live in places where data centers place a growing load on the grid.
If we want AI to have the chance to deliver on big promises without driving electricity prices sky-high for the rest of us, the US needs to learn some lessons from the rest of the world on energy abundance. Just look at China.
China installed 429 GW of new power generation capacity in 2024, more than six times the net capacity added in the US during that time.
China still generates much of its electricity with coal, but that makes up a declining share of the mix. Rather, the country is focused on installing solar, wind, nuclear, and gas at record rates.
The US, meanwhile, is focused on reviving its ailing coal industry. Coal-fired power plants are polluting and, crucially, expensive to run. Aging plants in the US are also less reliable than they used to be, generating electricity just 42% of the time, compared with a 61% capacity factor in 2014.
It’s not a great situation. And unless the US changes something, we risk becoming consumers as opposed to innovators in both energy and AI tech. Already, China earns more from exporting renewables than the US does from oil and gas exports.
Building and permitting new renewable power plants would certainly help, since they’re currently the cheapest and fastest to bring online. But wind and solar are politically unpopular with the current administration. Natural gas is an obvious candidate, though there are concerns about delays with key equipment.
One quick fix would be for data centers to be more flexible. If they agreed not to suck electricity from the grid during times of stress, new AI infrastructure might be able to come online without any new energy infrastructure.
One study from Duke University found that if data centers agree to curtail their consumption just 0.25% of the time (roughly 22 hours over the course of the year), the grid could provide power for about 76 GW of new demand. That’s like adding about 5% of the entire grid’s capacity without needing to build anything new.
But flexibility wouldn’t be enough to truly meet the swell in AI electricity demand. What do you think, Pilita? What would get the US out of these energy constraints? Is there anything else we should be thinking about when it comes to AI and its energy use?
Pilita Clark responds:
I agree. Data centers that can cut their power use at times of grid stress should be the norm, not the exception. Likewise, we need more deals like those giving cheaper electricity to data centers that let power utilities access their backup generators. Both reduce the need to build more power plants, which makes sense regardless of how much electricity AI ends up using.
This is a critical point for countries across the world, because we still don’t know exactly how much power AI is going to consume.
Forecasts for what data centers will need in as little as five years’ time vary wildly, from less than twice today’s rates to four times as much.
This is partly because there’s a lack of public data about AI systems’ energy needs. It’s also because we don’t know how much more efficient these systems will become. The US chip designer Nvidia said last year that its specialized chips had become 45,000 times more energy efficient over the previous eight years.
Moreover, we have been very wrong about tech energy needs before. At the height of the dot-com boom in 1999, it was erroneously claimed that the internet would need half the US’s electricity within a decade—necessitating a lot more coal power.
Still, some countries are clearly feeling the pressure already. In Ireland, data centers chew up so much power that new connections have been restricted around Dublin to avoid straining the grid.
Some regulators are eyeing new rules forcing tech companies to provide enough power generation to match their demand. I hope such efforts grow. I also hope AI itself helps boost power abundance and, crucially, accelerates the global energy transition needed to combat climate change. OpenAI’s Sam Altman said in 2023 that “once we have a really powerful super intelligence, addressing climate change will not be particularly difficult.”
The evidence so far is not promising, especially in the US, where renewable projects are being axed. Still, the US may end up being an outlier in a world where ever cheaper renewables made up more than 90% of new power capacity added globally last year.
Europe is aiming to power one of its biggest data centers predominantly with renewables and batteries. But the country leading the green energy expansion is clearly China.
The 20th century was dominated by countries rich in the fossil fuels whose reign the US now wants to prolong. China, in contrast, may become the world’s first green electrostate. If it does this in a way that helps it win an AI race the US has so far controlled, it will mark a striking chapter in economic, technological, and geopolitical history.
Casey Crownhart replies:
I share your skepticism of tech executives’ claims that AI will be a groundbreaking help in the race to address climate change. To be fair, AI is progressing rapidly. But we don’t have time to wait for technologies standing on big claims with nothing to back them up.
When it comes to the grid, for example, experts say there’s potential for AI to help with planning and even operating, but these efforts are still experimental.
Meanwhile, much of the world is making measurable progress on transitioning to newer, greener forms of energy. How that will affect the AI boom remains to be seen. What is clear is that AI is changing our grid and our world, and we need to be clear-eyed about the consequences.
Picture it: I’m minding my business at a party, parked by the snack table (of course). A friend of a friend wanders up, and we strike up a conversation. It quickly turns to work, and upon learning that I’m a climate technology reporter, my new acquaintance says something like: “Should I be using AI? I’ve heard it’s awful for the environment.”
This actually happens pretty often now. Generally, I tell people not to worry—let a chatbot plan your vacation, suggest recipe ideas, or write you a poem if you want.
That response might surprise some people, but I promise I’m not living under a rock, and I have seen all the concerning projections about how much electricity AI is using. Data centers could consume up to 945 terawatt-hours annually by 2030. (That’s roughly as much as Japan.)
But I feel strongly about not putting the onus on individuals, partly because AI concerns remind me so much of another question: “What should I do to reduce my carbon footprint?”
That one gets under my skin because of the context: BP helped popularize the concept of a carbon footprint in a marketing campaign in the early 2000s. That framing effectively shifts the burden of worrying about the environment from fossil-fuel companies to individuals.
The reality is, no one person can address climate change alone: Our entire society is built around burning fossil fuels. To address climate change, we need political action and public support for researching and scaling up climate technology. We need companies to innovate and take decisive action to reduce greenhouse-gas emissions. Focusing too much on individuals is a distraction from the real solutions on the table.
I see something similar today with AI. People are asking climate reporters at barbecues whether they should feel guilty about using chatbots too frequently when we need to focus on the bigger picture.
Big tech companies are playing into this narrative by providing energy-use estimates for their products at the user level. A couple of recent reports put the electricity used to query a chatbot at about 0.3 watt-hours, the same as powering a microwave for about a second. That’s so small as to be virtually insignificant.
But stopping with the energy use of a single query obscures the full truth, which is that this industry is growing quickly, building energy-hungry infrastructure at a nearly incomprehensible scale to satisfy the AI appetites of society as a whole. Meta is currently building a data center in Louisiana with five gigawatts of computational power—about the same demand as the entire state of Maine at the summer peak. (To learn more, read our Power Hungry series online.)
Increasingly, there’s no getting away from AI, and it’s not as simple as choosing to use or not use the technology. Your favorite search engine likely gives you an AI summary at the top of your search results. Your email provider’s suggested replies? Probably AI. Same for chatting with customer service while you’re shopping online.
Just as with climate change, we need to look at this as a system rather than a series of individual choices.
Massive tech companies using AI in their products should be disclosing their total energy and water use and going into detail about how they complete their calculations. Estimating the burden per query is a start, but we also deserve to see how these impacts add up for billions of users, and how that’s changing over time as companies (hopefully) make their products more efficient. Lawmakers should be mandating these disclosures, and we should be asking for them, too.
That’s not to say there’s absolutely no individual action that you can take. Just as you could meaningfully reduce your individual greenhouse-gas emissions by taking fewer flights and eating less meat, there are some reasonable things that you can do to reduce your AI footprint. Generating videos tends to be especially energy-intensive, as does using reasoning models to engage with long prompts and produce long answers. Asking a chatbot to help plan your day, suggest fun activities to do with your family, or summarize a ridiculously long email has relatively minor impact.
Ultimately, as long as you aren’t relentlessly churning out AI slop, you shouldn’t be too worried about your individual AI footprint. But we should all be keeping our eye on what this industry will mean for our grid, our society, and our planet.
This article is from The Spark, MIT Technology Review’s weekly climate newsletter. To receive it in your inbox every Wednesday, sign up here.
Last week, an American-Israeli company that claims it’s developed proprietary technology to cool the planet announced it had raised $60 million, by far the largest known venture capital round to date for a solar geoengineering startup.
The company, Stardust, says the funding will enable it to develop a system that could be deployed by the start of the next decade, according to Heatmap, which broke the story.
Heat Exchange
MIT Technology Review’s guest opinion series, offering expert commentary on legal, political and regulatory issues related to climate change and clean energy. You can read the rest of the pieces here.
As scientists who have worked on the science of solar geoengineering for decades, we have grown increasingly concerned about the emerging efforts to start and fund private companies to build and deploy technologies that could alter the climate of the planet. We also strongly dispute some of the technical claims that certain companies have made about their offerings.
Given the potential power of such tools, the public concerns about them, and the importance of using them responsibly, we argue that they should be studied, evaluated, and developed mainly through publicly coordinated and transparently funded science and engineering efforts. In addition, any decisions about whether or how they should be used should be made through multilateral government discussions, informed by the best available research on the promise and risks of such interventions—not the profit motives of companies or their investors.
The basic idea behind solar geoengineering, or what we now prefer to call sunlight reflection methods (SRM), is that humans might reduce climate change by making the Earth a bit more reflective, partially counteracting the warming caused by the accumulation of greenhouse gases.
There is strong evidence, based on years of climate modeling and analyses by researchers worldwide, that SRM—while not perfect—could significantly and rapidly reduce climate changes and avoid important climate risks. In particular, it could ease the impacts in hot countries that are struggling to adapt.
The goals of doing research into SRM can be diverse: identifying risks as well as finding better methods. But research won’t be useful unless it’s trusted, and trust depends on transparency. That means researchers must be eager to examine pros and cons, committed to following the evidence where it leads, and driven by a sense that research should serve public interests, not be locked up as intellectual property.
In recent years, a handful of for-profit startup companies have emerged that are striving to develop SRM technologies or already trying to market SRM services. That includes Make Sunsets, which sells “cooling credits” for releasing sulfur dioxide in the stratosphere. A new company, Sunscreen, which hasn’t yet been announced, intends to use aerosols in the lower atmosphere to achieve cooling over small areas, purportedly to help farmers or cities deal with extreme heat.
Our strong impression is that people in these companies are driven by the same concerns about climate change that move us in our research. We agree that more research, and more innovation, is needed. However, we do not think startups—which by definition must eventually make money to stay in business—can play a productive role in advancing research on SRM.
Many people already distrust the idea of engineering the atmosphere—at whichever scale—to address climate change, fearing negative side effects, inequitable impacts on different parts of the world, or the prospect that a world expecting such solutions will feel less pressure to address the root causes of climate change.
Adding business interests, profit motives, and rich investors into this situation just creates more cause for concern, complicating the ability of responsible scientists and engineers to carry out the work needed to advance our understanding.
The only way these startups will make money is if someone pays for their services, so there’s a reasonable fear that financial pressures could drive companies to lobby governments or other parties to use such tools. A decision that should be based on objective analysis of risks and benefits would instead be strongly influenced by financial interests and political connections.
The need to raise money or bring in revenue often drives companies to hype the potential or safety of their tools. Indeed, that’s what private companies need to do to attract investors, but it’s not how you build public trust—particularly when the science doesn’t support the claims.
Notably, Stardust says on its website that it has developed novel particles that can be injected into the atmosphere to reflect away more sunlight, asserting that they’re “chemically inert in the stratosphere, and safe for humans and ecosystems.” According to the company, “The particles naturally return to Earth’s surface over time and recycle safely back into the biosphere.”
But it’s nonsense for the company to claim they can make particles that are inert in the stratosphere. Even diamonds, which are extraordinarily nonreactive, would alter stratospheric chemistry. First of all, much of that chemistry depends on highly reactive radicals that react with any solid surface, and second, any particle may become coated by background sulfuric acid in the stratosphere. That could accelerate the loss of the protective ozone layer by spreading that existing sulfuric acid over a larger surface area.
(Stardust didn’t provide a response to an inquiry about the concerns raised in this piece.)
In materials presented to potential investors, which we’ve obtained a copy of, Stardust further claims its particles “improve” on sulfuric acid, which is the most studied material for SRM. But the point of using sulfate for such studies was never that it was perfect, but that its broader climatic and environmental impacts are well understood. That’s because sulfate is widespread on Earth, and there’s an immense body of scientific knowledge about the fate and risks of sulfur that reaches the stratosphere through volcanic eruptions or other means.
If there’s one great lesson of 20th-century environmental science, it’s how crucial it is to understand the ultimate fate of any new material introduced into the environment.
Chlorofluorocarbons and the pesticide DDT both offered safety advantages over competing technologies, but they both broke down into products that accumulated in the environment in unexpected places, causing enormous and unanticipated harms.
The environmental and climate impacts of sulfate aerosols have been studied in many thousands of scientific papers over a century, and this deep well of knowledge greatly reduces the chance of unknown unknowns.
Grandiose claims notwithstanding—and especially considering that Stardust hasn’t disclosed anything about its particles or research process—it would be very difficult to make a pragmatic, risk-informed decision to start SRM efforts with these particles instead of sulfate.
We don’t want to claim that every single answer lies in academia. We’d be fools to not be excited by profit-driven innovation in solar power, EVs, batteries, or other sustainable technologies. But the math for sunlight reflection is just different. Why?
Because the role of private industry was essential in improving the efficiency, driving down the costs, and increasing the market share of renewables and other forms of cleantech. When cost matters and we can easily evaluate the benefits of the product, then competitive, for-profit capitalism can work wonders.
But SRM is already technically feasible and inexpensive, with deployment costs that are negligible compared with the climate damage it averts.
The essential questions of whether or how to use it come down to far thornier societal issues: How can we best balance the risks and benefits? How can we ensure that it’s used in an equitable way? How do we make legitimate decisions about SRM on a planet with such sharp political divisions?
Trust will be the most important single ingredient in making these decisions. And trust is the one product for-profit innovation does not naturally manufacture.
Ultimately, we’re just two researchers. We can’t make investors in these startups do anything differently. Our request is that they think carefully, and beyond the logic of short-term profit. If they believe geoengineering is worth exploring, could it be that their support will make it harder, not easier, to do that?
David Keith is the professor of geophysical sciences at the University of Chicago and founding faculty director of the school’s Climate Systems Engineering Initiative. Daniele Visioni is an assistant professor of earth and atmospheric sciences at Cornell University and head of data for Reflective, a nonprofit that develops tools and provides funding to support solar geoengineering research.
Solar panels in China’s northern Inner Mongolia region. Rapid growth of clean energy technologies like solar panels and electric vehicles have slightly reduced forecasts of future emissions in places like China and Europe.
Demand for copper is surging, as is pollution from its dirty production processes. The founders of one startup, Still Bright, think they have a better, cleaner way to generate the copper the world needs.
The company uses water-based reactions, based on battery chemistry technology, to purify copper in a process that could be less polluting than traditional smelting. The hope is that this alternative will also help ease growing strain on the copper supply chain.
“We’re really focused on addressing the copper supply crisis that’s looming ahead of us,” says Randy Allen, Still Bright’s cofounder and CEO.
Copper is a crucial ingredient in everything from electrical wiring to cookware today. And clean energy technologies like solar panels and electric vehicles are introducing even more demand for the metal. Global copper demand is expected to grow by 40% between now and 2040.
As demand swells, so do the climate and environmental impacts of copper extraction, the process of refining ore into a pure metal. There’s also growing concern about the geographic concentration of the copper supply chain. Copper is mined all over the world, and historically, many of those mines had smelters on-site to process what they extracted. (Smelters form pure copper metal by essentially burning concentrated copper ore at high temperatures.) But today, the smelting industry has consolidated, with many mines shipping copper concentrates to smelters in Asia, particularly China.
That’s partly because smelting uses a lot of energy and chemicals, and it can produce sulfur-containing emissions that can harm air quality. “They shipped the environmental and social problems elsewhere,” says Simon Jowitt, a professor at the University of Nevada, Reno, and director of the Nevada Bureau of Mines and Geology.
It’s possible to scrub pollution out of a smelter’s emissions, and smelters are much cleaner than they used to be, Jowitt says. But overall, smelting centers aren’t exactly known for environmental responsibility.
So even countries like the US, which have plenty of copper reserves and operational mines, largely ship copper concentrates, which contain up to around 30% copper, to China or other countries for smelting. (There are just two operational ore smelters in the US today.)
Still Bright avoids the pyrometallurgic process that smelters use in favor of a chemical approach, partially inspired by devices called vanadium flow batteries.
In the startup’s reactor, vanadium reacts with the copper compounds in copper concentrates. The copper metal remains a solid, leaving many of the impurities behind in the liquid phase. The whole thing takes between 30 and 90 minutes. The solid, which contains roughly 70% copper after this reaction, can then be fed into another, established process in the mining industry, called solvent extraction and electrowinning, to make copper that’s over 99% pure.
This is far from the first attempt to use a water-based, chemical approach to processing copper. Today, some copper ore is processed with acid, for example, and Ceibo, a startup based in Chile, is trying to use a version of that process on the type of copper that’s traditionally smelted. The difference here is the particular chemistry, particularly the choice to use vanadium.
One of Still Bright’s founders, Jon Vardner, was researching copper reactions and vanadium flow batteries when he came up with the idea to marry a copper extraction reaction with an electrical charging step that could recycle the vanadium.
COURTESY OF STILL BRIGHT
After the vanadium reacts with the copper, the liquid soup can be fed into an electrolyzer, which uses electricity to turn the vanadium back into a form that can react with copper again. It’s basically the same process that vanadium flow batteries use to charge up.
While other chemical processes for copper refining require high temperatures or extremely acidic conditions to get the copper into solution and force the reaction to proceed quickly and ensure all the copper gets reacted, Still Bright’s process can run at ambient temperatures.
One of the major benefits to this approach is cutting the pollution from copper refining. Traditional smelting heats the target material to over 1,200 °C (2,000 °F), forming sulfur-containing gases that are released into the atmosphere.
Still Bright’s process produces hydrogen sulfide gas as a by-product instead. It’s still a dangerous material, but one that can be effectively captured and converted into useful side products, Allen says.
Another source of potential pollution is the sulfide minerals left over after the refining process, which can form sulfuric acid when exposed to air and water (this is called acid mine drainage, common in mining waste). Still Bright’s process will also produce that material, and the company plans to carefully track it, ensuring that it doesn’t leak into groundwater.
The company is currently testing its process in the lab in New Jersey and designing a pilot facility in Colorado, which will have the capacity to make about two tons of copper per year. Next will be a demonstration-scale reactor, which will have a 500-ton annual capacity and should come online in 2027 or 2028 at a mine site, Allen says. Still Bright recently raised an $18.7 million seed round to help with the scale-up process.
How scale up goes will be a crucial test of the technology and whether the typically conservative mining industry will jump on board, UNR’s Jowitt says: “You want to see what happens on an industrial scale. And I think until that happens, people might be a little reluctant to get into this.”
Bill Gates doesn’t shy away or pretend modesty when it comes to his stature in the climate world today. “Well, who’s the biggest funder of climate innovation companies?” he asked a handful of journalists at a media roundtable event last week. “If there’s someone else, I’ve never met them.”
The former Microsoft CEO has spent the last decade investing in climate technology through Breakthrough Energy, which he founded in 2015. Ahead of the UN climate meetings kicking off next week, Gates published a memo outlining what he thinks activists and negotiators should focus on and how he’s thinking about the state of climate tech right now. Let’s get into it.
Are we too focused on near-term climate goals?
One of the central points Gates made in his new memo is that he thinks the world is too focused on near-term emissions goals and national emissions reporting.
So in parallel with the national accounting structure for emissions, Gates argues, we should have high-level climate discussions at events like the UN climate conference. Those discussions should take a global view on how to reduce emissions in key sectors like energy and heavy industry.
“The way everybody makes steel, it’s the same. The way everybody makes cement, it’s the same. The way we make fertilizer, it’s all the same,” he says.
As he noted in one recent essay for MIT Technology Review, he sees innovation as the key to cutting the cost of clean versions of energy, cement, vehicles, and so on. And once products get cheaper, they can see wider adoption.
What’s most likely to power our grid in the future?
“In the long run, probably either fission or fusion will be the cheapest way to make electricity,” he says. (It should be noted that, as with most climate technologies, Gates has investments in both fission and fusion companies through Breakthrough Energy Ventures, so he has a vested interest here.)
He acknowledges, though, that reactors likely won’t come online quickly enough to meet rising electricity demand in the US: “I wish I could deliver nuclear fusion, like, three years earlier than I can.”
He also spoke to China’s leadership in both nuclear fission and fusion energy. “The amount of money they’re putting [into] fusion is more than the rest of the world put together times two. I mean, it’s not guaranteed to work. But name your favorite fusion approach here in the US—there’s a Chinese project.”
Can carbon removal be part of the solution?
I had my colleague James Temple’s recent story on what’s next for carbon removal at the top of my mind, so I asked Gates if he saw carbon credits or carbon removal as part of the problematic near-term thinking he wrote about in the memo.
Gates buys offsets to cancel out his own personal emissions, to the tune of about $9 million a year, he said at the roundtable, but doesn’t expect many of those offsets to make a significant dent in climate progress on a broader scale: “That stuff, most of those technologies, are a complete dead end. They don’t get you cheap enough to be meaningful.
“Carbon sequestration at $400, $200, $100, can never be a meaningful part of this game. If you have a technology that starts at $400 and can get to $4, then hallelujah, let’s go. I haven’t seen that one. There are some now that look like they can get to $40 or $50, and that can play somewhat of a role.”
Will AI be good news for innovation?
During the discussion, I started a tally in the corner of my notebook, adding a tick every time Gates mentioned AI. Over the course of about an hour, I got to six tally marks, and I definitely missed making a few.
Gates acknowledged that AI is going to add electricity demand, a challenge for a US grid that hasn’t seen net demand go up for decades. But so too will electric cars and heat pumps.
I was surprised at just how positively he spoke about AI’s potential, though:
“AI will accelerate every innovation pipeline you can name: cancer, Alzheimer’s, catalysts in material science, you name it. And we’re all trying to figure out what that means. That is the biggest change agent in the world today, moving at a pace that is very, very rapid … every breakthrough energy company will be able to move faster because of using those tools, some very dramatically.”
I’ll add that, as I’ve noted here before, I’m skeptical of big claims about AI’s potential to be a silver bullet across industries, including climate tech. (If you missed it, check out this story about AI and the grid from earlier this year.)
This article is from The Spark, MIT Technology Review’s weekly climate newsletter. To receive it in your inbox every Wednesday, sign up here.
It was October 2024, and Hurricane Helene had just devastated the US Southeast. Representative Marjorie Taylor Greene of Georgia found an abstract target on which to pin the blame: “Yes they can control the weather,” she posted on X. “It’s ridiculous for anyone to lie and say it can’t be done.”
There was no word on who “they” were, but maybe it was better that way.
She was repeating what’s by now a pretty familiar and popular conspiracy theory: that shadowy forces are out there, wielding unknown technology to control the weather and wreak havoc on their supposed enemies. This claim, fundamentally preposterous from a scientific standpoint, has grown louder and more common in recent years. It pops up over and over when extreme weather strikes: in Dubai in April 2024, in Australia in July 2022, in the US after California floods and hurricanes like Helene and Milton. In the UK, conspiracy theorists claimed that the government had fixed the weather to be sunny and rain-free during the first covid lockdown in March 2020. Most recently, the theories spread again when disastrous floods hit central Texas this past July. The idea has even inspired some antigovernment extremists to threaten and try to destroy weather radar towers.
This story is part of MIT Technology Review’s series “The New Conspiracy Age,” on how the present boom in conspiracy theories is reshaping science and technology.
But here’s the thing: While Greene and other believers are not correct, this conspiracy theory—like so many others—holds a kernel of much more modest truth behind the grandiose claims.
Sure, there is no current way for humans to control the weather. We can’t cause major floods or redirect hurricanes or other powerful storm systems, simply because the energy involved is far too great for humans to alter significantly.
But there are ways we can modify the weather. The key difference is the scale of what is possible.
The most common weather modification practice is called cloud seeding, and it involves injecting small amounts of salts or other materials into clouds with the goal of juicing levels of rain or snow. This is typically done in dry areas that lack regular precipitation. Research shows that it can in fact work, though advances in technology reveal that its impact is modest—coaxing maybe 5% to 10% more moisture out of otherwise stubborn clouds.
But the fact that humans can influence weather at all gives conspiracy theorists a foothold in the truth. Add to this a spotty history of actual efforts by governments and militaries to control major storms, as well as other emerging but not-yet-deployed-at-any-scale technologies that aim to address climate change … and you can see where things get confusing.
So while more sweeping claims of weather control are ultimately ridiculous from a scientific standpoint, they can’t be dismissed as entirely stupid.
This all helped make the conspiracytheoriesswirling after the recent Texas floods particularly loud and powerful. Just days earlier, 100 miles away from the epicenter of the floods, in a town called Runge, the cloud-seeding company Rainmaker had flown a single-engine plane and released about 70 grams of silver iodide into some clouds; a modest drizzle of less than half a centimeter of rain followed. But once the company saw a storm front in the forecast, it suspended its work; there was no need to seed with rain already on the way.
“We conducted an operation on July 2, totally within the scope of what we were regulatorily permitted to do,” Augustus Doricko, Rainmaker’s founder and CEO, recently told me. Still, when as much as 20 inches of rain fell soon afterward not too far away, and more than 100 people died, the conspiracy theory machine whirred into action.
As Doricko told the Washington Post in the tragedy’s aftermath, he and his company faced “nonstop pandemonium” on social media; eventually someone even posted photos from outside Rainmaker’s office, along with its address. Doricko told me a few factors played into the pile-on, including a lack of familiarity with the specifics of cloud seeding, as well as what he called “deliberately inflammatory messaging from politicians.” Indeed, theories about Rainmaker and cloud seeding spread online via prominent figures including Greene and former national security advisor Mike Flynn.
Unfortunately, all this is happening at the same time as the warming climate is making heavy rainfall and the floods that accompany it more and more likely. “These events will become more frequent,” says Emily Yeh, a professor of geography at the University of Colorado who has examined approaches and reactions to weather modification around the world. “There is a large, vocal group of people who are willing to believe anything but climate change as the reason for Texas floods, or hurricanes.”
Worsening extremes, increasing weather modification activity, improving technology, a sometimes shady track record—the conditions are perfect for an otherwise niche conspiracy theory to spread to anyone desperate for tidy explanations of increasingly disastrous events.
Here, we break down just what’s possible and what isn’t—and address some of the more colorful reasons why people may believe things that go far beyond the facts.
What we can do with the weather—and who is doing it
The basic concepts behind cloud seeding have been around for about 80 years, and government interest in the topic goes back even longer than that.
The primary practice involves using planes, drones, or generators on the ground to inject tiny particles of stuff, usually silver iodide, into existing clouds. The particles act as nuclei around which moisture can build up, forming ice crystals that can get heavy enough to fall out of the cloud as snow or rain.
“Weather modification is an old field; starting in the 1940s there was a lot of excitement,” says David Delene, a research professor of atmospheric sciences at the University of North Dakota and an expert on cloud seeding. In a US Senate report from 1952 to establish a committee to study weather modification, authors noted that a small amount of extra rain could “produce electric power worth hundreds of thousands of dollars” and “greatly increase crop yields.” It also cited potential uses like “reducing soil erosion,” “breaking up hurricanes,” and even “cutting holes in clouds so that aircraft can operate.”
But, as Delene adds, “that excitement … was not realized.”
Through the 1980s, extensive research often funded or conducted by Washington yielded a much better understanding of atmospheric science and cloud physics, though it proved extremely difficult to actually demonstrate the efficacy of the technology itself. In other words, scientists learned the basic principles behind cloud seeding, and understood on a theoretical level that it should work—but it was hard to tell how big an impact it was having on rainfall.
There is huge variability between one cloud and another, one storm system and another, one mountain or valley and another; for decades, the tools available to researchers did not really allow for firm conclusions on exactly how much extra moisture, if any, they were getting out of any given operation. Interest in the practice died down to a low hum by the 1990s.
But over the past couple of decades, the early excitement has returned.
Cloud seeding can enhance levels of rain and snow
While the core technology has largely stayed the same, severalprojects launched in the US and abroad starting in the 2000s have combined statistical modeling with new and improved aircraft-based measurements, ground-based radar, and more to provide better answers on what results are actually achievable when seeding clouds.
“I think we’ve identified unequivocally that we can indeed modify the cloud,” says Jeff French, an associate professor and head of the University of Wyoming’s Department of Atmospheric Science, who has worked for years on the topic. But even as scientists have come to largely agree that the practice can have an impact on precipitation, they also largely recognize that the impact probably has some fairly modest upper limits—far short of massive water surges.
“There is absolutely no evidence that cloud seeding can modify a cloud to the extent that would be needed to cause a flood,” French says. Floods require a few factors, he adds—a system with plenty of moisture available that stays localized to a certain spot for an extended period. “All of these things which cloud seeding has zero effect on,” he says.
The technology simply operates on a different level. “Cloud seeding really is looking at making an inefficient system a little bit more efficient,” French says.
As Delene puts it: “Originally [researchers] thought, well, we could, you know, do 50%, 100% increases in precipitation,” but “I think if you do a good program you’re not going to get more than a 10% increase.”
Asked for his take on a theoretical limit, French was hesitant—“I don’t know if I’m ready to stick my neck out”—but agreed on “maybe 10-ish percent” as a reasonable guess.
Another cloud seeding expert, Katja Friedrich from the University of Colorado–Boulder, says that any grander potential would be obvious by this point: We wouldn’t have “spent the last 100 years debating—within the scientific community—if cloud seeding works,” she writes in an email. “It would have been easy to separate the signal (from cloud seeding) from the noise (natural precipitation).”
It can also (probably) suppress precipitation
Sometimes cloud seeding is used not to boost rain and snow but rather to try to reduce its severity—or, more specifically, to change the size of individual rain droplets or hailstones.
One of the most prominent examples has beenin parts of Canada, where hailstorms can be devastating; a 2024 event in Calgary, for instance, was the country’s second-most-expensive disaster ever, with over $2 billion in damages.
Insurance companies in Alberta have been working together for nearly three decades on a cloud seeding program that’s aimed at reducing some of that damage. In these cases, the silver iodide or other particles are meant to act essentially as competition for other “embryos” inside the cloud, increasing the total number of hailstones and thus reducing each individual stone’s average size.
Smaller hailstones means less damage when they reach the ground. The insurance companies—which continue to pay for the program—say losses have been cut by 50% since the program started, though scientists aren’t quite as confident in its overall success. A 2023 study published in Atmospheric Research examined 10 years of cloud seeding efforts in the province and found that the practice did appear to reduce potential for damage in about 60% of seeded storms—while in others, it had no effect or was even associated with increased hail (though the authors said this could have been due to natural variation).
Similar techniques are also sometimes deployed to try to improve the daily forecast just a bit. During the 2008 Olympics, for instance, China engaged in a form of cloud seeding aimed at reducing rainfall. As MIT Technology Reviewdetailed back then, officials with the Beijing Weather Modification Office planned to use a liquid-nitrogen-based coolant that could increase the number of water droplets in a cloud while reducing their size; this can get droplets to stay aloft a little longer instead of falling out of the cloud. Though it is tough to prove that it definitively would have rained without the effort, the targeted opening ceremony did stay dry.
So, where is this happening?
The United Nations’ World Meteorological Organization says that some form of weather modification is taking place in “more than 50 countries” and that “demand for these weather modification activities is increasing steadily due to the incidence of droughts and other calamities.”
The biggest user of cloud-seeding tech is arguably China. Following the work around the Olympics, the country announced a huge expansion of its weather modification program in 2020, claiming it would eventually run operations for agricultural relief and other functions, including hail suppression, over an area about the size of India and Algeria combined. Since then, China has occasionally announced bits of progress—including updates to weather modification aircraft and the first use of drones for artificial snow enhancement. Overall, it spends billions on the practice, with more to come.
Elsewhere, desert countries have taken an interest. In 2024, Saudi Arabia announced an expanded research program on cloud seeding—Delene, of the University of North Dakota, was part of a team that conducted experiments in various parts of that country in late 2023. Its neighbor the United Arab Emirates began “rain enhancement” activities back in 1990; this program too has faced outcry, especially after more than a typical year’s worth of rain fell in a single day in 2024, causing massive flooding. (Bloomberg recently published a story about persistent questions regarding the country’s cloud seeding program; in response to the story, French wrote in an email that the “best scientific understanding is still that cloud seeding CANNOT lead to these types of events.” Other experts we asked agreed.)
In the US, a 2024 Government Accountability Office report on cloud seeding said that at least nine states have active programs. These are sometimes run directly by the state and sometimes contracted out through nonprofits like the South Texas Weather Modification Association to private companies, including Doricko’s Rainmaker and North Dakota–based Weather Modification. In August, Doricko told me that Rainmaker had grown to 76 employees since it launched in 2023. It now runs cloud seeding operations in Utah, Idaho, Oregon, California, and Texas, as well as forecasting services in New Mexico and Arizona. And in an answer that may further fuel the conspiracy fire, he added they are also operating in one Middle Eastern country; when I asked which one, he’d only say, “Can’t tell you.”
What we cannot do
The versions of weather modification that the conspiracy theorists envision most often—significantly altering monsoons or hurricanes or making the skies clear and sunny for weeks at a time—have so far proved impossible to carry out. But that’s not necessarily for lack of trying.
The US government attempted to alter a hurricane in 1947 as part of a program dubbed Project Cirrus. In collaboration with GE, government scientists seeded clouds with pellets of dry ice, the idea being that the falling pellets could induce supercooled liquid in the clouds to crystallize into ice. After they did this, the storm took a sharp left turn and struck the area around Savannah, Georgia. This was a significant moment for budding conspiracy theories, since a GE scientist who had been working with the government said he was “99% sure” the cyclone swerved because of their work. Other experts disagreed and showed that such storm trajectories are, in reality, perfectly possible without intervention. Perhaps unsurprisingly, public outrage and threats of lawsuits followed.
It took some time for the hubbub to die down, after which several US government agencies continued—unsuccessfully—trying to alter and weaken hurricanes with a long-running cloud seeding program called Project Stormfury. Around the same time, the US military joined the fray with Operation Popeye, essentially trying to harness weather as a weapon in the Vietnam War—engaging in cloud seeding efforts over Vietnam, Cambodia, and Laos in the late 1960s and early 1970s, with an eye toward increasing monsoon rains and bogging down the enemy. Though it was never really clear whether these efforts worked, the Nixon administration tried to deny them, going so far as to lie to the public and even to congressional committees.
More recently and less menacingly, there have been experiments with Dyn-O-Gel—a Florida company’s super-absorbent powder, intended to be dropped into storm clouds to sop up their moisture. In the early 2000s, the company carried out experiments with the stuff in thunderstorms, and it had grand plans to use it to weaken tropical cyclones. But according to one former NOAA scientist, you would need to drop almost 38,000 tons of it, requiring nearly 380 individual plane trips, in and around even a relatively small cyclone’s eyewall to really affect the storm’s strength. And then you would have to do that again an hour and a half later, and so on. Reality tends to get in the way of the biggest weather modification ideas.
Beyond trying to control storms, there are some other potential weather modification technologies out there that are either just getting started or have never taken off. Swiss researchers have tried to use powerful lasers to induce cloud formation, for example; in Australia, where climate change is imperiling the Great Barrier Reef, artificial clouds created when ship-based nozzles spray moisture into the sky have been used to try to protect the vital ecosystem. In each case, the efforts remain small, localized, and not remotely close to achieving the kinds of control the conspiracy theorists allege.
What is not weather modification—but gets lumped in with it
Further worsening weather control conspiracies is that there is a tendency to conflate cloud seeding and other promising weather modification research with concepts such as chemtrails—a full-on conspiracist fever dream about innocuous condensation trails left by jets—and solar geoengineering, a theoretical stopgap to cool the planet that has been subject to much discussion and modeling research but has never been deployed in any large-scale way.
One controversial form of solar geoengineering, known as stratospheric aerosol injection, would involve having high-altitude jets drop tiny aerosol particles—sulfur dioxide, most likely—into the stratosphere to act essentially as tiny mirrors. They would reflect a small amount of sunlight back into space, leaving less energy to reach the ground and contribute to warming. To date, attempts to launch physical experiments in this space have been shouted down, and only tiny—though stillcontroversial—commercial efforts have taken place.
One can see why it gets lumped in with cloud seeding: bits of stuff, dumped into the sky, with the aim of altering what happens down below. But the aims are entirely separate; geoengineering would alter the global average temperature rather than having measurable effects on momentary cloudbursts or hailstorms. Some research has suggested that the practice could alter monsoon patterns, a significant issue given their importance to much of the world’s agriculture, but it remains a fundamentally different practice from cloud seeding.
Still, the political conversation around supposed weather control often reflects this confusion. Greene, for instance, introduced a bill in July called the Clear Skies Act, which would ban all weather modification and geoengineering activities. (Greene’s congressional office did not respond to a request for comment.) And last year, Tennessee became the first state to enact a law to prohibit the “intentional injection, release, or dispersion, by any means, of chemicals, chemical compounds, substances, or apparatus … into the atmosphere with the express purpose of affecting temperature, weather, or the intensity of the sunlight.” Florida followed suit, with Governor Ron DeSantis signing SB 56 into law in June of this year for the same stated purpose.
Also this year, lawmakers in more than 20 otherstates have also proposed some version of a ban on weather modification, often lumping it in with geoengineering, even though caution on the latter is more widely accepted or endorsed. “It’s not a conspiracy theory,” one Pennsylvania lawmaker who cosponsored a similar bill told NBC News. “All you have to do is look up.”
Oddly enough, as Yeh of the University of Colorado points out, the places where bans have passed are states where weather modification isn’t really happening. “In a way, it’s easy for them to ban it, because, you know, nothing actually has to be done,” she says. In general, neither Florida nor Tennessee—nor any other part of the Southeast—needs any help finding rain. Basically, all weather modification activity in the US happens in the drier areas west of the Mississippi.
Finding a culprit
Doricko told me that in the wake of the Texas disaster, he has seen more people become willing to learn about the true capabilities of cloud seeding and move past the more sinister theories about it.
I asked him, though, about some of his company’s flashier branding: Until recently, visitors to the Rainmaker website were greeted right up top with the slogan “Making Earth Habitable.” Might this level of hypecontribute to public misunderstanding or fear?
He said he is indeed aware that Earth is, currently, habitable, and called the slogan a “tongue-in-cheek, deliberately provocative statement.” Still, in contrast to the academics who seem more comfortable acknowledging weather modification’s limits, he has continued to tout its revolutionary potential. “If we don’t produce more water, then a lot of the Earth will become less habitable,” he said. “By producing more water via cloud seeding, we’re helping to conserve the ecosystems that do currently exist, that are at risk of collapse.”
While other experts cited that 10% figure as a likely upper limit of cloud seeding’s effectiveness, Doricko said they could eventually approach 20%, though that might be years away. “Is it literally magic? Like, can I snap my fingers and turn the Sahara green? No,” he said. “But can it help make a greener, verdant, and abundant world? Yeah, absolutely.”
It’s not all that hard to see why people still cling to magical thinking here. The changing climate is, after all, offering up what’s essentially weaponized weather, only with a much broader and long-term mechanism behind it. There is no single sinister agency or company with its finger on the trigger, though it can be tempting to look for one; rather, we just have an atmosphere capable of holding more moisture and dropping it onto ill-prepared communities, and many of the people in power are doing little to mitigate the impacts.
“Governments are not doing a good job of responding to the climate crisis; they are often captured by fossil-fuel interests, which drive policy, and they can be slow and ineffective when responding to disasters,” Naomi Smith, a lecturer in sociology at the University of the Sunshine Coast in Australia who has written about conspiracy theories and weather events, writes in an email. “It’s hard to hold all this complexity, and conspiracy theorizing is one way of making it intelligible and understandable.”
“Conspiracy theories give us a ‘big bad’ to point the finger at, someone to blame and a place to put our feelings of anger, despair, and grief,” she writes. “It’s much less satisfying to yell at the weather, or to engage in the sustained collective action we actually need to tackle climate change.”
The sinister “they” in Greene’s accusations is, in other words, a far easier target than the real culprit.
Dave Levitan is an independent journalist, focused on science, politics, and policy. Find his work at davelevitan.com and subscribe to his newsletter at gravityisgone.com.
On a gloomy Saturday morning this past May, a few months after entire blocks of Altadena, California, were destroyed by wildfires, several dozen survivors met at a local church to vent their built-up frustration, anger, blame, and anguish. As I sat there listening to one horror story after another, I almost felt sorry for the very polite consultants who were being paid to sit there, and who couldn’t do a thing about what they were hearing.
Hosted by a third-party arbiter at the behest of Los Angeles County, the gathering was a listening session in which survivors could “share their experiences with emergency alerts and evacuations” for a report on how the response to the Eaton Fire months earlier had succeeded and failed.
It didn’t take long to see just how much failure there had been.
This story is part of MIT Technology Review’s series “The New Conspiracy Age,” on how the present boom in conspiracy theories is reshaping science and technology.
After a small fire started in the bone-dry brush of Pasadena’s Eaton Canyon early in the evening of Tuesday, January 7, 2025, the raging Santa Ana winds blew its embers into nearby Altadena, the historically Black and middle-class town just to the north. By Wednesday morning, much of it was burning. Its residents spent the night making frantic, desperate scrambles to grab whatever they could and get to safety.
In the aftermath, many claimed that they received no warning to evacuate, saw no first responders battling the blazes, and had little interaction with official personnel. Most were simply left to fend for themselves.
Making matters worse, while no place is “good” for a wildfire, Altadena was especially vulnerable. It was densely packed with 100-year-old wooden homes, many of which were decades behind on the code upgrades that would have better protected them. It was full of trees and other plants that had dried out during the rain-free winter. Few residents or officials were prepared for the seemingly remote possibility that the fires that often broke out in the mountains nearby would jump into town. As a result, resources were strained to the breaking point, and many homes simply burned freely.
So the people packed into the room that morning had a lot to be angry about. They unloaded their own personal ordeals, the traumas their community had experienced, and even catastrophes they’d heard about secondhand. Each was like a dagger to the heart, met with head-nods and “uh-huhs” from people all going through the same thing.
LA County left us to die because we couldn’t get alerts!
I’m sleeping in my car because I was a renter and have no insurance coverage!
Millions of dollars in aid were raised for us, and we haven’t gotten anything!
Developers are buying up Altadena and pricing out the Black families who made this place!
The firefighting planes were grounded on purpose by Joe Biden so he could fly around LA!
One of these things was definitely not like the others. And I knew why.
Two trains collide
It’s something of a familiar cycle by now: Tragedy hits; rampant misinformation and conspiracy theories follow. Think of the deluge of “false flag” and “staged gun grab” conspiracy theories after mass shootings, or the rampant disinformation around covid-19 and the 2020 election. It’s often even more acute in the case of a natural disaster, when conspiracy theories about what “really” caused the calamity run right into culture-war-driven climate change denialism. Put together, these theories obscure real causes while elevating fake ones, with both sides battling it out on social media and TV.
I’ve studied these ideas extensively, having spent the last 10 years writing about conspiracy theories and disinformation as a journalist and researcher. I’ve covered everything from the rise of QAnon to whether Donald Trump faked his assassination attempt to the alarming rises in antisemitism, antivaccine conspiracism, and obsession with human trafficking. I’ve written three books, testified to Congress, and even written a report for the January 6th Committee. So this has been my life for quite a while.
Still, I’d never lived it. Not until the Eaton Fire.
For a long time, I’d been able to talk about the conspiracy theories without letting them in. Now the disinformation was in the room with me, and it was about my life.
My house, a cottage built in 1925, was one of those that burned back in January. Our only official notification to flee had come at 3:25 a.m., nine hours after the fires started. We grabbed what we could in 10 minutes, I locked our front door, and six hours later, it was all gone. We could have died. Eighteen Altadena residents did die—and all but one were in the area that was warned too late.
Previously in my professional life, I’d always been able to look at the survivors of a tragedy, crying on TV about how they’d lost everything, and think sympathetically but distantly, Oh, those poor people. And soon enough, the conspiracy theories I was following about the incident for work would die down, and then it was no longer in my official purview—I could move on to the next disaster and whatever mess came with it.
Now I was one of those poor people. The Eaton Fire had changed everything about my life. Would it change everything about my work as well? It felt as though two trains I’d managed to keep on parallel tracks had collided.
For a long time, I’d been able to talk about the conspiracy theories without letting them in. Now the disinformation was in the room with me, and it was about my life. And I wondered: Did I have a duty to journalism to push back on the wild thinking—or on this particular idea that Biden was responsible?
Or did I have a duty to myself and my sanity to just stay quiet?
Just true enough
In the days following the Eaton Fire, which coincided with another devastating fire in Los Angeles’ Pacific Palisades neighborhood, the Biden plane storyline was just one of countless rumors, false claims, hoaxes, and accusations about what had happened and who was behind them.
Most were culture-war nonsense or political fodder. I also saw clearly fake AI slop (no, the Hollywood sign was not on fire) and bits of TikTok ephemera that could largely be ignored.
They were from something like an alternate world, one where forest floors hadn’t been “raked” and where incompetent “DEI firefighters” let houses burn while water waited in a giant spigot that California’s governor, Gavin Newsom, refused to “turn on” because he preferred to protect an endangered fish. There were claims that the fires were set on purpose to clear land for the Olympics, or to cover up evidence of human trafficking. Rumors flew that LA had donated all its firefighting money and gear to Ukraine. Some speculated that the fires were started by undocumented immigrants (one was suspected of causing one of the fires but never charged) or “antifa” or Black Lives Matter activists—never mind that one of the most demographically Black areas in the city was wiped out. Or, as always, it was the Jews. In this case, blame fell on a “wealthy Jewish couple” who supposedly owned most of LA’s water and wouldn’t let it go.
These claims originated from the same “just asking questions” influencers who run the same playbook for every disaster. And they spread rapidly through X, a platform where breaking news had been drowned out by hysterical conspiracism.
But many did have elements of truth to them, surrounded by layers of lies and accusations. A few were just true enough to be impossible to dismiss out of hand, but also not actually true.
So, for the record: Biden did not ground firefighting aircraft in Los Angeles.
According to fact-checking by both USA Today and Reuters, Biden flew into Los Angeles the day before the Eaton Fire broke out (which was also the same day that the Palisades Fire started, roughly 30 miles to the west), to dedicate two new national monuments. He left two days later. And while there were security measures in place, including flight restrictions over the area where he was staying, firefighting planes simply had to coordinate with air traffic controllers to cross into the closed-off space.
But when my sort-of neighbor brought up this particular theory that day in May, I wasn’t able to debunk it. For one thing, this was my first time hearing the rumor. But more than that, what could I say that would assuage this man’s anger? And if he wanted to blame Biden for his house burning down, was it really my place to tell him he was wrong—even if he was?
It’s common for survivors of a disaster to be aware of only parts of the story, struggle to understand the full picture, or fail to fully recollect what happened to them in the moment of survival. Once the trauma ebbs, we’re left looking for answers and clarity and someone who knows what’s going on, because we certainly don’t have a clue. Hoaxes and misinformation stem from anger, confusion, and a lack of clear answers to rapidly evolving questions.
I can confirm that it was dizzying. Rumors and hoaxes were going around in my personal circles too, even if they weren’t so lurid and even if we didn’t really believe them. Bits of half-heard news circulated constantly in our group texts, WhatsApp chains, Facebook groups, and in-person gatherings.
There was confusion over who was responsible for the extent of the devastation, genuine anger about purported LA Fire Department budget cuts (though those had not actually happened to the extent conspiracists claimed they did), and fears that a Trump-controlled federal government would abandon California.
Many of the homes and businesses that we heard had burned down hadn’t, and others that we heard had survived were gone. In an especially heartbreaking early bit of misinformation, a local child-care facility shared a Facebook post stating that FEMA was handing out vouchers to pay 90% of your rent for the next three years—except FEMA doesn’t hand out rent vouchers without an application process. I quietly reached out to the source, who took it down.
In this information vacuum, and given my work, friends started asking me questions, and answering them took energy and time I didn’t have. Honestly, the “disinformation researcher” was largely just as clueless as everyone else.
Some of the questions were harmless enough. At one point a friend texted me about a picture from Facebook of a burned Bible page that survived the fire when everything else had turned to ash. It looked too corny and convenient to be real. But I had also found a burned page of Psalms that had survived. I kept it in a ziplock bag because it seemed like the right thing to do. So I told my friend I didn’t know if it was real. I still don’t—but I also still have that ziplock somewhere.
Under attack
As weeks passed, we began to deal with another major issue where truth and misinformation walked together: the reasonable worry that a new president who constantly belittled California would not be willing to provide relief funds.
Recovery depended on FEMA to distribute grants, on the EPA to clear toxic debris, on the Small Business Administration to make loans for rebuilding or repairing homes, on the Army Corps of Engineers to remove the detritus of burned structures, and so much more. How would this square with the new “government efficiency” mandate touting the trillions of dollars and tens of thousands of jobs to be cut from the federal budget?
Nobody knew—including the many kind government employees who spent months in Altadena helping us recover while silently wondering if they were about to be fired.
We dealt with scammers, grifters, squatters, thieves, and even tow truck companies that simply stole cars parked outside burned lots and held them for ransom. After a decade of helping people recognize scams and frauds, there was little I could do when they came for us.
Many residents of Altadena began to have trepidation about accepting government assistance, particularly in its Black community, which already had a well-earned deep distrust of the federal government. Many Black residents felt that their needs and stories were being left behind in the recovery, and feared they would be the first to be priced out of whatever Altadena would become in the future.
Outreach in person became critical. I happened to meet the two-star general in charge of the Army Corps’ effort at lunch one day, as he and his team tried to find outside-the-box ways to engage with exhausted and wary residents. He told me they had tried to use technology—texts, emails, clips designed to go viral—but it was too much information, all apparently delivered in the wrong way. Many of the people they needed to reach, particularly older residents, didn’t use social media, weren’t able to communicate well via text, and were easy prey for sophisticated scammers. It was also easy for the real information to get lost as we got bombarded with communications, including many from hoaxers and frauds.
This, too, wasn’t new to me. Many of the movements I’ve covered are awash in grift and worthless wellness products. I know the signs of a scam and a snake-oil salesman. Still, I watched helplessly as my friends and my community, desperate for help, were turned into chum for cash-hungry sharks opening their jaws wide.
The community was hammered by dodgy contractors and fly-by-night debris removal companies, relief scams and phony grants, and spam calls from “repair companies” and builders. We dealt with scammers, grifters, squatters, thieves, and even tow truck companies that simply stole cars parked outside burned lots and held them for ransom. We were also victimized by looting: Abandoned wires on our lot were stripped for copper, and our neighbor’s unlocked garage was ransacked. After a decade of helping people recognize scams and frauds, there was little I could do when they came for us.
The fear of being conned was easily transmittable, even to me personally. After hearing of friends who couldn’t get a FEMA grant because a previous owner of their home had fraudulently filed an application, we delayed our own appointment with FEMA for weeks. The agency’s call had come so out of the blue that we were convinced it was fake. Maybe my job made me overcautious, or maybe we were just paralyzed by the sheer tonnage of decisions and calls that needed to be handled. Whatever the reason, the fear meant we later had to make multiple calls just to get our meeting rescheduled. It’s a small thing, but when you’re as exhausted and dispirited as we were, there are no small things.
Contractors for the US Army Corps of Engineers remove hazardous materials from a home destroyed in the Eaton Fire, near a burned-out car.
Making all this even more frustrating was that the scammers, the people spinning tales of lasers and endangered fish and antifa, were very much ignoring the reality: that our planet is trying to kill us. While federal officials recently made an arrest in the Palisades Fire, the direct causes of that fire and the nearby Eaton Fire may still take years of investigation and litigation to be fully known. But even now, it can’t be denied to any reasonable degree that climate change worsened the wind that made the fires spread more quickly.
The Santa Ana winds bombarding Southern California were among the worst ever to hit the region. Their ferocity drove the embers well beyond the nominal fire danger line, particularly in Altadena. Many landed in brush left brittle and dead by the decades-long drought plaguing California. And they had even more fuel because the previous two winters had been among the wettest in the region’s recent history. Such rapid swings between wet and dry or cold and hot have become so common around the world that they even have a name: climate whiplash.
There are the conspiracy theory gurus who see this and make money off it, peddling disinformation on their podcasts and livestreams, while blaming everyone and everything but the real reasons. Many of these figures have spent decades railing against the very idea that the climate could change. And if it is changing, they claimed, human consumption and urbanization have nothing to do with it. When faced with a disaster that undeniably reflected climate change at work, their business models—which rely on sales of subscriptions and merchandise—demanded that they just keep denying it was climate change at work.
As more cities and countries deal with “once in a century” climate disasters, I have no doubt that these figures will continue to deflect attention away from human activity. They will use crackpot science, conspiracy theories, politics, and—increasingly—fake videos depicting whatever AI can generate. They will prey on their audiences’ limited understanding of basic science, their inability to perceive how climate and weather differ, and their fears that globalist power brokers will somehow use the weather against them. And their message will spread with little pushback from social media platforms more concerned with virality and shareholder value than truth.
Resisting the temptation
When you cover disinformation and live through an event creating a massive volume of disinformation, it’s like floating outside your body on an operating table as your heart is being worked on, while also being a heart surgeon. I knew I should be trying to help. But I did not have the mental capacity, the time, or, to be honest, the interest in covering what the worst people on the internet were saying about the worst time of my life. I had very real questions about where my family would live. Thinking about my career was not a priority.
But of course, these experiences cannot now be excised from my career. I’ve spent a lot of time talking about how trauma influences conspiracism; see how the isolation and boredom of covid created a new generation of conspiracy theory believers. And now I had my own trauma, and it has been a test of my abilities as a journalist and a thinker to avoid falling into the pit of despair.
At the same time, I have a much deeper understanding of the psychology at work in conspiracy belief. One of the biggest reasons conspiracy theories take off after a disaster is that they serve to make sense out of something that makes no sense. Neighborhoods aren’t supposed to burn down in an era of highly trained firefighters and seemingly fireproof materials. They especially aren’t supposed to burn down in Los Angeles, one of the wealthiest cities on the planet. These were seven- and eight-figure homes going up like matches. There must be a reason, people figured. Someone, or something, must be responsible.
So, as I emerge from the haze to something resembling “normal,” I feel more compassion and understanding for trauma victims who turn to conspiracy theories. Having faced the literal burning down of my life, I get the urge to assign meaning to such a calamity and point a finger at whoever we think did it to us.
Meanwhile, the people of Altadena and Pacific Palisades continue to slowly put our lives and communities back together. The effects of both our warming planet and our disinformation crisis continue to assert themselves every day. It’s still alluring to look for easy answers in outrageous conspiracy theories, but such answers are not real and offer no actual help—only the illusion of help.
It’s equally tempting for someone who researches and debunks conspiracy theories to mock or belittle the people who believe these ideas. How could anyone be so dumb as to think Joe Biden caused the fire that burned down my home?
I kept my mouth shut that day at the meeting in the church, though, again, I can now sympathize much more deeply with something I’d otherwise think completely inane.
But even a journalist who lost his house is still a journalist. So I decided early on that what I really needed to do was keep Altadena in the news. I went on TV and radio, blogged, and happily told our story to anyone who asked. I focused on the community, the impact, the people who would be working to recover long after the national spotlight moved to the next shiny object.
If there is a professional lesson to be taken from this nightmare, it might be that the people caught up in tragedies are exactly that: caught up. And those who believe this nonsense find something of value in it. They find hope and comfort and the reassurance that whoever did this to them will get what they deserve.
I could have done it too, throwing away years of experience to embrace conspiracist nihilism in the face of unspeakable trauma. After all, those poor people going through this weren’t just on my TV.
They were my friends. They were me. They could be anyone.
Mike Rothschild is a journalist and an expert on the growth and impact of conspiracy theories and disinformation. He has written three books, including The Storm Is Upon Us, about the QAnon conspiracy movement, and Jewish Space Lasers, about the myths around the Rothschild banking family. He also is a frequent expert witness in legal cases involving conspiracy theories and has spoken at colleges and conferences around the country. He lives in Southern California.
The new projects would include a Westinghouse reactor, like those used in the recent construction of two units at the Alvin W. Vogtle Electric Generating Plant in Waynesboro, Ga.
MIT Technology Review’s What’s Next series looks across industries, trends, and technologies to give you a first look at the future. You can read the rest of them here.
In the early 2020s, a little-known aquaculture company in Portland, Maine, snagged more than $50 million by pitching a plan to harness nature to fight back against climate change. The company, Running Tide, said it could sink enough kelp to the seafloor to sequester a billion tons of carbon dioxide by this year, according to one of its early customers.
Instead, the business shut down its operations last summer, marking the biggest bust to date in the nascent carbon removal sector.
Its demise was the most obvious sign of growing troubles and dimming expectations for a space that has spawned hundreds of startups over the last few years. A handful of other companies have shuttered, downsized, or pivoted in recent months as well. Venture investments have flagged. And the collective industry hasn’t made a whole lot more progress toward that billion-ton benchmark.
The hype phase is over and the sector is sliding into the turbulent business trough that follows, warns Robert Höglund, cofounder of CDR.fyi, a public-benefit corporation that provides data and analysis on the carbon removal industry.
“We’re past the peak of expectations,” he says. “And with that, we could see a lot of companies go out of business, which is natural for any industry.”
The open question is: If the carbon removal sector is heading into a painful if inevitable clearing-out cycle, where will it go from there?
The odd quirk of carbon removal is that it never made a lot of sense as a business proposition: It’s an atmospheric cleanup job, necessary for the collective societal good of curbing climate change. But it doesn’t produce a service or product that any individual or organization strictly needs—or is especially eager to pay for.
To date, a number of businesses have voluntarily agreed to buy tons of carbon dioxide that companies intend to eventually suck out of the air. But whether they’re motivated by sincere climate concerns or pressures from investors, employees, or customers, corporate do-goodism will only scale any industry so far.
Most observers argue that whether carbon removal continues to bobble along or transforms into something big enough to make a dent in climate change will depend largely on whether governments around the world decide to pay for a whole, whole lot of it—or force polluters to.
“Private-sector purchases will never get us there,” says Erin Burns, executive director of Carbon180, a nonprofit that advocates for the removal and reuse of carbon dioxide. “We need policy; it has to be policy.”
What’s the problem?
The carbon removal sector began to scale up in the early part of this decade, as increasingly grave climate studies revealed the need to dramatically cut emissions and suck down vast amounts of carbon dioxide to keep global warming in check.
Specifically, nations may have to continually remove as much as 11 billion tons of carbon dioxide per year by around midcentury to have a solid chance of keeping the planet from warming past 2 °C over preindustrial levels, according to a UN climate panel report in 2022.
A number of startups sprang up to begin developing the technology and building the infrastructure that would be needed, trying out a variety of approaches like sinking seaweed or building carbon-dioxide-sucking factories.
And they soon attracted customers. Companies including Stripe, Google, Shopify, Microsoft, and others began agreeing to pre-purchase tons of carbon removal, hoping to stand up the nascent industry and help offset their own climate emissions. Venture investments also flooded into the space, peaking in 2023 at nearly $1 billion, according to data provided by PitchBook.
From early on, players in the emerging sector sought to draw a sharp distinction between conventional carbon offset projects, which studies have shown frequently exaggerate climate benefits, and “durable” carbon removal that could be relied upon to suck down and store away the greenhouse gas for decades to centuries. There’s certainly a big difference in the price: While buying carbon offsets through projects that promise to preserve forests or plant trees might cost a few dollars per ton, a ton of carbon removal can run hundreds to thousands of dollars, depending on the approach.
That high price, however, brings big challenges. Removing 10 billion tons of carbon dioxide a year at, say, $300 a ton adds up to a global price tag of $3 trillion—a year.
Which brings us back to the fundamental question: Who should or would foot the bill to develop and operate all the factories, pipelines, and wells needed to capture, move, and bury billions upon billions of tons of carbon dioxide?
The state of the market
The market is still growing, as companies voluntarily purchase tons of carbon removal to make strides toward their climate goals. In fact, sales reached an all-time high in the second quarter of this year, mostly thanks to several massive purchases by Microsoft.
But industry sources fear that demand isn’t growing fast enough to support a significant share of the startups that have formed or even the projects being built, undermining the momentum required to scale the sector up to the size needed by midcentury.
To date, all those hundreds of companies that have spun up in recent years have disclosed deals to sell some 38 million tons of carbon dioxide pulled from the air, according to CDR.fyi. That’s roughly the amount the US pumps out in energy-related emissions every three days.
And they’ve only delivered around 940,000 tons of carbon removal. The US emits that much carbon dioxide in less than two hours. (Not every transaction is publicly announced or revealed to CDR.fyi, so the actual figures could run a bit higher.)
Another concern is that the same handful of big players continue to account for the vast majority of the overall purchases, leaving the health and direction of the market dependent on their whims and fortunes.
Most glaringly, Microsoft has agreed to buy 80% of all the carbon removal purchased to date, according to CDR.fyi. The second-biggest buyer is Frontier, a coalition of companies that includes Google, Meta, Stripe, and Shopify, which has committed to spend $1 billion.
If you strip out those two buyers, the market shrinks from 16 million tons under contract during the first half of this year to just 1.2 million, according to data provided to MIT Technology Review by CDR.fyi.
Signs of trouble
Meanwhile, the investor appetite for carbon removal is cooling. For the 12-month period ending in the second quarter of 2025, venture capital investments in the sector fell more than 13% from the same period last year, according to data provided by PitchBook. That tightening funding will make it harder and harder for companies that aren’t bringing in revenue to stay afloat.
Other companies that have already shut down include the carbon removal marketplace Nori, the direct air capture company Noya and Alkali Earth, which was attempting to use industrial by-products to tie up carbon dioxide.
Still other businesses are struggling. Climeworks, one of the first companies to build direct-air-capture (DAC) factories, announced it was laying off 10% of its staff in May, as it grapples with challenges on several fronts.
The company’s plans to collaborate on the development of a major facility in the US have been at least delayed as the Trump administration has held back tens of millions of dollars in funding granted in 2023 under the Department of Energy’s Regional Direct Air Capture Hubs program. It now appears the government could terminate the funding altogether, along with perhaps tens of billions of dollars’ worth of additional grants previously awarded for a variety of other US carbon removal and climate tech projects.
“Market rumors have surfaced, and Climeworks is prepared for all scenarios,” Christoph Gebald, one of the company’s co-CEOs, said in a previous statement to MIT Technology Review. “The need for DAC is growing as the world falls short of its climate goals and we’re working to achieve the gigaton capacity that will be needed.”
But purchases from direct-air-capture projects fell nearly 16% last year and account for just 8% of all carbon removal transactions to date. Buyers are increasingly looking to categories that promise to deliver tons faster and for less money, notably including burying biochar or installing carbon capture equipment on bioenergy plants. (Read more in my recent story on that method of carbon removal, known as BECCS, here.)
CDR.fyi recently described the climate for direct air capture in grim terms: “The sector has grown rapidly, but the honeymoon is over: Investment and sales are falling, while deployments are delayed across almost every company.”
“Most DAC companies,” the organization added, “will fold or be acquired.”
What’s next?
In the end, most observers believe carbon removal isn’t really going to take off unless governments bring their resources and regulations to bear. That could mean making direct purchases, subsidizing these sectors, or getting polluters to pay the costs to do so—for instance, by folding carbon removal into market-based emissions reductions mechanisms like cap-and-trade systems.
More government support does appear to be on the way. Notably, the European Commission recently proposed allowing “domestic carbon removal” within its EU Emissions Trading System after 2030, integrating the sector into one of the largest cap-and-trade programs. The system forces power plants and other polluters in member countries to increasingly cut their emissions or pay for them over time, as the cap on pollution tightens and the price on carbon rises.
That could create incentives for more European companies to pay direct-air-capture or bioenergy facilities to draw down carbon dioxide as a means of helping them meet their climate obligations.
There are also indications that the International Civil Aviation Organization, a UN organization that establishes standards for the aviation industry, is considering incorporating carbon removal into its market-based mechanism for reducing the sector’s emissions. That might take several forms, including allowing airlines to purchase carbon removal to offset their use of traditional jet fuel or requiring the use of carbon dioxide obtained through direct air capture in some share of sustainable aviation fuels.
Meanwhile, Canada has committed to spend $10 million on carbon removal and is developing a protocol to allow direct air capture in its national offsets program. And Japan will begin accepting several categories of carbon removal in its emissions trading system.
Despite the Trump administration’s efforts to claw back funding for the development of carbon-sucking projects, the US does continue to subsidize storage of carbon dioxide, whether it comes from power plants, ethanol refineries, direct-air-capture plants, or other facilities. The so-called 45Q tax credit, which is worth up to $180 a ton, was among the few forms of government support for climate-tech-related sectors that survived in the 2025 budget reconciliation bill. In fact, the subsidies for putting carbon dioxide to other uses increased.
Even in the current US political climate, Burns is hopeful that local or federal legislators will continue to enact policies that support specific categories of carbon removal in the regions where they make the most sense, because the projects can provide economic growth and jobs as well as climate benefits.
“I actually think there are lots of models for what carbon removal policy can look like that aren’t just things like tax incentives,” she says. “And I think that this particular political moment gives us the opportunity in a unique way to start to look at what those regionally specific and pathway specific policies look like.”
The dangers ahead
But even if more nations do provide the money or enact the laws necessary to drive the business of durable carbon renewal forward, there are mounting concerns that a sector conceived as an alternative to dubious offset markets could increasingly come to replicate their problems.
Various incentives are pulling in that direction.
Financial pressures are building on suppliers to deliver tons of carbon removal. Corporate buyers are looking for the fastest and most affordable way of hitting their climate goals. And the organizations that set standards and accredit carbon removal projects often earn more money as the volume of purchases rises, creating clear conflicts of interest.
Some of the same carbon registries that have long signed off on carbon offset projects have begun creating standards or issuing credits for various forms of carbon removal, including Verra and Gold Standard.
“Reliable assurance that a project’s declared ton of carbon savings equates to a real ton of emissions removed, reduced, or avoided is crucial,” Cynthia Giles, a senior EPA advisor under President Biden, and Cary Coglianese, a law professor at the University of Pennsylvania, wrote in a recent editorial in Science. “Yet extensive research from many contexts shows that auditors selected and paid by audited organizations often produce results skewed toward those entities’ interests.”
Noah McQueen, the director of science and innovation at Carbon180, has stressed that the industry must strive to counter the mounting credibility risks, noting in a recent LinkedIn post: “Growth matters, but growth without integrity isn’t growth at all.”
In an interview, McQueen said that heading off the problem will require developing and enforcing standards to truly ensure that carbon removal projects deliver the climate benefits promised. McQueen added that to gain trust, the industry needs to earn buy-in from the communities in which these projects are built and avoid the environmental and health impacts that power plants and heavy industry have historically inflicted on disadvantaged communities.
Getting it right will require governments to take a larger role in the sector than just subsidizing it, argues David Ho, a professor at the University of Hawaiʻi at Mānoa who focuses on ocean-based carbon removal.
He says there should be a massive, multinational research drive to determine the most effective ways of mopping up the atmosphere with minimal environmental or social harm, likening it to a Manhattan Project (minus the whole nuclear bomb bit).
“If we’re serious about doing this, then let’s make it a government effort,” he says, “so that you can try out all the things, determine what works and what doesn’t, and you don’t have to please your VCs or concentrate on developing [intellectual property] so you can sell yourself to a fossil-fuel company.”
Ho adds that there’s a moral imperative for the world’s historically biggest climate polluters to build and pay for the carbon-sucking and storage infrastructure required to draw down billions of tons of greenhouse gas. That’s because the world’s poorest, hottest nations, which have contributed the least to climate change, will nevertheless face the greatest dangers from intensifying heat waves, droughts, famines, and sea-level rise.
“It should be seen as waste management for the waste we’re going to dump on the Global South,” he says, “because they’re the people who will suffer the most from climate change.”
Correction (October 24): An earlier version of this article referred to Noya as a carbon removal marketplace. It was a direct air capture company.
Rondo Energy just turned on what it says is the world’s largest thermal battery, an energy storage system that can take in electricity and provide a consistent source of heat.
The company announced last week that its first full-scale system is operational, with 100 megawatt-hours of capacity. The thermal battery is powered by an off-grid solar array and will provide heat for enhanced oil recovery (more on this in a moment).
Thermal batteries could help clean up difficult-to-decarbonize sectors like manufacturing and heavy industrial processes like cement and steel production. With Rondo’s latest announcement, the industry has reached a major milestone in its effort to prove that thermal energy storage can work in the real world. Let’s dig into this announcement, what it means to have oil and gas involved, and what comes next.
The concept behind a thermal battery is overwhelmingly simple: Use electricity to heat up some cheap, sturdy material (like bricks) and keep it hot until you want to use that heat later, either directly in an industrial process or to produce electricity.
Rondo’s new system has been operating for 10 weeks and achieved all the relevant efficiency and reliability benchmarks, according to the company. The bricks reach temperatures over 1,000 °C (about 1,800 °F), and over 97% of the energy put into the system is returned as heat.
This is a big step from the 2 MWh pilot system that Rondo started up in 2023, and it’s the first of the mass-produced, full-size heat batteries that the company hopes to put in the hands of customers.
Thermal batteries could be a major tool in cutting emissions: 20% of total energy demand today is used to provide heat for industrial processes, and most of that is generated by burning fossil fuels. So this project’s success is significant for climate action.
There’s one major detail here, though, that dulls some of that promise: This battery is being used for enhanced oil recovery, a process where steam is injected down into wells to get stubborn oil out of the ground.
It can be tricky for a climate technology to show its merit by helping harvest fossil fuels. Some critics argue that these sorts of techniques keep that polluting infrastructure running longer.
When I spoke to Rondo founder and chief innovation officer John O’Donnell about the new system, he defended the choice to work with oil and gas.
“We are decarbonizing the world as it is today,” O’Donnell says. To his mind, it’s better to help an oil and gas company use solar power for its operation than leave it to continue burning natural gas for heat. Between cheap solar, expensive natural gas, and policies in California, he adds, Rondo’s technology made sense for the customer.
Having a willing customer pay for a full-scale system has been crucial to Rondo’s effort to show that it can deliver its technology.
And the next units are on the way: Rondo is currently building three more full-scale units in Europe. The company will be able to bring them online cheaper and faster because of what it’s learned from the California project, O’Donnell says.
The company has the capacity to build more batteries, and do it quickly. It currently makes batteries at its factory in Thailand, which has the capacity to make 2.4 gigawatt-hours’ worth of heat batteries today.
I’ve been following progress on thermal batteries for years, and this project obviously represents a big step forward. For all the promises of cheap, robust energy storage, there’s nothing like actually building a large-scale system and testing it in the field.
It’s definitely hard to get excited about enhanced oil recovery—we need to stop burning fossil fuels, and do it quickly, to avoid the worst impacts of climate change. But I see the argument that as long as oil and gas operations exist, there’s value in cleaning them up.
And as O’Donnell puts it, heat batteries can help: “This is a really dumb, practical thing that’s ready now.”
This article is from The Spark, MIT Technology Review’s weekly climate newsletter. To receive it in your inbox every Wednesday, sign up here.
The crushed-up soda can disappears in a cloud of steam and—though it’s not visible—hydrogen gas. “I can just keep this reaction going by adding more water,” says Peter Godart, squirting some into the steaming beaker. “This is room-temperature water, and it’s immediately boiling. Doing this on your stove would be slower than this.”
Godart is the founder and CEO of Found Energy, a startup in Boston that aims to harness the energy in scraps of aluminum metal to power industrial processes without fossil fuels. Since 2022, the company has worked to develop ways to rapidly release energy from aluminum on a small scale. Now it’s just switched on a much larger version of its aluminum-powered engine, which Godart claims is the largest aluminum-water reactor ever built.
Early next year, it will be installed to supply heat and hydrogen to a tool manufacturing facility in the southeastern US, using the aluminum waste produced by the plant itself as fuel. (The manufacturer did not want to be named until the project is formally announced.)
If everything works as planned, this technology, which uses a catalyst to unlock the energy stored within aluminum metal, could transform a growing share of aluminum scrap into a zero-carbon fuel. The high heat generated by the engine could be especially valuable to reduce the substantial greenhouse-gas emissions generated by industrial processes, like cement production and metal refining, that are difficult to power with electricity directly.
“We invented the fuel, which is a blessing and a curse,” says Godart, surrounded by the pipes and wires of the experimental reactor. “It’s a huge opportunity for us, but it also means we do have to develop all of the systems around us. We’re redefining what even is an engine.”
Engineers have long eyed using aluminum as a fuel thanks to its superior energy density. Once it has been refined and smelted from ore, aluminum metal contains more than twice as much energy as diesel fuel by volume and almost eight times as much as hydrogen gas. When it reacts with oxygen in water or air, it forms aluminum oxides. This reaction releases heat and hydrogen gas, which can be tapped for zero-carbon power.
Liquid metal
The trouble with aluminum as a fuel—and the reason your soda can doesn’t spontaneously combust—is that as soon as the metal starts to react, an oxidized layer forms across its surface that prevents the rest of it from reacting. It’s like a fire that puts itself out as it generates ash. “People have tried it and abandoned this idea many, many times,” says Godart.
Some believe using aluminum as a fuel remains a fool’s errand. “This potential use of aluminum crops up every few years and has no possibility of success even if aluminum scrap is used as the fuel source,” says Geoff Scamans, a metallurgist at Brunel University of London who spent a decade working on using aluminum to power vehicles in the 1980s. He says the aluminum-water reaction isn’t efficient enough for the metal to make sense as a fuel given how much energy it takes to refine and smelt aluminum from ore to begin with: “A crazy idea is always a crazy idea.”
But Godart believes he and his company have found a way to make it work. “The real breakthrough was thinking about catalysis in a different way,” he says: Instead of trying to speed up the reaction by bringing water and aluminum together onto a catalyst, they “flipped it around” and “found a material that we could actually dissolve into the aluminum.”
JAMES DINNEEN
The liquid metal catalyst at the heart of the company’s approach “permeates the microstructure” of the aluminum, says Godart. As the aluminum reacts with water, the catalyst forces the metal to froth and split open, exposing more unreacted aluminum to the water.
The composition of the catalyst is proprietary, but Godart says it is a “low-melting-point liquid metal that’s not mercury.” His dissertation research focused on using a liquid mixture of gallium and indium as the catalyst, and he says the principle behind the current material is the same.
During a visit in early October, Godart demonstrated the central reaction in the Found R&D lab, which after the company’s $12 million seed round last year now fills the better part of two floors of an industrial building in Boston’s Charlestown neighborhood. Using a pair of tongs to avoid starting the reaction with the moisture on his fingers, he placed a pellet of aluminum treated with the secret catalyst in a beaker and then added water. Immediately, the metal began to bubble with hydrogen. Then the water steamed away, leaving behind a frothing gray mass of aluminum hydroxide.
“One of the impediments to this technology taking off is that [the aluminum-water reaction] was just too sluggish,” says Godart. “But you can see here we’re making steam. We just made a boiler.”
From Europa to Earth
Godart was a scientist at NASA when he first started thinking about fresh ways to unlock the energy stored in aluminum. He was working on building aluminum robots that could consume themselves for fuel when roving on Jupiter’s icy moon Europa. But that work was cut short when Congress reduced funding for the mission.
“I was sort of having this little mini crisis where I was like, I need to do something about climate change, about Earth problems,” says Godart. “And I was like, you know—I bet this aluminum technology would be even better for Earth applications.” After completing a dissertation on aluminum fuels at MIT, he started Found Energy in his house in Cambridge in 2022 (the next year, he earned a place on MIT Technology Review’s annual 35 Innovators under 35 list).
Until this year, the company was working at a tiny scale, tweaking the catalyst and testing different conditions within a small 10-kilowatt reactor to make the reaction release more heat and hydrogen more quickly. Then, in January, it began designing an engine that’s 10 times larger, big enough to supply a useful amount of power for industrial processes beyond the lab.
This larger engine took up most of the lab on the second floor. The reactor vessel resembled a water boiler turned on its side, with piping and wires connected to monitoring equipment that took up almost as much space as the engine itself. On one end, there was a pipe to inject water and a piston to deliver pellets of aluminum fuel into the reactor at variable rates. On the other end, outflow pipes carried away the reaction products: steam, hydrogen gas, aluminum hydroxide, and the recovered catalyst. Godart says none of the catalyst is lost in the reaction, so it can be used again to make more fuel.
The company first switched on the engine to begin testing in July. In September, it managed to power it up to its targeted power of 100 kilowatts—roughly as much as can be supplied by the diesel engine in a small pickup truck. In early 2026, it plans to install the 100-kilowatt engine to supply heat and hydrogen to the tool manufacturing facility. This pilot project is meant to serve as the proof of concept needed to raise the money for a 1-megawatt reactor, 10 times larger again.
The initial pilot will use the engine to supply hot steam and hydrogen. But the energy released in the reactor could be put to use in a variety of ways across a range of temperatures, according to Godart. The hot steam could spin a turbine to produce electricity, or the hydrogen could produce electricity in a fuel cell. By burning the hydrogen within the steam, the engine can produce superheated steam as hot as 1,300 °C, which could be used to generate electricity more efficiently or refine chemicals. Burning the hydrogen alone could generate temperatures of 2,400 °C, hot enough to make steel.
Picking up scrap
Godart says he and his colleagues hope the engine will eventually power many different industrial processes, but the initial target is the aluminum refining and recycling industry itself, as it already handles scrap metal and aluminum oxide supply chains. “Aluminum recyclers are coming to us, asking us to take their aluminum waste that’s difficult to recycle and then turn that into clean heat that they can use to re-melt other aluminum,” he says. “They are begging us to implement this for them.”
Citing nondisclosure agreements, he wouldn’t name any of the companies offering up their unrecyclable aluminum, which he says is something of a “dirty secret” for an industry that’s supposed to be recycling all it collects. But estimates from the International Aluminium Institute, an industry group, suggest that globally a little over 3 million metric tons of aluminum collected for recycling currently goes unrecycled each year; another 9 million metric tons isn’t collected for recycling at all or is incinerated with other waste. Together, that’s a little under a third of the estimated 43 million metric tons of aluminum scrap that currently gets recycled each year.
Even if all that unused scrap was recovered for fuel, it would still supply only a fraction of the overall industrial demand for heat, let alone the overall industrial demand for energy. But the plan isn’t to be limited by available scrap. Eventually, Godart says, the hope is to “recharge” the aluminum hydroxide that comes out of the reactor by using clean electricity to convert it back into aluminum metal and react it again.According to the company’s estimates, this “closed loop” approach could supply all global demand for industrial heat by using and reusing a total of around 300 million metric tons of aluminum—around 4% of Earth’s abundant aluminum reserves.
However, all that recharging would require a lot of energy. “If you’re doing that, [aluminum fuel] is an energy storage technology, not so much an energy providing technology,” says Jeffrey Rissman, who studies industrial decarbonization at Energy Innovation, a think tank in California. As with other forms of energy storage like thermal batteries or green hydrogen, he says, that could still make sense if the fuel can be recharged using low-cost, clean electricity. But that will be increasingly hard to come by amid the scramble for clean power for everything from AI data centers to heat pumps.
Despite these obstacles, Godart is confident his company will find a way to make it work. The existing engine may already be able to squeeze out more power from aluminum than anticipated. “We actually believe this can probably do half a megawatt,” he says. “We haven’t fully throttled it.”
James Dinneen is a science and environmental journalist based in New York City.
Flowers play a key role in most landscapes, from urban to rural areas. There might be dandelions poking through the cracks in the pavement, wildflowers on the highway median, or poppies covering a hillside. We might notice the time of year they bloom and connect that to our changing climate. Perhaps we are familiar with their cycles: bud, bloom, wilt, seed. Yet flowers have much more to tell in their bright blooms: The very shape they take is formed by local and global climate conditions.
The form of a flower is a visual display of its climate, if you know what to look for. In a dry year, its petals’ pigmentation may change. In a warm year, the flower might grow bigger. The flower’s ultraviolet-absorbing pigment increases with higher ozone levels. As the climate changes in the future, how might flowers change?
Anthocyanins are red or indigo pigments that supply antioxidants and photoprotectants, which help a plant tolerate climate-related stresses such as droughts.
An artistic research project called Plant Futures imagines how a single species of flower might evolve in response to climate change between 2023 and 2100—and invites us to reflect on the complex, long-term impacts of our warming world. The project has created one flower for every year from 2023 to 2100. The form of each one is data-driven, based on climate projections and research into how climate influences flowers’ visual attributes.
More ultraviolet pigment protects flowers’ pollen against increasing ozone levels.
MARCO TODESCO
Under unpredictable weather conditions, the speculative flowers grow a second layer of petals. In botany, a second layer is called a “double bloom” and arises from random mutations.
COURTESY OF ANNELIE BERNER
Plant Futures began during an artist residency in Helsinki, where I worked closely with the biologist Aku Korhonen to understand how climate change affected the local ecosystem. While exploring the primeval Haltiala forest, research collaborator Monika Seyfried and I learned of the Circaea alpina, a tiny flower that was once rare in that area but has become more common as temperatures have risen in recent years. Yet its habitat is delicate: The plant requires shade and a moist environment, and the spruce population that provides those conditions is declining in the face of new forest pathogens. I wondered: What if the Circaea alpina could survive in spite of climate uncertainty? If the dark, shaded bogs turn into bright meadows and the wet ground dries out, how might the flower adapt in order to survive? This flower’s potential became the project’s grounding point.
The author studying historical Circaea samples in the Luomus Botanical Collections.
COURTESY OF ANNELIE BERNER
Outside the forest, Seyfried and I met with botanical experts in the Luomus Botanical Collections. I studied samples of Circaea flowers from as far back as 1906 and researched historical climate conditions in an attempt to understand how flower size and color related to a year’s temperature and precipitation patterns.
I researched how other flowering plants respond to changes to their climate conditions and wondered how the Circaea would need to adapt to thrive in a future world. If such changes happened, what would the Circaea look like in 2100?
We designed the future flowers through a combination of data-driven algorithmic mapping and artistic control. I worked with the data artist Marcin Ignac from Variable Studio to create 3D flowers whose appearance was connected to climate data. Using Nodes.io, we made a 3D model of the Circaea alpina based on its current morphology and then mapped how those physical parameters might shift as the climate changes. For example, as the temperature rises and precipitation decreases in the data set, the petal color shifts toward red, reflecting how flowers protect themselves with an increase in anthocyanins. Changes in temperature, carbon dioxide levels, and precipitation rates combine to affect the flowers’ size, density of veins, UV pigments, color, and tendency toward double bloom.2025: Circaea alpina is ever so slightly larger than usual owing to a warmer summer, but it is otherwise close to the typical Circaea flower in size, color, and other attributes.2064: We see a bigger flower with more petals, given an increase in carbon dioxide levels and temperature. The bull’s-eye pattern, composed of UV pigment, is bigger and messier because of an increase in ozone and solar radiation. A second tier of petals reflects uncertainty in the climate model.2074: The flower becomes pinker, an antioxidative response to the stress of consecutive dry days and higher temperatures. Its size increases, primarily because of higher levels of carbon dioxide. The double bloom of petals persists as the climate model’s projections increase in uncertainty.2100: The flower’s veins are densely packed, which could signal appropriation of a technique leaves use to improve water transport during droughts. It could also be part of a strategy to attract pollinators in the face of worsening air quality that degrades the transmission of scents.2023—2100: Each year, the speculative flower changes. Size, color, and form shift in accordance with the increased temperature and carbon dioxide levels and the changes in precipitation patterns.
In this 10-centimeter cube of plexiglass, the future flowers are “preserved,” allowing the viewer to see them in a comparative, layered view.
COURTESY OF ANNELIE BERNER
Based in Copenhagen, Annelie Berner is a designer, researcher, teacher, and artist specializing in data visualization.
Used in aviation, book and claim offers companies the ability to financially support the use of SAF even when it is not physically available at their locations.
As companies that ship goods by air or provide air freight related services address a range of climate goals aiming to reduce emissions, the importance of sustainable aviation fuel (SAF) couldn’t be more pronounced. In its neat form, SAF has the potential to reduce life cycle GHG emissions by up to 80% compared to conventional jet fuel.
In this exclusive webcast, leaders discuss the urgency for reducing air freight emissions for freight forwarders and shippers, and reasons why companies should use SAF. They also explain how companies can best make use of the book and claim model to support their emissions reduction strategies.
Learn from the leaders
What book and claim is and how companies can use it
Why SAF use is so important
How freight-forwarders and shippers can both potentially utilise and contribute to the benefits of SAF
Featured speakers
Raman Ojha, President, Shell Aviation. Raman is responsible for Shell’s global aviation business, which supplies fuels, lubricants, and lower carbon solutions, and offers a range of technical services globally. During almost 20 years at Shell, Raman has held leadership positions across a variety of industry sectors, including energy, lubricants, construction, and fertilisers. He has broad experience across both matured markets in the Americas and Europe, as well as developing markets including China, India, and Southeast Asia.
Bettina Paschke, VP ESG Accounting, Reporting & Controlling, DHL Express. Bettina Paschke leads ESG Accounting, Reporting & Controlling, at DHL Express a division of DHL Group. In her role, she is responsible for ESG, including, EU Taxonomy Reporting, and Carbon Accounting. She has more than 20 years’ experience in Finance. In her role she is driving the Sustainable Aviation Fuel agenda at DHL Express and is engaged in various industry initiatives to allow reliable book and claim transactions.
Christoph Wolff, Chief Executive Officer at Smart Freight Centre. Christoph Wolff is currently the Chief Executive Officer at Smart Freight Centre, leading programs focused on sustainability in freight transport. Prior to this role, Christoph served as the Senior Advisor and Director at ACME Group, a global leader in green energy solutions. With a background in various industries, Christoph has held positions such as Managing Director at European Climate Foundation and Senior Board Advisor at Ferrostaal GmbH. Christoph has also worked at Novatec, Solar Millennium AG, DB Schenker, McKinsey & Company, and served as an Assistant Professor at Northwestern University – Kellogg School of Management. Christoph holds multiple degrees from RWTH Aachen University and ETH Zürich, along with ongoing executive education at the University of Michigan.
This discussion is presented by MIT Technology Review Insights in association with Avelia. Avelia is a Shell owned solution and brand that was developed with support from Amex GBT, Accenture and Energy Web Foundation. The views from individuals not affiliated with Shell are their own and not those of Shell PLC or its affiliates. Cautionary note | Shell Global
This content was produced by Insights, the custom content arm of MIT Technology Review. It was not written by MIT Technology Review’s editorial staff. It was researched, designed, and written by human writers, editors, analysts, and illustrators. AI tools that may have been used were limited to secondary production processes that passed thorough human review.
Not all offerings are available in all jurisdictions. Depending on jurisdiction and local laws, Shell may offer the sale of Environmental Attributes (for which subject to applicable law and consultation with own advisors, buyers might be able to use such Environmental Attributes for their own emission reduction purposes) and/or Environmental Attribute Information (pursuant to which buyers are helping subsidize the use of SAF and lower overall aviation emissions at designated airports but no emission reduction claims may be made by buyers for their own emissions reduction purposes). Different offerings have different forms of contracts, and no assumptions should be made about a particular offering without reading the specific contractual language applicable to such offering.