The first huge renewable-energy revolution—the one that dotted the US with hydroelectric dams and ultimately made power ubiquitous in every American home—started at a bankruptcy sale. In 1877, Jacob Schoellkopf went to an auction for a waterway owned by the Niagara Falls Canal Company. A succession of entrepreneurs had tried and failed to harness the ferocious power of the falling water. That night he told his wife, “Momma, I bought the ditch.”
Two years later, Thomas Edison made a light bulb that glowed for 40 continuous hours in his lab. Three years after that, Schoellkopf installed a generator below the falls to power 16 electric lamps above it.
Those first lights wowed tourists and gave people a sense of the powerful waterfall’s potential. But they didn’t reveal how to generate power that could travel long distances, never mind how to make a profit on it. For the next 14 years investors tried to harness the falls (one engineer proposed building a long tunnel beneath them to feed 38 vertical shafts with turbines that could power factories above), but everyone failed. It took Nikola Tesla’s invention of an efficient polyphase generator to transmit those electrons—and the sale of his patents to Westinghouse—to make hydro viable. In 1896 the “Cathedral of Power” started sending watts to the towns of Niagara and Buffalo, right next door.
But this 17-year sprint from the lab to Buffalo was, in a sense, only a proof of concept, what we might now call a demonstration project. It would be another quarter century before even a third of US homes got electricity. In 1905 there was a political backlash against the idea of diverting the public beauty of the falls for the gain of private companies. “Shall We Make a Coal-Pile of Niagara?” asked the Ladies’ Home Journal, sparking one of the first examples of federal legislation focused on the environment. The politics of power began to shift, as people realized how important it was; in 1912 a federal report noted that 60 percent of hydropower in the US was controlled by just two companies. In 1931, New York governor Franklin Delano Roosevelt created a state power authority that could act as a check on private monopolies, announcing that he was giving “back to the people the waterpower which is theirs.” It would take FDR’s national power initiatives to eventually wire all of rural America. Today Niagara Falls creates enough electricity to power 3.8 million homes, and hydro plants provide 16 percent of the world’s electricity.
Niagara’s long timeline is worth remembering as we get serious about reducing carbon emissions fast enough to keep average global temperature increases below 2 degrees by 2100. To accomplish this we will need to push many techno-Niagaras from the light-bulb-in-the-lab stage to full deployment around the world—within just a few decades. These days we tend to think of such energy revolutions—with all of their attendant bankruptcies and political backlashes—as impossible tasks. Or only for dreamers. But this is not true. In fact the United States has led such sweeping technological revolutions before, and we could do it again. But we’ll need to dismantle some old myths and ideologies about who bankrolls innovation and who benefits.
Americans are, in general, complacent about innovation, assuming the solution to our energy problems is one brilliant new mind away. A few more Elon Musks and we’ll be saved. But it’s been obvious for nearly a decade that the private sector isn’t getting us where we need to go. In 2011 there were 1,256 patents filed for global-warming-related energy technologies; by 2018, only 285 were filed. And US venture capitalists, long seen as the drivers of global innovation, have been eschewing the cleantech sector since their investments peaked at over $7.5 billion in 2011. They invested less than $2.4 billion in 2019. Today’s VCs, with their focus on quick profitability, would see the transformative powers of Niagara Falls as nothing more than a bankrupt ditch.
Nor can we rely on the traditional high-carbon energy companies that sell oil, gas, and electricity to lead us into a clean energy transition because, in addition to bankrolling opposition to climate change, they are heavily vested in an infrastructure and business model that stands to be overturned by new technology.
It’s increasingly clear, then, that the kind of fast, transformative technology development and adoption we need will require the government to take the lead.
Right about now, people usually start to mouth the phrase moon shot—in homage to the taxpayer-funded innovation binge that started in 1961 and ended in 1972, organized around the discrete goal of putting a man on the moon and bringing him safely home. Whenever Americans pine for new ways to solve problems, that’s the go-to nomenclature: Google X wanted a moon shot; the NIH has a cancer moon shot; environmentalists and labor created an “Apollo Alliance” in 2003. Little wonder that the moon shot is so attractive in hindsight: It had a single, clearly stated goal; it united Americans during a decade of upheaval; it resulted in one giantly successful step; and it spun off other advances. But in a way, the cult of the moon shot actually understates what government can do. A decade, the lesson seems to be, is about as long as the American public can stand to bankroll its geeks and wizardesses to make gizmos.
To get to net-zero emissions, though, we need not one decade but many. And the task of truly reducing emissions while coping with a changing planet is vastly murkier than depositing a human on a rock. First we need to mightily improve the nascent technologies underlying electric vehicles, energy efficiency, and advanced renewable energy storage—and get them into general use. At the same time, we have to foster technology that is in earlier stages (like carbon capture, fuel cells, and sustainable biofuels) out of labs and into large-scale demonstration projects where they can be tested and tinkered with until they can be scaled up. Finally, we need to explore and develop tech that’s barely visible on the horizon, like new types of nuclear reactors and methods of capturing carbon directly from the air. And as we go along, each technology will bring its own challenges, while new crises arise, literally, from the atmosphere. We have to be ready for that.
We’re talking about at least 30 years of taxpayer-led investment in innovation, probably more. This is no moon shot; this is an entire cold war. In fact, the Cold War itself is a very useful and instructive precedent for anyone who wants to bring the full might of the US government to bear on a warming planet. “The planners who started to contend with the Cold War didn’t know what it was and how long it would take, and yet they committed resources to dealing with it,” says Daniel Sarewitz of the Consortium for Science, Policy & Outcomes at Arizona State University. “It’s similar to the emergent issue of the climate—where we’ll ultimately manage it with many technologies rather than solving it with a single one.”
An era of government-led technological innovation, modeled after the loose bipartisan consensus over the strategy of containment that guided us in the Cold War, would be equal to the task of cooling the planet. Not only that, but the complex federal machinery that delivered some of the greatest innovations of the mid-20th century is still lying around, waiting to be fired up and duly aimed.
Immediately after World War II ended, funding for military technology fell dramatically. Nuclear weapon and jet engine development slowed, while US troops in South Korea, outfitted with obsolete weapons, suffered defeats that inspired the military to get directly involved in research. Vannevar Bush, who had been the director of the wartime Office of Scientific Research and Development, argued in a 1945 report titled “Science—The Endless Frontier” that American peace and prosperity required significant government investment in innovation. Bush advocated heavy spending on curiosity-driven science in university labs as well as funding for federal laboratories like those that had been part of the Manhattan Project. Under the existential threat of nuclear war, US leaders embraced Bush’s vision of science, combined with military development of technology, as a path forward in an uncertain time.
The Cold War inspired the creation of several key publicly funded organizations, many of them military, that have reconfigured the nation’s economy, and the world’s, through a series of transformative technology booms. The Defense Advanced Research Projects Agency (Darpa), which was founded by President Eisenhower in 1958 as a response to Sputnik, has been credited with laying the groundwork for the internet, Wi-Fi, supercomputing, desktop computing, GPS, robotics, artificial intelligence, drones, and voice recognition. Through the ’50s and ’60s, the Department of Defense learned how to best use its position as a primary customer to spur industries to create better and more innovative technologies—a process that has brought to market three of the most important energy technologies of the past century: nuclear power, sophisticated and efficient turbines, and solar photovoltaic tech. (The depth of the military’s influence on the US economy is so profound that, to understand its role, I found myself reading an economics book titled Is War Necessary for Economic Growth? The answer was, with some qualifications, yes.)
As Arati Prabhakar, who led Darpa from 2012 to 2017, explained to me, “We are very good at innovating in this country for the things that we set out to innovate for in 1945: national security, which led to changes in information technology, and health, which became biomedicine. And I don’t think it’s an accident that that’s what we’re good at now—because those were precisely the things that we focused on.”
The military has been successful at creating tech for a few reasons: As Prabhakar suggested, it sets priorities for problems it wishes to solve and then pursues multiple technological pathways. What’s more, it perseveres without caring excessively about costs.
Take Darpa itself. According to MIT’s Bill Bonvillian, who has studied the agency’s role in innovation for more than two decades, Darpa’s greatest advantage is its uniquely nimble, collaborative, mission-driven culture, where managers move back and forth between research and application, creating communities among researchers and industry. “In most R&D agencies, the critical decision is awarding the grant,” he says. “In Darpa, the managers award the grants and then move into the researcher’s home.”
In addition to providing what economists call the “technology push” by funding foundational science through Darpa, the military also excels at creating a “demand pull” by partnering with industry to develop the products, mounting large-scale demonstration projects, and being an early-adopting customer with deep pockets. Many of these innovations have made their way into civilian life.
Every time you board a 737, for example, you are experiencing the result of the Army’s demand pull in the world economy. In the early ’60s, Army and NASA engineers set out on a program of basic and applied research to radically change the way they understood jet engines, in a bid to make them much more energy-efficient. As researcher John Alic has documented, they went deep into the physics of the machines, studying the way air flowed over the blades and how metals behave at high temperatures. They funded basic research on rare earth magnets at university labs and developed ceramic coatings that are now standard for high-temperature uses. With the Army spending billions of dollars on research and then purchasing expensive products that spun out of it—like Apache helicopter blades—not only did jet engines become more efficient and reliable, the private sector adopted and built off of the new technologies to create civilian products—like that passenger aircraft, the turbines in gas-fired power plants, and even the magnets that run the electric windows in your car.
Sign Up Today
Sign up for our Longreads newsletter for the best features, ideas, and investigations from WIRED.
The US has wallowed in the politics of climate despair since the late 1990s, so it may be hard to accept what I’m going to say next: We could fairly quickly adapt our existing federal technology innovation system to work on the tech we need to decarbonize energy at a scale that would have real impact. (What’s more, by shifting innovation from military applications to civilian ones, we’d be building a country where war is no longer necessary for economic growth. But that’s a different conversation.)
As it happens, we’ve already successfully cloned Darpa to create a civilian entity that works exclusively on energy and the climate. In 2009, Congress budgeted $400 million to the Advanced Research Projects Agency-Energy (Arpa-E) at the Department of Energy. It even staffed it with former employees of Darpa. Though it has a small budget (these days, one-tenth of Darpa’s), Arpa-E is widely considered a success. By 2018, the agency had funded 660 early-stage energy innovation projects, including innovative batteries that could be used to back up renewable energy on the grid, floating offshore wind technology, and new systems for maintaining advanced nuclear reactors.
It would not be hard to combine Arpa-E’s early-stage development work with the Department of Defense’s knack for scaling technology into practical uses. Dorothy Robyn, a former deputy undersecretary of defense who is now a senior fellow at Boston University’s Institute for Sustainable Energy, argues that we should significantly increase funding to Arpa-E and then have it work with Darpa and the Department of Defense to mount large-scale projects to develop things like microgrids, advanced solar photovoltaic cells, and energy storage facilities at military bases and other properties. “It’s low-hanging fruit,” she told me.
So how could this happen? First, the president or Congress would need to define carbon as an existential threat and make decarbonization a general mission. Then, task the military and the national labs—and many other government agencies—with committing resources to the rapid development and deployment of technology to accomplish the mission together.
Of course, getting government entities involved in bringing technology to market will require them to change the way they approach their work. Consider another venerable Cold War asset, the country’s network of 17 national labs, which are part of the DOE. While several of the national labs have programs that match scientists with money, mentors, and expertise to form startups, in general the labs now focus on basic research and try to stay above the fray of commerce. As Prabhakar observes, “If you wanted to make a caricature of it, you’d say people [at the labs] are afraid to actually have an impact. Over time the mission of a lot of public funding and basic research has been just to focus on publication, citations—which are important but don’t suffice to meet societal needs.”
Another candidate in need of change is American industry; though Cold War behemoths like IBM, McDonnell Douglas, and General Dynamics once developed everything from semiconductors to jet engines, profiting from the process, they are no longer on the forefront of innovation. According to Ilan Gur—former Arpa-E program manager and the current head of nonprofit Activate, which offers fellowships at federally funded labs to cleantech scientists to start businesses—“Today’s industry is not incentivized by Wall Street to do all the speculative work to develop those technologies themselves.” Gur supports dramatically increasing funding to Arpa-E, but he—and others—also point out that we’ll need to entice big manufacturers to jump in as well. “The force multipliers come from engaging industry—you’re not going to win a lot of these games by just sprinkling budget dust in at the early stages.”
The world is getting warmer, the weather is getting worse. Here’s everything you need to know about what humans can do to stop wrecking the planet.
By Katie M. Palmer and Matt Simon
As powerful as government capital can be in a time of international urgency, there are also two relatively new sources of “budget dust” that could help carry risky but necessary technology over the ditch and into the market.
Real Life. Real News. Real Voices
Help us tell more of the stories that matterBecome a founding member
The first is really a reinvention of another Cold War idea: venture capital. The original VC company, American Research and Development Corporation, was formed in 1946 to invest in “noble” technology created by the war effort. When that fund invested $200,000 in a firm that made machines to deliver radiation to cancerous tumors, one of the VC founders, MIT president Karl Compton, observed that they didn’t expect the company to make money, but the “ethics of the thing and the human qualities of treating cancer” made up for that. Then, almost accidentally, the company—High Voltage Engineering Company—turned out to be worth $1.8 million when it went public in 1955. The VC made even more money when another investment, Digital Equipment Corporation, went public in 1966. Soon what had been “noble” capital started to become moneymaking capital; tax laws were changed, pension funds jumped in, and venture funds became a giant profit-seeking asset class that proudly compared itself to a shark.
There is now a broad movement afoot to return the venture capital model to its philanthropic roots, specifically where climate change is concerned.
Bill Gates’ Breakthrough Energy Ventures and, more recently, Jeff Bezos’ Earth Fund are both multibillion-dollar philanthropic entities that act, essentially, like very risk-tolerant angel investors. There are others, too, including Arati Prabhakar’s Actuate, which plans to use philanthropic funds to do interdisciplinary research with a social payoff. The Cambridge, Massachusetts-based Prime Impact Fund, which draws from multiple sources of philanthropic wealth, issues long-term loans to startups that promise to launch “gigaton-scale emissions projects” like extracting lithium sustainably, pulling carbon dioxide from the atmosphere, and heating and cooling in environmentally friendly ways. If a single investment yields returns, those can be reinvested or contributed to another philanthropic cause. If investments don’t work out (they are high-risk, so of course some go bust), the contribution will be viewed much the same as a traditional grant.
If the idea of giving billionaires tax breaks while they decide which climate technologies get angel funding makes you nervous, there is a more democratic option—green banks, which use public capital as seed money to make low-interest loans to companies with emissions-reducing technology. Green banks have some bipartisan support, and a recent House proposal suggested endowing a nonprofit national climate bank with $35 billion in federal funds. Reed Hundt, founder of the Green Capital Coalition, says that such a public investment would be leveraged to borrow $350 billion, which could then be loaned to projects that have the potential to reduce carbon emissions significantly. By reinvesting this money as the loans are paid off, he says, the scheme could put $1 trillion into early-stage technology over the next 30 years.
Green banks could be coupled with other public initiatives like government-backed green bonds, or even something like war bonds, which would allow individual investors to put their retirement money to work supporting an environment they wouldn’t mind growing old in. Hundt sees green capital expansively: “The goal here is to have renewables provide cheap and clean power to 100 percent of humanity really, really quickly, while at the same time shoving the carbon industry into the past.”
This sounds wonderful, doesn’t it? We already have the tools, we have the people and the programs, we even have a decent amount of capital. So why aren’t we already making the future happen faster and shoving carbon into the past?
It’s ironic, but in many ways all these Cold War institutions and the relatively exotic new sources of philanthropic and green capital are more shovel-ready than the mind of the American voter. What’s wrong with us? The answer, I think, is that we have been conditioned to be passive about technological growth, and after years of arguing over whether climate change is occurring, we’ve also become resigned to the idea that tackling it in a robust way is politically impossible. It is time for us to reexamine these myths—and also to design a new innovation system that benefits more people more directly.
Blame a legacy of Cold War secrecy, as well as a much more recent dogma that relentlessly celebrates individual entrepreneurs. The economist Mariana Mazzucato, director of the Institute for Innovation and Public Purpose at University College London, has spent years studying the way the US government uses taxpayer funding for innovation. She points out that the system has long socialized the risks of bringing technology to market while privatizing the gains when entrepreneurs such as Steve Jobs applied that technology to consumer goods. In other words, a lot of innovative tech that has made some people rich was built on public investment, but taxpayers have no idea they underwrote the whole thing.
Mazzucato suggests that taxpayer-funded innovation should instead put us in control—by including ways for citizens to influence policy, transparency in funding, and ways for the funders—us—to profit. And politicians should start talking about taxpayer investments in technology as a source of pride. “You’re part of this massive shift in global capitalism, greening production, distribution, consumption patterns—it kind of makes you happy to be alive!”
But what about the politics? For the past 25 years, the challenge has been getting the political system to simply buy into the reality of climate change. Because that was a long and exhausting war to which many people dedicated their careers, it’s still the struggle that transfixes the people who write and worry about the environment. Meanwhile the climate itself has moved on, and soon the discussion will too. It’s already happening: Republicans have begun proposing carbon taxes on the floor of Congress. As the future unfolds with one Australian fire or Indonesian flood after another, magnified by social media, investing in climate technology will become a point of bipartisan agreement.
Anyway, as Niagara Falls showed, technology changes politics almost faster than it changes the world. Building a better, cheaper solar panel could accommodate any number of ideological positions, from support for a Green New Deal, a wonk’s preference for cap and trade, a Republican carbon tax, a more libertarian turn toward local microgrids. Or, for that matter, a 21st-century FDR could reincarnate and entirely nationalize the electrical grid. We should anticipate these shifts by deploying technology in ways that give more power to the very people who have funded its development.
When we do begin to decarbonize our world, there will be new challenges: We’ll need to get used to the very weirdness and randomness of faster innovation—the notion that what starts with light bulbs over a waterfall winds up sparking an environmental movement and handheld computers filled with cat memes. This is what Activate’s Ilan Gur calls “the stochastic nature of innovation”—the sheer unpredictability of what happens when a technology hits the complex system that includes markets, global societies, and the planet’s climate. “But the one thing we know is that if you don’t define the horizon of change that you want to see, and you don’t plant those seeds of innovation, then you won’t ever get there.”
When you buy something using the retail links in our stories, we may earn a small affiliate commission. Read more about how this works.
LISA MARGONELLI (@LisaMargonelli) is the author of, most recently, Underbug: An Obsessive Tale of Termites and Technology.
This article appears in the April issue. Subscribe now.
Let us know what you think about this article. Submit a letter to the editor at firstname.lastname@example.org.
Subscribe to the newsletter news
We hate SPAM and promise to keep your email address safe