Morningside Campus Status Updates

Current Access Level “I” – ID Only: CUID holders, alumni, and approved guests only

News

Explore our expert insights and analysis in leading energy and climate news stories.

Energy Explained

Get the latest as our experts share their insights on global energy policy.

Podcasts

Hear in-depth conversations with the world’s top energy and climate leaders from government, business, academia, and civil society.

Events

Find out more about our upcoming and past events.

Podcast
Columbia Energy Exchange

Balancing AI’s Growing Energy Demands

Guests

Transcript

Jared Dunnmon: If it’s going to be an option for the US, you actually have to invest in it because other countries are driving down the cost curve, and if you’re not taking advantage of the economic mass that’s pumping into building reactors, certainly domestically, but around the world as well, you’re going to lose from a cost perspective eventually. And the data centers are then going to go where the power is cheap. So there’s a long game here that’s becoming apparent.

Bill Loveless: The artificial intelligence boom is fueling a massive uptick in energy demand globally. A Goldman Sachs report from earlier this year claimed that processing a single ChatGPt query requires almost 10 times the amount of electricity as a single Google search. But it’s not just ChatGPT queries driving up demand. As we transition to more renewable energy sources, AI is becoming critical to managing and improving efficiency across our electric grid. So how are some of the biggest American tech companies securing the power they need to meet demand? They’re going nuclear.

Tech giant Microsoft recently secured a deal to restart the last functional reactor at Three Mile Island with access to 100% of the power generated. And Amazon announced a $500 million investment to develop small modular nuclear reactors. It’s a sign that large tech companies see data centers and the AI they enable as critical to their futures.

So how does this renewed interest in nuclear power factor into our overall transition to clean energy? What role should the US government play in managing this growth? And what should the next administration keep in mind as it begins to set policy around this? This is Columbia Energy Exchange, a weekly podcast from The Center on Global Energy Policy at Columbia University. I’m Bill Loveless.

Today on the show, Jason Bordoff and Jared Dunnmon. Besides being the other host of this podcast, Jason is the founding director of The Center on Global Energy Policy at Columbia University School of International and Public Affairs. He’s also a professor of professional practice in international and public affairs, the co-founding dean emeritus at the Columbia Climate School, and a former senior director on the staff of the US National Security Council. Jared is a non-resident fellow at The Center on Global Energy Policy. He’s also a former technical director for artificial intelligence at the US Defense Department’s Defense Innovation Unit.

I talked to them about their latest co-written column for foreign policy titled America’s AI Leadership Depends on Energy. I hope you enjoy our conversation. Jared Dunnmon, welcome to Columbia Energy Exchange.

Jared Dunnmon: Thanks for having me.

Bill Loveless: And Jason, it’s not often that you and I get to be together on the air. I look forward to it.

Jason Bordoff: I’m a little intimidated being a guest. It’s much easier being a host.

Bill Loveless: It’s much different position to be in. Huh?

Jason Bordoff: Yeah.

Bill Loveless: I control the mic.

Jason Bordoff: Be nice to me. Fortunately, Jared’s here to answer the hard questions.

Bill Loveless: I’m sure it’s all going to work out fine. Jared, before we get going, tell us a little bit about yourself.

Jared Dunnmon: Sure thing. I’m an engineer by training. I spent most of my career on the academic side building energy systems of various types, so flexible fuel combustion, wind energy. I spent some time on nuclear, et cetera. I got very annoyed with a sensor I was using during my PhD. Around this time, the deep neural networks were starting to work pretty well. Ended up spending some time in the machine learning world and did a post-doc in the AI lab over at Stanford. One thing led to another, helped start a company out of that, and then spent about three years in the DOD running the AI portfolio at the Defense Innovation Unit in the Pentagon as its technical director. After that, since I’ve been back in the private sector, I’ve been working on a couple of companies, and then starting to spend some time on thinking about the intersection of some of the things that I’ve worked on in my career, which are AI, energy, and security.

Bill Loveless: Yeah, yeah. So this is a topic in which you are well versed.

Jared Dunnmon: It is a strange confluence of the things that I’ve spent my time on in life, yes.

Bill Loveless: Yeah. It’s also timely now. Well, Jason, and you Jared start your essay by recalling the movie, The China Syndrome, a blockbuster that was released in 1979. Jason, this is before many of our listeners were born, and others may not recall the movie, so fill us in.

Jason Bordoff: It’s a great movie. See if I get this right, Jane Fonda, Michael Douglas, Jack Lemmon. You can correct me if I got that wrong, Bill. So just a great movie and it was super popular in Hollywood. And it was called The China Syndrome because it was a disaster at a nuclear power plant, and the idea was that the core meltdown could melt down all the way through the core all the way down to the other side of the Earth to China. That was sort of where the phrase came from. It was this kind of horror story of how nuclear energy could go wrong at a time in the late 1970s when nuclear was already controversial. I’m sitting here in my office, and in addition to the picture of President Obama on the wall, the only other person on the wall is Bruce Springsteen.

And I’m thinking of that because there was this famous concert, an anti-nukes concert here at Madison Square Garden in New York, which Springsteen gave one of his best performances ever. So this was the mood of the time, there was a lot of concern about nuclear power. And then of course, what happened was just a few weeks after China Syndrome came out, the Three Mile Island nuclear disaster happened. And in retrospect, we know that a lot of things went wrong that day, but in the end, the system worked to prevent large scale fallout of radiation that would’ve … Zero people died from Three Mile Island, and it was a disaster to be sure, but it was far from as horrible as say Chernobyl or something.

But nonetheless, I think it kind of all came to a head in a moment where a real world disaster happening just a few weeks after this Hollywood disaster really gave a huge amount of momentum to the anti-nuclear community, and nuclear was already facing challenges at the time. But in many ways, really put the final nail in the coffin where new nuclear capacity, which there had been a plan to build a lot in this country, particularly after the Arab Oil Embargo. Remember, this is all just a couple of years after we said, “You know what, our dependence on other parts of the world, particularly the Middle East for oil, is a problem.”

At the time, we were actually producing some amount of electricity from oil. We don’t do that much anymore, so there was a huge push to find other ways to produce electricity. But that really put an end to the idea that we would expand nuclear, the use of nuclear power in this country for quite a while.

Bill Loveless: Right, right. Jason, I love how you managed to work Bruce Springsteen into your discussion [inaudible 00:07:27].

Jason Bordoff: In the first answer.

Bill Loveless: Have to play, we need some music perhaps in the background here. But that certainly was a big setback for nuclear. I recall it well. I was just beginning to cut my teeth at the time as an energy reporter. Now, 45 years later, nuclear energy is enjoying some of its most hopeful moments since Three Mile Island, with announcements by tech giants like Amazon, Google, Microsoft. They’re investing in nuclear power to support their burgeoning fleets of data centers. Let’s get to those nuclear stories in a moment. But Jared, first remind us of the impacts of these data centers on the electric grid.

Jared Dunnmon: I mean, at a high level, you’re just look at a substantial amount of demand. And there are a couple of aspects of that demand that are interesting. First of all, it’s concentrated in a sense that you have a large facility that’s going in. And that facility is going to suck up power at not a constant rate, but a pretty reliable rate. And it’s big enough that there are arguments for having facilities from a power generation perspective, that just power those data centers because the demand is big enough. And so there’s an opportunity here for behind the meter or even off grid arrangements that power these facilities without actually having a huge impact on “the grid.”

On the other hand, what that means is that if you, for instance, have a deal that occurs that takes power that used to be on the grid and uses it to power one of these facilities, you are having a major impact on the grid. The second interesting piece about this demand is that it’s projected to grow, and depending on which projection you believe, that growth could be pretty substantial. You’re looking at single or arguably almost approaching double-digit percentage growth between now and 2030 by a number of projections.

And so the question becomes: If that is true, where do these data centers go? Where does that demand go? And you would think, well, we could put it in places where there’s a lot of potential for energy supply, but it’s not just that. The reality of computational loads is that unlike most electrical loads, you can send them at not quite the speed of light, but pretty fast over large distances. So I can route a query to a data center that is halfway across the world if I want to.

And that has some dynamics that are pretty interesting and arguably under-explored for having a load type that is a substantial contributor to what the electrical grid has to handle that is modulate-able in that way. And so there are a bunch of interesting dynamics from the standpoint of adding a bunch of demand, that demand to be met by both on grid and off grid sources. And oh, by the way, underneath the hood, that demand can actually be kind of pinged around the Earth at the speed of light. And so there’s a bunch of really interesting dynamics there that are not necessarily apparent when we just say, “Hey, we’re adding a data center.”

Bill Loveless: Right, right. And yet, these things tend to accumulate in certain places. Right? I mean, I’m sitting in Northern Virginia as we speak, and it’s the world’s largest data center market. It’s grown some 30% annually since 2014 and accounts for 21% of the electricity sales through Dominion Energy, which is the state’s leading utility. I was just reading yesterday, one local official said that the intensification of energy use ushered in by AI is quote, “Unfathomable.” So you do have these big concentrations and with heavy demands on power.

Jason Bordoff: You do. I mean, to Jared’s point, I think part of what is interesting about data centers is that they to some … And I’ll let Jared speak to issues of latency and how close it needs to be to the usage, but it is a very different application use case than the IEA and their world energy outlook reminded us this week that as big as these numbers are, the projected growth in an old technology, not a new technology, called air conditioning, is going to be three times as much as data center. But that’s kind of hard to move around, that load growth for, say, an air conditioning is going to be where it’s going to be. What’s interesting about data centers I think is that it’s pretty flexible in where that demand can be located.

Jared Dunnmon: Yeah. I was going to say exactly that. Right? It has the potential to be large, but it’s also not … There are also many other things that are much larger, and so it should be considered in that context. What this really gets at though is that it’s less a matter of how much growth there’s going to be that I would argue is driving the current conversation about this. It’s how fast because if you had 20 years to deal with this, you would say, “Okay, I’ll figure this out.” If you look at the scale and the scope and the timing on some of these data center build outs, that’s actually what’s driving the conversation, I would argue, is folks are saying, “Okay, look, I need these data centers online tomorrow.”

Where am I in the next maybe 18 months? Where am I going to find the power? And so as you’ve seen, there’s interest in certainly nuclear power. There’s interest in having data centers installed in countries that have ample power resources. And then this gets to some of the points that we made in the piece, which is there are also a lot of ways to think about this. Right? There’s: Okay, well, how can I move power from one place to another? How do I fix that problem? How do I think about the demand management side? Because yes, if you were to use the largest machine learning model in existence on every single problem, yes, demand would get pretty high.

On the other hand, do you need the largest machine learning model in existence for most problems or queries that a user sends? Arguably not. And so there’s also a demand management kind of piece on the software side that’s interesting. And then you also start getting into, well, if power’s actually your limiting factor on engineering these systems, well, then you start having an incentive to really build out energy efficient chips for what’s called inference, so actually a running a model versus training a model. And those chips may actually look different, and there’s an argument for really focusing on those versus not.

And then finally, Bill, to get to your point about Virginia, there’s a bunch of other stuff that has to be there. Right? It’s not just the power. You need interconnection. You need connectivity because I can’t query your server, it’s not useful to me on the internet. And so for Virginia, there certainly have been favorable tax incentives. There’s obviously kind of rule of law and stability regulation, but there’s also a substantial amount of fiber that allows me to install new data centers and be pretty confident that I can actually handle the connectivity load, whereas in other places where you might want to build power, maybe you don’t have that quite yet.

Jason Bordoff: And that point about speed that Jared made is really important because it is also one of the things that is different. Of course, we’ve got to move really fast to decarbonize as well. We’re nowhere close to being on track for our climate goals. But we made this point in the piece, when you look at the so-called hyper scalers, the huge tech companies, Google and Meta, Amazon, Microsoft, they have enormous capital budgets, I mean, tens of billions of dollars they’re spending in order to meet their power demand growth for data centers. They’re willing to pay a premium, at least to some extent, for that to be green power.

The only thing they can’t do is wait and move slow because this technological revolution of AI is really existential for the future of those companies. And we pointed out in the piece it’s true for competitiveness, America’s competitive position. We know China’s moving really fast to try to build strength and dominate as they have in other sectors, batteries, critical mineral. We’re finding and processing solar panels in AI. And so you take an issue like green hydrogen, where there’s a whole debate in Washington about so-called additionality. Does all the new power to meet the needs of electrolyzers need to be additional zero carbon electricity added to the grid? And there’s a good argument maybe why the answer to that should be yes, other arguments on the other side.

But if the answer’s yes, the result of that might be green hydrogen is slowed down a little bit. That’s not an acceptable answer for these companies. They won’t take no for an answer because the future of these companies really is at stake. That’s how they view the investment in meeting the needs of AI, training these models in the data center capacity. They want to do it green, but if it has to be natural gas, it will be natural gas. They want to do it in the US, but if it needs to be somewhere else, that might happen as well. Think of the Gulf states, which have massive amounts of money in low carbon electricity, and say, “Come do it here instead.”

So this is why this whole issue provides a sense of urgency. I wish, frankly, we had that sense of urgency about de-carbonization, but we’re not quite there yet. But a sense of urgency to move much faster than we are today to make it easier to build clean electricity in this country than it is, that takes you to issues like permitting reform. How do we accelerate nuclear power so it doesn’t have to be a decade from now? How do we accelerate other clean firm sources of electricity, like maybe advanced geothermal, which is really exciting? And of course, renewables, renewables with storage will play a big role too. They probably can’t do everything, but they should be a big part of the solution as well. And then you need to make it easier to build transmission lines again, which is super hard right now.

Bill Loveless: Right. Interesting point you made, Jason, on the sense of urgency. I hadn’t really thought about it that way. But so often with the issues we discuss, de-carbonization, all the various options for addressing climate change, we often talk about: To what extent is there a sense of urgency? And there are certainly limitations that we see in discussing that topic. For better or worse, nuclear power is a big part of the story right now with AI and data centers. I take Jared’s point that there’s many more options and there’s a much bigger picture than just looking at, say, what nuclear can do. But that’s been the headlines lately. Tech companies see nuclear power as a reliable source of electricity for their data centers as well as a strategic option to meet their net carbon goals.

One of the recent headlines we’ve seen is the announcement by Constellation Energy that it will restart a reactor next to the one that failed at Three Mile Island to provide power for Microsoft data centers. Jason, TMI is back in the news.

Jason Bordoff: Yeah, it’s pretty remarkable. Again, when you talk about how that disaster 45 years ago spurred concerns after the China Syndrome movie the idea that would come back online.

Bill Loveless: Not the exact reactor that failed, but rather, the one adjacent to it.

Jason Bordoff: Right. Right, exactly, correct. Yes. But that’s an example of the urgency. The issue, again, coming to Jared’s point, is speed. There’s not a lot of nuclear reactors sitting around waiting to be turned back on. I know that’s not exactly what happened here, but there’s just the idea that there’s an existing nuclear facility that can be repurposed. There are in some other places. I was just in Japan a few weeks ago. They have a lot of nuclear reactors that actually the national government would like to bring online. There’s still local opposition to that after Fukushima. But other than maybe Japan or Germany, there’s not a lot of places with existing nuclear capacity. And new nuclear takes time, and we can talk about whether that can be accelerated. We’re doing a lot of work here at The Center on Global Energy Policy, my colleague, Matt Bowen, and some others on how to reform the Nuclear Regulator Commission, how to accelerate the timeline for permitting.

I wrote a piece on foreign policy two years ago now, two and a half years ago, about basically three reasons why I was bullish on nuclear. And since I mentioned Bruce Springsteen, I think that was prompted by the Canadian singer, Grimes, going on social media to use her star power to advocate against the closure of the nuclear power plant in California at a time when the Diablo Canyon Nuclear Plant, and actually in the end, California did change course on that. But I sort of explained it for three reasons. So first was a sense that there’s a growing recognition that it is going to be much more difficult and much more costly to meet our net-zero goals if nuclear energy is not part of the mix of solutions. And all the modeling I’m aware of, the IEA, Princeton University, others, sort of backs that up as well as renewables. They need to play a role.

The second was really exciting advances in nuclear technology, companies like Kairos, which signed a deal this week, TerraPower, X-Energy. When you look at the advanced reactor designs that incorporate greater inherent safety dynamics and they can produce power more cheaply than past reactors. And the third was national security. I mean, we’re still in this country getting I think around a quarter of the enriched uranium for our nuclear power plants from Russia. Russia’s building most, before the invasion of Ukraine, building more nuclear power plants around the world than any other country. I think that may still be true. China’s building more nuclear power plants in total, including what it’s doing domestically, than any country in the world. So there is a competitiveness and national security dimension to rebuilding America’s leadership in nuclear power as well.

Bill Loveless: Yeah. Maybe, Jared, maybe what’s happening here is that while there’s this bigger AI story, the prominence of the technology, its emergence, the requirements for electricity to support it, somehow in recent days, well, not somehow, these are impressive stories. These deals with Amazon, Microsoft, Google on nuclear power perhaps is more a story about hey, maybe there’s an option here. Maybe there’s some hope here for reviving the nuclear option in the United States. Maybe that’s more the story, more a nuclear story than it is: What can nuclear do for AI?

Jared Dunnmon: Yeah, I would say that the demand, the rapid growth of demand we discussed earlier combined with how these companies look at their futures and say, “Yes, we are existentially concerned,” as Jason pointed out, “about falling behind in building these systems out.” One of these executives recently said basically, “Look, I am much more concerned about the cost of under-building than the cost of overbuilding.”

Bill Loveless: You’re talking about AI.

Jared Dunnmon: Yeah, for AI. Correct. And in that context, you’re looking around for sources of power that in the long run, ideally, a lot of these companies don’t want to be spinning up huge amounts of dirty power. They want power that is reliable, that’s firm power, that’s not intermittent. And they want technology that is reasonably well-proven. And certainly, nuclear power, as we’ve discussed, it’s continued to provide a substantial amount of power, not just in the US, business the abroad for years. But we’ve just slowed our build-out substantially, particularly post Three Mile Island.

And so all of those factors now are coming together to make this argument that if you believe that AI leadership is important, and if you now believe that the engineering, one of the major engineering limiters to AI leadership is not just software, it’s not just chips, but it’s actually reliable power that is relatively inexpensive. And I say relatively in the sense of, okay, if there’s only so much power, the price is just going to skyrocket at the margins, so relatively inexpensive. There are only so many options you have. And it ties in with the national security piece because there’s kind of the whole interaction between AI leadership and national security leadership.

But then it becomes this question of, okay, well, if you believe that’s true, then you believe you actually have to make sure that you’re powered to be able to continue to lead in AI. And then you look around the world and say, “Well, okay, from a competition perspective, you look at the solar industry.” There are certainly kind of issues in the solar industry from an international competitive perspective. You look at other industries, there are their own issues there. Nuclear is one where the US certainly has the technology to continue to lead in that area. We’ve historically done that.

In this case, we’ve chosen not to continue doing that. And the interesting kind of I would say part of what’s going on right now is the combination of the commercial imperative for these companies, so the national security imperatives underlying the AI revolution are kind of coming together to make this argument that in fact, if you have to power these systems in the long run, nuclear is actually a pretty good option for the US. And if it’s going to be an option for the US, you actually have to invest in it because other countries are driving down the cost curve, and if you’re not taking advantage of the economic mass that’s pumping into building reactors, certainly domestically, but around the world as well, you’re going to lose from a cost perspective eventually. And the data centers are then going to go where the power is cheap, so there’s long game here that’s becoming apparent.

Bill Loveless: Yeah, right.

Jason Bordoff: Maybe I’ll just mention one other thing that I think is important to highlight for listeners, Bill, which is we have five recommendations in our piece. And the first of the five was first for a reason, and it was the easiest way to do this is to reduce how much more electricity we need in the first place. So yes, we are going to need more. We shouldn’t pretend otherwise, but maybe it’s not 10 times as much, it’s five, six, seven times, whatever it is. But Jared will know this better than I, so he should elaborate on it. But it takes 10 times as much electricity for a ChatGPT search as a Google search. But most things we’re using ChatGPT don’t need ChatGPT for in the first place. And you can maybe route those inquiries in ways that we’re not doing today to use energy more efficiently.

The Nvidia chips and how much compute power you get from a given amount of electricity input that will continue, should continue to improve over time. We saw that with the internet revolution when in the late 1990s, there were lots of estimates about how the internet was going to cause electricity demand to spike, and it didn’t actually happen because the technology got more efficient. So I do think there are some things about AI that are different. We shouldn’t assume that will be the case here as well, but the famous … We were talking about the 1970s a moment ago, the famous Amory Lovins article in Foreign Affairs about a hard path and a soft path to meet our energy needs. The hard path is let’s break rocks and figure out how to get more energy out of the ground. And the soft path is nobody cares about energy. They care about cold beer and warm showers, as he put it. Can we use our energy more efficiently?

And I think there’s some amount of that we should be doing with AI as well. And again, Jared can give some details on exactly what some of those options look like from a technology standpoint.

Bill Loveless: I think that technology aspect’s important. Before we leave nuclear, I just have to note, Jason, you mentioned that Matt Bowen, our colleague at the center, who does specialize in nuclear energy. He was quoted in Axios as saying that the news regarding tech companies had him, quote, “On the verge of being optimistic,” quote, about the nuclear sector. But he went on to add that he noted the ease of building lower cost natural gas plants and said it will take a lot more than big tech to drive a major expansion in nuclear.

Jason Bordoff: Yeah. And we talked about that in the piece. And if you want to move really fast, which many of these companies need to do, we should talk about permitting reform and whether there are ways in which it would make it easier to build transmission lines for renewables again in this country, accelerate the process of building new nuclear in this country. But if you want to move really fast right now, natural gas generation is one way to do it. Now there are EPA rules that say by 2035, you need to be capturing 95% of the CO2. Our argument was people need to adhere to that rule even whether the Supreme Court upholds them or not. We should make sure that is the case. We can’t just go hog wild on natural gas.

But what we see happening in the world today to some extent are companies saying, “We don’t have other options right now,” and gas is getting a boost from this. That’s just the reality, so we have to make it easier to do everything that’s zero-carbon, and then if we’re going to do any amount of gas, make sure it’s meeting the standards that the EPA has set for capture of the CO2 associated with it.

Bill Loveless: Jared, when you look at these tech companies, do you get the impression that climate commitments at these companies may take a back seat to remaining competitive in the race for AI dominance? You suggest that I think in the piece, indeed you note that Google … You and Jason note that Google recently dropped its claim of becoming carbon neutral. How likely is this to happen?

Jared Dunnmon: Well, first of all, I actually just at a high level think that a lot of the tech companies have actually done a pretty good job of actually thinking about the carbon impact of their operations and thinking about de-carbonizing. These companies didn’t … Nobody forced them to commit to trying to de-carbonize or to be carbon neutral by a certain point. A number of them did it on their own because they thought they could. And there’s obviously some positive consumer impact there, but it wasn’t like they were forced to do it. So I mean just at a high level, the fact that those goals existed in the first place I would argue is a credit to a lot of folks in kind of some long-term thinking, so just to start there.

But the other reality is that if it turns out to be true, that’s a big if, if it turns out to be true that AI driven capabilities are what is going to drive the next generation of competitiveness for these companies, consumers are pretty clear. Consumers tend to go with the thing that works the best, and we’ve seen that. Jason was just talking about cold beers and warm showers. If I can’t get a cold beer and warm shower, a lot of consumers don’t necessarily care where it’s coming from. They care that they’re getting it. And so if you’re able to offer qualitatively better services by running these AI systems, if you’re the company that decides not to do it, you’re not going to have users. You’re not going to have revenue. You’re management may well be changed by your shareholders, who you have a fiduciary responsibility to.

And in a capitalist system, that’s the way that this is going to work. So I think ultimately, while some of these commitments do exist and I think some folks should be lauded for those commitments to try to decease the climate impact of these operations. At the end of the day, in a competitive environment, if you lose on product, you won’t have that much of an impact on the climate because you won’t be running very much. So these companies, I don’t want to say perverse set of incentives, but it’s just the reality is that they have to compete to be impactful anyway. And so you’re going to have to address these problems while you compete. It’s not an option to not compete in the market because someone else will, and so you’re not going to win that way.

Bill Loveless: Right.

Jason Bordoff: Yeah, many of these companies have actually taken these goals of de-carbonization, frankly, more seriously than many others. And what some like Microsoft have done with carbon removal or others, even Google, the example we gave in the piece, I mean, they also have a net-zero by 2030 goal, which includes, I think includes scope three emissions, so these are ambitious goals. There’s a reality to what they have to do to meet rapidly rising electricity use given the urgency of AI and data centers. But the opportunity we have, I don’t think we should come down too hard on them. The opportunity is these are companies with huge amounts of money to spend. They’re really committed to meeting the power needs of data centers. But they also are taking these goals reasonably seriously, and so that’s an opportunity to look to a sector that might be willing to, again, you can’t ask people to spend endlessly, but is willing to bear some premium in order to try to make sure that we’re doing this in a way that is lower carbon. And I think the fact that these tech companies have the money to spend, if we just say, “We just want to move fast,” but we’re willing to do that in a way that’s green. We need to build power really quickly.

And when you bring that together, as we wrote in the piece with the economic competitiveness, issues around AI, the desire to maintain American leadership, the national security concerns with locating all of this elsewhere, or competition with China, and that these are the companies that are behind these investments, there’s a big opportunity there to build a broader set of stakeholders and a broader sense of urgency, and a broader sense of momentum to try to move faster to do what we should be doing anyway for the climate challenge, which is building clean energy in this country much faster than we are today. Now we have to do it for all these reasons as well.

Jared Dunnmon: I have to really emphasize that is that there is a confluence of market forces that are pushing companies to build out data centers, increase their electricity demand, alongside this question of: How are you going to physically accomplish that? And the direction that those things are pointing, which is certainly to some degree given recent weeks, towards a version of new nuclear power in the United States. I would argue that’s pretty exciting from an optimistic perspective. I would argue that it’s pretty exciting if this giant bolus of electricity demand is going to be pointed towards building out power that looks like the power we need in the future because ultimately, someone is going to have to drive the cost curve down on those technologies because ideally when you build the first one, the second one costs less, the third one costs less, the fourth one costs less. And if right now, the tech companies, because of this imperative, are in a spot where they’re willing to pay that price to get it out quickly, and paying that price now makes it such that the next one is cheaper, and the next one is cheaper, that’s a very exciting prospect versus in the previous world, both from potentially a risk perspective and risk perception perspective.

But also, from a capital perspective, no one was willing to take on that risk because there wasn’t a reward. Here, there’s a reward, which is you’re not falling behind on AI, and there’s also I would argue from an optimism perspective, there are fewer things that one would like that are bipartisan in the current environment. One of them I would say certainly is kind of national security and potentially the interaction with that with AI leadership. And if it becomes clear that in fact the thing that’s limiting us is not actually the capital because companies are putting capital into this, and the thing that’s limiting us is actually permitting reform, inciting reform.

Then you start to have an impetus to getting those things done that is not based in an argument around whether it’s the right thing to do for the climate. The argument is then based in: Is it the right thing to do for national security and economic leadership? And I would argue that’s a very different conversation from both a policy and politics perspective.

Bill Loveless: Right, but one that could have implications for climate concerns.

Jared Dunnmon: Absolutely. They all point in the same direction. It’s not always common that happens. In this case, it seems to be the case.

Bill Loveless: Jason, you and Jared offer recommendations in your piece. And among them is, you say the first and fastest option is for tech companies to reduce the AI energy needs in the first place. Maybe this is more of a question for Jared given his technical expertise there. You began to talk about that earlier. Tell us, explain to us the technology there. What can make this more efficient in the first place?

Jared Dunnmon: I would categorize the strategies that you could take into two buckets. One bucket is software and one bucket is hardware. Hardware, so let’s assume you’re running the same software, in hardware, there are tricks that you can play to make your operations more energy efficient. And folks have played a number of different tricks in chip design over the years to do that. You can do things like use the kind of current laws that you learn in kind of intro electromagnetics and physics to very efficiently do adds and multiplies, some of the core operations underneath neural networks, in a very energy efficient way, versus having to do them digitally, so you can use kind of the analog circuit logs to do that.

Number two, you can design chips in a fundamentally different way that doesn’t look like, say, your kind of Nvidia chip. It looks slightly different in the way that the computation, the data, is routed through the chip that has substantial energy implications for how efficient the computation is. At a high level in computation, there is a trade-off between programmability and efficiency. So for instance, your CPU in your computer, your central processing unit, it executes many different types of instructions, but it executes them very fast. But it’s not necessarily doing all of them hyper efficiently. A GPU, the graphics processing unit, so something that would be, say an Nvidia chip, is each individual core executes many, many fewer instructions, but it executes … But it has many of these smaller cores and you can execute many of those instructions in parallel, and so it can be much more efficient for certain things.

When you also start to get into what are called application specific integrated circuits, where you are designing a chip to do literally one or two things, it becomes very difficult to program because it only does one or two things, but it does them extremely efficiently. And so there’s this spectrum that we operate on between programmability and efficiency. And if we believe that these workloads are actually going to drive a huge amount of power consumption, it actually becomes economically viable and in fact favorable to say, “Actually, let’s not use the general purpose chips that are less efficient and can do all these other things we don’t actually need. Let’s actually just build chips that do exactly the thing that we need.” And that’s starting to happen, so that’s on the hardware side.

On the software side, there is a whole host of strategies because right now, the motivating factor for a lot of companies that are running AI based services is getting and retaining users that may not even be using the AI at the AI system. For instance, if I’m really excited about using Microsoft or Google’s AI system, maybe I buy Google Workspace or Microsoft 365. Right? There are other reasons that even if it’s a loss leader formally, I still might do that. So because I care about that type of acquisition, that type of dynamic, and there are network effects to some of these systems. I care about getting users, so the thing that I care most about right now is a very positive user experience. So I’m going to send, if I ask you, “Hey, what color … “

If I ask ChatGPT, “What color is a strawberry?” You don’t want it to take a long time and you don’t want to get the answer wrong, but particularly, you don’t want to get the answer wrong. And so you’re probably going to err on the side of, well, I could send it to this model that’ll get it right 99% of the time and it’s a much smaller model. When I say smaller, usually you’re measuring these things in terms of parameters, like the number of numbers that have to be stored to define a model. So for instance, I could send it to a seven billion parameter model or I could send it to a 400 billion parameter model. And the 400 billion parameter model is much more expensive to run. I’m going to send it to the 400 billion parameter model because I don’t want you, the user, to have a bad experience, even though 99% of the time, the seven billion parameter model would’ve been fine and it would save you a lot of energy and a lot of cost.

So thinking about how we incentivize movement towards, hey, let’s use the smaller models when we can is an important thing to think about because right now, the market dynamics are such that because you really care about getting users and retention, you’re not really thinking about the systems implications outside of, well, ultimately because I’m running all these things, I need more power, so let’s get more power.

Bill Loveless: And you think that as time goes on, the tech companies are going to take this into consideration more and more, use their expertise, their technology more efficiently, and in fact, use less electricity.

Jared Dunnmon: Yeah. And it’s already happening. What you’re starting to see routinely is, for instance, from an enterprise perspective, what you see is folks will start by using one of these very, very large models, very capable models to perform a given task. And they’ll find out, actually, it’s costing me a reasonable amount, so let me see if I can take some very application specific data and train a smaller model to do the subset of things that I need to do, and at the same level of quality that big model was doing. And in fact, you can often do that. It turns out that’s a common thing to be able to do. So what you start seeing is folks actually ripping out these big models to run smaller models for an enterprise side.

Now the dynamics that I think need to be figured out there, or it’s not obvious how it’s going to turn out is that can still incentivize more use. Right? It’s one of these situations where the cost of the good goes down, but because the cost of the good goes down, you’re using more of it. So it’s not clear how those dynamics are going to work with these sorts of models, for instance, on the enterprise side. But certainly from the consumer side, if I’m just at a set number of queries, if I can route those queries to the smallest model possible, then yes, you’re going to save some cost and power.

Bill Loveless: Yeah, okay. So I shouldn’t be worrying so much about my own ChatGPT inquiries and the impact it having on the grid.

Jared Dunnmon: I don’t think that’s a problem a user should solve. I mean, yeah, sure, a user … You could say, “I’m going to use a traditional Google search versus whatever, ChatGPT.” But I think relying on users to do that, we’ve seen how well relying on users to reduce their energy demand works kind of in other parts of the economy. I don’t think that’s a user problem to solve. I think it’s a company problem to solve and a provider problem to solve. But I think they’re also incentivized to do that. Right? If you look at the numbers coming out right now, these systems are costing a ton of money to run, and in many cases, more money to run than they’re making. And so there is a market incentive to do this. That’s just the next step.

Bill Loveless: Jason, you and Jared also recommend regulatory reforms, or you say that regulatory reforms are essential to meeting this AI energy goal, such as the changes in the permitting requirements for energy infrastructure, as well as modifications in the utility business model. Certainly, anybody who’s following the regulatory front these days has read a lot about permitting reform in Congress. There’s a bill pending now that some hope may have some chance of going forward, but it’s been an issue that’s stumbled and failed for years now. But nevertheless, you cite that as being important as well as the changes in the utility business model. I mean, how confident are you that these will be addressed effectively any time soon?

Jason Bordoff: Well, it’s hard to have a high degree of confidence, for sure. And any time people talk about something happening during a lame duck session of Congress, history tells us there’s reason to be skeptical that anything will really get done. I think one reason to think the odds are maybe higher than zero is what Jared and I were talking about before, this kind of new sense of urgency. We were talking about permitting reform for years to make it easier to build renewable energy, for example, and transmission. But now there’s this new sense of urgency because of our competitive position in the world of AI because of the hyper scalers who were so focused on this, so there’s a broader set of actors that are suddenly focused on this because it matters to them, so that should make one a little more optimistic. It is still controversial.

Several hundred climate scientists just signed a letter in opposition to the compromise bill between Senator Manchin and Senator Barrasso, but I think the goal in our current makeup in Congress, most things are going to have to be compromised and bipartisan, so while it does have provisions to make it easier to export LNG or do some offshore oil and gas leasing, it has very important provisions. We’re doing some analysis of that here at the center now, so I don’t want to prejudge that, but the initial sense is that those are significantly more consequential for the outlook for clean energy to make it easier, particularly to build transmission for things like cost allocation and national interest corridors to make it easier to build the infrastructure we need to scale clean energy very substantially.

As you said, we also talked about reforming the current utility business model because today, that often rewards utilities for capital investments in new infrastructure. As we’ve been talking about, there are ways that power companies can deliver electricity efficiently, not only by building new infrastructure, but by using existing infrastructure more efficiently to deploy batteries, to store renewable energy, to rewire old transmission lines with advanced conductors that can put more power through existing infrastructure. And you want to make sure that there’s a return for them on doing that. You don’t just rate base capital investments.

And then we also lay out in the piece a set of principles that we think should guide US policy in the area of AI and electricity, which is three things, one, to hold existing rate payers harmless, to require that new power generation be clean, and to allow the tech firms to move quickly. And so that last piece is important for regulators as well, protecting existing rate payers from bearing the cost of additional power because again, typically, the cost of capital investments that utilities make is spread among all rate payers. We’re already seeing a backlash, like a not in my backyard phenomenon happening with people who see huge amounts of data centers being put in their communities because of pollution, because of noise, or because they’re starting to see a lot more investment in electricity generation capacity as being borne by everyone, but there’s just a couple of companies or data center owners who are benefiting from all of that increased electricity generation capacity.

Bill Loveless: Yeah. And there can be a political backlash there again, as I’ve seen here in Virginia.

Jason Bordoff: Exactly.

Bill Loveless: It does enter into politics in a way that’s not necessarily productive. Jared, before we go, I want to get back at that point you made earlier on these centers could be … These data centers, the location of them doesn’t necessarily matter in terms of how quickly and efficiently they can deliver for tech companies and for tech company customers. I mean, you could have centers abroad. On the one hand, your piece as I recall makes a point of saying, “Well, some of these centers could end up in places that are not necessarily so friendly to the United States and maybe that’s a problem.” On the other hand, you and Jason say there are maybe opportunities for engagement with partner nations to create opportunities abroad for US firms and AI. What do you mean by that? What do you have in mind?

Jared Dunnmon: I think there are a couple of different directions here that are interesting. And when I say you physically could send queries to different places, there are different types of queries depending when I say query, I mean kind of for instance, asking a model a question, for instance. Physically, yes, I could send that to a data center that’s halfway across the world. Now what are the problems with that? There are some queries that need to be actually be answered very quickly, and the speed actually does matter, so it’s some percentage. It’s a double-digit percentage of queries where you don’t actually care about that, that run in what’s called batch mode.

So for instance, some large percentage of queries, you could just route physically somewhere else and you could route it to where power is cheap. You can route it to where the sun is shining. But then you run into a bunch of interesting questions, so things like laws on data protection and sovereignty. And does the government in the country that I’m sending this query to have the right to go in and inspect the data that’s being sent?

Bill Loveless: Yeah. And what if you’re the Defense Department? You have a defense background. I mean, if it’s a sensitive matter for the Pentagon.

Jared Dunnmon: Yeah, or what if you end up in a world where you had some of the undersea cables have a problem, and then you’re relying on being able to send information to data centers that you can’t actually get at? Or for instance, does that also mean that the waits for those models, the parameters have to sit on a server that’s in a foreign jurisdiction, and do you have a problem with that? Can someone say, “I’m going to nationalize the data center”? And grab your frontier AI model. Right? There’s a bunch of different and interesting dynamics here that have to be addressed, and you saw that recently with particularly in the public eye, there was the kind of deal between Microsoft and G42 that was kind of aimed at leveraging AI regionally in the Middle East that would’ve involved some level of kind of Microsoft technology going into the Middle East.

And so you’re seeing those things worked out in real time, and there’s some interesting dynamics there because it’s a matter of, okay, well, for instance if the US says, “Well, actually, we’re not going to let our technology be sent abroad like that,” then you have these other countries that are going to say, “Okay. I can’t get it from you, I’ll get it from someone else.” And that hurts US companies, that potentially hurts US partnerships and security interests in a world where allies and partners are really important. That has substantial security implications. And so there is this tension between wanting to protect US intellectual assets, capabilities, et cetera, but also being realistic and cognizant of the fact that it’s an important world to build partnerships in, and if in a globalized economy and in a globalized certainly AI ecosystem, certainly the US has some modicum of leadership. But if countries can’t get that technology from us, they are going to go get it from somewhere else. And that has other knock on effects from a national security perspective that we may not like.

And so I’m not arguing that I have an easy answer to that question, but that’s the tension. And it’s a really important one to approach kind of with some intentionality and some nuance.

Bill Loveless: Jason, you again in the paper mentioned there are opportunities to partner with nations to create opportunities abroad for US firms. I think you mentioned Japan, for example.

Jason Bordoff: Well, it was an example of an interesting question to consider, and people listening and Jared as well may know whether this is a nonsensical idea or not. But I was just in Japan a few weeks ago, that’s why it was top of mind. And they’re very focused on things like what we were talking about a minute ago, which is the pause on US LNG exports. They’re trying to sign contracts with countries to import LNG into their country to meet power needs, including for AI, and they’re thinking a lot about AI, but even more generally. And that’s really expensive importing natural gas into your country. They’re a very import dependent country, other than the domestic nuclear that they may restart some more of, they’ve never gone back to what they had before.

But energy can be a lot cheaper in other places, including here in the US. And might it make more sense to locate data centers where the electricity is cheap and import the bits and the data than to import the electrons and the molecules halfway around the world to power data centers there? Now that, for the other country, would I imagine raise significant concerns. People might be concerned about privacy, data security, geopolitical tensions, if you start to locate a significant amount of the data centers that your country, your companies use. But it’s an interesting question to think about whether there can be allies working closely together to build partnerships to work together on where we locate the power and the data centers that the world needs for this kind of AI revolution.

Bill Loveless: Well, Jason, Jared, you bring to mind such an important topic here with AI. But I think also, for those of us who are concerned with energy security, you bring to mind some important thoughts, and perhaps different thoughts on approaches that arise in that context and ways in which perhaps the options are changing. So thanks for taking the time to join us today on Columbia Energy Exchange.

Jason Bordoff: Thanks for having me on our show, Bill.

Bill Loveless: And Jason, I think next time we’re on the show together, we should include a soundtrack from your musician, Bruce Springsteen.

Jason Bordoff: Oh, yes. That’s a good idea. I’d be very happy to do that.

Jared Dunnmon: Thanks for having us, Bill.

Bill Loveless: Thanks again. That’s it for this week’s episode of Columbia Energy Exchange. Thank you again, Jason Bordoff and Jared Dunnmon, and thank you for listening. The show is brought to you by The Center on Global Energy Policy at Columbia University School of International and Public Affairs. The show is hosted by Jason Bordoff and me, Bill Loveless. The show is produced by Tim Peterson from Latitude Studios. Additional support from Lily Lee, Caroline Pittman, and Q Lee. Sean Marquand is the sound engineer. For more information about the show or The Center on Global Energy policy, visit us online at energypolicy.columbia.edu or follow us on social media @columbiauenergy. If you like this episode, leave us a rating on Spotify or Apple Podcasts. You can also share it with a friend or a colleague to help us reach more listeners. Either way, we appreciate your support. Thanks again for listening. See you next week.

The artificial intelligence boom is fueling a massive uptick in energy demand globally. 

A Goldman Sachs report from earlier this year claimed that processing a single ChatGPT query requires almost ten times the amount of electricity as a single Google search. 

But it’s not just ChatGPT queries driving up demand. As we transition to more renewable energy sources, AI is becoming critical to managing and improving efficiency across our electric grid. 

So how are some of the biggest American tech companies securing the power they need to meet demand? They’re going nuclear. 

Tech giant Microsoft recently secured a deal to restart the last functional reactor at Three Mile Island with access to 100% of the power generated. And Amazon announced a $500 million investment to develop small modular nuclear reactors. It’s a sign that large tech companies see data centers – and the AI they enable – as critical to their futures. 

This week, host Bill Loveless talks with Jason Bordoff and Jared Dunnmon about their latest co-written column for Foreign Policy, titled “America’s AI Leadership Depends on Energy.”

Jason is founding director of the Center on Global Energy Policy at Columbia University’s School of International and Public Affairs. He’s also a professor of professional practice in international and public affairs, the co-founding dean emeritus at the Columbia Climate School, and a former senior director on the staff of the U.S. National Security Council.

Jared is a nonresident fellow at the Center on Global Energy Policy. He’s also a former technical director for artificial intelligence at the U.S. Defense Department’s Defense Innovation Unit.

Related

More Episodes

Our Work

Relevant
Publications

ICEF Artificial Intelligence for Climate Change Mitigation Roadmap (Second Edition)

Can artificial intelligence help cut emissions of greenhouse gases? In this second edition of the Artificial Intelligence for Climate Change Mitigation Roadmap, a team of 25 co-authors led by CGEP Inaugural Fellow David Sandalow explores that question, finding that AI has the potential to make significant contributions to fighting climate change. The Roadmap’s 17 chapters provide introductions to AI and climate change, examine the potential for AI to help reduce greenhouse gas emissions in eight sectors and discuss cross-cutting topics such as large language models and government policy. Each chapter includes 5-10 specific, actionable recommendations for realizing AI’s potential to help fight climate change.

External Publications with David Sandalow, Antoine Halff, Zhiyuan Fan + 4 more • Innovation for Cool Earth Forum • November 15, 2024
ICEF Artificial Intelligence for Climate Change Mitigation Roadmap (Second Edition)
See All Work