For our new series, A Fordham Focus on AI, we’re speaking with faculty experts from a range of disciplines about the impact of this rapidly evolving technology and what to expect next.
In this installment, we sat down with Marc Conte, PhD, an economics professor whose research has examined the economic impact of climate change. His writing has been cited by the White House, and he currently serves on the New York City Panel on Climate Change, an independent advisory body.
Conte is now studying the impact of data centers—roughly 3,900 of which are operating in the United States today, with an additional 1,200 in the planning or construction stages. New construction of these centers has accelerated dramatically in recent years to meet the energy demands of AI, with a typical AI-focused data center annually consuming as much electricity as 100,000 households.
What are your concerns about AI data centers and how they might impede our efforts to mitigate climate change?
Data centers built to support AI use significant energy because AI models are constantly running, and the computations they conduct to respond to billions of daily queries require enormous processing power. The result is a dramatically increased demand for electricity at a time when we’re facing significant environmental and health costs associated with greenhouse gas emissions from our use of fossil fuels for electricity.
At the same time, we’ve been working over the last 15 years to shift the mix of electricity energy sources toward renewable energy. These projects are getting built, but there’s a really long wait to connect them to the nation’s electrical grid when they’re finished.
Now tech companies are going to places across the country, saying they’re investing in the area by building data centers. That creates significant pressure on local utilities and agencies to allow them to connect to the grid, so there’s a chance these companies could skip the queue, jumping ahead of renewable energy projects and further slowing our movement away from fossil fuels.
You’re working on a research project aimed at making AI less energy-intensive. Can you tell me more about that?
I’m working with a partner at IBM and a graduate student at Yale to identify the electricity required by an individual computer chip and convert that into the amount needed to respond to a single query in a large language model like ChatGPT. The idea is to show that the carbon emissions associated with a specific query depend on where the data center is located and when the query is answered. Companies could then optimize how they route questions to specific times and places to reduce energy consumption.
Why do AI queries consume so much energy in the first place?
The objective of the engineers who built the models used by the likes of ChatGPT and Gemini was to create something that could perform tasks well. They weren’t told to create something that could do tasks pretty well without using too much energy.
Large language models are often described in terms of the number of parameters that are used to determine their performance. Some of the largest models, like ChatGPT, may have trillions of parameters, which determine how the model processes and generates text. Parameters are similar to neurons in your brain; they enable these models to learn more complex patterns and answer more queries. The more parameters in a large language model, the bigger it is and the more energy it uses.
So some tend to think, if performance increases when we go from a billion to a trillion parameters, why not just do it? But how much does performance go up? Is it worth the energy used?
The unanswered question right now is, what is the trade-off between model size and model performance? Tech companies might not have such massive energy demands if they adopt different models using fewer parameters.
What is one thing you want people to know about AI and energy costs?
The costs of building and operating the models powering things like ChatGPT are higher than the average consumer knows. Data centers impact local communities’ land-use decisions, water quality, and electricity pricing. If we maintain this path of development, we are on a trend toward higher electricity prices and more volatility in electricity markets.
One way we could address this is by having data centers pay the full cost of grid connection or designate them as a special customer class that might be charged higher rates to cover the burden they impose on other customers.
Learn more about AI for the greater good at Fordham.
