Home Top Stories The power demands of AI computing can put great strain on the...

The power demands of AI computing can put great strain on the electrical grid

0
The power demands of AI computing can put great strain on the electrical grid

Computing has taken a leap forward with the advent of artificial intelligence, but with progress comes challenges. There are those who believe that the rapid rise of AI will pose a major energy problem for the future.

When most people surf the Internet, they don’t think much about what’s actually happening. But Dr. Jonathan Koomey does. The founder of Koomey Analytics has been researching electricity use in computers since the 1990s.

“There are servers – in other words, computers – that are in a big data center somewhere. And you send a message to them and you say, do this task for me. And when you send them that message and it performs the task, it uses electricity,” he said.

Now, with the rise of AI, those massive data centers are getting bigger, and so are their energy bills.

A Goldman Sachs analysis predicts a 160% increase in data center electricity demand, stating: “These kinds of spikes in energy demand haven’t been seen in the U.S. since the turn of the century. It will be fueled in part by electrification and industrial reshoring, but also by AI. Data centers will consume 8% of U.S. power in 2030, up from 3% in 2022.”

“If you have giant computers whose job is to create the AI ​​models, and then you have millions and millions of users sending these questions to get answers, that also uses a lot of electricity,” says Dr. Koomey.

The problem is that it takes about ten times as much energy to answer a ChatGPT question than it does to ask the same thing in a normal Google search. There are many dire predictions about what people who search for cat videos using AI will do to the country’s power supply.

But Dr. Koomey isn’t worried. It may be wide open at this point, but he said those paying for the expensive servers will likely focus AI on applications where they can make money.

“Ultimately they have to pay for it themselves,” said Dr. Koomey. “I can tell you that ChatGPT is great for entertaining 15-year-olds – our kids love it – but I don’t know if that’s a sustainable business model, right? They’re probably not high-margin customers.”

Dr. Koomey said new technologies are always a bit rough, but it usually doesn’t take long for them to become more efficient.

“That to me is the most important thing for people to understand: Electricity consumption can increase, but it can also stabilize, as has happened in the past, if we get smarter about the way we deliver those computing services,” he said.

And getting smarter is what it’s all about.

It turns out that AI is already coming up with ways to solve some of its own problems. Google used machine learning to reduce cooling costs in one of its data centers by 30-40%.

NO COMMENTS

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Exit mobile version