There’s a big problem with generative AI, says Sasha Luccioni at Hugging Face, a machine-learning company. Generative AI is an energy hog.
“Every time you query the model, the whole thing gets activated, so it’s wildly inefficient from a computational perspective,” she says.
Take the Large Language Models (LLMs) at the heart of many Generative AI systems. They have been trained on vast stores of written information, which helps them to churn out text in response to practically any query.
“When you use Generative AI… it’s generating content from scratch, it’s essentially making up answers,” Dr Luccioni explains. That means the computer has to work pretty hard.
A Generative AI system might use around 33 times more energy than machines running task-specific software, according to a recent study, external by Dr Luccioni and colleagues. The work has been peer-reviewed but is yet to be published in a journal.
It’s not your personal computer that uses all this energy, though. Or your smartphone. The computations we increasingly rely on happen in giant data centres that are, for most people, out of sight and out of mind.
“The cloud,” says Dr Luccioni. “You don’t think about these huge boxes of metal that heat up and use so much energy.”
The world’s data centres are using ever more electricity, external. In 2022, they gobbled up 460 terawatt hours of electricity, and the International Energy Agency (IEA) expects, external this to double in just four years. Data centres could be using a total of 1,000 terawatts hours annually by 2026. “This demand is roughly equivalent to the electricity consumption of Japan,” says the IEA. Japan has a population of 125 million people.
No comments:
Post a Comment