Does the brain really consume less energy than a computer?
There’s been a lot of buzz lately about the energy consumption of artificial intelligence and machine learning, but if we dig deeper and look at energy consumption in terms of output, the story isn’t as clear-cut as it first appears. It turns out, the human brain, while incredibly efficient, might not always come out on top in every energy comparison with a computer.
Experts and researchers have voiced concerns about the massive amounts of power needed to train and run modern AI systems, particularly large-scale models like those behind today’s language processing tools. A widely cited study from the University of Massachusetts Amherst reported that training a single AI model could emit as much carbon as five cars over their entire lifetimes. Numbers like these have sparked a debate: do computers, especially those powering AI, really consume more energy than the human brain?
Some notable figures in the tech world are concerned. Kate Crawford, who works with Microsoft and authored Atlas of AI, has pointed to the immense power demands of AI as an issue we need to address. Andrew Ng, a leading AI researcher, has been calling for more sustainable AI practices. Companies like Google are taking steps in this direction, developing specialized hardware like Tensor Processing Units (TPUs) to curb energy costs—claiming up to 30 times more efficiency than conventional processors. Yet, even with these advancements, the growing energy needs of AI remain a contentious issue, particularly as models become increasingly complex and data-intensive.
The Brain: A Model of Biological Efficiency
The average human brain, weighing about 1.4 kilograms, operates on roughly 20 watts of power—continuously. Whether you’re reading, writing, sleeping, or solving complex problems, the brain’s power consumption remains surprisingly steady. This energy usage translates to about 300 kilocalories a day, making up around 20% of the body’s total energy consumption. This constant power draw supports a vast range of activities, from controlling basic motor functions to engaging in advanced cognitive processes like reasoning, planning, and imagining.
When we compare this to computers, the numbers at first seem quite different. A standard laptop typically uses between 60 to 100 watts, and a high-performance desktop might consume 400 to 600 watts or more, depending on its workload. Servers running complex AI models can require thousands of watts, reflecting a seemingly huge disparity in energy consumption between computers and the human brain. But the context in which we make these comparisons matters greatly.
Task by Task: Brains vs. Computers
Let’s consider a simple task: adding a thousand three-digit numbers. A human being might spend around three hours on this task, using approximately 60 watt-hours of energy (or 216 kJ). Meanwhile, even a basic computer can perform this same operation in microseconds, consuming a tiny fraction of a watt. In this context, the computer is far more energy-efficient. Actually, my 65-watt-rated laptop added 1,000 randomly generated numbers in approximately 0.0019948483 seconds, thereby consuming a paltry 1.81× 10^-11 watt-hours, or about 6.5×10?8 joules. This starkly contrasts with the human energy consumption.
The human brain, however, is not designed for single-purpose tasks. It is a general-purpose organ constantly juggling multiple functions—managing sensory inputs, regulating emotions, maintaining homeostasis, and solving problems all at once. The energy consumption remains relatively stable across different tasks because the brain is designed to be versatile, handling numerous processes simultaneously rather than excelling in any one domain.
Complex Tasks: Writing and Language Generation
Now, let’s take a look at more complex tasks, like generating a piece of writing. Training a large language model like GPT-3 consumed about 1,287 megawatt-hours of electricity—equivalent to the power usage of around 120 U.S. homes for a year. On the surface, this figure sounds staggering and lends weight to concerns over AI’s environmental impact. However, it is important to consider what the model accomplishes once trained.
A single instance of ChatGPT can write a 1,000-word blog post in about 30 seconds and can do this for millions of users at the same time. In contrast, a human might need around 10 hours to craft the same blog post, using approximately 10 watt-hours of energy. While the human brain’s total energy use appears lower for this single task, it cannot compete in terms of scale. LLMs are designed for massive parallelism; they can handle millions of writing requests simultaneously. When you calculate the energy consumption per task across all of these outputs, computers can be remarkably efficient.
Understanding Energy Efficiency: Output Scope Matters
The key to understanding why computers might not consume more energy than the human brain is to measure efficiency by the scope of output. The brain is a marvel of biological engineering, incredibly efficient across a broad spectrum of tasks. However, it is fundamentally limited in how many tasks it can handle concurrently.
Computers, especially those optimized for specific functions, excel in scale. They can complete a specific operation, like generating text or running complex simulations, millions of times over, with relatively little additional energy cost per task. While computers may require more power upfront, they compensate by producing far more output in a given time frame.
Different Measures for Different Systems
It’s essential to evaluate the human brain and computers differently. The brain is a multi-purpose organ designed for a wide array of functions, all handled with a consistent level of energy use. Computers, on the other hand, are specialized machines built for speed and efficiency in narrowly defined domains. While their total energy use can be significant, especially when running data-heavy AI models, they achieve a level of scale and concurrency the human brain simply cannot match.
So, does the brain really consume more energy than a computer? It depends on how you measure it. In raw wattage, the brain is much more energy-efficient, but when you consider the scope and scale of tasks that computers, particularly AI models, can perform, they start to look surprisingly efficient.
Computers can handle vast amounts of work simultaneously, driving their energy cost per task down. As we continue to rely more heavily on AI and computing technologies, it’s crucial to find ways to balance their performance with sustainability. Both the human brain and computers have their unique strengths, and understanding these differences helps us appreciate their complementary roles in our ever-evolving technological landscape.