Dynamic Business Logo
Home Button
Bookmark Button

Could the human brain hold the key to sustainable AI?

Could the secret to sustainable AI lie in the workings of the human mind? Researchers are exploring this possibility by studying how the brain’s efficiency can be applied to AI algorithms.

Understanding the Challenge

Large language models (LLMs) are revolutionizing artificial intelligence with their impressive capabilities. However, this progress comes with a considerable energy cost. Data centers, which are essential for running these models, contribute significantly to global energy consumption. Reports indicate that data centers account for approximately 2% of the total energy use in the United States and 1% in Australia, with projections suggesting this could rise to 8% by 2030. Advanced LLMs like OpenAI’s ChatGPT can consume as much electricity as up to 17,000 households, and future models may require even more energy.

The human brain offers a compelling contrast to the energy demands of artificial intelligence. It operates on just 20 watts of power, demonstrating remarkable efficiency by selectively activating only the necessary neurons. This selective activation stands in sharp contrast to AI systems, which often engage large portions of their network for each task, leading to excessive energy use.

The Algorithm

Researchers at the University of Sydney, led by Associate Professor Chang Xu and his team at the Net Zero Institute, are developing ground-breaking algorithms inspired by the brain’s energy-efficient operations. These algorithms aim to mimic the brain’s selective activation, activating only the parts of the model necessary for a specific task. This approach could significantly reduce computational overhead and energy consumption.

Key Components of the Algorithm

  1. Task-Based Activation: This component of the algorithm identifies the specific parts of the model required for a given task, reducing the need for full network activation.
  2. Dynamic Resource Allocation: The algorithm allocates computational resources only to the active components, enhancing efficiency and reducing waste.
  3. Redundancy Reduction: By identifying and eliminating redundant computations, the algorithm further optimizes energy use and computational performance.
  4. Significant Energy Savings: The algorithm’s ability to reduce unnecessary computations could lead to substantial reductions in energy consumption, addressing the environmental impact of data centers.
  5. Improved Sustainability: This reduction aligns with broader sustainability goals, making AI development more environmentally responsible.
  6. Hardware Efficiency: The algorithm could drive advancements in hardware design, leading to more energy-efficient systems that support its operational needs.
  7. Scalability: Researchers are working to ensure the algorithm can handle increasingly complex AI models while maintaining efficiency.
  8. Real-World Applications: Integration into practical AI applications will be crucial for demonstrating the algorithm’s effectiveness and benefits.
  9. Collaboration: Collaborations with hardware manufacturers are being sought to optimize hardware for the algorithm’s requirements.

Associate Professor Chang Xu, from the University’s Net Zero Institute, emphasizes the need for energy-efficient computing in light of the rising energy demands of large language models. “We’re meant to be scaling back our energy use, but the advent of large language models has been a shot in the arm for energy consumption,” Xu said.

He highlights that current AI models, despite being queried for simple tasks, use extensive energy resources due to their non-selective activation. Professor Deanna D’Alessandro, Director of the Net Zero Institute, underscores the importance of addressing the energy impact of AI in the broader context of climate change. “While AI helps in understanding and solving climate change issues, we must ensure that new technologies do not exacerbate the problem by becoming significant sources of emissions,” she noted. 

The Net Zero Institute at the University of Sydney is dedicated to accelerating research and solutions towards achieving net zero carbon emissions by 2050. With over 150 researchers, the institute focuses on a range of disciplines, including green computing, greenhouse gas removals, and critical mineral extraction from waste.

By drawing inspiration from the human brain’s energy-efficient operations, researchers are poised to make significant strides in developing algorithms that enhance the sustainability of AI. These advancements have the potential to reshape the energy landscape of artificial intelligence, making it both more powerful and environmentally responsible.

Keep up to date with our stories on LinkedInTwitterFacebook and Instagram.

What do you think?

    Be the first to comment

Add a new comment

Yajush Gupta

Yajush Gupta

Yajush is a journalist at Dynamic Business. He previously worked with Reuters as a business correspondent and holds a postgrad degree in print journalism.

View all posts