Forget AI Models, The Real Battlefield is The Energy

AI Study Tech

In my view, not AI models, but the environmentally friendly energy solution may become the important factor which decides who will lead the world in AI. One may have the best AI model, but if its energy consumption is not economical, then it’s not a practical solution.

Some people think, due to some ethical issues like lack of transparency (For example: ChatGPT is a proprietary software, so consumers are unaware of its rules of processing) or embedded bias (For example: Models learn from old test datasets which are sometimes biased) etc, and that’s why chatbots like ChatGPT won’t do some jobs fairly like humans. But over the period of time, new better technologies, models, datasets, algorithms or methods will be invented and these ethical issues may be resolved.

However, in my view, the real problem is the energy and its footprint consumed by each ChatGPT query. At present, the cost of processing a simple general query to ChatGPT isn’t economical compared to asking a human. The cost per query in ChatGPT is estimated to be roughly 0.36 cents for users who subscribe. But for free users, this cost is likely to be much higher. That’s why many researchers are working on how to reduce energy cost per query at the microchip level. However, chips are also material dependent and may have limitations at a certain level in the future. Anyway, innovation in computer chips segment is also the current need. There is no question about that. You might be wondering if we aren’t using energy and creating an energy footprint while searching on Google. Yes, we are. However, as per one estimation, on average, a single ChatGPT query consumes 15 times more energy than a Google Search query. A single complex ChatGPT query may consume upto 0.01 kilo-Watt-hours of power. To put it in context, a 60 W incandescent light bulb consumes 0.06 kilo-Watt-hours of power in 1 hour. In a few years, powering AI could use as much electricity as a small country.

At present, China is the largest producer of AI research papers in the world and has been pouring billions into AI development and research in its goal to become a global leader in AI by 2030. It is more likely that Chinese researchers have identified the energy problem of AI software too early. To overcome the energy problem, China is making the ‘Artificial Sun’. China’s Experimental Advanced Superconducting Tokamak (EAST), established in the mid 2000s, which mimics the energy generation process of the sun, also got its nickname ‘Artificial Sun’ in the late 2010s. The artificial sun project is based on nuclear fusion giving China an unlimited energy source without generating residual waste. China has successfully activated the ‘Artificial Sun’ which lasted up to 17 minutes for the first time on December 4, 2020. It is predicted to come online in 2025 for experiments and to become operational in the 2030s.

On the other hand, India also aspires to be the global AI lab for emerging economies and it has shown the highest growth rates for AI patent applications by country. But, at present, we are still lagging behind both China & the US, not only in research publications but also in talent and data processing. For example, in one such reported case, the dataset, like voice samples from India, was sold to agencies in China for use and analysis without permission. To counter such activities, new acts are introduced and some old are amended in India too. We are also focusing on improving the skill sets of the developers. Though India has been a member of the International Thermonuclear Experimental Reactor (ITER) since 2005, we also need a structure set up like the ‘Artificial Sun’ on an urgent basis, otherwise it’s difficult to be one of the AI leaders.