Sometimes I test a local LLM on my M2 MacBook Air, which isn’t very powerful for this task. After a few prompts, it gets quite hot.๐ก๏ธ A MacBook Pro, with fans, would handle it better. Now, think about millions of people prompting LLMs in data centers worldwide, with much more powerful models responding and consuming huge amounts of energy. No wonder some parts of the world are experiencing an energy crisis because of AI.