AI goes small, GLP-1s go big(ger), fungus for dinner, and more

An illustration of a data center.

As the use of generative AI goes up, the need for more data centers and electricity also rises.

QUANTIC69/ISTOCK/GETTY IMAGES

AI goes small

📸 The big picture: compute

Generative artificial intelligence (AI), for all its promises and rewards, is a notorious energy hog. The massive models that understand and generate human language inside market frontrunners ChatGPT, Gemini, Claude, and even Chinese upstart DeepSeek, require tremendous computational power (a.k.a. “compute”). All those millions of queries to servers tax energy grids, suck up tons of water, and pump out climate-warming carbon emissions. What can be done?

🧪Scientists respond

Lauren Leffer reports for SN on efforts to mitigate the energy impact. Big tech is retooling its data centers and mobilizing AI developers to lessen their carbon emissions and resource use. Adjusting day-to-day operations could minimize the energy demand. For example, training models only when there’s ample carbon-free power on the grid (say, on sunny days when solar panels produce abundant energy). Or, implementing server-cooling solutions that recycle water or involve alternative coolants.

💡What else matters: going small

But there’s another hopeful conversation around the future of AI: the rise of small language models (SLMs). Like the large language models (LLMs) in established chatbots, SLMs process, understand, and generate human language. Trained on smaller datasets than LLMs, they use fewer parameters to generate outputs, making them lighter and more efficient — thereby consuming less compute.

Premium Subscription

Access to this content is for Investors Lab subscribers only. Investors Lab delivers exclusive, data-backed insights into scientific breakthroughs set to disrupt industries.