Tensor networks and the quantum bargain: startups to watch

A 3-D of multi-colored balls connected by small, cylindrical tubes.

Illustration by Hawaii

The future of the spatial economy is quite literally being built on the dust of the past.

As large language models (LLMs) like OpenAI’s GPT, Google’s Gemini, and Anthropic’s Claude continue to balloon in size, the infrastructure costs for training and operating them are becoming economically unsustainable. LLMs and the data centers they run on are also notorious ​energy hogs​. From the deepest-pocketed tech giants to municipalities to home consumers, we are all shouldering the financial load. Now, physicists are experimenting with quantum-inspired algorithms to fine-tune neural networks and compress their data footprints — which may just help keep the AI revolution from collapsing under its own weight. Science News’s Emily Conover brings the ​full technical breakdown​.

⚖️ Squishing the matrix: The art of tensor networks

Scientists are working to scrunch AI models with tensor networks, a mathematical framework borrowed from quantum physics that describes how complex systems interact. By applying this logic to AI models, they can essentially zip the data, identifying and keeping only the most critical correlations in the data while discarding the redundant noise. In a telling test of one LLM, where billions of numbers in parameters in the code determined how a chatbot processed prompts, compressing redundancies resulted in 30 to 40% less energy usage without sacrificing power.

😇 Computing with a conscience: Green AI models

The current trajectory of AI energy consumption is a major sustainability bottleneck, with data centers on track to consume a significant percentage of global electricity.

Premium Subscription

Access to this content is for Investors Lab subscribers only. Investors Lab delivers exclusive, data-backed insights into scientific breakthroughs set to disrupt industries.