The Shift from Models to Compound AI Systems
AI captured the spotlight in 2023 with the rise of Large Language Models (LLMs) capable of performing a variety of tasks with just a prompt. This led to a focus on models as the core of AI development, but there’s a shift happening now. State-of-the-art AI results are being achieved through compound systems with multiple components, not just monolithic models.
For instance, Google’s AlphaCode 2 achieved breakthroughs in programming by combining LLMs with other components to generate and filter solutions. This trend towards compound systems is evident in various fields, including enterprise applications and research projects, indicating a move away from relying solely on single models to achieve AI milestones.
This post dives into the trend of compound AI systems, exploring why developers are gravitating towards this approach, its sustainability as models improve, and the emerging tools for building and optimizing compound systems. It’s becoming increasingly clear that compound AI systems will likely be the key to maximizing AI results in the future, making it a crucial trend for 2024.
Increasingly many new AI results are from compound systems.
The AI System Design Space
Recent compound AI systems showcase the variety in design choices:
AI System | Components | Design | Results |
---|---|---|---|
AlphaCode 2 |
|
Generates and filters solutions to coding problems | Matches human performance on coding contests |
AlphaGeometry |
|
Combines LLM with traditional solver for olympiad problems | Competitive with International Math Olympiad medalists |
Key Challenges in Compound AI Systems
Design Space
Building compound AI systems involves exploring a vast design space and allocating resources efficiently among system components.
Optimization
Maximizing the quality of compound systems requires co-optimizing the components to work seamlessly together.
Operation
Managing compound AI systems poses challenges in monitoring, data serving, and security compared to traditional ML models.
Emerging Paradigms
Several new approaches are emerging to address the challenges of building and optimizing compound AI systems:
- Designing AI Systems: Composition Frameworks and Strategies
- Automatically Optimizing Quality: DSPy
- Optimizing Cost: FrugalGPT and AI Gateways
- Operation: LLMOps and DataOps
As the shift towards compound AI systems continues, developers will need to embrace these emerging paradigms to maximize the quality and reliability of AI applications moving forward. It’s an exciting time for AI development, with compound systems poised to be at the forefront of innovation in 2024.
BibTex for this post:
@misc{compound-ai-blog,
title={The Shift from Models to Compound AI Systems},
author={Matei Zaharia and Omar Khattab and Lingjiao Chen and Jared Quincy Davis
and Heather Miller and Chris Potts and James Zou and Michael Carbin
and Jonathan Frankle and Naveen Rao and Ali Ghodsi},
howpublished={\url{https://bair.berkeley.edu/blog/2024/02/18/compound-ai-systems/}},
year={2024}
}