BentoML + Modular: Acquisition Roundup & Resources

Welcome to the BentoML community on the Modular forum! If you’ve been following the news, BentoML has joined Modular, and we wanted to put together a quick resource roundup for anyone looking to catch up or dive deeper.

The announcement

Read the full story on why this combination makes sense and what it means for the future of AI inference and deployment: https://www.modular.com/blog/bentoml-joins-modular

The open source project

BentoML remains Apache 2.0 licensed and fully open source. Whether you’re building inference APIs, LLM apps, multi-model pipelines, or job queues, the project is right where it’s always been: https://github.com/bentoml/BentoML

The AMA

Chris Lattner and BentoML founder Chaoyu Yang hosted an Ask Us Anything session on February 17th. Topics covered include the open source roadmap, BentoML + MAX/Mojo integration plans, BentoCloud’s future, and hardware support across NVIDIA, AMD, and more. Lots of great Q&A in there: https://forum.modular.com/t/modular-has-acquired-bentoml-ask-us-anything/2706

We’re excited to have the BentoML community here. Feel free to use this space to ask questions, share projects, and connect with the team. More to come!