MAX can now be used in Google Colab

As we just announced, Modular’s platform is now available via pip install modular. This new deployment mechanism unlocks an exciting capability: using MAX in Google Colab.

The stable 25.3 release of modular will work with the L4 and A100 GPUs in Colab Pro. Thanks to the fantastic work of @ssslakter , introductory support for the T4 GPUs in the free tier of Colab is present in the latest nightlies. Full models don’t yet run on T4 GPUs, but you can run MAX graphs in the free tier of Colab.

This is a sample Jupyter Notebook that you can load into Colab and try out for yourself. This uses the Python interfaces in MAX to construct and run a simple graph, as well as perform inference using a supported large language model. This uses the latest nightly modular package, so the first section of the notebook will run on T4 GPUs in the free tier of Colab. If you have Colab Pro, you can run the LLM portion at the end of the notebook on an L4 or A100 GPU.


Google Colab support is still experimental and very new, but if you have any questions about the process, feel free to ask them below.

7 Likes

Through @stef 's hacking on an IPython magic command, here’s a prototype Colab notebook that can run Mojo code in a cell on GPU. We’ll keep working to improve the experience here, but thought that’d be fun to share.


4 Likes

This support for building and running Mojo code in a Jupyter notebook cell has been added into the max package, so in Colab you can now

import max.support.notebook

and compile and run Mojo code in a cell using

%%mojo

# Mojo code here

This even extends to building Mojo packages from a notebook cell, using the syntax

%%mojo package -o custom_ops.mojopkg

# Mojo code here

The newly simplified Mojo notebook example now uses this import.

3 Likes