When trying open-clip, getting "SmallVector unable to grow" internal error

Hi,

I am trying to run open-clip model using the ONNX format, and running into some internal error.

The code is just this:

from pathlib import Path
import torch
from datasets import load_dataset

from max import engine
from max.dtype import DType

print(engine.__version__)

MODEL_PATH = Path("../downloader/clip_vit_l14_image.onnx")


def main():
    session = engine.InferenceSession()
    model = session.load(MODEL_PATH)

if __name__ == "__main__":
    main()

The error message is

 % python run.py
25.2.0.dev2025021805
LLVM ERROR: SmallVector unable to grow. Requested capacity (5072621376) is larger than maximum value for size type (4294967295)

The enviroment was configured with

magic add "max~=25.1" "pytorch==2.4.0" "numpy<2.0" "onnx==1.16.0" \
  "transformers==4.40.1" "datasets==2.18" "pillow"

Finally, the ONXX file was produced as follows:

import torch
import open_clip
import torch.onnx

model, _, preprocess = open_clip.create_model_and_transforms("ViT-L-14", pretrained="openai", force_quick_gelu=True)

model.eval()

dummy_input = torch.randn(1, 3, 224, 224)
torch.onnx.export(model.visual,
                  dummy_input,
                  "clip_vit_l14_image.onnx",
                  input_names=["image"],
                  output_names=["image_features"],
                  #opset_version=17,
                  )

Note also that I was able to serve the same onnx file using Triton, it loads, and returns responses.

Am I doing something wrong, or is there a bug?

This does look like a bug, this should work. Can you submit a bug report here? Sign in to GitHub · GitHub

This is torch onnx exporter bug. Resolved using optimum-cli [BUG]: "SmallVector unable to grow" internal error load ONNX model · Issue #298 · modular/max · GitHub

1 Like

This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.