Modular
Error running offline_inference.py
MAX
zetwhite
May 29, 2025, 12:17am
2
Ah, I found this post
Quickstart Docs
and giving
setting
to LLM fixed my problem!
2 Likes
show post in topic
Related topics
Topic
Replies
Views
Activity
Max Local Server Hanging with Message: instrument is None for maxserve.pipeline_load
MAX
discussion
6
114
July 2, 2025
MAX on CPU doubt and request
MAX
discussion
,
feature-request
,
24_6
3
168
July 9, 2025
Inference performance issue
MAX
6
289
October 31, 2025
Gemma-3-1b-it having trouble with long promts, unlike VLLM
MAX
5
264
December 15, 2025
Build an LLM in MAX from scratch 📖
MAX
max-llms
,
max-llm-book
10
601
February 6, 2026