Modular
Error running offline_inference.py
MAX
zetwhite
May 29, 2025, 12:17am
2
Ah, I found this post
Quickstart Docs
and giving
setting
to LLM fixed my problem!
2 Likes
show post in topic
Related topics
Topic
Replies
Views
Activity
Max Local Server Hanging with Message: instrument is None for maxserve.pipeline_load
MAX
discussion
6
45
July 2, 2025
MAX on CPU doubt and request
MAX
discussion
,
feature-request
,
24_6
3
117
July 9, 2025
MAX Nightly 25.5.0.dev2025061705 Released
Nightly
0
16
June 17, 2025
MAX Nightly 25.4.0.dev2025051623 Released
Nightly
0
23
May 17, 2025
Gemma-3-1b-it having trouble with long promts, unlike VLLM
MAX
4
119
June 18, 2025