Modular
Error running offline_inference.py
MAX
zetwhite
May 29, 2025, 12:17am
2
Ah, I found this post
Quickstart Docs
and giving
setting
to LLM fixed my problem!
2 Likes
show post in topic
Related topics
Topic
Replies
Views
Activity
MAX on CPU doubt and request
MAX
discussion
,
feature-request
,
24_6
2
113
January 10, 2025
MAX Nightly 25.5.0.dev2025061705 Released
Nightly
0
15
June 17, 2025
Issue running the llama3 example in MAX 24.6
MAX
debugging
,
24_6
12
145
December 21, 2024
MAX Nightly 25.4.0.dev2025051623 Released
Nightly
0
23
May 17, 2025
Gemma-3-1b-it having trouble with long promts, unlike VLLM
MAX
4
112
June 18, 2025