Here is a how-to-guide for setting up a local code assistant solution by combining MAX Serve with the Continue VSCode extension. 
Overview
- Deploy a local endpoint using MAX Serve
- Configure OpenAI-compatible endpoint in Continue
- Leverage Continue code assistant features with MAX!
I outlined the steps in depth in this blog
Please share your feedback! While this is experimental, Iām excited about this potential use case for MAX.
5 Likes
tl;dr
Start MAX Serve
magic run serve --huggingface-repo-id=Qwen/Qwen2.5-1.5B-Instruct
Add MAX endpoint to Continue
open ~./continue/config.json
and add MAX openai-compatible endpoint to models
section:
{
"models": [
{
"model": "Qwen/Qwen2.5-1.5B-Instruct",
"title": "MAX - Qwen/Qwen2.5-1.5B-Instruct",
"apiBase": "http://localhost:8000/v1/",
"apiKey": "",
"provider": "openai"
}
]
}
once MAX is configured in Continue, it will appear as an option in the model drop down:
1 Like