Used huggingface-cli to download openai/gpt-oss-20b model and tried max command as “max serve –model-path openai/gpt-oss-20b”, it shows error as follows:
ValueError: The checkpoint you are trying to load has model type gpt_oss
but Transformers does not recognize this architecture. This could be because of an issue with the checkpoint, or because your version of Transformers is out of date.
Tried upgrade of transformers, but still the same issue. Do we need to wait for newer version of modular max?