Note there are two formats: Org ; with new 4bit tensors : https://huggingface.co/openai/gpt-oss-20b (a mess ?) and the fine tune ; ie unsloth -> converted to bf16: https://huggingface.co/unsloth/gpt-oss-20b-BF16 NOTE: Both have odd layer dimensions. Suggest the latter ; as the "tuning" versions are converted to bf16. As of this writing there are 70+ fine tunes ; 120B seems to have same format based on quantizating.