Loading...
736.2K
144
16-24 GB
RTX 4090 / M2 Max
Qwen3.5 397B A17B
Alibaba
Qwen3.5 27B
Qwen3.5 122B A10B
Qwen3.5 397B A17B (Non-reasoning)
Qwen: Qwen3 Max Thinking
Qwen3.5 27B (Non-reasoning)