Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

any hardware recommendations? how much memory do we need to this?


You will effectively want a 48GB card or more for quantized versions, otherwise you won't have meaningful space left for the KV cache. Blackwell and above is generally a good idea to get faster hardware support for 4b (some recent models took some time to ship for older architectures, gpt-oss IIRC).


This is a Mixture of Experts model with only 3B activated parameters. But I agree that for the intended usage scenario VRAM for the KV cache is the real limitation.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: