Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Easy Setup Self-host Mixtral-8x7B across devices with a 2M inference app (secondstate.io)
2 points by 3Sophons on Jan 2, 2024 | hide | past | favorite | 1 comment


Run open source large language model 'Mixtral-8x7B' locally. This MoE model use open source protocol Apache 2.0. It is the most powerful open weight model currently on the market. It can be easily deployed on various devices with WasmEdge. Whether it’s a laptop or an edge device, you can get it running with just a few command lines. The fully portable inference app that runs this model is only 2MB! Do not believe? Then take a look for yourself and witness its power with your own eyes! https://www.secondstate.io/articles/mixtral-8-7b/




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: