lqdev

https://github.com/ml-explore/mlx-examples/tree/main/mixtral

Run the Mixtral1 8x7B mixture-of-experts (MoE) model in MLX on Apple silicon.


Send me a message or webmention
Back to feed