Try
mixtral-8x7b-32kseqlen
The Mixtral-8x7B Large Language Model (LLM) is a pretrained generative Sparse Mixture of Experts.
Stay ahead with weekly updates: get platform news, explore projects, discover updates, and dive into case studies and feature breakdowns.