Chat: mistralai/Mixtral-8x7B-Instruct-v0.1
Check your docs version
These docs are for the new Anyscale design. If you started using Anyscale before April 2024, use Version 1.0.0 of the docs. If you're transitioning to Anyscale Preview, see the guide for how to migrate.
info
See the Hugging Face model page for more model details.
About this modelā
Model name to use in API calls:
mistralai/Mixtral-8x7B-Instruct-v0.1
The Mixtral-8x7B Large Language Model (LLM) is a pretrained generative Sparse Mixture of Experts.
Model Developers: Mistral
InputĀ Models: input text only.
OutputĀ Models: generate text only.
Context Length: 32768
License: Apache 2.0