Chat: mistralai/Mixtral-8x22B-Instruct-v0.1
Check your docs version
These docs are for the new Anyscale design. If you started using Anyscale before April 2024, use Version 1.0.0 of the docs. If you're transitioning to Anyscale Preview, see the guide for how to migrate.
info
See the Hugging Face model page for more model details.
About this modelā
Model name to use in API calls:
mistralai/Mixtral-8x22B-Instruct-v0.1
The Mixtral-8x22B Large Language Model (LLM) is a pretrained generative Sparse Mixture of Experts. This model natively supports function calling, which allows you to call functions directly from the model's output.
Model Developers: Mistral
InputĀ Models: input text only.
OutputĀ Models: generate text only.
Context Length: 65536
License: Apache 2.0