Skip to main content
Version: Latest

Embedding: thenlper/gte-large

Check your docs version

These docs are for the new Anyscale design. If you started using Anyscale before April 2024, use Version 1.0.0 of the docs. If you're transitioning to Anyscale Preview, see the guide for how to migrate.

info

See the the Hugging Face model page for more model details.

See Generate an embedding for how to use the embedding model with Anyscale Endpoints.

About this modelā€‹

Model name to use in API calls:

thenlper/gte-large

The GTE models completed training on a large-scale corpus of relevance text pairs, covering a wide range of domains and scenarios. This enables the GTE models to be useful for various downstream tasks of text embeddings, including information retrieval, semantic textual similarity, text reranking, etc.

Model Developers: Hugging Face

InputĀ Models: input text only.

OutputĀ Models: embedding of the text only.

Maximum Input Length: 512

License: MIT