RayLLM is deprecating soon: The Ray Team is consolidating around open source online inferencing solutions. Ray Serve LLM provides LLM serving solution that makes it easy to deploy and manage a variety of open source LLMs. See the migration guide for transitioning your workflows.
Migrate from OpenAI to open models
Introduction
RayLLM provides an OpenAI compatible Rest API that can be used to query open-weight models served on Anyscale. This guide covers the list of features that can be migrated with minimal changes from OpenAI to RayLLM deployed models. It also supports additional features that are supported by RayLLM but not by OpenAI.
For basic use-case the migration of the applications should be as simple as following the steps below:
- Setting the
OPENAI_BASE_URL
environment variable - Setting the
OPENAI_API_KEY
environment variable - Changing the model name in the code
- Adjusting any parameters to the API calls
Hopefully these 4 steps mean it should take you just a few minutes to migrate.