Combining Anyscale and OpenAI endpoints
Check your docs version
These docs are for the new Anyscale design. If you started using Anyscale before April 2024, use Version 1.0.0 of the docs. If you're transitioning to Anyscale Preview, see the guide for how to migrate.
Anyscale Endpoints works together with the OpenAI endpoint. Have both an Anyscale Endpoints token and an OpenAI API key ready to call the OpenAI endpoint and Anyscale endpoint with this example code.
Call the OpenAI endpoint with an OpenAI API key
Execute unset OPENAI_BASE_URL and unset OPENAI_API_KEY before running the code below to avoid calling the wrong APIs.
Install openai>=1.0.0
for this example.
import openai
system_content = "You will be provided with a product description and seed words. Your task is to generate potential product names."
user_content = "Product description: A home milkshake maker. Seed words: fast, healthy, compact."
client = openai.OpenAI(api_key="YOUR_OPENAI_API_KEY")
chat_completion = client.chat.completions.create(
model="gpt-3.5-turbo-0301",
messages=[{"role": "system", "content": system_content},
{"role": "user", "content": user_content}],
temperature=0.7
)
product_names = chat_completion.choices[0].message.content
print("Results from the OpenAI endpoint:\n", product_names)
Call Anyscale Endpoints
You can also call Anyscale Endpoints with base_url
and api_key
parameters set for the OpenAI client.
client = openai.OpenAI(
base_url = "https://api.endpoints.anyscale.com/v1",
api_key="YOUR_ANYSCALE_ENDPOINT_TOKEN"
)
chat_completion = client.chat.completions.create(
model="meta-llama/Llama-2-70b-chat-hf",
messages=[{"role": "system", "content": system_content},
{"role": "user", "content": user_content}],
temperature=0.7
)
product_names = chat_completion.choices[0].message.content
print("Results from Anyscale Endpoint:\n", product_names)