LLM Model API Reference (0.26.46)
LLM Model API Reference (0.26.46)
This is archived documentation for version 0.26.46. For the current documentation, see current documentation.
Customer-hosted cloud features
Some features are only available on customer-hosted clouds. Reach out to support@anyscale.com for info.
LLM Model CLI
anyscale llm model get Alpha
This command is in early development and may change. Users must be tolerant of change.
Usage
anyscale llm model get [OPTIONS]
Gets the model card for the given model ID or corresponding job ID.
Example usage:
anyscale llm model get --model-id my-model-id
anyscale llm model get --job-id job_123
Options
--model-id: ID for the model of interest--job-id: ID for the Anyscale job corresponding to the fine-tuning run
Examples
- CLI
$ anyscale llm model get --model-id my-model-id
Output
{
'id': 'my-model-id',
'base_model_id': 'meta-llama/Meta-Llama-3-8B',
'storage_uri': 'gs://my_bucket/my_folder',
'ft_type': 'LORA',
'cloud_id': 'cld_tffbxe9ia5phqr1unxhz4f7e1e',
'project_id': 'prj_dqb6ha67zubz3gdlvn2tmmglb8',
'created_at': 1725563985,
'creator': 'test@anyscale.com',
'job_id': 'N/A',
'workspace_id': 'expwrk_yje3t8twim18iuta9r45gwcgcn',
'generation_config': {
'prompt_format': {
'system': '<|start_header_id|>system<|end_header_id|>\n\n{instruction}<|eot_id|>',
'assistant': '<|start_header_id|>assistant<|end_header_id|>\n\n{instruction}<|eot_id|>',
'trailing_assistant': '<|start_header_id|>assistant<|end_header_id|>\n\n',
'user': '<|start_header_id|>user<|end_header_id|>\n\n{instruction}<|eot_id|>',
'bos': '<|begin_of_text|>',
'default_system_message': '',
'add_system_tags_even_if_message_is_empty': False,
'system_in_user': False,
'system_in_last_user': False,
'strip_whitespace': True
},
'stopping_sequences': None
}
}
anyscale llm model list Alpha
This command is in early development and may change. Users must be tolerant of change.
Usage
anyscale llm model list [OPTIONS]
Lists fine-tuned models available to the user.
By default, all models in all visible clouds under all visible projects to the user are listed. This is optionally filtered by project_id and/or cloud_id.
Example usage:
anyscale llm model list
anyscale llm model list --max-items 50
anyscale llm model list --cloud-id cld_123
anyscale llm model list --project-id prj_123
anyscale llm model list --cloud-id cld_123 --project-id prj_123
NOTE:
If you are running this from within an Anyscale workspace, and neither cloud_id nor project_id are provided, the cloud and project of the workspace will be used.
Options
--cloud-id: Cloud ID to filter by. If not specified, all models from all visible clouds (filtered optionally byproject_id) are listed.--project-id: Project ID to filter by. If not specified, all the models from all visible projects (filtered optionally bycloud_id) are listed.--max-items: Maximum number of items to show in the list. By default, the 20 most recently created models are fetched.
Examples
- CLI
$ anyscale llm model list --cloud-id cld_1j41ls4gwkga4pwp8nbql6f239 --project_id prj_i4wy1t442cbe2sthxp61dmtkbh --max-items 2
Output
[
{
'id': 'meta-llama/Meta-Llama-3-8B-Instruct:test:bnkve',
'base_model_id': 'meta-llama/Meta-Llama-3-8B-Instruct',
'storage_uri': 's3://anyscale-production-data-cld-1j41ls4gwkga4pwp...',
'ft_type': 'LORA',
'cloud_id': 'cld_1j41ls4gwkga4pwp8nbql6f239',
'project_id': 'prj_i4wy1t442cbe2sthxp61dmtkbh',
'created_at': 1725572462,
'creator': 'test@anyscale.com',
'job_id': 'N/A',
'workspace_id': 'expwrk_bqld1y579g3clukr49rsnd7i5m',
'generation_config': '{"prompt_format": {"system": "<|start_header_id|>s...'
},
{
'id': 'neuralmagic/Meta-Llama-3.1-8B-Instruct-FP8:test:czcal',
'base_model_id': 'neuralmagic/Meta-Llama-3.1-8B-Instruct-FP8',
'storage_uri': 'gs://storage-bucket-cld-tffbxe9ia5phqr1unxhz4f7e1e...',
'ft_type': 'LORA',
'cloud_id': 'cld_1j41ls4gwkga4pwp8nbql6f239',
'project_id': 'prj_i4wy1t442cbe2sthxp61dmtkbh',
'created_at': 1725563985,
'creator': 'test@anyscale.com',
'job_id': 'N/A',
'workspace_id': 'expwrk_yje3t8twim18iuta9r45gwcgcn',
'generation_config': '{"prompt_format": {"system": "<|start_header_id|>s...'
}
]
anyscale llm model delete Alpha
This command is in early development and may change. Users must be tolerant of change.
Usage
anyscale llm model delete [OPTIONS] MODEL_ID
Deletes the model for the given model ID. Requires owner permission for the corresponding Anyscale project.
MODEL_ID = ID for the model of interest
Example usage:
anyscale llm model delete my-model-id
Options
Examples
- CLI
$ anyscale llm model delete --model-id my-model-id
Output
{'id': 'my-model-id', 'deleted_at': 1725572462}