Skip to main content
Open In ColabOpen on GitHub

OCI 数据科学模型部署端点

OCI 数据科学 是一个完全托管且无服务器的平台,供数据科学团队在 Oracle 云基础设施中构建、训练和管理机器学习模型。

For the latest updates, examples and experimental features, please see ADS LangChain Integration.

本笔记本介绍了如何使用托管在 OCI 数据科学模型部署 上的大型语言模型。

在身份验证方面,使用 oracle-ads 库自动加载调用端点所需的凭据。

!pip3 install oracle-ads

前置条件

部署模型

您可以轻松地使用 OCI Data Science 模型部署上的 AI 快速操作 来部署、微调和评估基础模型。有关更多部署示例,请访问 Oracle GitHub 示例存储库

政策

确保拥有访问OCI数据科学模型部署端点所需的策略

设置

在部署模型后,您需要设置调用所需的以下参数:

  • endpoint: 部署模型的模型 HTTP 端点,例如 https://modeldeployment.<region>.oci.customer-oci.com/<md_ocid>/predict

认证

您可以选择通过广告或环境变量设置身份验证。在 OCI 数据科学笔记本会话中,您可以利用资源主体访问其他 OCI 资源。查看 此处 以了解更多信息。

示例

import ads
from langchain_community.llms import OCIModelDeploymentLLM

# Set authentication through ads
# Use resource principal are operating within a
# OCI service that has resource principal based
# authentication configured
ads.set_auth("resource_principal")

# Create an instance of OCI Model Deployment Endpoint
# Replace the endpoint uri and model name with your own
# Using generic class as entry point, you will be able
# to pass model parameters through model_kwargs during
# instantiation.
llm = OCIModelDeploymentLLM(
endpoint="https://modeldeployment.<region>.oci.customer-oci.com/<md_ocid>/predict",
model="odsc-llm",
)

# Run the LLM
llm.invoke("Who is the first president of United States?")
import ads
from langchain_community.llms import OCIModelDeploymentVLLM

# Set authentication through ads
# Use resource principal are operating within a
# OCI service that has resource principal based
# authentication configured
ads.set_auth("resource_principal")

# Create an instance of OCI Model Deployment Endpoint
# Replace the endpoint uri and model name with your own
# Using framework specific class as entry point, you will
# be able to pass model parameters in constructor.
llm = OCIModelDeploymentVLLM(
endpoint="https://modeldeployment.<region>.oci.customer-oci.com/<md_ocid>/predict",
)

# Run the LLM
llm.invoke("Who is the first president of United States?")
import os

from langchain_community.llms import OCIModelDeploymentTGI

# Set authentication through environment variables
# Use API Key setup when you are working from a local
# workstation or on platform which does not support
# resource principals.
os.environ["OCI_IAM_TYPE"] = "api_key"
os.environ["OCI_CONFIG_PROFILE"] = "default"
os.environ["OCI_CONFIG_LOCATION"] = "~/.oci"

# Set endpoint through environment variables
# Replace the endpoint uri with your own
os.environ["OCI_LLM_ENDPOINT"] = (
"https://modeldeployment.<region>.oci.customer-oci.com/<md_ocid>/predict"
)

# Create an instance of OCI Model Deployment Endpoint
# Using framework specific class as entry point, you will
# be able to pass model parameters in constructor.
llm = OCIModelDeploymentTGI()

# Run the LLM
llm.invoke("Who is the first president of United States?")

异步调用

await llm.ainvoke("Tell me a joke.")

流式调用

for chunk in llm.stream("Tell me a joke."):
print(chunk, end="", flush=True)

API 参考

有关所有功能和配置的详细信息,请参阅每个类的 API 参考文档: