Skip to main content
Open on GitHub

Predibase

了解如何在Predibase上使用LangChain与模型。

设置

  • 创建一个Predibase账户并获取API密钥
  • 使用 pip install predibase 安装Predibase Python客户端
  • 使用您的API密钥进行身份验证

LLM

Predibase 通过实现 LLM 模块与 LangChain 进行集成。你可以查看下面的简短示例,或者在 LLM > 集成 > Predibase 下查看完整的笔记本。

import os
os.environ["PREDIBASE_API_TOKEN"] = "{PREDIBASE_API_TOKEN}"

from langchain_community.llms import Predibase

model = Predibase(
model="mistral-7b",
predibase_api_key=os.environ.get("PREDIBASE_API_TOKEN"),
predibase_sdk_version=None, # optional parameter (defaults to the latest Predibase SDK version if omitted)
"""
Optionally use `model_kwargs` to set new default "generate()" settings. For example:
{
"api_token": os.environ.get("HUGGING_FACE_HUB_TOKEN"),
"max_new_tokens": 5, # default is 256
}
"""
**model_kwargs,
)

"""
Optionally use `kwargs` to dynamically overwrite "generate()" settings. For example:
{
"temperature": 0.5, # default is the value in model_kwargs or 0.1 (initialization default)
"max_new_tokens": 1024, # default is the value in model_kwargs or 256 (initialization default)
}
"""
response = model.invoke("Can you recommend me a nice dry wine?", **kwargs)
print(response)
API 参考:Predibase

Predibase 还支持基于 model 参数给定的基础模型进行微调的 Predibase 托管和 HuggingFace 托管适集成:

import os
os.environ["PREDIBASE_API_TOKEN"] = "{PREDIBASE_API_TOKEN}"

from langchain_community.llms import Predibase

# The fine-tuned adapter is hosted at Predibase (adapter_version must be specified).
model = Predibase(
model="mistral-7b",
predibase_api_key=os.environ.get("PREDIBASE_API_TOKEN"),
predibase_sdk_version=None, # optional parameter (defaults to the latest Predibase SDK version if omitted)
adapter_id="e2e_nlg",
adapter_version=1,
"""
Optionally use `model_kwargs` to set new default "generate()" settings. For example:
{
"api_token": os.environ.get("HUGGING_FACE_HUB_TOKEN"),
"max_new_tokens": 5, # default is 256
}
"""
**model_kwargs,
)

"""
Optionally use `kwargs` to dynamically overwrite "generate()" settings. For example:
{
"temperature": 0.5, # default is the value in model_kwargs or 0.1 (initialization default)
"max_new_tokens": 1024, # default is the value in model_kwargs or 256 (initialization default)
}
"""
response = model.invoke("Can you recommend me a nice dry wine?", **kwargs)
print(response)
API 参考:Predibase

Predibase 还支持通过 model 参数指定的基础模型进行微调的适集成:

import os
os.environ["PREDIBASE_API_TOKEN"] = "{PREDIBASE_API_TOKEN}"

from langchain_community.llms import Predibase

# The fine-tuned adapter is hosted at HuggingFace (adapter_version does not apply and will be ignored).
model = Predibase(
model="mistral-7b",
predibase_api_key=os.environ.get("PREDIBASE_API_TOKEN"),
predibase_sdk_version=None, # optional parameter (defaults to the latest Predibase SDK version if omitted)
adapter_id="predibase/e2e_nlg",
"""
Optionally use `model_kwargs` to set new default "generate()" settings. For example:
{
"api_token": os.environ.get("HUGGING_FACE_HUB_TOKEN"),
"max_new_tokens": 5, # default is 256
}
"""
**model_kwargs,
)

"""
Optionally use `kwargs` to dynamically overwrite "generate()" settings. For example:
{
"temperature": 0.5, # default is the value in model_kwargs or 0.1 (initialization default)
"max_new_tokens": 1024, # default is the value in model_kwargs or 256 (initialization default)
}
"""
response = model.invoke("Can you recommend me a nice dry wine?", **kwargs)
print(response)
API 参考:Predibase