Skip to main content
Open In ColabOpen on GitHub

MLflow

MLflow is a versatile, open-source platform for managing workflows and artifacts across the machine learning and generative AI lifecycle. It has built-in integrations with many popular AI and ML libraries, but can be used with any library, algorithm, or deployment tool.

MLflow的LangChain集成提供了以下功能:

  • 追踪: 使用一行代码即可可视化数据在LangChain组件中的流动(mlflow.langchain.autolog()
  • 实验跟踪: 记录来自您的LangChain运行的产物、代码和指标
  • 模型管理: 使用依赖跟踪来版本化和部署LangChain应用程序
  • 评估: 测量您的LangChain应用程序的性能

注意:MLflow 跟踪功能在 MLflow 2.14.0 及更高版本中可用。

本简短指南重点介绍MLflow对LangChain和LangGraph应用程序的跟踪功能。您将了解如何通过一行代码启用跟踪并查看应用程序的执行流程。有关MLflow的其他功能以及探索更多教程的信息,请参阅MLflow针对LangChain的文档。如果您是MLflow新手,请查阅MLflow入门指南

设置

要开始使用MLflow跟踪LangChain,请安装MLflow Python包。我们还将使用langchain-openai包。

%pip install mlflow langchain-openai langgraph -qU

接下来,设置 MLflow 跟踪 URI 和 OpenAI API 密钥。

import os

# Set MLflow tracking URI if you have MLflow Tracking Server running
os.environ["MLFLOW_TRACKING_URI"] = ""
os.environ["OPENAI_API_KEY"] = ""

MLflow 跟踪

MLflow 的追踪功能帮助您可视化 LangChain 应用程序的执行流程。以下是启用方法。

import mlflow

# Optional: Set an experiment to organize your traces
mlflow.set_experiment("LangChain MLflow Integration")

# Enable tracing
mlflow.langchain.autolog()

示例:跟踪LangChain应用程序

以下是一个使用LangChain进行MLflow跟踪的完整示例:

import mlflow
from langchain_core.output_parsers import StrOutputParser
from langchain_core.prompts import ChatPromptTemplate
from langchain_openai import ChatOpenAI

# Enable MLflow tracing
mlflow.langchain.autolog()

# Create a simple chain
llm = ChatOpenAI(model_name="gpt-4o")

prompt = ChatPromptTemplate.from_messages(
[
(
"system",
"You are a helpful assistant that translates {input_language} to {output_language}.",
),
("human", "{input}"),
]
)

chain = prompt | llm | StrOutputParser()

# Run the chain
result = chain.invoke(
{
"input_language": "English",
"output_language": "German",
"input": "I love programming.",
}
)

要查看跟踪信息,请在终端中运行 mlflow ui,然后导航到 MLflow UI 中的“跟踪”选项卡。

示例:跟踪LangGraph应用程序

MLflow 还支持追踪 LangGraph 应用程序:

import mlflow
from langchain_core.tools import tool
from langgraph.prebuilt import create_react_agent

# Enable MLflow tracing
mlflow.langchain.autolog()


# Define a tool
@tool
def count_words(text: str) -> str:
"""Counts the number of words in a text."""
word_count = len(text.split())
return f"This text contains {word_count} words."


# Create a LangGraph agent
llm = ChatOpenAI(model="gpt-4o")
tools = [count_words]
graph = create_react_agent(llm, tools)

# Run the agent
result = graph.invoke(
{"messages": [{"role": "user", "content": "Write me a 71-word story about a cat."}]}
)

要查看跟踪信息,请在终端中运行 mlflow ui,然后导航到 MLflow UI 中的“跟踪”选项卡。

资源

有关将 MLflow 与 LangChain 结合使用的更多信息,请访问: