Skip to main content
Open on GitHub

Memcached

Memcached is a free & open source, high-performance, distributed memory object caching system, generic in nature, but intended for use in speeding up dynamic web applications by alleviating database load.

本页面介绍如何使用 pymemcache 作为客户端,将 Memcached 与 langchain 结合使用,连接到已运行的 Memcached 实例。

安装与设置

pip install pymemcache

大型语言模型缓存

将 Memcached 缓存集成到您的应用程序中:

from langchain.globals import set_llm_cache
from langchain_openai import OpenAI

from langchain_community.cache import MemcachedCache
from pymemcache.client.base import Client

llm = OpenAI(model="gpt-3.5-turbo-instruct", n=2, best_of=2)
set_llm_cache(MemcachedCache(Client('localhost')))

# The first time, it is not yet in cache, so it should take longer
llm.invoke("Which city is the most crowded city in the USA?")

# The second time it is, so it goes faster
llm.invoke("Which city is the most crowded city in the USA?")

示例笔记本中了解更多信息