Using DeepSeek-V3 in Langchain
An open-source model claiming performance comparable to GPT-4 and Claude-3.5-Sonnet, the DeepSeek V3 model, has been released. In this post, we will briefly introduce how to use the DeepSeek-V3 model in LangChain.
An open-source model claiming performance comparable to GPT-4 and Claude-3.5-Sonnet, the DeepSeek V3 model, has been released. DeepSeek is a Chinese AI company known for open-sourcing all of its AI models over the past year and a half. This model, with 671 billion parameters (671B), was trained on a total of 14.8 trillion tokens and can process 60 tokens per second. Until February 8, 2025, DeepSeek-V3 is available at a significantly discounted price, making it an opportunity too good to miss. In this post, we will briefly introduce how to use the DeepSeek-V3 model in LangChain.
Setting
Install langchain
and langchain-openai
.
pip install langchain langchain-openai
Obtain an API key from the DeepSeek platform. Please use the link below:
https://platform.deepseek.com/api_keys
Save the issued API key for later use. I prefer storing API keys in environment variables, so I have saved mine in the DEEPSEEK_API_KEY
environment variable.
export DEEPSEEK_API_KEY="sk-...."
Using DeepSeek-V3 in Langchain
The DeepSeek-V3 model's API is compatible with the OpenAI API, making it easy to use within LangChain with minimal configuration. Please check out the simple code example below. It's important to set the model
value to "deepseek-chat"
and the openai_api_base
value to "https://api.deepseek.com"
. The os.getenv("DEEPSEEK_API_KEY")
part can be replaced with the API key you obtained earlier.
import os
from langchain_openai import ChatOpenAI
llm = ChatOpenAI(
model="deepseek-chat",
openai_api_key=os.getenv("DEEPSEEK_API_KEY"),
openai_api_base="https://api.deepseek.com",
max_tokens=1024
)
print(llm.invoke("Hello!"))
Concluding remark
In this post, we briefly explored how to use the latest open-source LLM model, DeepSeek-V3, with LangChain. Given that this model offers performance comparable to GPT-4o and Claude-3.5-Sonnet at nearly one-tenth the price until February 8, 2025, it seems likely that DeepSeek will attract quite a few users early in the year. I’m curious to see how OpenAI and Anthropic will respond. Will we soon be able to use GPT-4o at a more affordable price? 🙂