Skip to main content

Introduction

LangChain is a powerful framework for developing language model applications. With Mixroute Api, you can flexibly use various AI models in LangChain.

Quick Start

1. Install Dependencies

pip install langchain langchain-openai

2. Basic Configuration

import os
from langchain_openai import ChatOpenAI

os.environ["OPENAI_API_KEY"] = "YOUR_MIXROUTE_API_KEY"
os.environ["OPENAI_BASE_URL"] = "https://console.mixroute.io/v1"

llm = ChatOpenAI(
    model="gpt-3.5-turbo",
    temperature=0.7
)

Core Features

Basic Chat

from langchain.schema import HumanMessage, SystemMessage

messages = [
    SystemMessage(content="You are a helpful assistant"),
    HumanMessage(content="Explain Python's main features")
]

response = llm.invoke(messages)
print(response.content)

Conversation Chain (with Memory)

from langchain.memory import ConversationBufferMemory
from langchain.chains import ConversationChain

memory = ConversationBufferMemory()
conversation = ConversationChain(llm=llm, memory=memory)

conversation.predict(input="I want to learn machine learning")
conversation.predict(input="Recommend some resources")

Streaming Output

from langchain.callbacks.streaming_stdout import StreamingStdOutCallbackHandler

streaming_llm = ChatOpenAI(
    model="gpt-3.5-turbo",
    streaming=True,
    callbacks=[StreamingStdOutCallbackHandler()]
)

streaming_llm.invoke("Write a poem about spring")

Model Selection

TaskModelReason
Simple chatgpt-3.5-turboFast, low cost
Complex reasoninggpt-4High accuracy
Long textclaude-3-opusLong context
Creative writingclaude-sonnet-4-5-20250929Smooth output