Skip to content

LangChain Setup

Connect a LangChain or LangGraph agent to graph8’s MCP server. Two transports are supported: hosted (remote OAuth) and self-hosted (local stdio).

Prerequisites

  • Python 3.10+
  • langchain-mcp-adapters, langgraph, langchain-anthropic (or your provider of choice)
  • A graph8 account with a personal API key from Profile -> Developer (for stdio), or any active session (for OAuth)
Terminal window
pip install langchain-mcp-adapters langgraph langchain-anthropic

Hosted MCP (remote OAuth)

Use this when you want graph8 to handle auth via OAuth and you do not want to install the MCP server locally. The agent connects to https://be.graph8.com/mcp/ over Streamable HTTP and prompts you to sign in once.

import asyncio
import os
from langchain_mcp_adapters.client import MultiServerMCPClient
from langgraph.prebuilt import create_react_agent
from langchain_anthropic import ChatAnthropic
async def main() -> None:
client = MultiServerMCPClient(
{
"graph8": {
"transport": "streamable_http",
"url": "https://be.graph8.com/mcp/",
}
}
)
tools = await client.get_tools()
model = ChatAnthropic(
model="claude-sonnet-4",
api_key=os.environ["ANTHROPIC_API_KEY"],
)
agent = create_react_agent(model, tools)
result = await agent.ainvoke(
{
"messages": [
(
"user",
"Find 25 VP Engineering at Series B SaaS companies "
"in the United States. Return name, company, and title.",
)
]
}
)
print(result["messages"][-1].content)
if __name__ == "__main__":
asyncio.run(main())

The first run opens a browser window for OAuth. Subsequent runs reuse the token cached by the MCP client.

Self-hosted MCP (stdio)

Use this when you cannot rely on browser-based OAuth (CI, headless servers, air-gapped dev). Requires uv or Python on the host and a personal API key.

Terminal window
export G8_API_KEY="g8_..."
import asyncio
import os
from langchain_mcp_adapters.client import MultiServerMCPClient
from langgraph.prebuilt import create_react_agent
from langchain_anthropic import ChatAnthropic
async def main() -> None:
client = MultiServerMCPClient(
{
"graph8": {
"transport": "stdio",
"command": "uvx",
"args": ["g8-mcp-server"],
"env": {
"G8_API_KEY": os.environ["G8_API_KEY"],
"G8_MCP_MODE": "gtm",
},
}
}
)
tools = await client.get_tools()
model = ChatAnthropic(model="claude-sonnet-4")
agent = create_react_agent(model, tools)
result = await agent.ainvoke(
{"messages": [("user", "Search my CRM for contacts at Stripe.")]}
)
print(result["messages"][-1].content)
if __name__ == "__main__":
asyncio.run(main())

Set G8_MCP_MODE to dev, gtm, or all to scope the tool set. See Modes.

Worked example: prospect, qualify, enroll

End-to-end LangGraph agent that finds prospects, saves them to a list, and enrolls the list in a sequence. Confirm credit-charging steps before running.

prompt = (
"1. Use g8_find_contacts to preview 50 VP Engineering at Series B SaaS in the US.\n"
"2. Show me the top 10 by company size and ask me to confirm.\n"
"3. On confirmation, use g8_build_contact_list to save them to a new list "
"named 'Series B SaaS VP Eng'.\n"
"4. Then use g8_list_sequences to find a sequence named 'New SaaS Outreach'.\n"
"5. Use g8_add_to_sequence to enroll the saved contacts. Ask me to confirm "
"before enrolling."
)

graph8 enforces a confirmation rule for credit-charging tools (g8_build_contact_list, g8_add_to_sequence). The agent will pause and ask before any of these run.