Skip to content

LlamaIndex Setup

Connect a LlamaIndex agent to graph8’s MCP server. LlamaIndex’s llama-index-tools-mcp package wraps MCP tools as native LlamaIndex FunctionTools your agents can use.

Prerequisites

  • Python 3.10+
  • llama-index, llama-index-tools-mcp, llama-index-llms-anthropic (or your model provider)
Terminal window
pip install llama-index llama-index-tools-mcp llama-index-llms-anthropic

Hosted MCP (remote OAuth)

import asyncio
import os
from llama_index.core.agent import ReActAgent
from llama_index.llms.anthropic import Anthropic
from llama_index.tools.mcp import McpToolSpec, BasicMCPClient
async def main() -> None:
client = BasicMCPClient("https://be.graph8.com/mcp/")
tool_spec = McpToolSpec(client=client)
tools = await tool_spec.to_tool_list_async()
llm = Anthropic(
model="claude-sonnet-4",
api_key=os.environ["ANTHROPIC_API_KEY"],
)
agent = ReActAgent.from_tools(tools, llm=llm, verbose=True)
response = await agent.achat(
"Use g8_find_contacts to preview 10 VP Sales at fintech startups "
"in New York. Return name, company, title."
)
print(response)
if __name__ == "__main__":
asyncio.run(main())

The first call opens a browser for OAuth. Subsequent calls reuse the cached token.

Self-hosted MCP (stdio)

For headless or CI use, run the MCP server locally with stdio and a personal API key.

Terminal window
export G8_API_KEY="g8_..."
import asyncio
import os
from llama_index.core.agent import ReActAgent
from llama_index.llms.anthropic import Anthropic
from llama_index.tools.mcp import McpToolSpec, BasicMCPClient
async def main() -> None:
client = BasicMCPClient(
command_or_url="uvx",
args=["g8-mcp-server"],
env={
"G8_API_KEY": os.environ["G8_API_KEY"],
"G8_MCP_MODE": "gtm",
},
)
tool_spec = McpToolSpec(client=client)
tools = await tool_spec.to_tool_list_async()
llm = Anthropic(model="claude-sonnet-4")
agent = ReActAgent.from_tools(tools, llm=llm, verbose=True)
response = await agent.achat("Search my CRM for contacts at Stripe.")
print(response)
if __name__ == "__main__":
asyncio.run(main())

Worked example: SaaS prospecting workflow

prompt = (
"Find 25 VP Engineering at Series B SaaS in the US using g8_find_contacts. "
"Pick the 10 with the highest signal score. Ask me to confirm before "
"saving them to a new list via g8_build_contact_list."
)
response = await agent.achat(prompt)

Credit-charging tools require explicit confirmation. The MCP server returns a structured prompt the agent surfaces to the user before any save.