Unified MCP Client Library: The Open-Source Bridge Between LLMs and Tools

In the fast-evolving world of artificial intelligence, large language models (LLMs) such as OpenAI’s GPT series and Anthropic’s Claude are transforming how developers build smart applications. To unlock their full potential, integrating these models with external tools—like web browsing, file management, or 3D modeling—is often essential. However, this process can be complex and time-intensive. That’s where the Unified MCP Client Library (MCP-Use) comes in—a powerful, open-source Python library designed to make this integration seamless.

MCP-Use enables developers to connect tool-calling LLMs to MCP (Multi-Capability Protocol) servers and create custom agents with ease. In this article, we’ll explore its features, benefits, and real-world applications to help you get started. Whether you’re new to AI or an experienced developer, MCP-Use offers a straightforward way to enhance your projects.

What is MCP-Use?

MCP-Use is an open-source Python library that simplifies connecting LLMs to external tools via MCP servers. Its key goals include:

  • Seamless LLM integration: Works with tool-calling LLMs like OpenAI, Anthropic, and Groq.
  • Custom agent creation: Build agents tailored to specific tasks using MCP server tools.
  • Dynamic server selection: Automatically picks the best server for each task.
  • Multi-server support: Combines multiple servers in one agent for advanced workflows.
  • Security controls: Limits tool access to ensure safe operations.

With a focus on simplicity and versatility, MCP-Use lets you set up sophisticated agents in just a few lines of code. Let’s dive into its standout features.

Key Features of MCP-Use

MCP-Use offers a robust set of capabilities that set it apart:

1. Easy Setup

Create a tool-enabled agent with as little as six lines of code—perfect for beginners and pros alike.

2. Broad LLM Compatibility

Thanks to LangChain integration, MCP-Use supports models like GPT-4o and Claude, as long as they handle tool calls.

3. HTTP Connectivity

Connect directly to MCP servers over HTTP, ideal for web-based integrations.

4. Smart Server Selection

Agents dynamically choose the best MCP server for the task at hand.

5. Multi-Server Workflows

Run complex tasks by leveraging multiple servers simultaneously in one agent.

6. Tool Restrictions

Enhance security by restricting access to sensitive tools, such as file or network operations.

7. Customizable Agents

Use the LangChain adapter to design agents for specific needs.

These features make MCP-Use a go-to solution for tool integration. Next, we’ll show you how to start using it.

How to Get Started with MCP-Use

Before you begin, ensure you have:

  • Python 3.11+
  • An MCP server (e.g., Playwright MCP)
  • LangChain and LLM libraries (e.g., langchain-openai)

Installation

Install MCP-Use with pip:

pip install mcp-use

Or from source:

git clone https://github.com/pietrozullo/mcp-use.git
cd mcp-use
pip install -e .

Add LLM providers:

pip install langchain-openai  # For OpenAI
pip install langchain-anthropic  # For Anthropic

Set up API keys in a .env file:

OPENAI_API_KEY=your_openai_api_key
ANTHROPIC_API_KEY=your_anthropic_api_key

Basic Agent Example

Here’s how to create an agent to find the best restaurant in San Francisco:

import asyncio
import os
from dotenv import load_dotenv
from langchain_openai import ChatOpenAI
from mcp_use import MCPAgent, MCPClient

async def main():
    load_dotenv()
    config = {
        "mcpServers": {
            "playwright": {
                "command""npx",
                "args": ["@playwright/mcp@latest"],
                "env": {"DISPLAY"":1"}
            }
        }
    }
    client = MCPClient.from_dict(config)
    llm = ChatOpenAI(model="gpt-4o")
    agent = MCPAgent(llm=llm, client=client, max_steps=30)
    result = await agent.run("Find the best restaurant in San Francisco")
    print(f"\nResult: {result}")

if __name__ == "__main__":
    asyncio.run(main())

This script sets up an agent with Playwright MCP and uses GPT-4o to complete the task.

Practical Applications of MCP-Use

MCP-Use shines in real-world scenarios. Here are some examples:

1. Web Browsing

Search the web with Playwright MCP:

async def main():
    load_dotenv()
    client = MCPClient.from_config_file("browser_mcp.json")
    llm = ChatOpenAI(model="gpt-4o")
    agent = MCPAgent(llm=llm, client=client, max_steps=30)
    result = await agent.run("Find the best restaurant in San Francisco USING GOOGLE SEARCH")
    print(f"\nResult: {result}")

2. Airbnb Search

Find accommodations with an Airbnb MCP server:

async def run_airbnb_example():
    load_dotenv()
    client = MCPClient.from_config_file("airbnb_mcp.json")
    llm = ChatAnthropic(model="claude-3-5-sonnet-20240620")
    agent = MCPAgent(llm=llm, client=client, max_steps=30)
    result = await agent.run(
        "Find me a nice place to stay in Barcelona for 2 adults for a week in August. "
        "I prefer places with a pool and good reviews. Show me the top 3 options."
    )
    print(f"\nResult: {result}")

3. 3D Modeling with Blender

Create a 3D model using Blender MCP:

async def run_blender_example():
    load_dotenv()
    config = {"mcpServers": {"blender": {"command""uvx""args": ["blender-mcp"]}}}
    client = MCPClient.from_dict(config)
    llm = ChatAnthropic(model="claude-3-5-sonnet-20240620")
    agent = MCPAgent(llm=llm, client=client, max_steps=30)
    result = await agent.run("Create an inflatable cube with soft material and a plane as ground.")
    print(f"\nResult: {result}")

Why Choose MCP-Use?

MCP-Use combines ease of use, flexibility, and security, making it ideal for developers looking to integrate LLMs with tools efficiently. From simple searches to complex multi-server tasks, it streamlines the process.

Ready to try it? Check out the GitHub repository for more details and start building smarter agents today!