Hub documentation
Building with the SDK
Building with the SDK
Build MCP-powered agents with the Hugging Face agentic SDKs. The huggingface_hub (Python) and @huggingface/tiny-agents (JavaScript) libraries provide everything you need to connect LLMs to MCP tools.
Installation
Python
JavaScript
pip install "huggingface_hub[mcp]"Quick Start: Run an Agent
The fastest way to get started is with the tiny-agents CLI:
Python
JavaScript
tiny-agents run julien-c/flux-schnell-generator
This loads an agent from the tiny-agents collection, connects to its MCP servers, and starts an interactive chat.
Using the Agent Class
The Agent class manages the chat loop and MCP tool execution. It uses Inference Providers to run the LLM.
Python
JavaScript
from huggingface_hub import Agent
import asyncio
agent = Agent(
model="Qwen/Qwen2.5-72B-Instruct",
provider="nebius",
servers=[
{
"type": "sse",
"url": "https://evalstate-flux1-schnell.hf.space/gradio_api/mcp/sse"
}
]
)
async def main():
async for chunk in agent.run("Generate an image of a sunset"):
if hasattr(chunk, 'choices'):
delta = chunk.choices[0].delta
if delta.content:
print(delta.content, end="")
asyncio.run(main())See the Agent reference for all options.
Using MCPClient Directly
For more control, use MCPClient to manage MCP servers and tool calls directly.
Python
JavaScript
import asyncio
from huggingface_hub import MCPClient
async def main():
async with MCPClient(
model="Qwen/Qwen2.5-72B-Instruct",
provider="nebius",
) as client:
# Connect to an MCP server
await client.add_mcp_server(
type="sse",
url="https://evalstate-flux1-schnell.hf.space/gradio_api/mcp/sse"
)
# Process a request with tools
messages = [{"role": "user", "content": "Generate an image of a sunset"}]
async for chunk in client.process_single_turn_with_tools(messages):
if hasattr(chunk, 'choices'):
delta = chunk.choices[0].delta
if delta.content:
print(delta.content, end="")
asyncio.run(main())See the MCPClient reference for all options.
Share Your Agent
Contribute agents to the tiny-agents collection on the Hub. Include:
agent.json- Agent configuration (required)PROMPT.mdorAGENTS.md- System prompt (optional)EXAMPLES.md- Sample prompts and use cases (optional)
Learn More
- huggingface_hub MCP Reference - Python API reference
- tiny-agents Documentation - JavaScript API reference
- Inference Providers - Available LLM providers
- tiny-agents Collection - Browse community agents
- MCP Server Guide - Connect to the Hugging Face MCP Server