How to Build an MCP Server in Rust
An MCP server built in Rust provides memory-safe, high-performance tool serving for AI models. It works well for deployments that need low latency or run on limited hardware. Python works for rapid prototyping, but Rust offers better throughput and reliability for production servers.
Why Choose Rust for MCP Servers?
Model Context Protocol (MCP) servers connect LLMs to external data or tools. Python is popular for its ease of use and fast prototyping, but Rust offers clear benefits for production environments where stability and efficiency matter.
Performance and Concurrency Rust's ownership model and zero-cost abstractions allow for efficient memory usage. When handling multiple concurrent tool requests from an AI agent, Rust's async runtime (Tokio) can manage thousands of connections with little overhead. Python might struggle with the Global Interpreter Lock (GIL) in similar situations. This efficiency means lower latency for the end-user, which is important when an LLM waits on tool output to continue.
Reliability and Safety The type system in Rust prevents entire classes of bugs at compile time. For an MCP server that might expose sensitive database operations or API actions to an AI, this safety helps. You can trust that if the code compiles, it is free from data races and null pointer dereferences. This reduced "runtime anxiety" makes Rust a good choice for production agent workflows.
Ecosystem and Tooling The Rust ecosystem has matured. It offers good libraries for almost any integration. Whether you need to connect to a PostgreSQL database, interface with a legacy C++ library, or perform complex data transformations, the combination of Cargo and a large crate ecosystem ensures you aren't building from scratch.
What to check before scaling mcp server rust
Before building, ensure you have the Rust toolchain installed. You also need a basic understanding of async Rust, as the MCP protocol uses non-blocking I/O to stay responsive.
1. Install Rust
If you haven't already, install Rust using rustup, the recommended installer for the Rust programming language. This will install rustc, cargo, and rustup itself.
curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh
2. Create a New Project Initialize a new binary project using Cargo. We recommend using a descriptive name that reflects the tools your server will provide.
cargo new my-mcp-server
cd my-mcp-server
3. Add Dependencies
You will need tokio for the async runtime, serde for JSON serialization (critical for the MCP JSON-RPC protocol), and an MCP implementation crate. While the ecosystem is evolving, crates like mcp-sdk-rs are becoming standard. These libraries manage the JSON-RPC 2.0 specification so you can focus on the business logic of your tools.
[dependencies]
tokio = { version = "1", features = ["full"] }
serde = { version = "1.0", features = ["derive"] }
serde_json = "1.0"
anyhow = "1.0"
### Example MCP SDK dependency
### mcp-sdk-rs
Implementing the MCP Server
The core of an MCP server involves setting up the transport layer (usually Stdio or SSE) and defining the capabilities (Resources, Prompts, or Tools) you want to expose. The MCP specification defines how the host application discovers and calls these capabilities.
Basic Server Structure Here is a simple example of how you might structure a server that exposes a calculator tool. In a real-world scenario, you would likely split your tool definitions into separate modules.
use serde::{Deserialize, Serialize};
use std::sync::Arc;
// Note: usage depends on the specific SDK version
#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
// Initialize the server with basic metadata
let server = McpServer::new("my-rust-server", "1.0.0");
// Register a tool with a unique name and logic
server.register_tool("add", |args: AddArgs| {
Box::pin(async move {
Ok(args.a + args.b)
})
});
// Start the transport (Stdio for local agents)
// This is the most common transport for desktop usage
server.listen_stdio().await?;
Ok(())
}
#[derive(Deserialize)]
struct AddArgs {
a: i32,
b: i32,
}
Give Your AI Agents Persistent Storage
Skip the build process. Fastio provides a free, hosted MCP server with 251 built-in tools for file management and searching.
Handling Asynchronous Requests
Rust handles async tasks well. When an LLM requests a tool execution, for example, fetching data from a slow external API, Rust allows the server to handle other incoming requests (like list_tools or ping) without blocking. This concurrency is needed for multi-agent systems where multiple requests might be in flight simultaneously.
Using Tokio
Ensure your tool implementations are non-blocking. If a tool performs heavy computation that would block the executor thread, use tokio::task::spawn_blocking to move that work to a dedicated thread pool. For I/O bound tasks, always prefer async libraries like reqwest for HTTP calls or sqlx for database queries to keep the server responsive.
// Async tool example
server.register_tool("fetch_data", |args: UrlArgs| {
Box::pin(async move {
let client = reqwest::Client::new();
let resp = client.get(&args.url).send().await?.text().await?;
Ok(resp)
})
});
Testing and Debugging
Debugging MCP servers can be hard since they often run as subprocesses of an LLM application (like Claude Desktop). This means you cannot use println! for debugging, as that would break the JSON-RPC stream on stdout.
Logging
Since standard output (stdout) is reserved for the MCP protocol messages, you should log debug information to stderr or a dedicated log file. The tracing crate is great for this, providing a structured way to capture events and logs that can be filtered by level or module.
MCP Inspector Use the official MCP Inspector tool to test your server in isolation before connecting it to an agent. This command-line utility allows you to manually send JSON-RPC requests and view the raw responses, making it much easier to find serialization errors or logic bugs in your tool implementations.
Deployment Strategies
Once your Rust MCP server is built, you have two main deployment paths depending on whether your agent is running locally or in the cloud:
- Local Stdio: Compile the binary in release mode (
cargo build --release) to ensure best performance. Then, configure your agent (e.g., inclaude_desktop_config.json) to run the executable directly. This provides the lowest possible latency and is the simplest way to get started.
Example Configuration:
{
"mcpServers": {
"my-rust-server": {
"command": "/path/to/your/bin/my-mcp-server",
"args": []
}
}
}
- Remote SSE: For cloud-based agents or distributed systems, wrap your MCP server in a lightweight web framework like
AxumorActix-web. This lets you expose the MCP protocol via Server-Sent Events (SSE) over HTTP. This approach is harder but allows you to share your tools across multiple agents and environments without requiring local installation.
Frequently Asked Questions
How do I build an MCP server in Rust?
To build an MCP server in Rust, create a new Cargo project, add dependencies like `tokio` and `serde`, and use an MCP library to implement the protocol. You define tools as async functions and expose them via Stdio or SSE transport.
Is there a Rust SDK for MCP?
Yes, several community crates exist, such as `mcp-sdk-rs` or `rust-mcp`. These libraries provide the types and traits needed to implement the Model Context Protocol without manually handling JSON-RPC messages.
Why use Rust instead of Python for MCP?
Rust provides better performance, memory safety, and concurrency compared to Python. It is ideal for high-throughput servers or environments where resource usage is a concern, such as edge devices.
What crates support MCP in Rust?
Key crates include `serde` for serialization, `tokio` for async runtime, and specific protocol implementations like `mcp-sdk-rs`. You may also use `axum` if deploying your MCP server over HTTP/SSE.
Related Resources
Give Your AI Agents Persistent Storage
Skip the build process. Fastio provides a free, hosted MCP server with 251 built-in tools for file management and searching.