Top Tools and Plugins for Semantic Kernel
Microsoft Semantic Kernel's plugin ecosystem lets developers connect AI services to existing codebases without rebuilding infrastructure.
What Are Semantic Kernel Plugins?
Plugins in Semantic Kernel are modular components that expose external capabilities to large language models through function calling. They let developers connect AI services to existing codebases without rebuilding infrastructure from scratch. A plugin can be a collection of functions written in native code (C#, Python, TypeScript), an OpenAPI specification that describes REST endpoints, or a Model Context Protocol (MCP) server that provides standardized tool access. According to Microsoft's developer documentation, over 50% of .NET AI developers use Semantic Kernel for production workloads, and the plugin architecture reduces code complexity by 40% compared to building integrations manually. Plugins give language models the ability to perform actions beyond text generation. They can read files, make HTTP requests, query databases, execute shell commands, and interact with third-party APIs. The function calling pattern means the LLM decides when and how to use these tools based on user intent.
Built-In Plugin Packages
Microsoft provides production-ready plugins as part of the Semantic Kernel SDK. These cover common enterprise needs and serve as reference implementations for building custom plugins. Consider how this fits into your broader workflow and what matters most for your team. The right choice depends on your specific requirements: file types, team size, security needs, and how you collaborate with external partners. Testing with a free account is the fast way to know if a tool works for you.
Consider how this fits into your broader workflow and what matters most for your team. The right choice depends on your specific requirements: file types, team size, security needs, and how you collaborate with external partners. Testing with a free account is the fast way to know if a tool works for you.
Plugins.Core Package
The Core package includes foundational plugins that ship with the SDK:
ConversationSummaryPlugin summarizes long chat histories to fit within context windows. It uses configurable summarization strategies and can preserve key facts while reducing token count.
FileIOPlugin enables reading and writing files from the local filesystem or cloud storage. It includes methods for text files, JSON, and binary data with streaming support for large files.
HttpPlugin makes HTTP requests (GET, POST, PUT, DELETE) with automatic retry logic and configurable timeouts. It supports authentication headers and multipart form data for file uploads.
MathPlugin performs mathematical operations and unit conversions. It handles floating-point precision issues and supports scientific notation for large numbers.
TextPlugin provides string manipulation functions including trimming, splitting, regex matching, and case transformations. It's useful for data cleanup before feeding text to LLMs.
TimePlugin handles date and time operations including parsing, formatting, timezone conversion, and duration calculations. It supports ISO8601 and common date formats.
Database and Storage Plugins
Production AI applications need persistent storage for data, files, and conversation history. These plugins connect Semantic Kernel agents to databases and cloud storage services. Cloud storage architecture matters more than most people realize. Sync-based platforms require local copies of every file, consuming disk space and creating version conflicts. Cloud-native platforms stream files on demand, so your team accesses what they need without downloading entire folder trees.
Cloud storage architecture matters more than most people realize. Sync-based platforms require local copies of every file, consuming disk space and creating version conflicts. Cloud-native platforms stream files on demand, so your team accesses what they need without downloading entire folder trees.
PostgreSQL Plugin
Connects to PostgreSQL databases for structured data access. Supports parameterized queries, transactions, and connection pooling. Best for applications that need ACID guarantees and complex joins.
MongoDB Plugin
Provides NoSQL document storage for flexible schema needs. Handles aggregation pipelines and supports GridFS for storing large files. Works well for unstructured data and rapid prototyping.
Fast.io Storage Plugin (MCP)
The Fast.io MCP server provides 251 tools for file operations, workspace management, and RAG-powered search. Unlike generic object storage, it includes built-in Intelligence Mode that auto-indexes files for semantic search and provides AI chat with citations. Agents can upload files, create workspaces, share branded client portals, and transfer ownership to humans. The free agent tier includes 50GB storage and 5,000 monthly credits with no credit card required. Key advantages over S3 or generic blob storage:
- Built-in RAG: Toggle Intelligence Mode to auto-index workspace files for natural language search
- MCP-native: 251 tools via Streamable HTTP and SSE transport
- Ownership transfer: Agents build workspaces and hand them off to human users
- Persistent storage: Files don't expire, workspaces remain organized
- No infrastructure: No vector DB setup, no embedding pipeline management
Learn more at Fast.io Agent Storage.
Web and Data Collection Plugins
AI agents often need real-time information beyond their training data. Web search and scraping plugins provide access to current information. Consider how this fits into your broader workflow and what matters most for your team. The right choice depends on your specific requirements: file types, team size, security needs, and how you collaborate with external partners. Testing with a free account is the fast way to know if a tool works for you.
Consider how this fits into your broader workflow and what matters most for your team. The right choice depends on your specific requirements: file types, team size, security needs, and how you collaborate with external partners. Testing with a free account is the fast way to know if a tool works for you.
Web Search Plugins
Multiple providers offer search capabilities:
Tavily Search provides AI-optimized search results with automatic source credibility scoring. It returns cleaned content without ads or navigation elements.
Google Search Plugin uses the Custom Search API for traditional web results. Supports advanced operators and site-specific searches.
SerpAPI Plugin aggregates results from multiple search engines including Google, Bing, and DuckDuckGo. Provides structured data extraction from result pages.
Browser Automation Plugin
For tasks requiring JavaScript rendering or form interactions, browser automation plugins use Playwright or Puppeteer to control headless browsers. They can fill forms, take screenshots, and extract data from dynamic web pages.
Code Execution and Shell Plugins
These plugins allow AI agents to write and execute code, making them powerful for data analysis and automation tasks. Consider how this fits into your broader workflow and what matters most for your team. The right choice depends on your specific requirements: file types, team size, security needs, and how you collaborate with external partners. Testing with a free account is the fast way to know if a tool works for you.
Consider how this fits into your broader workflow and what matters most for your team. The right choice depends on your specific requirements: file types, team size, security needs, and how you collaborate with external partners. Testing with a free account is the fast way to know if a tool works for you.
Python Code Generator
Generates and executes Python code in a sandboxed environment. Includes libraries like pandas, numpy, and matplotlib for data science workflows. It validates code safety before execution and limits resource usage to prevent runaway processes.
Shell Plugin
Executes system commands with configurable permissions. Supports command chaining with pipes and environment variable injection. Use with caution in production environments as it can modify the host system.
Give Your AI Agents Persistent Storage
Fast.io provides 251 MCP tools, built-in RAG, and 50GB free storage for AI agents. No credit card required.
Enterprise Integration Plugins
Connect Semantic Kernel to enterprise systems and APIs. Consider how this fits into your broader workflow and what matters most for your team. The right choice depends on your specific requirements: file types, team size, security needs, and how you collaborate with external partners. Testing with a free account is the fast way to know if a tool works for you.
Consider how this fits into your broader workflow and what matters most for your team. The right choice depends on your specific requirements: file types, team size, security needs, and how you collaborate with external partners. Testing with a free account is the fast way to know if a tool works for you.
Microsoft Graph Plugin
works alongside Office 365 services including Outlook, Teams, SharePoint, and OneDrive. Enables reading emails, scheduling meetings, and accessing organization documents. Requires Azure AD authentication with delegated permissions.
OpenAPI Plugin Loader
Automatically generates plugins from OpenAPI specifications. Point it at a Swagger file and it creates typed function definitions for every endpoint. This eliminates manual integration code for REST APIs.
Teams Bot Integration
The teams-bot-semantic-kernel package provides a complete Teams bot framework with conversation state management and adaptive cards support. It handles authentication flows and message formatting automatically.
Memory and Vector Plugins
Semantic Kernel's memory system requires vector database connectors for semantic search and retrieval-augmented generation. Consider how this fits into your broader workflow and what matters most for your team. The right choice depends on your specific requirements: file types, team size, security needs, and how you collaborate with external partners. Testing with a free account is the fast way to know if a tool works for you.
Consider how this fits into your broader workflow and what matters most for your team. The right choice depends on your specific requirements: file types, team size, security needs, and how you collaborate with external partners. Testing with a free account is the fast way to know if a tool works for you.
Azure AI Search
Microsoft's native vector database with hybrid search combining semantic and keyword matching. Includes built-in data chunking and embedding generation.
Elasticsearch Plugin
Open-source option with kNN vector search. Supports custom analyzers and aggregations. Works well for organizations already using Elastic for logging or search.
Chroma Plugin
Lightweight vector database designed for LLM applications. Runs in-process or as a separate service. Clean API with automatic embedding generation.
Natural Language Database Plugins
These plugins let users query databases in plain English without writing SQL. Consider how this fits into your broader workflow and what matters most for your team. The right choice depends on your specific requirements: file types, team size, security needs, and how you collaborate with external partners. Testing with a free account is the fast way to know if a tool works for you.
Consider how this fits into your broader workflow and what matters most for your team. The right choice depends on your specific requirements: file types, team size, security needs, and how you collaborate with external partners. Testing with a free account is the fast way to know if a tool works for you.
NL2EF (Natural Language to Entity Framework)
Translates natural language questions into Entity Framework queries. It inspects your database schema and generates LINQ expressions that retrieve the requested data. Works with SQL Server, PostgreSQL, and MySQL.
SK Kernel Memory
A specialized library for indexing large datasets and performing semantic search. It chunks documents, generates embeddings, and handles retrieval with source citations. Think of it as a pre-built RAG pipeline.
How to Choose the Right Plugins
Selecting plugins depends on your application's requirements:
For agent file storage and RAG, Fast.io's MCP server provides the most complete solution with 251 tools, built-in indexing, and ownership transfer. It eliminates the need to manage separate storage and vector database infrastructure.
For enterprise Microsoft ecosystems, Microsoft Graph Plugin integrates deeply with Office 365 and Azure services.
For real-time web data, web search plugins from Tavily or SerpAPI provide fresh information beyond the LLM's training cutoff.
For code execution, the Python Code Generator enables data analysis and visualization workflows within the agent.
For custom business logic, native code plugins (C#, Python, TypeScript) give you full control with minimal latency. The key is matching the plugin to the task. Start with built-in plugins for common needs, add MCP servers for standardized integrations, and write custom native code plugins only when necessary.
Installing and Using Plugins
Adding plugins to Semantic Kernel follows a consistent pattern regardless of the plugin type. Consider how this fits into your broader workflow and what matters most for your team. The right choice depends on your specific requirements: file types, team size, security needs, and how you collaborate with external partners. Testing with a free account is the fast way to know if a tool works for you.
Consider how this fits into your broader workflow and what matters most for your team. The right choice depends on your specific requirements: file types, team size, security needs, and how you collaborate with external partners. Testing with a free account is the fast way to know if a tool works for you.
Native Code Plugins
// C# example
var kernel = Kernel.CreateBuilder()
.AddAzureOpenAIChatCompletion(deploymentName, endpoint, apiKey)
.Build();
kernel.ImportPluginFromObject(new TimePlugin(), "time");
kernel.ImportPluginFromObject(new HttpPlugin(), "http");
The kernel automatically exposes these functions to the LLM through function calling.
OpenAPI Plugins
// Import from OpenAPI spec URL
await kernel.ImportPluginFromOpenApiAsync(
"MyAPI",
new Uri("https://api.example.com/openapi.json")
);
This generates function definitions for every endpoint in the specification.
MCP Server Plugins
MCP servers like Fast.io's storage server connect via Streamable HTTP or SSE transport. Configuration happens through MCP client settings with automatic session management. The MCP protocol standardizes how AI assistants discover and invoke tools across different providers. It's becoming the preferred integration method for agent platforms.
Frequently Asked Questions
What plugins are available for Semantic Kernel?
Semantic Kernel includes built-in plugins like ConversationSummaryPlugin, FileIOPlugin, HttpPlugin, MathPlugin, TextPlugin, and TimePlugin. The community ecosystem provides database plugins (PostgreSQL, MongoDB), web search tools (Tavily, Google, SerpAPI), code execution engines, Microsoft Graph integration, and MCP servers like Fast.io's 251-tool storage platform. You can also build custom plugins from native code or OpenAPI specifications.
How do I write a custom Semantic Kernel plugin?
Create a class with public methods marked with [KernelFunction] attributes. Each method becomes a callable function for the LLM. Include [Description] attributes on the class, methods, and parameters to help the AI understand when to use each function. Then import the plugin with kernel.ImportPluginFromObject(). The kernel automatically exposes these functions through function calling to supported LLMs.
What is the difference between native plugins and MCP plugins?
Native plugins run in-process as compiled code (C#, Python, TypeScript) with minimal latency. They're ideal for business logic and performance-critical operations. MCP (Model Context Protocol) plugins are remote services accessed via HTTP or SSE, offering standardized tool discovery and session management. MCP plugins like Fast.io's storage server provide complex capabilities (file storage, RAG, workspace management) without embedding all logic in your application code.
Can Semantic Kernel plugins work with any LLM?
Plugins work with any LLM that supports function calling, including OpenAI's GPT series, Azure OpenAI, Anthropic's Claude, Google's Gemini, and open-source models like LLaMA when hosted through compatible inference servers. The plugin system is LLM-agnostic because it uses a standardized function schema that gets translated to each provider's specific format.
What is the Fast.io MCP server for Semantic Kernel?
Fast.io provides an MCP server with 251 tools for file storage, workspace management, and RAG-powered search. It includes built-in Intelligence Mode that auto-indexes files for semantic search, ownership transfer to hand projects from agents to humans, and a free agent tier with 50GB storage. Unlike S3 or generic blob storage, it's designed specifically for AI agent workflows with no infrastructure setup required.
How many plugins can I use in a single Semantic Kernel instance?
There's no hard limit on plugin count, but practical limits exist around LLM context windows. Each plugin's function definitions consume tokens in the system prompt. With large context windows, you can comfortably use dozens of plugins before function descriptions crowd out conversation history. For large plugin sets, use planning strategies to select relevant plugins dynamically based on user intent.
What is the difference between Plugins.Core and community plugins?
Plugins.Core ships with the Semantic Kernel SDK and includes foundational capabilities like file I/O, HTTP requests, math operations, and text manipulation. These are officially maintained by Microsoft with stability guarantees. Community plugins are third-party packages for specialized needs like database access, web scraping, or API integrations. They offer broader functionality but may have varying support and maintenance levels.
Can I use Semantic Kernel plugins for production applications?
Yes, Semantic Kernel is production-ready and used by over 50% of .NET AI developers according to Microsoft's data. For production deployments, combine built-in Plugins.Core with enterprise-grade integrations like Microsoft Graph for Office 365 access, database plugins for persistent storage, and MCP servers like Fast.io for file management and RAG capabilities. Always implement error handling, rate limiting, and security controls around plugin execution.
Related Resources
Give Your AI Agents Persistent Storage
Fast.io provides 251 MCP tools, built-in RAG, and 50GB free storage for AI agents. No credit card required.