- English
- français
- Deutsche
- Contact us
- Docs
- Login

Artificial intelligence is changing the way we execute our everyday operations. AI assistants are incredibly intelligent; they can write code, explain complex concepts, and answer any question you throw at them. However, they can't execute actions on their own.
If you ask your AI assistant to “create a backup of my database,” it may provide you with clear instructions, run the CLI commands directly or in some cases, even trigger actions through connected agent workflows. But for most developers today, these tasks still require switching to your terminal or platform tooling to review, validate, and run commands manually. This constant back-and-forth between your AI assistant and your actual tools wastes time and breaks your focus.
The Model Context Protocol (MCP) closes this gap between AI assistants and external systems. MCP provides a standardized way for AI to access tools and real-time data. The Upsun MCP server applies this to cloud infrastructure management, connecting your AI assistant directly to the Upsun API.
AI assistants have a fundamental limitation: they can describe how to do something, but they can't actually do it. They lack connections to the tools and systems you use daily.
Before MCP, every connection between an AI model and an external tool required custom code. Want Claude to read your Google Drive files? Someone builds that specific integration. Want ChatGPT to query your database? That's another custom integration. The number of integrations needed grows exponentially with each new AI model and tool you add. This fragmented approach created significant problems, such as:
MCP addresses this by standardizing how AI assistants interface with external systems.
Anthropic, the developers of MCP, described MCP as "the USB-C for AI", a single, universal standard for connecting AI assistants to external tools and data.
MCP is an open standard for connecting AI applications to external systems. If Large Language Models are the brain, MCP is the nervous system that connects the brain to the hands to be able to do things. MCP standardizes how AI systems connect to external tools and data sources. An MCP server is the component that exposes the real capabilities to AI assistants, including reading files, querying APIs, fetching logs, and, in some cases, infrastructure modifications.
When your AI assistant needs to perform an action:
1. The AI decides which tool to use.
2. The MCP client sends a standardized request.
3. The MCP server executes the action.
4. The server returns the result.
5. Your AI uses that result in its response.
This interaction model works identically across all MCP-compatible tools and assistants. Instead of building a custom integration for every combination of AI and tool, you build once using MCP. Any AI assistant that speaks MCP can then connect to any tool that speaks MCP.
To understand how MCP servers work, it's important to understand the components that work together. MCP servers work within a three-part system:
1. MCP host
The application where your AI assistant runs. This is the interface you interact with when you type a question or request. This can Claude Desktop, or IDES such as, VS Code, Cursor, JetBrains IDEs, or any tool with an AI assistant. The host application contains the AI model and handles your requests.
2. MCP client
The MCP client resides within the host and serves as a translator. It translates the AI model's requests into a format the MCP server understands and converts the server's responses back into a format the AI can use. Unlike the MCP host, you don't see the client in action because it operates behind the scenes to keep everything running smoothly.
3. MCP server
The MCP server is an external service that provides context, data, or capabilities via an MCP interface, such as accessing databases, calling APIs, managing file systems, or, in our case, managing cloud infrastructure.
To connect to multiple services, a single host manages multiple MCP clients, each of which connects to a different MCP server. The AI assistant can then request actions across all connected services using natural language.
With AI assistants becoming standard in development tools, MCP shifts them from passive helpers into active operators. MCP enables new workflows for infrastructure management:
The Upsun MCP server connects AI assistants to your Upsun cloud platform. You can manage your Upsun projects and environments through conversations with your AI assistant, right from your code editor, without switching to terminals, web consoles, or memorizing CLI commands.
The Upsun MCP server gives your AI assistant access to the Upsun API, the same secure API used by the CLI and Console. All operations follow the same authentication, authorization, and security validation as direct API calls, ensuring that your organization's permissions and access controls are adhered to. While you can give the Upsun MCP the ability to make changes to your projects, the Upsun MCP defaults to read only to start.
The Upsun MCP server provides three core capabilities:
These capabilities enable specific operations such as:
Note: The Upsun MCP server focuses on infrastructure and deployment operations. Administrative functions such as user management, billing, and account settings are managed through the Upsun Console.
Security is a fundamental aspect of the Upsun MCP server. Understanding how it protects your infrastructure helps you use it confidently.
As mentioned earlier, the MCP operations follow the existing Upsun security model. When your AI assistant makes a request to the MCP server, it goes through the same authentication and authorization checks as if you used the CLI or made a direct API call. The MCP server doesn't bypass or weaken any existing security controls; it uses them. Authentication is handled via API tokens, which you generate in your Upsun account settings with the appropriate permissions for your projects.
Read-only by default
The Upsun MCP server operates in read-only mode by default, allowing you to safely explore and understand your infrastructure without risking unintended changes. In this mode, you can view organizations, projects, and environments, read configuration files like .upsun/config.yaml, but you cannot make changes until you explicitly enable write operations. This default behavior ensures that no deployments, environment changes, or configuration updates occur unless you intentionally allow them.
Enabling write operations
On the Upsun MCP server, write operations are disabled by default, but you can enable them by setting "enable-write":"true" in your MCP configuration. This setting should be enabled only after you understand its implications and are comfortable with your AI assistant making infrastructure changes on your behalf. Even with write mode enabled, all actions remain bound by your account’s permissions. The AI assistant cannot perform operations beyond what your API token is authorized to do.
API Token management
API tokens follow Upsun's standard security practices:
The MCP server respects the boundaries of your token's permissions. If your token doesn't have permission to modify production environments, the MCP server can't modify production environments, regardless of what your AI assistant requests.
Getting started with the Upsun MCP server requires two steps: obtaining an API token and configuring your MCP client.
Obtaining your Upsun API token
Navigate to the Upsun Console and access your account settings. Generate a new API token with appropriate permissions for your projects. The token scope determines which operations the MCP server can perform on your behalf. For initial exploration, read-only permissions provide a safe starting point.
Remember to store your API token securely. This token provides access to your infrastructure, so treat it with the same care as any other credential.
Configuring your MCP client
The Upsun MCP server works with all major AI development environments. Configuration follows a consistent pattern across different clients, though the specific syntax varies by platform.
For example, on Cursor, paste the following configuration into your ~/.cursor/mcp.json file. You can also install it in a specific project by creating a .cursor/mcp.json file in your project folder. See Cursor MCP docs for more information.
{
"mcpServers": {
"upsun": {
"url": "https://mcp.upsun.com/mcp",
"headers": {
"upsun-api-token": "YOUR_API_TOKEN",
"enable-write": "false"
}
}
}
}
For Claude Code, run this command in your terminal. See Claude Code MCP docs for more information.
claude mcp add --transport http upsun https://mcp.upsun.com/mcp --header "upsun-api-token: YOUR_API_TOKEN" --header "enable-write: false"
For VS Code, add this to your VS Code MCP config file. See VS Code MCP docs for more information.
"upsun": {
"type": "http",
"url": "https://mcp.upsun.com/mcp",
"headers": {
"upsun-api-token": "YOUR_API_TOKEN",
"enable-write": "false"
}
}
The Upsun MCP server works with 20+ AI development environments. For complete setup instructions for all supported clients, see the Upsun MCP documentation.
Test your connection
After configuration, verify your connection by asking your AI assistant simple questions about your Upsun infrastructure. You can start with read-only operations, such as checking project status or listing environments.
Being specific in your requests yields better results: instead of "check my stuff," try "show me the status of all environments in the production project." As you gain confidence, explore more complex operations and ask your AI assistant to explain what it plans to do before executing write operations.
Note: Results depend on the capabilities of your AI assistant. Different AI models interpret requests differently and may vary in their ability to handle complex infrastructure tasks. It is important to verify the AI's understanding before executing write operations and using clear instructions.
The Upsun MCP server integrates with other AI features to create a unified development experience:
The Model Context Protocol changes how you interact with tools and infrastructure. By standardizing the connection between AI assistants and external systems, MCP eliminates the fragmented landscape of custom integrations and enables truly connected workflows. The Upsun MCP server brings this vision to cloud infrastructure management. It can eliminate interruptions, reduce manual work, and let you focus on building rather than managing. This is not about replacing your expertise. It is about augmenting your capabilities and removing friction from your daily workflows.
As MCP adoption grows across the industry, expect more services to expose their capabilities through standardized servers. The ecosystem is moving toward a future where your AI assistant can seamlessly interact with every tool and platform you use, all through natural language.
For comprehensive information about the Model Context Protocol standard, visit the official MCP documentation maintained by Anthropic.
Join our monthly newsletter
Compliant and validated