• Formerly Platform.sh
  • Contact us
  • Docs
  • Login
Watch a demoFree trial
Blog
Blog
BlogProductCase studiesNewsInsights
Blog

Managing cloud infrastructure with AI assistant and Upsun MCP server

AIinfrastructure automationdeveloper workflowMCP server
02 December 2025
Anita Okem-Achu
Anita Okem-Achu
Technical Writer
Share

Artificial intelligence is changing the way we execute our everyday operations. AI assistants are incredibly intelligent; they can write code, explain complex concepts, and answer any question you throw at them. However, they can't execute actions on their own.

If you ask your AI assistant to “create a backup of my database,” it may provide you with clear instructions, run the CLI commands directly or in some cases, even trigger actions through connected agent workflows. But for most developers today, these tasks still require switching to your terminal or platform tooling to review, validate, and run commands manually. This constant back-and-forth between your AI assistant and your actual tools wastes time and breaks your focus.

The Model Context Protocol (MCP) closes this gap between AI assistants and external systems. MCP provides a standardized way for AI to access tools and real-time data. The Upsun MCP server applies this to cloud infrastructure management, connecting your AI assistant directly to the Upsun API.

The problem MCP solves

AI assistants have a fundamental limitation: they can describe how to do something, but they can't actually do it. They lack connections to the tools and systems you use daily.

Before MCP, every connection between an AI model and an external tool required custom code. Want Claude to read your Google Drive files? Someone builds that specific integration. Want ChatGPT to query your database? That's another custom integration. The number of integrations needed grows exponentially with each new AI model and tool you add. This fragmented approach created significant problems, such as:

  • No standardization: Each AI assistant had its own method for connecting to external systems, making integrations difficult to reuse or maintain across platforms. A tool built for Claude wouldn't work with ChatGPT or other AI assistants.
  • Duplicate effort for developers: Organizations building AI integrations have to write and maintain separate code for each AI platform they want to support. For example, if a tool is needed to manage cloud resources, it will be rebuilt for every AI assistant. Work done for one provider couldn’t be easily transferred to another.
  • Limited access to real systems and data: Without a standardized way to expose tools or data sources, AI assistants operate with incomplete context. They could generate helpful outputs, but lacked the secure access needed to interact with external systems.

MCP addresses this by standardizing how AI assistants interface with external systems.

What is an MCP server?

Anthropic, the developers of MCP, described MCP as "the USB-C for AI", a single, universal standard for connecting AI assistants to external tools and data.

MCP is an open standard for connecting AI applications to external systems. If Large Language Models are the brain, MCP is the nervous system that connects the brain to the hands to be able to do things. MCP standardizes how AI systems connect to external tools and data sources. An MCP server is the component that exposes the real capabilities to AI assistants, including reading files, querying APIs, fetching logs, and, in some cases, infrastructure modifications. 

When your AI assistant needs to perform an action:

 1. The AI decides which tool to use.

2. The MCP client sends a standardized request.

 3. The MCP server executes the action.

 4. The server returns the result.

 5. Your AI uses that result in its response.

This interaction model works identically across all MCP-compatible tools and assistants.  Instead of building a custom integration for every combination of AI and tool, you build once using MCP. Any AI assistant that speaks MCP can then connect to any tool that speaks MCP.

How MCP servers work

To understand how MCP servers work, it's important to understand the components that work together. MCP servers work within a three-part system:

1. MCP host

The application where your AI assistant runs. This is the interface you interact with when you type a question or request. This can Claude Desktop, or IDES such as, VS Code, Cursor, JetBrains IDEs, or any tool with an AI assistant. The host application contains the AI model and handles your requests.

2. MCP client

The MCP client resides within the host and serves as a translator. It translates the AI model's requests into a format the MCP server understands and converts the server's responses back into a format the AI can use. Unlike the MCP host, you don't see the client in action because it operates behind the scenes to keep everything running smoothly.

3. MCP server

The MCP server is an external service that provides context, data, or capabilities via an MCP interface, such as accessing databases, calling APIs, managing file systems, or, in our case, managing cloud infrastructure.

To connect to multiple services, a single host manages multiple MCP clients, each of which connects to a different MCP server. The AI assistant can then request actions across all connected services using natural language.

Why MCP matters for cloud infrastructure management

With AI assistants becoming standard in development tools, MCP shifts them from passive helpers into active operators. MCP enables new workflows for infrastructure management:

  • Multi-step automation: Your AI assistant can check prerequisites, execute actions in sequence, verify results, and adjust its approach based on outcomes, adapting rather than following rigid scripts.
  • Natural language becomes a valid interface for technical operations: Describe what you want in plain language, and the AI translates that intent into proper actions through MCP. "Create a new environment from this branch" becomes a complete, executable instruction.
  • Context switching disappears from your workflow: Instead of jumping between your IDE, terminal, cloud console, and documentation, you work in one place. Your AI assistant becomes the bridge to all your tools. You stay focused, maintain flow, and work more efficiently.
  • One integration works across multiple AI platforms: MCP provides a universal, open standard for connecting AI systems with data sources, replacing fragmented integrations with a single protocol. Build an MCP server once, and it works with an MCP-compatible AI assistant. You're not locked into a single vendor's ecosystem.
  • Access real-time data from your systems: Rather than relying on outdated training data or requiring manual input, AI can query your databases, check your deployment status, or retrieve your project configuration directly. The information is current, accurate, and contextual.

Upsun MCP server

The Upsun MCP server connects AI assistants to your Upsun cloud platform. You can manage your Upsun projects and environments through conversations with your AI assistant, right from your code editor, without switching to terminals, web consoles, or memorizing CLI commands.

Why use the Upsun MCP server

The Upsun MCP server gives your AI assistant access to the Upsun API, the same secure API used by the CLI and Console. All operations follow the same authentication, authorization, and security validation as direct API calls, ensuring that your organization's permissions and access controls are adhered to.  While you can give the Upsun MCP the ability to make changes to your projects, the Upsun MCP defaults to read only to start. 

The Upsun MCP server provides three core capabilities:

  • Natural language infrastructure management: Describe what you need in plain language instead of memorizing CLI commands or navigating dashboards. "List all environments in this project and highlight which ones are active" becomes an executable instruction, not just a request for help.
  • CI/CD integration: Trigger deployments, monitor build progress, and check pipeline status directly from your IDE or documentation tools. Your AI assistant connects to your existing development workflows without requiring you to switch contexts.
  • AI-assisted operations: Using environment details, activities, logs, routing data, and configuration retrieved through MCP, your AI assistant can help diagnose issues, summarize system state, and guide you toward the next steps. As Upsun expands its MCP support for metrics and performance insights, this guidance will become even more powerful, offering adaptive recommendations rather than rigid, predefined scripts.

These capabilities enable specific operations such as:

  • Project and environment management: View project information and metadata, list projects in your organization, create and delete projects, retrieve environment details, and manage environment configurations.
  • Environment creation and management: Create new environments from branches, delete environments when they are no longer needed, retrieve detailed environment information, including status and deployments, list all environments within a project, and inspect environment variables and settings. Every operation maintains consistency with Upsun's Git-driven infrastructure model.
  • Operational visibility: Access detailed information about your infrastructure, including environment state, recent deployment activities, application logs, routing configuration, project domains, available backups, and active certificates. By pulling this live operational data directly from the MCP server, your AI assistant can understand your system's current health and behavior, identify issues, trace deployment history, and provide informed guidance during troubleshooting or operational planning.

Note: The Upsun MCP server focuses on infrastructure and deployment operations. Administrative functions such as user management, billing, and account settings are managed through the Upsun Console.

Security and operational safety

Security is a fundamental aspect of the Upsun MCP server. Understanding how it protects your infrastructure helps you use it confidently. 

As mentioned earlier, the MCP operations follow the existing Upsun security model. When your AI assistant makes a request to the MCP server, it goes through the same authentication and authorization checks as if you used the CLI or made a direct API call. The MCP server doesn't bypass or weaken any existing security controls; it uses them. Authentication is handled via API tokens, which you generate in your Upsun account settings with the appropriate permissions for your projects.

Read-only by default

The Upsun MCP server operates in read-only mode by default, allowing you to safely explore and understand your infrastructure without risking unintended changes. In this mode, you can view organizations, projects, and environments, read configuration files like .upsun/config.yaml, but you cannot make changes until you explicitly enable write operations. This default behavior ensures that no deployments, environment changes, or configuration updates occur unless you intentionally allow them.

Enabling write operations

On the Upsun MCP server, write operations are disabled by default, but you can enable them by setting "enable-write":"true" in your MCP configuration. This setting should be enabled only after you understand its implications and are comfortable with your AI assistant making infrastructure changes on your behalf. Even with write mode enabled, all actions remain bound by your account’s permissions. The AI assistant cannot perform operations beyond what your API token is authorized to do.

API Token management

API tokens follow Upsun's standard security practices:

  • Generate tokens through your account settings with permissions scoped to specific projects.
  • Use separate tokens for different environments or teams to maintain fine-grained access control.
  •  Rotate tokens regularly following your organization's security policies.
  • Store tokens securely. 

The MCP server respects the boundaries of your token's permissions. If your token doesn't have permission to modify production environments, the MCP server can't modify production environments, regardless of what your AI assistant requests.

Setting up the Upsun MCP server

Getting started with the Upsun MCP server requires two steps: obtaining an API token and configuring your MCP client.

Obtaining your Upsun API token

Navigate to the Upsun Console and access your account settings. Generate a new API token with appropriate permissions for your projects. The token scope determines which operations the MCP server can perform on your behalf. For initial exploration, read-only permissions provide a safe starting point.

Remember to store your API token securely. This token provides access to your infrastructure, so treat it with the same care as any other credential.

Configuring your MCP client

The Upsun MCP server works with all major AI development environments. Configuration follows a consistent pattern across different clients, though the specific syntax varies by platform.

For example, on Cursor, paste the following configuration into your ~/.cursor/mcp.json file. You can also install it in a specific project by creating a .cursor/mcp.json file in your project folder. See Cursor MCP docs for more information.

{
  "mcpServers": {
    "upsun": {
      "url": "https://mcp.upsun.com/mcp",
      "headers": {
        "upsun-api-token": "YOUR_API_TOKEN",
        "enable-write": "false"
      }
    }
  }
}


For Claude Code, run this command in your terminal. See Claude Code MCP docs for more information.

claude mcp add --transport http upsun https://mcp.upsun.com/mcp --header "upsun-api-token: YOUR_API_TOKEN" --header "enable-write: false"

For VS Code, add this to your VS Code MCP config file. See VS Code MCP docs for more information.
 

"upsun": { 
  "type": "http",
  "url": "https://mcp.upsun.com/mcp",
  "headers": {
    "upsun-api-token": "YOUR_API_TOKEN",
    "enable-write": "false"

  }

}

The Upsun MCP server works with 20+ AI development environments. For complete setup instructions for all supported clients, see the Upsun MCP documentation.

Test your connection

After configuration, verify your connection by asking your AI assistant simple questions about your Upsun infrastructure. You can start with read-only operations, such as checking project status or listing environments. 

Being specific in your requests yields better results: instead of "check my stuff," try "show me the status of all environments in the production project." As you gain confidence, explore more complex operations and ask your AI assistant to explain what it plans to do before executing write operations.

Note: Results depend on the capabilities of your AI assistant. Different AI models interpret requests differently and may vary in their ability to handle complex infrastructure tasks. It is important to verify the AI's understanding before executing write operations and using clear instructions.


Extend your workflow

The Upsun MCP server integrates with other AI features to create a unified development experience:

  • AI-generated configuration: Run 'upsun init' to analyze your repository and automatically generate an initial Upsun configuration file.
  • Context7 integration: Access Upsun documentation directly from your AI assistant for troubleshooting guides and configuration examples without leaving your IDE. See the Upsun and Context7 guide for setup instructions.
  • PostgreSQL MCP: Query your Upsun databases using natural language. See Using PostgreSQL MCP with Upsun Remote Database for details.

 

Start building with Upsun today

The Model Context Protocol changes how you interact with tools and infrastructure. By standardizing the connection between AI assistants and external systems, MCP eliminates the fragmented landscape of custom integrations and enables truly connected workflows. The Upsun MCP server brings this vision to cloud infrastructure management. It can eliminate interruptions, reduce manual work, and let you focus on building rather than managing. This is not about replacing your expertise. It is about augmenting your capabilities and removing friction from your daily workflows.

As MCP adoption grows across the industry, expect more services to expose their capabilities through standardized servers. The ecosystem is moving toward a future where your AI assistant can seamlessly interact with every tool and platform you use, all through natural language.

For comprehensive information about the Model Context Protocol standard, visit the official MCP documentation maintained by Anthropic.

Stay updated

Subscribe to our monthly newsletter for the latest updates and news.

Your greatest work
is just on the horizon

Free trial
UpsunFormerly Platform.sh

Join our monthly newsletter

Compliant and validated

ISO/IEC 27001SOC 2 Type 2PCI L1HIPAATX-RAMP
© 2025 Upsun. All rights reserved.