Ad

Our DNA is written in Swift
Jump

Four Months in the Making: SwiftMCP 1.0 is Here

After four months of intensive development, I’m thrilled to announce that SwiftMCP 1.0 is feature-complete and ready for you to use.

For those just joining, SwiftMCP is a native Swift implementation of the Model Context Protocol (MCP). The goal is to provide a dead-simple way for any developer to make their app, or parts of it, available as a powerful server for AI agents and Large Language Models. You can read the official specification at modelcontextprotocol.io.

I did a SwiftMCP 1.0 Feature Speed Run on YouTube, if that’s what you prefer.

The Core Idea: Your Documentation is the API

Before diving into features, it’s crucial to understand the philosophy of SwiftMCP. The framework is built on the principle that your existing documentation should be the primary source of truth for an AI. By using standard Swift documentation comments, you provide all the context an AI needs to understand and use your server’s capabilities.

/**
 Adds two numbers and returns their sum.

 - Parameter a: The first number to add
 - Parameter b: The second number to add
 - Returns: The sum of a and b
 */
@MCPTool
func add(a: Int, b: Int) -> Int {
    a + b
}

This code shows the simplest use case. The @MCPTool macro inspects the add function and its documentation comment. It automatically extracts the main description (“Adds two numbers…”), the descriptions for parameters a and b, and the description of the return value, making all of this information available to an AI client without any extra work.

Server Features: Exposing Your App’s Logic

These are the capabilities your Swift application (the server) exposes to a client.

Tools: The Foundation of Action

Tools are the primary way to expose your app’s functionality. By decorating any function with @MCPTool, you make it a callable action for an AI. A good tool is well-documented, handles potential errors, and provides clear functionality.

// Define a simple error and enum for the tool
enum TaskError: Error { case invalidName }
enum Priority: String, Codable, CaseIterable { case low, medium, high }

/**
 Schedules a task with a given priority.
 - Parameter name: The name of the task. Cannot be empty.
 - Parameter priority: The execution priority.
 - Parameter delay: The delay in seconds before the task runs. Defaults to 0.
 - Returns: A confirmation message.
 - Throws: `TaskError.invalidName` if the name is empty.
 */
@MCPTool
func scheduleTask(name: String, priority: Priority, delay: Double = 0) async throws -> String {
    guard !name.isEmpty else {
        throw TaskError.invalidName
    }

    // Simulate async work
    try await Task.sleep(for: .seconds(delay))

    return "Task '\(name)' scheduled with \(priority.rawValue) priority."
}

This example demonstrates several key features at once. The function is async to perform work that takes time, and it throws a custom TaskError for invalid input. It uses a CaseIterable enum, Priority, as a parameter, which SwiftMCP can use to offer auto-completion to clients. Finally, the delay parameter has a default value, making it optional for the caller.

Resources: Publishing Read-Only Data

Resources allow you to publish data that clients can query by URI. SwiftMCP offers a flexible system for this, which can be broken down into two main categories: function-backed resources and provider-based resources.

Function-Backed Resources

These resources are defined by individual functions decorated with the @MCPResource macro. If a function has no parameters, it acts as a static endpoint. If it has parameters, they must be represented as placeholders in the URI template.

/// Static Resource: Returns a server info string
@MCPResource("server://info")
func getServerInfo() -> String {
    "SwiftMCP Demo Server v1.0"
}

/// Dynamic Resource: Returns a greeting for a user by ID
/// - Parameter user_id: The user's unique identifier
@MCPResource("users://{user_id}/greeting")
func getUserGreeting(user_id: Int) -> String {
    "Hello, user #\(user_id)!"
}

The getServerInfo function is a static resource; a client can request the URI server://info and will always get the same string back. The getUserGreeting function is dynamic; the {user_id} placeholder in the URI tells SwiftMCP to expect a value. When a client requests users://123/greeting, the framework automatically extracts “123”, converts it to an Int, and passes it to the user_id parameter.

Provider-Based Resources (like files)

For exposing a dynamic collection of resources, like files in a directory, you can conform your server to MCPResourceProviding. This requires implementing a property to discover the resources and a function to provide their content on request.

extension DemoServer: MCPResourceProviding {
    // Announce available file resources
    var mcpResources: [any MCPResource] {
        let docURL = URL(fileURLWithPath: "/Users/Shared/document.pdf")
        return [FileResource(uri: docURL, name: "Shared Document")]
    }

    // Provide the file's content when its URI is requested
    func getNonTemplateResource(uri: URL) async throws ->
        [MCPResourceContent] {
        guard FileManager.default.fileExists(atPath: uri.path) else {
            return []
        }

        return try [FileResourceContent.from(fileURL: uri)]
    }
}

This code shows the two-part mechanism. First, the mcpResources property is called by the framework to discover what resources are available. Here, we announce a single PDF file. Second, when a client actually requests the content of that file’s URI, the getNonTemplateResource(uri:) function is called. It verifies the file exists and then returns its contents.

Prompts: Reusable Templates for LLMs

For reusable prompt templates, the @MCPPrompt macro works just like @MCPTool. It exposes a function that returns a string or PromptMessage objects, making its parameters available for the AI to fill in.

/// A prompt for saying Hello
@MCPPrompt()
func helloPrompt(name: String) -> [PromptMessage] {
    let message = PromptMessage(role: .assistant, 
        content: .init(text: "Hello \(name)!"))
    return [message]
}

This example defines a simple prompt template. An AI client can discover this prompt and see that it requires a name parameter. The client can then call the prompt with a specific name, and the server will execute the function to construct and return the fully formed prompt message, ready to be sent to an LLM.

Progress Reporting: Handling Long-Running Tasks

For tasks that take time, you can report progress back to the client using RequestContext.current, which prevents the client from being left in the dark.

@MCPTool
func countdown() async -> String {
    for i in (0...30).reversed() {
        let done = Double(30 - i) / 30
        await RequestContext.current?.reportProgress(done, 
            total: 1.0, message: "\(i)s left")
        try? await Task.sleep(nanoseconds: 1_000_000_000)
    }
    return "Countdown completed!"
}

In this function, the server loops for 30 seconds. Inside the loop, reportProgress is called on the RequestContext.current. This sends a notification back to the original client that made the request, which can then use the progress value and message to update a UI element like a progress bar.

Client Features: The Client is in Control

While SwiftMCP is a server framework, it fully supports the powerful capabilities a client can offer. The client holds a great deal of control, and your server can adapt its behavior by checking Session.current?.clientCapabilities.

Roots: Managing File Access

The client is in complete control of what local data the server can see. When a client adds or removes a root directory, your server is notified and can react by implementing handleRootsListChanged().

func handleRootsListChanged() async {
    guard let session = Session.current else { return }
    do {
        let updatedRoots = try await session.listRoots()
        await session.sendLogNotification(LogMessage(
            level: .info,
            data: [ "message": "Roots list updated", "roots": updatedRoots ]
        ))
    } catch {
        // Handle error...
    }
}

This function is a notification handler. When a client modifies its list of shared directories (or “roots”), it sends a notification to the server. SwiftMCP automatically calls this function, which can then use session.listRoots() to get the updated list and react accordingly, for example, by refreshing its own list of available files.

Cancellation: Stopping Tasks Gracefully

If the client is showing a progress bar for that countdown, it should also have a cancel button. The client can send a cancellation notification, and your server code must be a good citizen and check for it with try Task.checkCancellation().

Elicitation: Asking the User for Input

Elicitation is a powerful interaction where the server determines it needs specific, structured information. It sends a JSON schema to the client, and the client is responsible for rendering a form to “elicit” that data.

@MCPTool
func requestContactInfo() async throws -> String {
    // Define the data you need with a JSON schema
    let schema = JSONSchema.object(JSONSchema.Object(
        properties: [
            "name": .string(description: "Your full name"),
            "email": .string(description: "Your email address", 
            format: "email")
        ],
        required: ["name", "email"]
    ))

    // Elicit the information from the client
    let response = try await RequestContext.current?.elicit(
        message: "Please provide your contact information",
        schema: schema
    )

    // Handle the user's response
    switch response?.action {
    case .accept:
        let name = response?.content?["name"]?.value as? String ?? "User"
        return "Thank you, \(name)!"
    case .decline:
        return "User declined to provide information."
    case .cancel, .none:
        return "User cancelled the request."
    }
}

This tool demonstrates the three steps of elicitation. First, it defines a JSONSchema that specifies the required fields (name and email). Second, it calls elicit on the current request context, sending the schema and a message to the client. Third, it waits for the user’s response and uses a switch statement to handle the different outcomes: the user accepting, declining, or canceling the request.

Sampling: Using the Client’s LLM

Perhaps the most fascinating feature is Sampling, which flips the script. The server can request that the client perform a generative task using its own LLM. This allows your server to be lightweight and delegate AI-heavy lifting.

@MCPTool
func sampleFromClient(prompt: String) async throws -> String {
    // Check if the client supports sampling
    guard await Session.current?.clientCapabilities?.sampling != nil else {
        throw MCPServerError.clientHasNoSamplingSupport
    }

    // Request the generation
    return try await RequestContext.current?.sample(prompt: prompt) ?? "No response from client"
}

This code shows how a server can leverage a client’s own generative capabilities. It first checks if the client has advertised support for sampling. If so, it calls sample(prompt:), which sends the prompt to the client. The client is then responsible for running the prompt through its own LLM and returning the generated text, which the server receives as the result of the await call.

What’s Next?

My vision is for developers to integrate MCP servers directly into their Mac apps. My API.me private app does exactly this, exposing a user’s local emails, contacts, and calendar through a local server that an LLM can securely interact with. I’m pondering if I should put this on the app store or possibly open source it. What do you think?

It has been a lot of work, and it’s finally ready. SwiftMCP 1.0 is here.

I am very much looking forward to your feedback. Please give it a try, check out the examples on GitHub, and let me know what you think. I hope to see you build some amazing things with it.

Oh and if you haven’t watched it yet, I really recommend watching my demonstration of all the new features:


Categories: Updates

Leave a Comment