Spring AI Agent Integrates Local File Data Through MCP

Introduction to Model Context Protocol (MCP)


The Model Context Protocol (MCP) is an open protocol that specifies how applications can provide context to large language models (LLMs). MCP provides a unified way to connect AI models to various data sources and tools, defining a standardized approach for integration. When developing agents, we often need to integrate them with data and tools. The MCP standardizes this integration, which can help you build agents and complex workflows on top of LLMs.


Currently, many services have been integrated and provided MCP server implementations. The ecosystem for this is growing at a very fast pace.

Introduction to Spring AI MCP


Spring AI MCP provides Java and Spring framework integration for the MCP. It enables Spring AI applications to interact with different data sources and tools through standardized interfaces, supporting both synchronous and asynchronous communication patterns.


1,The Spring AI MCP uses a modular architecture and includes the following components:


• Spring AI applications: Use the Spring AI framework to build generative AI applications that want to access data through MCP.


• Spring MCP clients: The Spring AI implementation of the MCP, maintaining a 1:1 connection with the servers.


• MCP servers: Lightweight programs, each of which exposes specific functionality through a standardized MCP.


• Local data sources: computer files, databases, and services that the MCP server can securely access.


• Remote services: External systems that the MCP server can connect to through the Internet (such as through APIs).


Quickly Experience Spring AI MCP with an Example


Here, we provide an example agent application that can query or update the local file system through MCP and interact with the model using the data in the file system as context. This example demonstrates how to integrate Spring AI with the local file system using the MCP.


Complete source code of the example: https://github.com/springaialibaba/spring-ai-alibaba-examples/spring-ai-alibaba-mcp-example


Sample Architecture (Description of Source Code)


In the previous sections, we explained the infrastructure for integrating Spring AI with MCP.

In the following example, we will use the following key components:


MCP client: The key to integrating with MCP, which can interact with the local file system.


Function callbacks: The function calling declaration method of Spring AI MCP.


Chat client: A key component of Spring AI, used for LLM model interaction and agent proxy.


Declare a ChatClient
// List<McpFunctionCallback> functionCallbacks;
var chatClient = chatClientBuilder.defaultFunctions(functionCallbacks).build();
Like in previous Spring AI applications, we first define a ChatClient Bean, which serves as a proxy for interacting with the LLMs. It should be noted that the functions we injected for ChatClient are created by the MCP component (McpFunctionCallback).


Let's take a look at how the McpFunctionCallback is used.


Declare MCP Function Callbacks


The following code snippet interacts with the MCP server through mcpClient, adapting the MCP tools to standard Spring AI functions through McpFunctionCallback.


Discover the list of available tools in the MCP server (referred to as functions in Spring AI).


Convert each tool into a Spring AI function callback sequentially.


Finally, we will register these McpFunctionCallbacks to ChatClient for use.


@Bean
public List<McpFunctionCallback> functionCallbacks(McpSyncClient mcpClient) {
    return mcpClient.listTools(null)
            .tools()
            .stream()
            .map(tool -> new McpFunctionCallback(mcpClient, tool))
            .toList();
}


It can be seen that the interaction between ChatClient and the model remains unchanged. The model informs ChatClient to make function calls when needed. However, Spring AI delegates the actual function call process to MCP via McpFunctionCallback, interacting with the local file system through the standardized MCP protocol:


• During the interaction with the large model, ChatClient handles the relevant function call requests.


• ChatClient calls the MCP tools (through McpClient).


• McpClient interacts with the MCP server (that is, the file system).


Initialize the McpClient


This agent application uses a synchronous MCP client to communicate with a filesystem MCP server running locally:


@Bean(destroyMethod = "close")
public McpSyncClient mcpClient() {
    var stdioParams = ServerParameters.builder("npx")
            .args("-y", "@modelcontextprotocol/server-filesystem", "path))
            .build(); // 1

    var mcpClient = McpClient.sync(new StdioServerTransport(stdioParams),
            Duration.ofSeconds(10), new ObjectMapper()); //2

    var init = mcpClient.initialize(); // 3
    System.out.println("MCP Initialized: " + init);

    return mcpClient;
}


In the above code:


Configure the MCP server startup command and parameters.
Initialize the McpClient: it connects to the MCP server and specifies a timeout, among other things.


Spring AI uses npx -y @modelcontextprotocol/server-filesystem "/path/to/file" to create a standalone subprocess on the local machine (representing the local MCP server). Spring AI communicates with McpClient, which in turn interacts with the local file system through its connection to the MCP server.


Example


Prerequisites


1.  Install npx (Node Package eXecute)

First, make sure that the npm is installed on the local machine, and then run the following command:

npm install -g npx


2.  Download the sample source code

git clone https://github.com/springaialibaba/spring-ai-alibaba-examples.git
cd spring-ai-alibaba-examples/spring-ai-alibaba-mcp-example/filesystem


3.  Set environment variables

# Set the Dashscope API-KEY for the Tongyi LLM
export AI_DASHSCOPE_API_KEY=${your-api-key-here}


4.  Build an example

./mvnw clean install
Run the Sample Application
To run the example, the agent will initiate a query to the model (the source code includes predefined questions, which you can review in the source files). You can view the output results in the console.

./mvnw spring-boot:run


If you are running the example in an IDE and encounter file access permission issues returned by the filesystem MCP server, ensure that the working directory of the current process is set to the spring-ai-alibaba-mcp-example/filesystem directory.


Summary


MCP, as an open protocol, directly standardizes how applications provide context to LLMs. MCP is like a USB-C port for AI applications. Just as USB-C provides a standardized way to connect devices to various peripherals and accessories, MCP offers a standardized method to connect AI models to different data sources and tools.


For Spring AI Alibaba, we will rapidly promote integration with the MCP ecosystem from two aspects:


• On the client side, the agents developed by Spring AI can quickly access various server services in the MCP ecosystem.


• On the server side, it helps to quickly convert a large number of Java services to MCP servers. It uses Spring AI Alibaba to publish monolithic or microservice applications developed by traditional Spring Boot, Spring Cloud, and Dubbo as MCP servers.

Read Also
Thai Government cooperates with Microsoft in expanding AI training
Confidential AI Best Practices for AI Model and Data Protection
ChatGPT search is growing quickly in Europe, OpenAI data suggests

Research