A logo showing the text blog.marcnuri.com
Español
Home»Artificial Intelligence»Connecting to a Model Context Protocol (MCP) Server from Java using LangChain4j

Recent Posts

  • Fabric8 Kubernetes Client 7.2 is now available!
  • Connecting to an MCP Server from JavaScript using AI SDK
  • Connecting to an MCP Server from JavaScript using LangChain.js
  • The Future of Developer Tools: Adapting to Machine-Based Developers
  • Connecting to a Model Context Protocol (MCP) Server from Java using LangChain4j

Categories

  • Artificial Intelligence
  • Front-end
  • Go
  • Industry and business
  • Java
  • JavaScript
  • Legacy
  • Operations
  • Personal
  • Pet projects
  • Tools

Archives

  • May 2025
  • April 2025
  • March 2025
  • February 2025
  • January 2025
  • December 2024
  • November 2024
  • August 2024
  • June 2024
  • May 2024
  • April 2024
  • March 2024
  • February 2024
  • January 2024
  • December 2023
  • November 2023
  • October 2023
  • September 2023
  • August 2023
  • July 2023
  • June 2023
  • May 2023
  • April 2023
  • March 2023
  • February 2023
  • January 2023
  • December 2022
  • November 2022
  • October 2022
  • September 2022
  • August 2022
  • July 2022
  • June 2022
  • May 2022
  • March 2022
  • February 2022
  • January 2022
  • December 2021
  • November 2021
  • October 2021
  • September 2021
  • August 2021
  • July 2021
  • January 2021
  • December 2020
  • November 2020
  • October 2020
  • September 2020
  • August 2020
  • July 2020
  • June 2020
  • May 2020
  • February 2020
  • January 2020
  • December 2019
  • October 2019
  • September 2019
  • July 2019
  • March 2019
  • November 2018
  • July 2018
  • June 2018
  • May 2018
  • April 2018
  • March 2018
  • February 2018
  • December 2017
  • July 2017
  • January 2017
  • December 2015
  • November 2015
  • December 2014
  • March 2014
  • February 2011
  • November 2008
  • June 2008
  • May 2008
  • April 2008
  • January 2008
  • November 2007
  • September 2007
  • August 2007
  • July 2007
  • June 2007
  • May 2007
  • April 2007
  • March 2007

Connecting to a Model Context Protocol (MCP) Server from Java using LangChain4j

2025-03-10 in Artificial Intelligence / Java tagged Java / LLM / AI Agent / Model Context Protocol (MCP) / LangChain4j / Testing by Marc Nuri | Last updated: 2025-05-19
Versión en Español

Introduction

Integrating your Java applications with a Model Context Protocol (MCP) server opens up a world of possibilities to implement artificial intelligence (AI) agents that can perform complex tasks and interact with external tools and services.

In this post, I'll show you how to connect your Java applications to an MCP server using LangChain4j, enabling your AI assistants to perform real-world actions like managing Kubernetes containers. I'll cover both the STDIO and SSE transports, and provide some insights on how to use LangChain4j for testing your MCP server implementations.

Setting up LangChain4j MCP clients

LangChain4j provides a flexible MCP client API for connecting to MCP servers through different transport mechanisms. The client acts as a bridge between your Java application and the MCP server, allowing you to send requests and receive responses with ease.

There are two main transports available: Standard Input/Output (STDIO) and Server-Sent Events (SSE). Let's check out how to set up each one.

Setting up an STDIO transport client

The STDIO transport method launches an MCP server as a subprocess and communicates with it through standard input/output streams. This method is most suitable for local execution which is convenient for agents that perform tasks on the same machine where the server is running. Managing local resources, executing shell commands, or interacting with local services are common use cases for STDIO transport.

The following code snippet demonstrates how to set up an STDIO transport client in LangChain4j:

LangChainMcpClient.java
import dev.langchain4j.mcp.client.DefaultMcpClient;
import dev.langchain4j.mcp.client.McpClient;
import dev.langchain4j.mcp.client.transport.stdio.StdioMcpTransport;

public final class LangChainMcpClient {
  private static McpClient initStdioClient(String... command) {
    return new DefaultMcpClient.Builder()
      // Optional client name to identify with the server, defaults to "langchain4j"
      .clientName("blog.marcnuri.com")
      // Optional MCP Protocol version, defaults to 2024-11-05
      .protocolVersion("2024-11-05")
      // Optional timeout for each individual tool execution, defaults to 60 seconds
      .toolExecutionTimeout(Duration.ofSeconds(10))
      .transport(new StdioMcpTransport.Builder()
        // The command to execute the MCP server
        .command(Arrays.asList(command))
        // Optional, should the MCP server communication events be logged to the logger
        .logEvents(true)
        .build())
      .build();
  }
}

These are the key components of the STDIO transport client setup:

  • clientName: An optional parameter to identify the client with the server. If not specified, the client name defaults to langchain4j.
  • protocolVersion: An optional parameter to specify the MCP protocol version. If not specified, the protocol version defaults to 2024-11-05.
  • toolExecutionTimeout: An optional parameter to set the timeout for each tool execution. If not specified, the timeout defaults to 60 seconds.
  • StdioMcpTransport.command: The command to execute the MCP server. This should be a List of strings representing the command and its arguments.
  • StdioMcpTransport.logEvents: An optional parameter to enable logging of MCP server communication message events.

Setting up an SSE transport client

The Server-Sent Events (SSE) transport method establishes a persistent connection with the MCP server over HTTP using SSE for streaming communication. This method is suitable for remote execution where the server is running on a different machine or in a cloud environment. Interacting with web services, managing cloud resources (such as Kubernetes), or executing long-running tasks are common use cases for SSE transport.

The following code snippet demonstrates how to set up an SSE transport client in LangChain4j:

LangChainMcpClient.java
import dev.langchain4j.mcp.client.DefaultMcpClient;
import dev.langchain4j.mcp.client.McpClient;
import dev.langchain4j.mcp.client.transport.http.HttpMcpTransport;

public final class LangChainMcpClient {
  private static McpClient initSseClient(String sseUrl) {
    return new DefaultMcpClient.Builder()
      .clientName("blog.marcnuri.com")
      .protocolVersion("2024-11-05")
      .toolExecutionTimeout(Duration.ofSeconds(10))
      .transport(new HttpMcpTransport.Builder()
        // The URL to connect to the MCP server
        .sseUrl(sseUrl)
        // Optional HTTP connect, read, and write timeouts, defaults to 60 seconds
        .timeout(Duration.ofSeconds(10))
        // Optional, should the MCP server requests be logged to the logger
        .logRequests(true)
        // Optional, should the MCP server responses be logged to the logger
        .logResponses(true)
        .build())
      .build();
  }
}

These are the key components of the SSE transport client setup:

  • HttpMcpTransport.sseUrl: The URL to connect to the MCP server using SSE.
  • HttpMcpTransport.timeout: An optional parameter to set the HTTP connect, read, and write timeouts. If not specified, the timeout defaults to 60 seconds.
  • HttpMcpTransport.logRequests: An optional parameter to enable logging of MCP server request messages.
  • HttpMcpTransport.logResponses: An optional parameter to enable logging of MCP server response messages.

Integrating MCP with Java-based AI agents

Once you have your MCP client set up, you can integrate it with LangChain4j's AI services to create powerful assistants that can interact with the real world.

Here's how to create a very simple assistant that can manage a Kubernetes cluster by using the kubernetes-mcp-server:

LangChainMcpClient.java
import dev.langchain4j.mcp.McpToolProvider;
import dev.langchain4j.mcp.client.McpClient;
import dev.langchain4j.model.github.GitHubModelsChatModel;
import dev.langchain4j.service.AiServices;

public final class LangChainMcpClient {
  private interface Assistant {
    String chat(String userMessage);
  }
  private static Assistant assistantIntegrationExample(McpClient client) {
    return AiServices.builder(Assistant.class)
      .chatModel(GitHubModelsChatModel.builder()
        .gitHubToken(System.getenv("GITHUB_TOKEN"))
        .modelName("gpt-4o-mini")
        .build())
      .toolProvider(McpToolProvider.builder().mcpClients(client).build())
      .build();
  }
  public static void main(String[] args) {
    final var assistant = assistantIntegrationExample(initStdioClient("npx", "-y", "kubernetes-mcp-server@latest"));
    System.out.println(assistant.chat("Run a Pod with the image marcnuri/chuck-norris and expose port 8080"));
    System.out.println(assistant.chat("List the Pods running in my cluster as a markdown table"));
  }
}

In this example, we create a very simple assistant that connects the kubernetes-mcp-server tool with the gpt-4o-mini language model provided by GitHub. As you can see, LangChain4j's API makes it extremely easy and straightforward to integrate your Java applications with MCP servers and AI models.

These are the key components of the assistant integration example:

  • Assistant:
    The simplest assistant interface that provides a chat method to interact with the AI model. LangChain4j's AI services use this interface to automatically create the assistant implementation.
  • GitHubModelsChatModel:
    A chat model that uses the GitHub API to interact with the GPT-4o-mini language model. We could use any other model and ChatModel implementation provided by LangChain4j.
    • gitHubToken: The GitHub token to authenticate with the GitHub API.
    • modelName: The name of the language model provided by GitHub to use.
  • McpToolProvider:
    A tool provider that uses the MCP client to execute tools on the MCP server. The mcpClients method accepts one or more MCP clients to use for tool execution.

In the main(String[]) method, we create an assistant instance using the assistantIntegrationExample method. Then we use this assistant to chat with the AI model by sending messages and printing the responses.

For the first message, we ask the assistant to run a Pod with the image marcnuri/chuck-norris and expose port 8080. This is the output of the assistant's response:

The Pod with the image `marcnuri/chuck-norris` has been successfully created and is exposing port 8080.
Here are the details:

### Pod Information
- **Name**: kubernetes-mcp-server-run-lt6r8
- **Namespace**: mnurisan-dev
- **Status**: Pending
- **Container Image**: marcnuri/chuck-norris
- **Port**: 8080 (TCP)

### Service Information
- **Name**: kubernetes-mcp-server-run-lt6r8
- **Type**: ClusterIP
- **Cluster IP**: 172.30.90.85
- **Port**: 8080 (Target Port: 8080)

You can access the service via the provided service once the Pod is in a running state.
If you need further assistance or checks, let me know!

For the second message, we ask the assistant to list the Pods running in the cluster as a Markdown table. This is the output of the assistant's response:

| Name                                   | Namespace       | Status   | Container Image       |
|----------------------------------------|-----------------|----------|-----------------------|
| kubernetes-mcp-server-run-lt6r8        | namespace-dev   | Pending  | marcnuri/chuck-norris |

Testing MCP servers with LangChain4j

Beyond everyday use cases, I also leverage this setup for testing my MCP server implementations in Java. By configuring an MCP client that connects to the server, I can simulate real-world interactions and verify the MCP server's behavior.

I use these techniques with three levels of integration testing:

1. Unit testing

For testing individual units of logic, in Quarkus applications, you can start a client with the following code:

MCPServerKubernetesTest.java
@QuarkusTest
class MCPServerKubernetesTest {
  @TestHTTPResource
  URL url;
  private McpClient mcpClient;

  @BeforeEach
  void setUpMcpClient() {
    mcpClient = new DefaultMcpClient.Builder()
      .clientName("test-mcp-client-kubernetes")
      .toolExecutionTimeout(Duration.ofSeconds(10))
      .transport(new HttpMcpTransport.Builder().sseUrl(url.toString() + "mcp/sse").build())
      .build();
  }
}

Since Quarkus provides the MCP server SSE URL as a test resource, it's extremely easy to set up an MCP client for testing.

By combining the MCP client with a Kubernetes Mock Server like the one provided by the Fabric8 Java Kubernetes Client, you can then perform individual requests using the client and verify the behavior of specific units of logic.

MCPServerKubernetesTest.java
mcpClient.executeTool(
  ToolExecutionRequest.builder().name("name_of_the_tool_in_test").arguments("{}").build());

2. Integration testing

Implementing larger integration tests for your application is also straightforward with LangChain4j and Quarkus. Considering a built application, you can start the real MCP server as a subprocess and connect to it using the STDIO transport method.

In the following code snippet, I show you how to achieve this in Quarkus:

MCPServerKubernetesIT.java
public class MCPServerKubernetesIT {
  private static McpClient mcpClient;
  @BeforeAll
  static void setUpMcpClient() throws Exception {
    mcpClient = new DefaultMcpClient.Builder()
      .clientName("test-mcp-client-kubernetes")
      .toolExecutionTimeout(Duration.ofSeconds(10))
      .transport(new StdioMcpTransport.Builder().command(
          List.of(
            // Retrieve the java binary location:
            ProcessHandle.current().info().command().orElseThrow(),
            "-jar",
            // Location of the built application (provided by Quarkus):
            System.getProperty("java.jar.path")
        )).logEvents(true).build())
      .build();
  }
}

In this example, we set up an MCP client using the STDIO transport method to connect to the packaged Quarkus application. This instructs the MCP client to spawn a real subprocess with the compiled application and communicate with it through standard input/output streams.

Just like in the unit testing example, you can then execute specific tools on the server and verify the behavior of the application as a whole.

3. End-to-end testing

For end-to-end testing, you can use the same setup as in integration testing but connect it to a real model instead. By following the same setup I described in the Integrating MCP with Java-based AI agents section, you can create a full end-to-end test that simulates real-world interactions with your application.

By automating chat queries and comparing the responses against expected outputs, you can ensure that your MCP server implementation is robust and reliable. However, note the non-deterministic nature of AI models, which may lead to different responses for the same input. You should consider this when writing your end-to-end tests and allow for some flexibility in the expected outputs.

Conclusion

LangChain4j's MCP client opens up exciting possibilities for Java developers working with AI. Whether you're using a third-party MCP server or developing your own, LangChain4j provides a flexible and powerful API for integration. By connecting your Java applications to MCP servers and Large Language Models (LLMs), you can create powerful AI assistants that can perform complex tasks and interact with external tools and services.

In this post, I've shown you how easy it is to set up an MCP client with LangChain4j and integrate it with your Java applications. I've also provided some insights on how to use LangChain4j to test your MCP server implementations, from unit tests to end-to-end tests. By leveraging these techniques, you can now streamline your development process, improve reliability, and create AI assistants that can perform real-world actions.

You can find the source code for this post on GitHub.

Twitter iconFacebook iconLinkedIn iconPinterest iconEmail icon

Post navigation
The Future of Developer Tools: Adapting to Machine-Based DevelopersGiving Superpowers to Small Language Models with Model Context Protocol (MCP)
© 2007 - 2025 Marc Nuri