A logo showing the text blog.marcnuri.com
Español
Home»Artificial Intelligence»Connecting to an MCP Server from JavaScript using LangChain.js

Recent Posts

  • Fabric8 Kubernetes Client 7.2 is now available!
  • Connecting to an MCP Server from JavaScript using AI SDK
  • Connecting to an MCP Server from JavaScript using LangChain.js
  • The Future of Developer Tools: Adapting to Machine-Based Developers
  • Connecting to a Model Context Protocol (MCP) Server from Java using LangChain4j

Categories

  • Artificial Intelligence
  • Front-end
  • Go
  • Industry and business
  • Java
  • JavaScript
  • Legacy
  • Operations
  • Personal
  • Pet projects
  • Tools

Archives

  • May 2025
  • April 2025
  • March 2025
  • February 2025
  • January 2025
  • December 2024
  • November 2024
  • August 2024
  • June 2024
  • May 2024
  • April 2024
  • March 2024
  • February 2024
  • January 2024
  • December 2023
  • November 2023
  • October 2023
  • September 2023
  • August 2023
  • July 2023
  • June 2023
  • May 2023
  • April 2023
  • March 2023
  • February 2023
  • January 2023
  • December 2022
  • November 2022
  • October 2022
  • September 2022
  • August 2022
  • July 2022
  • June 2022
  • May 2022
  • March 2022
  • February 2022
  • January 2022
  • December 2021
  • November 2021
  • October 2021
  • September 2021
  • August 2021
  • July 2021
  • January 2021
  • December 2020
  • November 2020
  • October 2020
  • September 2020
  • August 2020
  • July 2020
  • June 2020
  • May 2020
  • February 2020
  • January 2020
  • December 2019
  • October 2019
  • September 2019
  • July 2019
  • March 2019
  • November 2018
  • July 2018
  • June 2018
  • May 2018
  • April 2018
  • March 2018
  • February 2018
  • December 2017
  • July 2017
  • January 2017
  • December 2015
  • November 2015
  • December 2014
  • March 2014
  • February 2011
  • November 2008
  • June 2008
  • May 2008
  • April 2008
  • January 2008
  • November 2007
  • September 2007
  • August 2007
  • July 2007
  • June 2007
  • May 2007
  • April 2007
  • March 2007

Connecting to an MCP Server from JavaScript using LangChain.js

2025-04-01 in Artificial Intelligence / JavaScript tagged JavaScript / LLM / AI Agent / Model Context Protocol (MCP) / LangChain.js by Marc Nuri | Last updated: 2025-04-01
Versión en Español

Introduction

Model Context Protocol (MCP) is a standardized way to expose data and functionality to Large Language Model (LLM) applications in a secure, consistent manner. Initially developed by Anthropic, MCP has gained significant traction in the artificial intelligence (AI) ecosystem, with hundreds of tool servers already published. The protocol allows LLMs to interact with external systems through well-defined interfaces, enabling them to access real-time data, perform calculations, and take actions in response to user queries.

In March 2025, the LangChain community released MCP adapters for LangChain.js, a JavaScript library for building applications powered by LLMs. The adapters convert MCP tools into LangChain.js and LangGraph.js compatible tools, allowing developers to seamlessly incorporate them into their existing agent workflows.

In this post, I'll show you how to connect your JavaScript applications to an MCP server using LangChain.js and LangGraph.js. I'll cover both the STDIO and SSE transports, and provide a simple example to create an agent to interact with a Kubernetes cluster.

Configuring the MCP clients

Before we can use MCP tools with LangChain.js, we need to set up MCP clients that can communicate with MCP servers. The TypeScript SDK for Model Context Protocol provides client implementations for different transport methods. Let's learn how to configure clients for both STDIO and SSE transports.

Setting up an STDIO transport client

The STDIO transport is ideal for command-line tools and direct integrations. It allows communication with an MCP server running as a subprocess of your application. This approach is particularly useful for local execution which is convenient for agents that perform tasks on the same machine where the server is running.

The following code snippet demonstrates how to set up an STDIO transport client:

stdio-client.js
import {Client} from '@modelcontextprotocol/sdk/client/index.js';
import {StdioClientTransport} from '@modelcontextprotocol/sdk/client/stdio.js';

const initStdioClient = async () => {
  const stdioClient = new Client({
    name: 'blog.marcnuri.com'
  });
  const transport = new StdioClientTransport({
    command: 'npx',
    args: ['-y', 'kubernetes-mcp-server@latest']
  });
  await stdioClient.connect(transport);
  return stdioClient;
};

In this example, we're creating a new MCP client with a name identifier and connecting it to a transport that runs the kubernetes-mcp-server package using npx. The STDIO transport spawns a child process and communicates with it through standard input and output streams.

These are the key components of the STDIO transport client setup:

  • We create a new Client instance with an optional name identifier. This name is used to identify the client with the server.
  • We create a new StdioClientTransport instance with the command to execute the MCP server.
    • command: The command to execute the MCP server.
    • args: The array of arguments to pass to the command.
  • We call the connect method on the client instance, passing the transport instance to establish the connection.

Setting up an SSE transport client

The Server-Sent Events (SSE) transport is suitable for MCP servers running in a remote environment. It establishes a persistent connection with the MCP server over HTTP using SSE for streaming communication.

The following code snippet demonstrates how to set up an SSE transport client:

sse-client.js
import {Client} from '@modelcontextprotocol/sdk/client/index.js';
import {SSEClientTransport} from '@modelcontextprotocol/sdk/client/sse.js';

const initSseClient = async () => {
  const sseClient = new Client({
    name: 'blog.marcnuri.com'
  });
  const transport = new SSEClientTransport('https://localhost:8080/sse');
  await sseClient.connect(transport);
  return sseClient;
};

In this example, we're creating a new MCP client and connecting it to an SSE transport that points to a local server on port 8080. This assumes that you have an MCP server running at that URL with an SSE endpoint. When the client connects, it establishes a long-lived connection to the server using the Server-Sent Events protocol, allowing the server to push messages to the client in real time.

These are the key components of the SSE transport client setup:

  • We create a new Client instance with an optional name identifier. This name is used to identify the client with the server.
  • We create a new SSEClientTransport instance with the URL of the MCP server.
  • We call the connect method on the client instance, passing the transport instance to establish the connection.

Setting up LangChain.js MCP adapters

Now that we have our MCP clients set up, we can use the LangChain.js MCP adapters to integrate MCP tools with LangChain.js. The @langchain/mcp-adapters package provides a simple way to load MCP tools and use them with LangChain agents.

MCP adapters provides a simple loadMcpTools function that wraps the MCP tools and makes them compatible with LangChain.js.

The following code snippet demonstrates how to load the MCP tools:

load-mcp-tools.js
import {loadMcpTools} from '@langchain/mcp-adapters';

// Assuming you've already set up an MCP client as shown above
const stdioClient = await initStdioClient();
const tools = await loadMcpTools('kubernetes-mcp-server', stdioClient);

The loadMcpTools function takes two arguments:

  1. A name for the tools (this can be any string you choose).
  2. The MCP client instance.

It returns an array of LangChain-compatible tools you can use with LangGraph agents. Behind the scenes, this function discovers the available tools from the MCP server and creates corresponding LangChain tool objects for each.

The adapter handles all the protocol-specific details, allowing you to focus on building your application logic.

It also provides features like:

  • 🔌 Multiple transport options with reconnection strategies.
  • 🔄 Connecting to multiple MCP servers simultaneously
  • 🧩 Seamless integration with LangChain agents
  • 🛠️ Graceful error handling when servers aren't available

Using LangGraph.js to create an agent

We now have all the pieces together to create an agent using LangGraph.js that can leverage the MCP tools. LangGraph.js provides a simple way to create complex agent workflows, and the prebuilt agents make it even easier to get started.

Here's a complete example that shows how to create an agent that can interact with a Kubernetes cluster using the kubernetes-mcp-server:

kubernetes-agent.js
import {createReactAgent} from '@langchain/langgraph/prebuilt';
import {ChatOpenAI} from '@langchain/openai';
import {loadMcpTools} from '@langchain/mcp-adapters';

const assistant = async () => {
  const model = new ChatOpenAI({
    configuration: {
      apiKey: process.env['GITHUB_TOKEN'],
      baseURL: 'https://models.inference.ai.azure.com'
    },
    model: 'gpt-4o-mini'
  });
  const stdioClient = await initStdioClient();
  const tools = await loadMcpTools('kubernetes-mcp-server', stdioClient);
  const agent = createReactAgent({
    llm: model,
    tools
  });
  const listPods = await agent.invoke({
    messages:[{
      role: 'user',
      content: 'List all pods in my cluster and output as markdown table'
    }]
  });
  console.log(listPods.messages.slice(-1)[0].content);
  await stdioClient.close();
};

assistant()
  .then(() => {
    console.log('done');
  })
  .catch(err => {
    console.error('Error:', err);
  });

In this example, we create a very simple assistant that connects the kubernetes-mcp-server tool with the gpt-4o-mini language model provided by GitHub. As you can see, the LangChain ecosystem makes it straightforward to create agents that can interact with MCP servers.

These are the key components of the agent example:

  • ChatOpenAI:
    Creating a new ChatOpenAI instance configured to use the gpt-4o-mini model from Azure AI/GitHub Marketplace.
    • configuration: The configuration object for the OpenAI API, including the API key and base URL.
    • model: The name of the language model to use.
  • loadMcpTools:
    Loads the MCP tools from the STDIO client as described in the previous section.
  • createReactAgent:
    Creates a LangGraph.js ReAct (Reasoning and Acting) agent using the models and tools we loaded.
  • invoke:
    Invokes the agent with a user request to list all pods in the Kubernetes cluster. The agent will use the tools to perform the action and return the result.
  • listPods.messages:
    The agent's response is stored in the listPods variable. We print the last message from the response, which contains the action result.

The agent follows a ReAct (Reasoning and Acting) pattern, alternating between reasoning about what to do next and taking action using the available tools. In this case, the agent can use the tools provided by the Kubernetes MCP server to interact with your Kubernetes cluster.

The output of the example will be similar to this:

Here is the list of pods in your cluster presented as a markdown table:

```markdown
| Name                     | Namespace | Status  | Container Image       | Pod IP      |
|--------------------------|-----------|---------|-----------------------|-------------|
| kubernetes-mcp-run-nxvvq | default   | Running | marcnuri/chuck-norris | 10.133.7.0  |
| kubernetes-mcp-run-rhqzb | default   | Running | marcnuri/chuck-norris | 10.133.7.0  |
```

You can copy and paste this markdown into any markdown viewer to see it formatted correctly.

However, the listPods.messages variable will contain the complete list of exchanged messages between the agent and the LLM. In this particular case, the invocation involved two messages.

The beauty of this approach is that the agent can decide which tools to use based on the user's request. For example, if the user asks to list pods, the agent will identify that it needs to use the appropriate Kubernetes tool to fetch pod information, and then format the result as a Markdown table.

Conclusion

In this post, I've shown you how to connect your JavaScript applications to an MCP server using LangChain.js and LangGraph.js. I've covered how to set up both STDIO and SSE transport clients, how to use the MCP adapters to load tools from the server, and how to bind everything together using LangGraph.js to create an agent. The combination of MCP, LangChain.js, and LangGraph.js provides a powerful platform for building LLM-powered applications that interact with various tools and resources. With the MCP adapters, you can seamlessly integrate the growing ecosystem of MCP tool servers into your LangChain and LangGraph agents.

You can find the source code for this post on GitHub.

Twitter iconFacebook iconLinkedIn iconPinterest iconEmail icon

Post navigation
Connecting to an MCP Server from JavaScript using AI SDKThe Future of Developer Tools: Adapting to Machine-Based Developers
© 2007 - 2025 Marc Nuri