A logo showing the text blog.marcnuri.com
Español
Home»Artificial Intelligence»Connecting to an MCP Server from JavaScript using AI SDK

Recent Posts

  • Fabric8 Kubernetes Client 7.2 is now available!
  • Connecting to an MCP Server from JavaScript using AI SDK
  • Connecting to an MCP Server from JavaScript using LangChain.js
  • The Future of Developer Tools: Adapting to Machine-Based Developers
  • Connecting to a Model Context Protocol (MCP) Server from Java using LangChain4j

Categories

  • Artificial Intelligence
  • Front-end
  • Go
  • Industry and business
  • Java
  • JavaScript
  • Legacy
  • Operations
  • Personal
  • Pet projects
  • Tools

Archives

  • May 2025
  • April 2025
  • March 2025
  • February 2025
  • January 2025
  • December 2024
  • November 2024
  • August 2024
  • June 2024
  • May 2024
  • April 2024
  • March 2024
  • February 2024
  • January 2024
  • December 2023
  • November 2023
  • October 2023
  • September 2023
  • August 2023
  • July 2023
  • June 2023
  • May 2023
  • April 2023
  • March 2023
  • February 2023
  • January 2023
  • December 2022
  • November 2022
  • October 2022
  • September 2022
  • August 2022
  • July 2022
  • June 2022
  • May 2022
  • March 2022
  • February 2022
  • January 2022
  • December 2021
  • November 2021
  • October 2021
  • September 2021
  • August 2021
  • July 2021
  • January 2021
  • December 2020
  • November 2020
  • October 2020
  • September 2020
  • August 2020
  • July 2020
  • June 2020
  • May 2020
  • February 2020
  • January 2020
  • December 2019
  • October 2019
  • September 2019
  • July 2019
  • March 2019
  • November 2018
  • July 2018
  • June 2018
  • May 2018
  • April 2018
  • March 2018
  • February 2018
  • December 2017
  • July 2017
  • January 2017
  • December 2015
  • November 2015
  • December 2014
  • March 2014
  • February 2011
  • November 2008
  • June 2008
  • May 2008
  • April 2008
  • January 2008
  • November 2007
  • September 2007
  • August 2007
  • July 2007
  • June 2007
  • May 2007
  • April 2007
  • March 2007

Connecting to an MCP Server from JavaScript using AI SDK

2025-04-11 in Artificial Intelligence / JavaScript tagged JavaScript / LLM / AI Agent / Model Context Protocol (MCP) / AI SDK by Marc Nuri | Last updated: 2025-04-12
Versión en Español

Introduction

Model Context Protocol (MCP) provides a standardized way for artificial intelligence (AI) models to interact with external tools and systems. Check my introduction to MCP if you're new to this revolutionary protocol to learn more about it.

By connecting to an MCP server, your JavaScript applications can leverage Large Language Models (LLMs) with the added ability to execute commands, retrieve data, and interact with various systems. Vercel AI SDK has a growing list of supported model providers that can connect to an MCP server. Making it an excellent alternative to the LangChain.js library.

In this post, I'll show you how to connect your JavaScript applications to an MCP server using Vercel AI SDK. I'll describe how to set up the MCP client, configure the AI SDK provider, and prompt the model using the generateText function. By the end of this post, you'll be able to integrate your JavaScript applications with an MCP server and leverage the power of LLMs to interact with the real world.

Creating the MCP clients

To interact with an MCP server, we first need to create an MCP client. AI SDK provides a convenient function called experimental_createMCPClient that allows us to establish this connection.

Let's learn how to create two different types of MCP clients: one using the STDIO transport and another using the SSE transport.

Setting up an STDIO transport client

The STDIO transport method allows us to spawn a local process and communicate with it through standard input and output streams. This is useful for running local MCP servers or processes that expose the local system's capabilities.

The following code snippet demonstrates how to set up an STDIO transport client for the Kubernetes MCP server:

stdio-client.js
import {
  experimental_createMCPClient as createMcpClient
} from 'ai';
import {
  Experimental_StdioMCPTransport as StdioClientTransport
} from 'ai/mcp-stdio';

const initStdioClient = async () => {
  const transport = new StdioClientTransport({
    command: 'npx',
    args: ['-y', 'kubernetes-mcp-server@latest']
  });
  return createMcpClient({name: 'blog.marcnuri.com', transport});
};

In this code, we're creating an MCP client instance that runs the Kubernetes MCP server using the npx command.

These are the key steps to set up the STDIO transport client:

  1. Import the necessary modules from the ai and ai/mcp-stdio packages.
  2. Create a new instance of StdioClientTransport, passing the command and arguments to run the MCP server.
  3. Call createMcpClient with the transport instance to create the MCP client and establish the connection.

Setting up an SSE transport client

The Server-Sent Events (SSE) transport method allows us to connect to a remote MCP server over HTTP. This is useful when you have an MCP server already running as a service that you want to connect to.

The following code snippet demonstrates how to set up an SSE transport client:

sse-client.js
import {
  experimental_createMCPClient as createMcpClient
} from 'ai';

const initSseClient = async () => {
  return createMcpClient({
    name: 'blog.marcnuri.com',
    transport: {
      type: 'sse',
      url: `http://localhost:8080/sse`
    }
  });
};

In this fragment, we're creating an MCP client that connects to a local server running on port 8080 using the SSE protocol. This approach is more suitable for production environments or when you have a dedicated MCP server running.

Note

In production environments, your URL should point to the actual address of your MCP server (which should also be HTTPS-enabled).

These are the key steps to set up the SSE transport client:

  1. Import the necessary module from the ai package.
  2. Call createMcpClient with the transport configuration, specifying the type as sse and providing the URL of the MCP server.
  3. The function returns an MCP client instance that can be used to interact with the remote server.

Notice how the SSE-based transport is much simpler to set up than the STDIO-based transport.

Setting up the AI SDK provider and model

Once we have the MCP client set up, we need to configure the AI SDK provider and model that will leverage the MCP server's tools. The AI SDK has a growing list of supported providers, in this case, we will use the Google Generative AI provider.

The following code snippet demonstrates how to set up the Google Generative AI provider with the Gemini model:

google-generative-ai.js
import {createGoogleGenerativeAI} from '@ai-sdk/google';

const google = createGoogleGenerativeAI({
  apiKey: process.env['GOOGLE_API_KEY']
});
const model = google('gemini-2.0-flash');

In this code, we're creating an instance of the Google Generative AI provider and specifying the model we want to use:

  • createGoogleGenerativeAI is a function that initializes the provider with the necessary API key.
  • The google function is then called with the model name to create a model instance that can be used for generating text.

Note

The API key should be set in your environment variables for security reasons, rather than hardcoded in your application. In this example, we're using the GOOGLE_API_KEY environment variable to store the API key.

Prompting the model with generateText

Now that we've set up all the necessary components, we can put the pieces together to prompt the AI model and let it interact with the MCP server's tools.

In this example, we are going to generate text by asking the model to list all Kubernetes pods in a Markdown table format.

The following code snippet shows the relevant parts of the pipeline:

assistant.js
import {
  generateText,
} from 'ai';
import {createGoogleGenerativeAI} from '@ai-sdk/google';

const assistant = async () => {
  console.log('Starting kubernetes-mcp-server in STDIO mode');
  const stdioClient = await initStdioClient();
  const google = createGoogleGenerativeAI({
    apiKey: process.env['GOOGLE_API_KEY']
  });
  const model = google('gemini-2.0-flash');
  const listPods = await generateText({
    model,
    tools,
    maxSteps: 10,
    messages: [{
      role: 'user',
      content: 'List all pods in my cluster and output as Markdown table'
    }]
  });
  console.log(listPods.text);
  await stdioClient.close();
};

assistant()
  .then(() => {
    console.log('done');
  })
  .catch(err => {
    console.error('Error:', err);
  });

In this code, we create an assistant that connects the gemini-2.0-flash model to the Kubernetes MCP server tool and perform a simple prompt.

These are the key components of this example:

  • We first initialize the STDIO client and the Google Generative AI provider.
  • We then call the generateText function with the following parameters:
    • model: The model instance we created earlier.
    • tools: The tools available from the MCP server.
    • maxSteps: The maximum number of steps the model can take to generate the response.
      Setting the value to more than 1 ensures the model can perform the initial prompt and a subsequent prompt with the tools execution result.
    • messages: An array of messages that represent the conversation history. In this case, we provide a single user message asking the model to list all pods in the cluster and format the output as a Markdown table.
  • Finally, we log the generated text to the console.

The output of this invocation will be similar to:

| Name | Namespace |
|---|---|
| vibe-code-game-65c6fdd6d7-lp47m | default |
| yakd-dashboard-66cf44d6db-qv4gz | yakd-dashboard |

Nonetheless, the listPods variable will contain the entire conversation history, including the model's reasoning and the tool execution result.

Conclusion

In this post, I showed you how to connect to an MCP server from JavaScript using the Vercel AI SDK. We learned how to set up both STDIO and SSE transport clients, configure the AI SDK provider and model from Google, and prompt the model using the generateText function to generate useful output.

This modular architecture not only illustrates the flexibility of the SDK but also shows how simple code fragments can be combined to build complex, real-world applications. By following these steps, you can easily integrate your JavaScript applications with an MCP server and leverage the power of LLMs to interact with external tools and systems.

As the MCP ecosystem matures, JavaScript developers are well-positioned to leverage the growing catalog of standardized AI tools.

You can find the source code for this post on GitHub.

Twitter iconFacebook iconLinkedIn iconPinterest iconEmail icon

Post navigation
Fabric8 Kubernetes Client 7.2 is now available!Connecting to an MCP Server from JavaScript using LangChain.js
© 2007 - 2025 Marc Nuri