Introducing Goose, the on-machine AI agent
Introduction
In January 2025, Block introduced Goose, an open-source extensible AI agent distributed as a command-line interface (CLI) and Desktop application. Goose runs locally, can be connected to different large language model LLM providers, and is extensible through Model Context Protocol (MCP) servers.
In this post, I'll guide you through setting up Goose CLI in Linux, connecting it to Groq, and extending its capabilities using an MCP server.
Setting up Goose CLI
The Goose Command Line Interface (CLI) is a lightweight tool that allows you to interact with Goose directly from your terminal. I prefer using the CLI because it provides a simple and efficient way to interact with the AI model without needing a graphical user interface.
First, download and install the Goose CLI. The getting started guide provides installation instructions for various operating systems.
For Linux, install the latest version of Goose using:
curl -fsSL https://github.com/block/goose/releases/download/stable/download_cli.sh | bash
This script fetches and installs the latest Goose version for your distribution.
Important
Consider verifying the script's source and checksum before execution for security best practices.
Alternatively, visit the GitHub releases page to download the appropriate binary for your operating system.
Configuring the LLM Provider (Groq)
Next, configure Goose to use an LLM provider. In this case, I'll use Groq, a powerful LLM provider that offers a wide range of models and capabilities and is known for its speed and efficiency.
Groq's unique Language Processing Unit (LPU) architecture delivers high throughput while consuming minimal power, making it ideal for real-time AI tasks.
Additionally, Groq offers a free tier with generous rate limits, allowing developers to experiment without incurring costs.
To connect Goose to Groq, you need to create an account on the Groq website and obtain an API key. You can generate this key in the Groq cloud console:
We will use this API key to configure Goose to connect to Groq by setting the Goose provider. You can do this by running the following command:
goose configure
The goose configure
command will guide you through setting up the provider, including entering your API key and selecting the default model.
If the configuration is successful, you'll see a confirmation message.
If you encounter issues, check your API key, and internet connection, or check the logs in your ~/.config/goose
directory for more information.
With Groq configured as our LLM provider, we can now explore Goose's capabilities by running a new chat session.
Running Goose for the first time
Start a new session by running the following command:
goose session
This command launches a terminal-based chat session where you can interact with the LLM.
In the image above, you can see the Goose session running in the terminal. In this case, I asked the model to list its capabilities, and it responded with a list of supported commands.
Setting up Puppeteer MCP server
To demonstrate Goose's extensibility, let's set up the reference Puppeteer MCP server.
You can do this through the command line or by editing your goose/config.yaml
file.
For this example, we will use the command line:
goose configure
The CLI will prompt:
This will update your existing config file
if you prefer, you can edit it directly at /home/user/.config/goose/config.yaml
┌ goose-configure
│
◆ What would you like to configure?
│ ○ Configure Providers (Change provider or update credentials)
│ ○ Toggle Extensions
│ ● Add Extension
└
Select Add Extension
, choose Command-line Extension
, and enter the following details for the Puppeteer MCP server:
┌ goose-configure
│
◇ What would you like to configure?
│ Add Extension
│
◇ What type of extension would you like to add?
│ Command-line Extension
│
◇ What would you like to call this extension?
│ puppeteer
│
◇ What command should be run?
│ npx -y @modelcontextprotocol/server-puppeteer
│