Introducing Goose, the on-machine AI agent
Introduction
In January 2025, Block introduced Goose, an open-source extensible AI agent distributed as a command-line interface (CLI) and Desktop application. Goose runs locally, can be connected to different large language model LLM providers, and is extensible through Model Context Protocol (MCP) servers.

In this post, I'll guide you through setting up Goose CLI in Linux, connecting it to Google Gemini, and extending its capabilities using an MCP server.
Setting up Goose CLI
The Goose Command Line Interface (CLI) is a lightweight tool that allows you to interact with Goose directly from your terminal. I prefer using the CLI because it provides a simple and efficient way to interact with the AI model without needing a graphical user interface.
First, download and install the Goose CLI. The getting started guide provides installation instructions for various operating systems.
For Linux, install the latest version of Goose using:
curl -fsSL https://github.com/block/goose/releases/download/stable/download_cli.sh | bash
This script fetches and installs the latest Goose version for your distribution.
Important
Consider verifying the script's source and checksum before execution for security best practices.
Alternatively, visit the GitHub releases page to download the appropriate binary for your operating system.
Configuring the LLM Provider (Google Gemini)
Next, configure Goose to use an LLM provider. In this case, I'll use Gemini API, Google's LLM provider. I chose this option because it offers a free tier with generous rate limits, allowing for experimentation without incurring costs.
This is the fastest way to get started with Goose. However, you may need to switch provider or upgrade to a paid plan if you reach the rate limits.
To connect Goose to Gemini, you need to create a Google account and obtain an API key. You can generate this key in the Google AI Studio:

Press on the "Create API Key" button to generate a new API key.

We will use this API key to configure Goose to connect to Google Gemini by setting the Goose provider. You can do this by running the following command:
goose configure
The goose configure
command will guide you through setting up the provider, including entering your API key and selecting the default model.

If the configuration is successful, you'll see a confirmation message.
If you encounter issues, check your API key, internet connection, or the logs in your ~/.config/goose
directory for more information.
Now that Google Gemini is configured as our LLM provider, let's explore Goose’s capabilities by starting a new chat session.
Running Goose for the first time
Start a new session by running the following command:
goose session
This command launches a terminal-based chat session where you can interact with the LLM.

In the image above, you can see the Goose session running in the terminal. In this case, I asked the model to list its capabilities, and it responded with a list of supported commands.
Setting up Puppeteer MCP server
To demonstrate Goose's extensibility, let's set up the reference Puppeteer MCP server.
You can do this through the command line or by editing your goose/config.yaml
file.
For this example, we will use the command line:
goose configure
The CLI will prompt:
This will update your existing config file
if you prefer, you can edit it directly at /home/user/.config/goose/config.yaml
┌ goose-configure
│
◆ What would you like to configure?
│ ○ Configure Providers (Change provider or update credentials)
│ ○ Toggle Extensions
│ ● Add Extension