Introducing Goose, the on-machine AI agent
Introduction
In January 2025, Block introduced Goose, an open-source extensible AI agent distributed as a command-line interface (CLI) and Desktop application. Goose runs locally, can be connected to different large language model LLM providers, and is extensible through Model Context Protocol (MCP) servers.
Since its initial release, Goose has evolved significantly. The project now supports over 25 LLM providers, including commercial services, cloud platforms, and local models. In December 2025, Block contributed Goose to the Linux Foundation's Agentic AI Foundation (AAIF), alongside Anthropic's MCP and OpenAI's AGENTS.md, ensuring the project's future is shaped by the community under neutral governance.

In this post, I'll guide you through setting up Goose CLI in Linux and macOS, connecting it to Google Gemini, and extending its capabilities using an MCP server.
Setting up Goose CLI
The Goose Command Line Interface (CLI) is a lightweight tool that allows you to interact with Goose directly from your terminal. I prefer using the CLI because it provides a simple and efficient way to interact with the AI model without needing a graphical user interface.
First, download and install the Goose CLI. The getting started guide provides installation instructions for various operating systems.
For Linux, install the latest version of Goose using:
curl -fsSL https://github.com/block/goose/releases/download/stable/download_cli.sh | bashThis script fetches and installs the latest Goose version for your distribution.
Important
Consider verifying the script's source and checksum before execution for security best practices.
Alternatively, visit the GitHub releases page to download the appropriate binary for your operating system.
For macOS users, Goose is also available via Homebrew:
brew install block-goose-cliOnce installed, you can update Goose at any time by running:
goose updateConfiguring the LLM Provider (Google Gemini)
Next, configure Goose to use an LLM provider. Goose now supports over 25 different providers across three main categories:
- API-based providers: Anthropic (Claude), OpenAI, Google Gemini, xAI (Grok), Mistral AI, and more
- Cloud platforms: Amazon Bedrock, GCP Vertex AI, Azure OpenAI, Databricks, Snowflake
- Local providers: Ollama, Ramalama, Docker Model Runner (completely free, runs on your machine)
- CLI pass-through: Claude Code, OpenAI Codex, Cursor Agent, Gemini CLI (uses your existing subscriptions)
For this example, I'll use Gemini API, Google's LLM provider. I chose this option because it offers a free tier with generous rate limits, allowing for experimentation without incurring costs.
This is the fastest way to get started with Goose. However, you may need to switch provider or upgrade to a paid plan if you reach the rate limits.
To connect Goose to Gemini, you need to create a Google account and obtain an API key. You can generate this key in the Google AI Studio:

Press on the "Create API Key" button to generate a new API key.

We will use this API key to configure Goose to connect to Google Gemini by setting the Goose provider. You can do this by running the following command:
goose configureThe goose configure command will guide you through setting up the provider, including entering your API key and selecting the default model.

If the configuration is successful, you'll see a confirmation message.
If you encounter issues, check your API key, internet connection, or the logs in your ~/.config/goose directory for more information.
Now that Google Gemini is configured as our LLM provider, let's explore Goose’s capabilities by starting a new chat session.
Running Goose for the first time
Start a new session by running the following command:
goose sessionThis command launches a terminal-based chat session where you can interact with the LLM.

In the image above, you can see the Goose session running in the terminal. In this case, I asked the model to list its capabilities, and it responded with a list of supported commands.
Goose also includes several built-in slash commands that you can use during a session:
/prompts- List available prompts/prompt- Use a specific prompt/compact- Compact the conversation history/clear- Clear the conversation
Setting up Playwright MCP server
The MCP ecosystem has grown significantly since Goose's initial release, with over 3,000 MCP servers now available covering developer tools, productivity suites, and specialized services.
To demonstrate Goose's extensibility, let's set up the Playwright MCP server from Microsoft.
This server enables browser automation using Playwright's accessibility tree, making it fast and LLM-friendly without requiring vision models.
You can configure it through the command line or by editing your goose/config.yaml file.
For this example, we will use the command line:
goose configureThe CLI will prompt:
This will update your existing config file
if you prefer, you can edit it directly at /home/user/.config/goose/config.yaml
┌ goose-configure
│
◆ What would you like to configure?
│ ○ Configure Providers (Change provider or update credentials)
│ ○ Toggle Extensions
│ ● Add Extension
└Select Add Extension, choose Command-line Extension, and enter the following details for the Playwright MCP server:
┌ goose-configure
│
◇ What would you like to configure?
│ Add Extension
│
◇ What type of extension would you like to add?
