2025 Year in Review: The Year of AI

2025 was a turning point. It was the year artificial intelligence fundamentally transformed how I work, think, and build software.
This is my sixth Red Hat anniversary post, but I'm reframing the series. Rather than focusing solely on my work anniversary in late October, I want to reflect on the entire year and the profound changes it brought.
You can catch up on my previous years as a Red Hatter in these posts:
- My first year at Red Hat
- Two years of Open Source at Red Hat
- Reflecting on three years at Red Hat
- Four years of Free Software at Red Hat
- Five years at Red Hat
📷 Featured photo by Julia (El Arte de Julieta)
A pivotal year
If I had to choose one word to describe 2025, it would be pivotal.
On one hand, this was the year I fully embraced AI-assisted development. I restructured my workflows around AI agents, achieved significant productivity gains, and created tooling that bridges the gap between AI systems and cloud-native infrastructure.
On the other hand, it was also a year of uncertainty. In February 2025, Red Hat announced changes to its middleware strategy, transferring key teams to IBM. These were the teams that relied most heavily on Fabric8 Kubernetes Client and Eclipse JKube, the projects I've poured years of effort into.
Despite my respect for IBM, it weighed on me. Being a Red Hatter isn't just a job title for me; it's an identity I've cherished since my teenage years when I first discovered Red Hat Linux.
But here I am, still doing what I love: building free open source tools that help developers. And this year, that mission expanded in unexpected directions.
This year's highlights
Now let me share some of the highlights from 2025.
Projects
Kubernetes MCP Server
If there's one project that defined 2025 for me, it's the Kubernetes MCP Server.
It started as a proof of concept in late January 2025, originally built around the Fabric8 Kubernetes Client. When I first ran the MCP server, the results surprised me. It was the first time I truly realized the potential of AI. Until then, AI was cool but not that much of a game changer. Seeing an AI agent autonomously deploy and manage applications on Kubernetes was a revelation.
I sent a few demos to internal mailing lists at Red Hat, and they were a great success. Seeing the potential, I decided to port the project to Go. Most of the Kubernetes ecosystem is built in Go, and Kubernetes developers are far more familiar with it than Java. This turned out to be the right call. Adoption started to increase, and so did interest from across Red Hat. People from other business units reached out to learn more and explore how they could use it.
I put a lot of effort into promoting the project and making it stable enough to be embedded in real products. The Model Context Protocol (MCP) provided the perfect bridge between AI agents and Kubernetes clusters, allowing AI systems to deploy, manage, and troubleshoot applications autonomously.
The project's growth speaks for itself: over 1,050 stars and 232 forks on GitHub, reflecting both its adoption by the AI developer community and the growing number of contributors.
The peak came when the project joined the Containers organization on GitHub, alongside industry-standard tools like Podman and Buildah. This gave the project a neutral space for collaboration and validated its importance. At this point, teams from OpenShift are actively contributing features on top of it.
This is one of the biggest successes of my career at Red Hat. Taking a proof of concept from my laptop to an organization-level project with cross-team contributions in less than a year is something I'm incredibly proud of.
Along the way, I documented my learnings on this blog. From giving superpowers to small language models to connecting MCP servers with various AI frameworks, these posts capture the evolution of my thinking about the future of developer tools and how they need to evolve for AI-augmented development.
Fabric8 Kubernetes Client
The Fabric8 Kubernetes Client saw 7 releases this year, including the new 7.x line. We released versions 7.1.0 through 7.4.0, plus maintenance releases for the 6.x branch.
Community contributions remain strong, and the project continues to be a foundation for countless Kubernetes tools in the Java ecosystem. While my personal focus shifted toward MCP tooling, the project remains healthy and actively maintained.
I'm happy that Ashish Thakur joined the team this year to help with the maintenance of both Fabric8 and JKube. His involvement ensures these projects continue to thrive.
Eclipse JKube
Eclipse JKube had 3 releases this year (1.18.0, 1.18.1, 1.18.2). The project now has 846 stars and 551 forks.
With the middleware team changes, the immediate future of JKube felt uncertain at times. But with Ashish on board and continued community growth, the project remains valuable for Java developers deploying to Kubernetes.
The AI Transformation
Beyond building MCP servers, 2025 was the year I transformed my own development practices.
After my summer break, I went all-in on AI tooling. I wrote about this extensively in Boosting My Developer Productivity with AI in 2025, but the short version is: my productivity more than doubled.
The key insight isn't about any single tool, but about parallelism. Using CLI agents like Claude Code and GitHub's Copilot Coding Agent, I can orchestrate multiple AI systems working on different tasks simultaneously. My role has shifted from implementer to orchestrator.
This shift had uncomfortable implications too. The nature of the job is changing. Coding is no longer the job; orchestrating AI agents is. As someone who genuinely loves the craft of coding, this realization was bittersweet.
I'm confident that software engineers will still be needed, even if our roles evolve toward orchestration and architecture. This shift doesn't diminish the importance of engineering fundamentals; if anything, it amplifies them. What worries me is that decision-makers who don't understand the nuances might believe that engineering teams can be fully replaced by AI agents. That's a dangerous misconception, and one I hope the industry navigates carefully.
Public Speaking
This year I spoke at DevBcn 2025 in July, delivering a talk on "Model Context Protocol Servers 101: Unlocking the Power of AI". It was a comprehensive introduction to MCP and its implications for the future of developer tooling.
To my surprise, I received the "Most Original Speaker" award. Once again, I'm humbled by this recognition. There were many great talks at the conference, and in my opinion, several deserved the award more than I did. However, for an introvert like me, this is huge. It's a testament to the effort I put into each of my conference talks, and validation that stepping outside my comfort zone has been worth it.
Unfortunately, the talk wasn't officially recorded. I did record a version from home later that week, which you can watch below. It's not as engaging as the live presentation, but if you're curious about MCP, it covers the essentials:
Looking Forward
As I write this in early 2026, the trajectory is clear: AI will continue reshaping how we build software. The question isn't whether to embrace these changes, but how to do so thoughtfully.
I remain committed to building free open source tools that empower developers, whether those developers are humans, AI agents, or some combination of both. The Kubernetes MCP Server, Fabric8 Kubernetes Client, and Eclipse JKube will continue to evolve.
Despite the organizational uncertainties, I'm grateful to be doing what I love at Red Hat. I want to thank my managers, who have reassured me about my value during these times. Their support has meant a lot when things felt uncertain.
Six years in, and the mission remains the same: make developers' lives easier through free and open source software.
If you've followed my journey or used any of the tools I've worked on, I'd love to hear your thoughts.
At the end of the day, what drives me hasn't changed: the joy of building something useful, sharing it with the world, and seeing others build on top of it. That's the magic of open source, and it's why I'll keep doing this for as long as I can.
