Building AI-Powered GitHub Workflows: A Complete Guide to LLM Action

blog cover

In the AI era, integrating Large Language Models into CI/CD pipelines has become crucial for improving development efficiency. However, existing solutions are often tied to specific service providers, and LLM outputs are typically unstructured free-form text that is difficult to parse and use reliably in automated workflows. LLM Action was created to solve these pain points.

The core feature is support for Tool Schema structured output—you can predefine a JSON Schema to force LLM responses to conform to a specified format. This means AI no longer just returns a block of text, but produces predictable, parseable structured data. Each field is automatically converted into GitHub Actions output variables, allowing subsequent steps to use them directly without additional string parsing or regex processing. This completely solves the problem of unstable LLM output that is difficult to integrate into automated workflows.

Additionally, LLM Action provides a unified interface to connect to any OpenAI-compatible service, whether it’s cloud-based OpenAI, Azure OpenAI, or locally deployed self-hosted solutions like Ollama, LocalAI, LM Studio, or vLLM—all can be seamlessly switched.

Practical use cases include:

  • Automated Code Review: Define a Schema to output fields like score, issues, suggestions, directly used to determine whether the review passes
  • PR Summary Generation: Structured output of title, summary, breaking_changes for automatic PR description updates
  • Issue Classification: Output category, priority, labels to automatically tag Issues
  • Release Notes: Generate arrays of features, bugfixes, breaking to automatically compose formatted release notes
  • Multi-language Translation: Batch output multiple language fields, completing multi-language translation in a single API call

Through Schema definition, LLM Action transforms AI output from “unpredictable text” to “programmable data,” truly enabling end-to-end AI automated workflows.

[Read More]

打造 AI 驅動的 GitHub 工作流程:LLM Action 完整指南

blog cover

在 AI 時代,將大型語言模型整合進 CI/CD 流程已成為提升開發效率的關鍵。然而,現有的解決方案往往綁定特定服務商,且 LLM 的輸出通常是非結構化的自由文字,難以在自動化流程中可靠地解析與使用。LLM Action 的誕生正是為了解決這些痛點。

最核心的特色是支援 Tool Schema 結構化輸出——你可以預先定義 JSON Schema,讓 LLM 的回應強制符合指定格式。這意味著 AI 不再只是回傳一段文字,而是產出可預測、可解析的結構化資料,每個欄位都會自動轉換為 GitHub Actions 的輸出變數,讓後續步驟能直接取用,無需額外的字串解析或正則表達式處理。這徹底解決了 LLM 輸出不穩定、難以整合進自動化流程的問題。

此外,LLM Action 提供統一介面串接任何 OpenAI 相容的服務,無論是雲端的 OpenAI、Azure OpenAI,還是本地部署的 Ollama、LocalAI、LM Studio、vLLM 等自託管方案,都能無縫切換。

實際應用場景包括:

  • 自動化 Code Review:定義 Schema 輸出 scoreissuessuggestions 等欄位,直接用於判斷是否通過審查
  • PR 摘要生成:結構化輸出 titlesummarybreaking_changes 供後續自動更新 PR 描述
  • Issue 分類:輸出 categoryprioritylabels 自動為 Issue 加上標籤
  • Release Notes:產出 featuresbugfixesbreaking 陣列,自動組成格式化的發布說明
  • 多語言翻譯:批次輸出多個語言欄位,一次 API 呼叫完成多語系翻譯

透過 Schema 定義,LLM Action 讓 AI 輸出從「不可預測的文字」變成「可程式化的資料」,真正實現端到端的 AI 自動化工作流程。

[Read More]

From Natural Language to K8s Operations: The MCP Architecture and Practice of kubectl-ai

blog cover

kubectl-ai is a revolutionary open-source project that seamlessly integrates Large Language Models (LLMs) with Kubernetes operations, enabling users to interact intelligently with K8s clusters using natural language. This article explores how this innovative technology addresses the pain points of traditional kubectl command complexity and significantly lowers the barrier to entry for Kubernetes users.

[Read More]

Building a Unified API Gateway for Cross-Cloud AI Services: A Secure and Scalable Enterprise Solution

blog logo

In today’s enterprise environment, Generative AI technology has become key to enhancing business efficiency and innovation. However, with the diversification and decentralization of AI services, how to uniformly manage and call these services has become a challenge. This article will introduce how to use Golang to develop a unified API Gateway for cross-cloud AI services, achieving a secure and scalable enterprise-level solution.

This was my second public talk at the 2025 iThome CloudSummit Taiwan. Below is the outline of the talk.

  1. Authentication and Authorization
  2. Multi-Cloud AI Backend Integration (Azure OpenAI, AWS Bedrock, Google Gemini AI, etc.)
  3. Traffic Control and Resource Management
  4. Monitoring and Metrics Aggregation
[Read More]

跨雲端 AI 服務統一 API Gateway:安全、可擴展的企業級解決方案

blog logo

在當今的企業環境中,Generative AI 技術已經成為提升業務效率和創新能力的關鍵。然而,隨著 AI 服務的多樣化和分散化,如何統一管理和調用這些服務成為了一個挑戰。本文將介紹如何使用 Golang 開發一個跨雲端 AI 服務的統一 API Gateway,實現安全、可擴展的企業級解決方案。

這是我在 2025 年 iThome 臺灣雲端大會 (CloudSummit) 的第二場公開演講,底下是演講的內容大綱。

  1. 身份驗證與授權
  2. 多雲人工智慧後端整合 ( Azure OpenAIAWS BedrockGoogle Gemini AI …. )
  3. 流量控制與資源管理
  4. 監控與指標聚合
[Read More]

Step-by-Step Guide to Building MCP Server and Client with Golang (Model Context Protocol)

blog logo

In 2025, I delivered a workshop at the iThome Taiwan Cloud Summit in Taipei, titled “Step-by-Step Guide to Building MCP Server and Client with Golang (Model Context Protocol)”. The goal of this workshop was to help developers understand how to implement the MCP protocol using Golang, providing practical code examples and hands-on guidance. I have organized all workshop materials into a GitHub repository, which you can find at go-training/mcp-workshop. For detailed workshop content, please refer to this link.

[Read More]

一步步學會用 Golang 開發 MCP 伺服器與客戶端 (Model Context Protocol)

blog logo

2025 年在台北 iThome 臺灣雲端大會 (CloudSummit) 給一場 Workshop,主題是「一步步學會用 Golang 開發 MCP 伺服器與客戶端 (Model Context Protocol)」。這次的工作坊旨在幫助開發者了解如何使用 Golang 實現 MCP 協議,並提供實際的程式碼範例和操作指南。我已經先將 Workshop 的內容整理成一個 GitHub Repo,您可以在 go-training/mcp-workshop 找到相關資源。詳細 Workshop 內容,請參考此連結

[Read More]

Integrating Gitea with Jira Software Development Workflow

blog logo

Before diving in, let’s familiarize ourselves with Gitea and Jira. For better context, I recommend reading “Git Software Development Guide: Key to Improving Team Collaboration” first.

Gitea is a lightweight self-hosted Git server written in Go, providing teams with an easily deployable code management solution. It supports multiple operating systems including Linux, Windows, and macOS, while offering comprehensive features for code review, issue tracking, and Wiki management—all essential tools for enhancing team collaboration.

Jira is Atlassian’s professional project management and issue tracking system. Widely adopted by software development teams worldwide, Jira excels in issue tracking, supports agile methodologies (including Scrum and Kanban), and provides robust data analytics capabilities to optimize project management and team collaboration.

[Read More]

Gitea 與 Jira 軟體開發流程整合

blog logo

在開始本文前,讓我們先來了解 GiteaJira 這兩個工具。建議您可以先閱讀『Git 軟體開發指南:提高團隊協作的關鍵』,以更好地理解後續內容。

Gitea 是一套以 Go 語言開發的輕量級自架式 Git 伺服器,為團隊提供了便於部署的程式碼管理方案。系統除了支援 Linux、Windows 和 macOS 等多種作業系統外,還具備完善的程式碼審查、問題追蹤和 Wiki 等功能,能大幅提升團隊的協作開發效率。

Jira 是 Atlassian 公司開發的專業級專案管理與問題追蹤系統。作為軟體開發團隊廣泛採用的工具,Jira 不僅提供完整的問題追蹤功能,還支援敏捷開發流程(如 Scrum 和 Kanban)以及豐富的數據分析功能,有效協助團隊管理專案進度並提升協作品質。

[Read More]