Kubernetes SUSE Rancher Open Source digiRunner

Building an Open-Source AI Gateway with SUSE Rancher and digiRunner

樊博文 Anthony Fan 2025/05/08 11:48:45
11

Building an Open-Source AI Gateway with SUSE Rancher and digiRunner

OpenTPI | December 27, 2024

 

In today’s fast-paced world of artificial intelligence and containerized application development, creating a robust and scalable platform for deploying GenAI services is critical. This blog explores how to leverage SUSE Rancher Kubernetes and digiRunner Open-source to build a secure, efficient, and scalable AI Services Gateway. This platform supports the development and deployment of GenAI applications, such as those powered by Ollama’s GenAI services, while integrating seamlessly into the DevOps process.

 

The Vision: Unified GenAI Application Development and Deployment

 

As AI technology continues to evolve, developers and organizations face challenges such as:

 

  • Managing containerized environments for GenAI services.

 

  • Ensuring secure communication and API traffic management.

 

  • Scaling deployments across multi-cluster Kubernetes environments.

 

  • Streamlining updates and configurations through a GitOps workflow.

 

This blog illustrates how SUSE Rancher Kubernetes, digiRunner Open-source, and DevOps best practices come together to address these challenges. By integrating these tools, you can create an API Gateway tailored for GenAI services, enabling a seamless pipeline for development, testing, and production deployment.

 

Core Technologies in Focus

 

1. SUSE Rancher Kubernetes with K3s

 

Rancher Desktop, powered by K3s, simplifies the management of Kubernetes clusters. It provides a lightweight, production-grade Kubernetes solution ideal for local development and testing.

 

  • Use Case: Setting up a development environment for containerized GenAI apps.

 

  • Key Features:
    • Integrated Kubernetes and container management.
    • Simulates production environments for robust testing.
    • Supports extensions for resource monitoring and management.

 

2. digiRunner Open-source API Gateway

 

digiRunner Open-source, an open-source API Gateway by TPI Software, ensures secure and efficient communication between GenAI applications and their APIs.

 

  • Use Case: Acting as a mediator for GenAI services like Ollama, providing streamlined and secure API traffic management.

 

  • Key Features:
    • API discovery, registration, and management.
    • Traffic security and authorization.
    • Simplified testing and debugging interfaces.

 

3. Rancher Fleet for GitOps-Driven Deployments

 

Rancher Fleet enables continuous delivery of applications across Kubernetes clusters. By adopting a GitOps approach, this tool automates large-scale deployments while ensuring consistency.

 

  • Use Case: Scaling GenAI applications across multi-cluster environments.

 

  • Key Features:
    • Multi-cluster orchestration.
    • Configuration versioning through Git.
    • Automatic updates based on repository changes.

 

4. Ollama with OpenWebUI

 

Ollama, integrated with OpenWebUI, delivers GenAI capabilities through APIs and an intuitive interface.

 

  • Use Case: Developing and deploying intelligent applications powered by the Mistral LLM and other models.
  •  
  • Key Features:
    • Local deployment of LLMs with low resource requirements.
    • API integration for seamless app development.

 

Key Steps to Build Your AI Services Gateway

 

1.       Setting Up the Development Environment

 

  • Install and configure Rancher Desktop with K3s.
  • Configure Kubernetes, container engines, and resource monitoring.
  • Deploy OpenWebUI with Ollama on a local K3s cluster.
  • Verify the deployment and test LLM functionality via APIs.

 

2.       Securing GenAI Apps with digiRunner Open-source

 

  • Deploy the digiRunner Open-source API Gateway within the local K3s cluster.
  • Register and secure APIs for Ollama’s GenAI services.
  • Perform API testing via digiRunner’s management dashboard.
  • Enable secure traffic flow for GenAI applications.

 

3.       Multi-Cluster Deployment with Rancher Fleet

 

  • Set up and manage cluster groups using Rancher Fleet.
  • Configure a Git repository for continuous delivery of updates.
  • Deploy applications to multiple clusters using GitOps workflows.
  • Test and monitor the scalability of the GenAI applications.

 

What You’ll Achieve

 

By following this guide, you will:

 

  1. Gain expertise in setting up a local Kubernetes environment using Rancher Desktop.

 

  1. Learn to deploy and test GenAI applications, such as those powered by Ollama and OpenWebUI.

 

  1. Understand how to secure and manage APIs with digiRunner Open-source, creating a robust AI Services Gateway.

 

  1. Explore multi-cluster deployment strategies using Rancher Fleet, enabling scalability and reliability.

 

  1. Master the GitOps workflow for seamless updates and deployments.

 

System Requirements

 

To replicate the setup described here, you’ll need:

 

  • A laptop with:
    • Quad-core processor.
    • 16GB RAM.
    • 50GB free SSD space.

 

  • Supported operating systems:
    • Windows 10.
    • macOS (Apple Silicon M1 or later).
    • Linux (e.g., OpenSUSE Leap).

 

  • Stable and fast internet connectivity.

 

Conclusion: Embrace the Power of Open-Source for GenAI

 

This blog demonstrates the synergy of open-source tools in creating a scalable, secure, and developer-friendly platform for GenAI applications. By integrating SUSE Rancher Kubernetes, digiRunner Open-source, and Ollama, you can unlock new possibilities for building and managing intelligent services that align with modern DevOps practices.

 

Simplify development, enhance collaboration, and ensure robust deployments of your GenAI applications. The future of AI-powered innovation starts here!

 

Reference

 

For the complete setup and additional details, visit the GitHub project: Rancher Desktop K3s GenAI.

樊博文 Anthony Fan