DevConf.CZ 2025

Get Up and Running with Llamastack to Create AI Applications!
2025-06-13 , D105 (capacity 300)

Llamastack is a framework that standardizes the core building blocks needed to build AI powered applications. In this hands-on workshop, we’ll show you how to run llamastack locally as a container, with either Ollama or vLLM backends. Once running, we’ll also show how to utilize the llamastack server to build AI applications.

What you’ll learn:
* Containerizing Llamastack with a remote vLLM backend or Ollama.
* Deploying vLLM in a container for efficient AI inference and tool calling.
* Upgrading containers to Quadlet & systemd for better automation and management.
* Building an AI application using Llamastack, with real-world deployment strategies.

Participants will get guided, practical experience running AI workloads in containers locally. Whether you're an AI developer, MLOps engineer, or container enthusiast, this workshop will equip you with the skills to efficiently deploy AI models in a modern, containerized environment. Bring your laptop and be ready to get hands-on!


What level of experience should the audience have to best understand your session?

Intermediate - attendees should be familiar with the subject

Urvashi is a Principal Software Engineer on the OpenShift MCO Team at Red Hat. She has spent the last few years developing Open Source container tools including Podman, Buildah, CRI-O, Skopeo, Kubernetes, and OpenShift. She is passionate about sharing her work and has given talks at various conferences including KubeCon, DevConf, and SCaLE. Urvashi is also a co-organizer of DevConf.US and an instructor at Boston University.

This speaker also appears in:

Sally Ann O'Malley is a Principal Software Engineer at Red Hat. She has worked on various teams within OpenShift. Currently, she is with the Emerging Technologies group. She enjoys combining open-source tools in creative ways to solve complex problems. Occasionally, she creates something new!