2025-02-28 –, Workshops | School of Design Floor-7/8 (capacity 50)
The rise of large language models (LLMs) has opened up exciting possibilities for developers looking to build intelligent applications. However, the process of adapting these models to specific use cases can be difficult, requiring deep expertise and substantial resources. In this talk, we'll introduce you to InstructLab, an open-source project that aims to make LLM tuning accessible to developers and engineers of all skill levels, on consumer-grade hardware.
We'll explore how InstructLab's innovative approach combines collaborative knowledge curation, efficient data generation, and instruction training to enable developers to refine foundation models for specific use cases. In this workshop, you’ll be provided a RHEL VM and learn how to enhance an LLM with new knowledge and capabilities for targeted applications, without needing data science expertise. Join us to explore how LLM tuning can be more accessible and democratized, empowering developers to build on the power of AI in their projects.
Pre-requisites for Workshop.
-
Have a laptop running Fedora, Mac or any linux variant.
If running windows, you can run a Linux virtual machine inside Hypervisor
[Please have any one hypervisor installed and linux system installed as Virtual Machine] -
Download the 3 models locally from GoogleDrive below :
https://drive.google.com/drive/folders/1NZOk46RU4l4XrYkXI3KkXXk07PHaYtPm?usp=drive_link
We will not have wifi facilities at venue, hence please download these models beforehand.
Beginner - no experience needed
Shardul is an AI Engineer at Finfactor, developing AI-driven solutions for fintech. Previously, he worked at Red Hat, driving innovation within the internal ecosystem. His interests include LLMs, Java, DevOps, and Flutter. A passionate educator, he has taught over 500 students across various colleges. Beyond technology, Shardul is an advocate of mindfulness and meditation, emphasizing balance and clarity in both work and life.
Dasharath Masirkar
Senior Principal Technical Support Engineer with 16 years of experience in cloud native development, works at Red Hat 🇮🇳, We can discuss (AI, RHEL AI, LLM, Python, InstructLab, AI application, Kubernetes, OpenShift)
As a DevOps Engineer with over 2 years of experience, I am passionate about streamlining workflows and automating infrastructure. My expertise lies in linux, networking, Infrastructure as a code using Ansible, Containerization using Openshift and Cloud infrastructure like AWS and Oracle Cloud. I'm always seeking ways to leverage cutting-edge technologies to improve efficiency and scalability.
I am AI evangelist working at Red Hat. I am really positive about what AI has to offer to the world and love to talk and discuss real life applications of AI. Red Hat offers RHEL AI which allows community and customers to make best use of LLMs for their enterprises.