Adarsh Dubey
I’m a Software Engineer at Red Hat and a current M.Tech AI student at IIT Jodhpur. I’ve been in tech for about four years and love working on cloud-native systems, applied AI, and the open-source ecosystem. I enjoy exploring how modern AI tools can be integrated into real production environments and sharing what I learn with the community.
Session
AI systems often look magical from the outside they can write code, generate images, and understand language. But behind the illusion, AI is powered by surprisingly simple foundations: basic linear algebra, optimization, high-dimensional geometry, and the physics-driven constraints of modern hardware. In this talk, we break down how AI models actually operate at a fundamental level: how vectors and matrices describe meaning, how gradient descent lets neural networks learn, why high-dimensional geometry makes embeddings work, and how GPUs and tensor cores accelerate the math. By the end, participants will understand the reality of AI systems' powerful, but not mystical, and see the clear connections between math, physics, and computing that enable modern models like transformers and LLMs.