AI is transforming the world, and at the core of this revolution are Large Language Models (LLMs). While cloud-based AI services like ChatGPT and Claude dominate the landscape, running AI models locally opens up new possibilities for control, customization, and efficiency. That’s where Ollama comes in.
This course empowers you to deploy and manage LLMs efficiently on your own infrastructure. Whether you’re a developer, data scientist, or AI enthusiast, mastering Ollama will give you the flexibility to build AI-powered applications while maintaining full control over your models.
This course combines hands-on labs with real-world scenarios, ensuring you gain practical experience in deploying and working with LLMs. You’ll experiment with model customization, build AI-powered applications, and develop skills to harness the full potential of AI on your local infrastructure.
Arsh is a passionate technologist with a deep curiosity for exploring and experimenting with emerging technologies. A strong advocate for continuous learning, he actively shares his knowledge through blogs and video tutorials. As a dedicated member of open-source communities, he has built a reputation as a prominent contributor and advocate. Recognized as a CNCF Ambassador and a recipient of the Kubernetes Contributor Award, Arsh has made significant contributions to the cloud-native ecosystem. His open-source journey includes working on the Kubernetes team at VMware and contributing to key CNCF projects such as cert-manager and Kyverno.
Beyond engineering, Arsh has extensive experience in technical marketing. At Okteto, a fast-growing startup in the platform engineering space, he led technical content creation and advocacy, bridging the gap between engineering and community engagement.