Log in

goodpods headphones icon

To access all our features

Open the Goodpods app
Close icon
Azure Cloud Talk - 24/12/3 Deploying Private Open Source LLMs on Azure

24/12/3 Deploying Private Open Source LLMs on Azure

Azure Cloud Talk

12/04/24 • 28 min

plus icon
bookmark
Share icon
Join Feynman Liang, CTO of Blueteam AI, for a practical demonstration of running open source AI models privately in Azure. This session will walk through a production-grade reference implementation that deploys Ollama and Open WebUI on AKS using infrastructure-as-code patterns. You'll learn how to set up a secure, compliant AI infrastructure using familiar tools like OpenTofu and Kubernetes, and understand the key architectural decisions that make this implementation suitable for enterprise use. Whether you're evaluating open source LLMs or looking to deploy them in production, this talk will provide you with actionable patterns and hands-on examples you can start using today. Key Topics: - Infrastructure-as-code patterns for AI workloads on Azure - Security and compliance considerations for private AI deployments - Practical deployment steps using OpenTofu and AKS - Live demo of the reference implementation - Best practices for scaling and managing open source AI infrastructure

Explicit content warning

12/04/24 • 28 min

plus icon
bookmark
Share icon

Generate a badge

Get a badge for your website that links back to this episode

Select type & size
Open dropdown icon
share badge image

<a href="https://goodpods.com/podcasts/azure-cloud-talk-454194/24123-deploying-private-open-source-llms-on-azure-79505748"> <img src="https://storage.googleapis.com/goodpods-images-bucket/badges/generic-badge-1.svg" alt="listen to 24/12/3 deploying private open source llms on azure on goodpods" style="width: 225px" /> </a>

Copy