AI is no longer a future investment—it’s a practical toolset you can start using today. In this session, we’ll explore how to enhance your modern .NET applications using the .NET Aspire stack alongside AI technologies, both locally and in the cloud.
We’ll show you how to bring LLMs into your apps using Microsoft.Extensions.AI, whether you’re calling models hosted in Azure OpenAI or running them locally with Ollama. From the simplicity of adding a chat interface to orchestrating complex interactions with multi-step reasoning agents, you’ll see firsthand how Aspire and AI unlock new possibilities.
We’ll cover: • Building AI-powered apps with Aspire’s opinionated orchestration • Connecting to Azure OpenAI, Ollama-hosted LLMs, and GitHub’s Code LLMs • Integrating Semantic Kernel to enable memory, embeddings, and planner-based agents • Designing and deploying lightweight agents for real-time workflows • Best practices for performance, reliability, and deployment (local and cloud)
Whether you’re exploring AI for the first time or looking to add intelligent features to an existing .NET solution, this talk gives you the tools and examples you need to get started—with no rewrite required.