Skip to content

1.1 | What You'll Learn

This is a 60-75 minute workshop that consists of a series of lab exercises that teach you how to build a production RAG (Retrieval Augmented Generation) based LLM application using Promptflow and Azure AI Studio.

You'll gain hands-on experience with the various steps involved in the end-to-end application development lifecycle from prompt engineering to LLM Ops.


Learning Objectives

By the end of this lab, you should be able to:

  1. Explain LLMOps - concepts & differentiation from MLOps.
  2. Explain Prompt Flow - concepts & tools for building LLM Apps.
  3. Explain Azure AI Studio - features & functionality for streamlining E2E app development.
  4. Design, run & evaluate RAG apps - using the Promptflow Extension on VS Code
  5. Deploy, test & use RAG apps - from Azure AI Studio UI (no code experience)

Pre-Requisites

We assume you have familiarity with the following:

  1. Machine Learning & Generative AI concepts
  2. Python & Jupyter Notebook programming
  3. Azure, GitHub & Visual Studio Code tooling

You will need the following to complete the lab:

  1. Your own laptop (charged) with a modern browser
  2. A GitHub account with GitHub Codespaces quota.
  3. An Azure subscription with Azure OpenAI access.
  4. An Azure AI Search resource with Semantic Ranker enabled.

Dev Environment

You'll make use of the following resources in this workshop:

Code Samples (GitHub Repositories)

Developer Tools (local and cloud)

Azure Resources (Provisioned in Subscription)

===