Skip to content

3.2: Update Prompt Metadata

CUSTOMIZE YOUR PROMPTY ASSET FRONTMATTER

The Prompty asset uses a YAML format with frontmatter in the top half that specifies configuration metadata, and the template in the bottom half for specifying the prompt context, instructions and system persona. In this step we'll:

  1. Fix the deployment configuration error in the frontmatter
  2. Customize the frontmatter metadata to reflect our application
  3. Run the customized prompty asset to validate our changes

1. Update model configuration

  1. Let's start by fixing the model deployment issue.
  2. First, copy basic.prompty to chat-0.prompty. This helps us mimic the iterative nature of prompt engineering with each version of the asset getting us closer to the final prototype.
    1
    cp basic.prompty chat-0.prompty
    
  3. Open the chat-0.prompty file in VS Code. The model configuration section looks like this:

    chat-0.prompty
    1
    2
    3
    4
    5
    6
    7
    8
    9
    model:
        api: chat
        configuration:
            type: azure_openai
            azure_endpoint: ${env:AZURE_OPENAI_ENDPOINT}
            azure_deployment: <your-deployment>
            api_version: 2024-07-01-preview
        parameters:
            max_tokens: 3000
    
  4. Update it to have configuration values replaced by environment variables as shown. Note that these correspond to the values in .env that we refreshed earlier.

    chat-0.prompty
    1
    2
    3
    4
    5
    6
    7
    8
    9
    model:
        api: chat
        configuration:
            type: azure_openai
            azure_endpoint: ${env:AZURE_OPENAI_ENDPOINT}
            azure_deployment: ${env:AZURE_OPENAI_CHAT_DEPLOYMENT}
            api_version: ${env:AZURE_OPENAI_API_VERSION}
        parameters:
            max_tokens: 3000
    

    Prompty uses a YAML syntax which can be sensitive to indentation (tabs, spaces). Make sure that the various properties are aligned correctly.


2. Edit Basic information

Basic information about the prompt template is provided at the top of the file. Update the name, description and authors to reflect your application needs. as shown.

TIP: Copy this and replace the frontmatter in your asset - then update the authors name to your own

1
2
3
4
5
---
name: Contoso Chat Prompt
description: A retail assistant for Contoso Outdoors products retailer.
authors:
  - Nitya Narasimhan

3. Edit the "sample" section

The sample section specifies the inputs to the prompty, and supplies default values to use if no input are provided. Replace it with the following:

1
2
3
sample:
  firstName: Nitya
  question: What can you tell me about your tents?

By doing this you will have removed the context input that was otherwise being used to ground the data in the default sample. Later, we will replace this with our own.


4. Run updated Prompty file

Generative AI models use randomness when creating responses, so your results may always be the same.

  1. Run chat-0.prompty. (Use the Run button or press F5.)

  2. Check the OUTPUT pane. You will see a response something like this:

    • "[info] Hey Nitya! Thank you for asking about our tents. ..."
  3. Try other changes later, to get an intuition for prompt engineering.

    • Try changing the question
    • Try changing the firstName value

CONGRATULATIONS. You updated your Prompty model configuration!