3.2: Update Prompt Metadata¶
OPTIONAL:
If you get stuck, you can skip this step and copy over a pre-edited file.
Click to expand this section to see the hidden commands to do this.
1 |
|
To execute the Prompty asset, we need specify the languge model to use for generating the response. This metadata is defined in the frontmatter of the Prompty file. In this section, we'll update the metadata with model configuration and other information.
1. Update model configuration¶
- Return to the Visual Studio Code terminal pane.
- If you are still seeing the error message from the previous step, then you are in the Output tab. Switch to the Terminal tab to get a command prompt.
-
Now, use this command to copy the previous prompty to a new one.
1
cp basic.prompty chat-0.prompty
-
Open
chat-0.prompty
and replace Line 11 with this one (fixing the placeholder value<your-deployment>
):1
azure_deployment: ${env:AZURE_OPENAI_CHAT_DEPLOYMENT}
Prompty will use the AZURE_OPENAI_CHAT_DEPLOYMENT variable from the .env file we created earlier to find the OpenAI endpoint we pre-deployed. For now, that env specifies gpt-35-turbo as the model.
2. Edit Basic information¶
Basic information about the prompt template is provided at the top of the file.
- name: Change it to:
1
Contoso Chat Prompt
- description: Change it to:
1
A retail assistant for Contoso Outdoors products retailer.
- authors: Replace the provided name with your own.
3. Edit the "sample" section¶
The sample section specifies the inputs to the prompty, and supplies default values to use if no input are provided. Edit that section as well.
-
firstName: Choose any name other than your own (for example,
Nitya
). -
context: Remove this entire section. (We'll update this later)
-
question: Replace the provided text with:
1
What can you tell me about your tents?
Your sample section should now look like this:
1 2 3 |
|
4. Run updated Prompty file¶
-
Run
chat-0.prompty
. (Use the Run button or press F5.) -
Check the OUTPUT pane. You will see a response something like this:
"[info] Hey Nitya! Thank you for asking about our tents. ..."
Generative AI models use randomness when creating responses, so your results aren't always the same.
OPTIONAL: Try these exercises at home
Continue ideating on your own! Try each change below, then run the Prompty and observe how the output changes. Can you see how the Prompty template is rendered based on these changes?
- Change the
question
field to ask a different question about camping gear. - Change the
firstName
to a different name
CONGRATULATIONS. You updated your Prompty model configuration!
Continue ideating on your own! If you like, try changing the firstName
and question
fields in the Prompty file and run it again. How do your changes affect the response?