Skip to content

Serverless Example

Prompty can be configured to use serverless models.

Refer to the serverless.prompty file for more examples.

YAML
 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
---
name: Serverless Example
description: A prompt that uses a serverless model
authors:
  - example_author
model:
  api: chat
  configuration:
    type: serverless
    endpoint: https://models.inference.ai.azure.com
    model: Mistral-small
    key: ${env:SERVERLESS_KEY:KEY}
sample:
  question: What is the weather like today?
---
system:
You are a helpful assistant.
user:
{{question}}