Foundations

Prompts are the vehicle for delivering context, instructions, and embedded resources to a large language model. At their core, prompts are a collection of system instructions that govern the behavior of the LLM and the messages that are exchanged between the user and the LLM. Within re-factor, these basic prompt components make up the CompletionPrompt type. In the case of agents, prompts also include tools that can be used by the LLM to complete the request outlined in the system and user messages. These prompts make up the AgentPrompt type.

Types

CompletionPrompt

A completion prompt is one that contains only system, user, and assistant messages. It does not contain tools.
system
string
A system message that provides context and instructions for the LLM. This is often used to set the overall tone and direction of the conversation.
messages
array<UserMessage | AssistantMessage>
required
An array of messages that define the conversation between the user and the LLM. Each message has a role, content, and optional metadata.Can be a UserMessage or AssistantMessage. This field is required and must contain at least one user message and one assistant message.

AgentPrompt

Like a completion prompt, an agent prompt may contain system, user, and assistant messages, but it differs in that it also contains a tools parameter, which is an array of tools that can be used by the LLM to complete the request outlined in the system and user messages.
system
string
A system message that provides context and instructions for the LLM. This is often used to set the overall tone and direction of the conversation.
messages
array<UserMessage | AssistantMessage>
required
An array of messages that define the conversation between the user and the LLM. Each message has a role, content, and optional metadata.Can be a UserMessage or AssistantMessage. This field is required and must contain at least one user message and one assistant message.
tools
array<Tool>
required
An array of tools that can be used by the LLM to complete the request outlined in the system and user messages.Each item in the array must be a Tool object. This field is required and must contain at least one tool.

UserMessage

A user message is a message sent by the user to the LLM. It typically contains a role, content, and optional metadata.
role
string
required
The role of the message. For a user message, this should be "user".
content
string | object
required
The content of the message. This is the text that the LLM will process.In many cases, this will be a string as you can use Resource Embedding to interpolate resources into your prompts. However, in the event you want fine grained control, you can also provide any Vercel AI SDK CoreUserMessage object.

AssistantMessage

An assistant message is a message sent by the LLM to the user. It typically contains a role, content, and optional metadata.
role
string
required
The role of the message. For an assistant message, this should be "assistant".
generate
boolean
required
A flag indicating whether the message should be generated by the LLM. If true, the content field should be omitted. If false, the content field should be provided.
format
enum<text, object>
default:"text"
The format of the message that will be generated by the LLM. This is required if generate: true.
set_output
string
The name of the output variable that will be set with the generated message. This is optional and should be omitted if generate: false.
schema
object
The schema of the message. This is required if generate: true and format: object. Should be a valid JSONSchema Draft-07 object.
content
string
The content of the message. This can be used to set a synthetic response from the LLM. It is optional and should be omitted if generate: true.