How to compose prompts together
This guide assumes familiarity with the following concepts:
LangChain provides a user friendly interface for composing different parts of prompts together. You can do this with either string prompts or chat prompts. Constructing prompts this way allows for easy reuse of components.
String prompt compositionβ
When working with string prompts, each template is joined together. You can work with either prompts directly or strings (the first element in the list needs to be a prompt).
import { PromptTemplate } from "@langchain/core/prompts";
const prompt = PromptTemplate.fromTemplate(
`Tell me a joke about {topic}, make it funny and in {language}`
);
prompt;
PromptTemplate {
lc_serializable: true,
lc_kwargs: {
inputVariables: [ "topic", "language" ],
templateFormat: "f-string",
template: "Tell me a joke about {topic}, make it funny and in {language}"
},
lc_runnable: true,
name: undefined,
lc_namespace: [ "langchain_core", "prompts", "prompt" ],
inputVariables: [ "topic", "language" ],
outputParser: undefined,
partialVariables: undefined,
templateFormat: "f-string",
template: "Tell me a joke about {topic}, make it funny and in {language}",
validateTemplate: true
}
await prompt.format({ topic: "sports", language: "spanish" });
"Tell me a joke about sports, make it funny and in spanish"
Chat prompt compositionβ
A chat prompt is made up a of a list of messages. Similarly to the above example, we can concatenate chat prompt templates. Each new element is a new message in the final prompt.
First, letβs initialize the a
ChatPromptTemplate
with a
SystemMessage
.
import {
AIMessage,
HumanMessage,
SystemMessage,
} from "@langchain/core/messages";
const prompt = new SystemMessage("You are a nice pirate");
You can then easily create a pipeline combining it with other messages
or message templates. Use a BaseMessage
when there are no variables
to be formatted, use a MessageTemplate
when there are variables to be
formatted. You can also use just a string (note: this will automatically
get inferred as a
HumanMessagePromptTemplate
.)
import { HumanMessagePromptTemplate } from "@langchain/core/prompts";
const newPrompt = HumanMessagePromptTemplate.fromTemplate([
prompt,
new HumanMessage("Hi"),
new AIMessage("what?"),
"{input}",
]);
Under the hood, this creates an instance of the ChatPromptTemplate class, so you can use it just as you did before!
await newPrompt.formatMessages({ input: "i said hi" });
[
HumanMessage {
lc_serializable: true,
lc_kwargs: {
content: [
{ type: "text", text: "You are a nice pirate" },
{ type: "text", text: "Hi" },
{ type: "text", text: "what?" },
{ type: "text", text: "i said hi" }
],
additional_kwargs: {},
response_metadata: {}
},
lc_namespace: [ "langchain_core", "messages" ],
content: [
{ type: "text", text: "You are a nice pirate" },
{ type: "text", text: "Hi" },
{ type: "text", text: "what?" },
{ type: "text", text: "i said hi" }
],
name: undefined,
additional_kwargs: {},
response_metadata: {}
}
]
Using PipelinePromptβ
LangChain includes a class called
PipelinePromptTemplate
,
which can be useful when you want to reuse parts of prompts. A
PipelinePrompt consists of two main parts:
- Final prompt: The final prompt that is returned
- Pipeline prompts: A list of tuples, consisting of a string name and a prompt template. Each prompt template will be formatted and then passed to future prompt templates as a variable with the same name.
import {
PromptTemplate,
PipelinePromptTemplate,
} from "@langchain/core/prompts";
const fullPrompt = PromptTemplate.fromTemplate(`{introduction}
{example}
{start}`);
const introductionPrompt = PromptTemplate.fromTemplate(
`You are impersonating {person}.`
);
const examplePrompt =
PromptTemplate.fromTemplate(`Here's an example of an interaction:
Q: {example_q}
A: {example_a}`);
const startPrompt = PromptTemplate.fromTemplate(`Now, do this for real!
Q: {input}
A:`);
const composedPrompt = new PipelinePromptTemplate({
pipelinePrompts: [
{
name: "introduction",
prompt: introductionPrompt,
},
{
name: "example",
prompt: examplePrompt,
},
{
name: "start",
prompt: startPrompt,
},
],
finalPrompt: fullPrompt,
});
const formattedPrompt = await composedPrompt.format({
person: "Elon Musk",
example_q: `What's your favorite car?`,
example_a: "Telsa",
input: `What's your favorite social media site?`,
});
console.log(formattedPrompt);
You are impersonating Elon Musk.
Here's an example of an interaction:
Q: What's your favorite car?
A: Telsa
Now, do this for real!
Q: What's your favorite social media site?
A:
Next stepsβ
Youβve now learned how to compose prompts together.
Next, check out the other how-to guides on prompt templates in this section, like adding few-shot examples to your prompt templates.