Azure OpenAI
Azure OpenAI is a cloud service to help you quickly develop generative AI experiences with a diverse set of prebuilt and curated models from OpenAI, Meta and beyond.
LangChain.js supports integration with Azure OpenAI using either the dedicated Azure OpenAI SDK or the OpenAI SDK.
You can learn more about Azure OpenAI and its difference with the OpenAI API on this page. If you don't have an Azure account, you can create a free account to get started.
Using the Azure OpenAI SDK
You'll first need to install the @langchain/azure-openai
package:
- npm
- Yarn
- pnpm
npm install -S @langchain/azure-openai
yarn add @langchain/azure-openai
pnpm add @langchain/azure-openai
We're unifying model params across all packages. We now suggest using model
instead of modelName
, and apiKey
for API keys.
You'll also need to have an Azure OpenAI instance deployed. You can deploy a version on Azure Portal following this guide.
Once you have your instance running, make sure you have the endpoint and key. You can find them in the Azure Portal, under the "Keys and Endpoint" section of your instance.
You can then define the following environment variables to use the service:
AZURE_OPENAI_API_ENDPOINT=<YOUR_ENDPOINT>
AZURE_OPENAI_API_KEY=<YOUR_KEY>
AZURE_OPENAI_API_DEPLOYMENT_NAME=<YOUR_DEPLOYMENT_NAME>
Alternatively, you can pass the values directly to the AzureOpenAI
constructor:
import { AzureOpenAI } from "@langchain/azure-openai";
const model = new AzureOpenAI({
azureOpenAIEndpoint: "<your_endpoint>",
apiKey: "<your_key>",
azureOpenAIApiDeploymentName: "<your_deployment_name",
});
If you're using Azure Managed Identity, you can also pass the credentials directly to the constructor:
import { DefaultAzureCredential } from "@azure/identity";
import { AzureOpenAI } from "@langchain/azure-openai";
const credentials = new DefaultAzureCredential();
const model = new AzureOpenAI({
credentials,
azureOpenAIEndpoint: "<your_endpoint>",
azureOpenAIApiDeploymentName: "<your_deployment_name",
});
If you're using Azure Managed Identity, you can also pass the credentials directly to the constructor:
import { DefaultAzureCredential } from "@azure/identity";
import { AzureOpenAI } from "@langchain/azure-openai";
const credentials = new DefaultAzureCredential();
const model = new AzureOpenAI({
credentials,
azureOpenAIEndpoint: "<your_endpoint>",
azureOpenAIApiDeploymentName: "<your_deployment_name",
model: "<your_model>",
});
LLM usage example
import { AzureOpenAI } from "@langchain/azure-openai";
export const run = async () => {
const model = new AzureOpenAI({
model: "gpt-4",
temperature: 0.7,
maxTokens: 1000,
maxRetries: 5,
});
const res = await model.invoke(
"Question: What would be a good company name for a company that makes colorful socks?\nAnswer:"
);
console.log({ res });
};
API Reference:
- AzureOpenAI from
@langchain/azure-openai
Chat usage example
import { AzureChatOpenAI } from "@langchain/azure-openai";
export const run = async () => {
const model = new AzureChatOpenAI({
model: "gpt-4",
prefixMessages: [
{
role: "system",
content: "You are a helpful assistant that answers in pirate language",
},
],
maxTokens: 50,
});
const res = await model.invoke(
"What would be a good company name for a company that makes colorful socks?"
);
console.log({ res });
};
API Reference:
- AzureChatOpenAI from
@langchain/azure-openai
Using OpenAI SDK
You can also use the OpenAI
class to call OpenAI models hosted on Azure.
For example, if your Azure instance is hosted under https://{MY_INSTANCE_NAME}.openai.azure.com/openai/deployments/{DEPLOYMENT_NAME}
, you could initialize your instance like this:
- npm
- Yarn
- pnpm
npm install @langchain/openai
yarn add @langchain/openai
pnpm add @langchain/openai
import { OpenAI } from "@langchain/openai";
const model = new OpenAI({
temperature: 0.9,
apiKey: "YOUR-API-KEY",
azureOpenAIApiVersion: "YOUR-API-VERSION",
azureOpenAIApiInstanceName: "{MY_INSTANCE_NAME}",
azureOpenAIApiDeploymentName: "{DEPLOYMENT_NAME}",
});
const res = await model.invoke(
"What would be a good company name for a company that makes colorful socks?"
);
console.log({ res });
If your instance is hosted under a domain other than the default openai.azure.com
, you'll need to use the alternate AZURE_OPENAI_BASE_PATH
environment variable.
For example, here's how you would connect to the domain https://westeurope.api.microsoft.com/openai/deployments/{DEPLOYMENT_NAME}
:
import { OpenAI } from "@langchain/openai";
const model = new OpenAI({
temperature: 0.9,
apiKey: "YOUR-API-KEY",
azureOpenAIApiVersion: "YOUR-API-VERSION",
azureOpenAIApiDeploymentName: "{DEPLOYMENT_NAME}",
azureOpenAIBasePath:
"https://westeurope.api.microsoft.com/openai/deployments", // In Node.js defaults to process.env.AZURE_OPENAI_BASE_PATH
});
const res = await model.invoke(
"What would be a good company name for a company that makes colorful socks?"
);
console.log({ res });