ChatAzureOpenAI
ChatAzureOpenAI(
endpoint,
deployment_id,
api_version,
api_key=None,
system_prompt=None,
kwargs=None,
)Chat with a model hosted on Azure OpenAI.
The Azure OpenAI server hosts a number of open source models as well as proprietary models from OpenAI.
Examples
import os
from chatlas import ChatAzureOpenAI
chat = ChatAzureOpenAI(
endpoint=os.getenv("AZURE_OPENAI_ENDPOINT"),
deployment_id="REPLACE_WITH_YOUR_DEPLOYMENT_ID",
api_version="YYYY-MM-DD",
api_key=os.getenv("AZURE_OPENAI_API_KEY"),
)
chat.chat("What is the capital of France?")Parameters
| Name | Type | Description | Default |
|---|---|---|---|
| endpoint | str | Azure OpenAI endpoint url with protocol and hostname, i.e. https://{your-resource-name}.openai.azure.com. Defaults to using the value of the AZURE_OPENAI_ENDPOINT environment variable. |
required |
| deployment_id | str | Deployment id for the model you want to use. | required |
| api_version | str | The API version to use. | required |
| api_key | Optional[str] | The API key to use for authentication. You generally should not supply this directly, but instead set the AZURE_OPENAI_API_KEY environment variable. |
None |
| system_prompt | Optional[str] | A system prompt to set the behavior of the assistant. | None |
| kwargs | Optional['ChatAzureClientArgs'] | Additional arguments to pass to the openai.AzureOpenAI() client constructor. |
None |
Returns
| Name | Type | Description |
|---|---|---|
| Chat | A Chat object. |