ChatAzureOpenAI
ChatAzureOpenAI(
endpoint,
deployment_id,
api_version,=None,
api_key=None,
system_prompt=MISSING,
seed=None,
kwargs )
Chat with a model hosted on Azure OpenAI.
The Azure OpenAI server hosts a number of open source models as well as proprietary models from OpenAI.
Examples
import os
from chatlas import ChatAzureOpenAI
= ChatAzureOpenAI(
chat =os.getenv("AZURE_OPENAI_ENDPOINT"),
endpoint="REPLACE_WITH_YOUR_DEPLOYMENT_ID",
deployment_id="YYYY-MM-DD",
api_version=os.getenv("AZURE_OPENAI_API_KEY"),
api_key
)
"What is the capital of France?") chat.chat(
Parameters
Name | Type | Description | Default |
---|---|---|---|
endpoint | str | Azure OpenAI endpoint url with protocol and hostname, i.e. https://{your-resource-name}.openai.azure.com . Defaults to using the value of the AZURE_OPENAI_ENDPOINT envinronment variable. |
required |
deployment_id | str | Deployment id for the model you want to use. | required |
api_version | str | The API version to use. | required |
api_key | Optional[str] | The API key to use for authentication. You generally should not supply this directly, but instead set the AZURE_OPENAI_API_KEY environment variable. |
None |
system_prompt | Optional[str] | A system prompt to set the behavior of the assistant. | None |
seed | int | None | MISSING_TYPE | Optional integer seed that ChatGPT uses to try and make output more reproducible. | MISSING |
kwargs | Optional['ChatAzureClientArgs'] | Additional arguments to pass to the openai.AzureOpenAI() client constructor. |
None |
Returns
Name | Type | Description |
---|---|---|
Chat | A Chat object. |