Deploy Azure OpenAI Resource With Terraform

This Azure OpenAI blog post will show how to create an OpenAI resource using Terraform.

Azure OpenAI allows organizations to use the OpenAI generative AI modules inside the Microsoft Azure cloud and simultaneously utilize the Azure security and integration capabilities.

To use Azure OpenAI, you must request access to the Azure OpenAI platform using the following URL. Once your request is approved, the service will be available under your Azure Subscription.

Azure OpenAI services are available from the following direct URL in a new portal called Azure AI Studio.

Terraform and Azure AI

The Terraform resource that controls the deployment of Azure OpenAI resources is the azurerm_cognitive_account. Azure Cognitive Services are now part of the Azure AI Services.

Steps to Use ChatGPI With Azure AI

We must use the following steps to get ChatGPT working on Azure OpenAI.

  1. Create an Azure OpenAI Service Resource
  2. Deploy AI Model ( GPT-4, GPT-35-turbo, embedding models, DALL-E models)
  3. Use prompts – Call the service from C#, AI Portal, Python or API

This post will create an Azure OpenAI Service Resource and a ChatGPT 3.5 deployment.

Terraform Configuration

The following Terraform configuration file will deploy an Azure OpenAI Service resource.

resource "azurerm_resource_group" "rg" {
  name     = "OpenAI"
  location = "AustraliaEast"

resource "azurerm_cognitive_account" "openai" {
  name                = "ntopenAI"
  location            = azurerm_resource_group.rg.location
  resource_group_name =
  kind                = "OpenAI"

  sku_name = "S0"

resource "azurerm_cognitive_deployment" "gpt" {
  name                 = "gpt-35-0613"
  cognitive_account_id =
  model {
    format  = "OpenAI"
    name    = "gpt-35-turbo"
    version = "0613"

  scale {
    type = "Standard"

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.