This is the multi-page printable view of this section. Click here to print.

Return to the regular view of this page.

Azure OpenAI

    Overview

    Use Azure OpenAI as the LLM provider for Atolio.

    Prerequisites

    • A healthy Atolio deployment
    • kubectl access to the cluster
    • atolioctl downloaded from the latest release
    • Azure OpenAI: API key, Endpoint URL, and Deployment Name
    • jq installed (used below)

    Prepare Azure OpenAI resources (if needed)

    Create an Azure OpenAI service

    • Navigate to the Azure AI Foundry Portal at https://ai.azure.com/ and create a new Azure AI Foundry resource.
    • Choose subscription, resource group, region, and resource name.
    • Create the resource.

    Create a model deployment

    • From the service homepage, open Deployments from the left-hand menu.
    • Create a new base model deployment for a supported model (e.g., gpt-4o).
    • Note the Deployment Name.

    Copy connection details

    • On the model deployment page, under Endpoint settings, copy the API key and the base Endpoint URL.
      • The base Endpoint URL will be in the format https://<your-resource>.openai.azure.com/
    • From the deployment info, copy the Deployment Name.

    Configure Atolio

    Step 1: Port‑forward Atolio Feeder (for KV operations)

    kubectl -n atolio-db port-forward service/feeder 8889
    

    Leave this running in a separate terminal while you configure keys.

    Step 2: Generate a JWT token

    # Get the JWT Secret. Needed for JWT generation
    export jwtSecret=$(kubectl -n atolio-svc get secret lumen-secrets -o json | jq -r .data.jwtSecretKey | base64 -d)
    
    # Change this to reflect your Atolio deployment's domain name
    export domainName="https://search.example.com"
    
    export JWT_TOKEN=$(atolioctl connector create-jwt --raw \
      --jwt-audience-sdk=${domainName} \
      --jwt-issuer-sdk=${domainName} \
      --jwt-secret-sdk=${jwtSecret} \
      "atolio:*:*:*")
    

    Confirm JWT_TOKEN is set to a JWT value.

    Step 3: Set provider and credentials in the KV store

    atolioctl --feeder-address :8889 --jwt-token-sdk ${JWT_TOKEN} kv set /lumen/system/ask_model_slug azure-openai
    atolioctl --feeder-address :8889 --jwt-token-sdk ${JWT_TOKEN} kv set /lumen/system/ask_api_key_azure {AZURE_OPENAI_API_KEY}
    atolioctl --feeder-address :8889 --jwt-token-sdk ${JWT_TOKEN} kv set /lumen/system/ask_azure_base_uri {AZURE_OPENAI_ENDPOINT}
    atolioctl --feeder-address :8889 --jwt-token-sdk ${JWT_TOKEN} kv set /lumen/system/ask_azure_deployment_name {AZURE_OPENAI_DEPLOYMENT_NAME}
    

    Replace placeholders with your values, for example:

    • {AZURE_OPENAI_ENDPOINT} like https://<your-resource>.openai.azure.com/
    • {AZURE_OPENAI_DEPLOYMENT_NAME} is the exact deployment name you created
    • {AZURE_OPENAI_API_KEY} is the API key you copied from the deployment page

    Step 4: Restart the Marvin service

    kubectl -n atolio-svc delete pod -l app=marvin
    

    Kubernetes will recreate the pod with the new configuration.

    Verification

    • In your Atolio UI, open the homepage and run a test question.
    • If rate-limited, adjust your Azure OpenAI deployment’s TPM limits.