This is the multi-page printable view of this section. Click here to print.

Return to the regular view of this page.

Configuring AI Services

    Overview

    Atolio makes it easy to securely configure and manage access to commercial LLMs of your choice.

    High-level configuration workflow

    With a healthy Atolio deployment running, an administrator may hit the root URL of the deployment (eg. https://search.example.com), visit the administration panel (/admin) and click LLM Provider to jump into the configuration.

    Atolio uses a range of models dependent on the use case. Per provider, the following models are currently supported:

    ProviderModels
    OpenAI / Azure OpenAIGPT-5, GPT-5 Mini
    Anthropic / Anthropic BedrockSonnet 4.6, Haiku 4.5
    Google GeminiGemini Flash 3

    Configuration through API Key

    LLM Provider API Key

    For the majority of providers, an access token is required. To enable the provider:

    1. Open Atolio and head to the Admin panel

    2. Select LLM Provider from the left-pane menu

    3. Select the LLM Provider and follow the specific configuration steps:

      • Azure OpenAI
        1. Navigate to the Azure AI Foundry Portal at https://ai.azure.com/ and create a new Azure AI Foundry resource.
        2. Choose subscription, resource group, region, and resource name. Create the resource.
        3. From the service home page, open Deployments from the left-hand menu
        4. Create a new base model deployment for gpt5.2-chat
        5. Create an additional base model deployment for gpt5-mini
        6. On the model deployment page, under Endpoint Settings, copy the API key and base Endpoint URL.
        7. Back within Atolio, enter the copied Endpoint URL and API Key.
      • OpenAI
        1. Navigate to https://platform.openai.com/api-keys
        2. Create a new secret key and copy it.
        3. Specify the API key within Atolio.
      • Anthropic
        1. Navigate to https://platform.claude.com/
        2. Create a new API key and copy it.
        3. Specify the API key within Atolio.
      • Anthropic Bedrock
        1. Ensure Atolio is deployed on an AWS instance with an IAM role that has Bedrock permissions.
        2. In the AWS Console, enable model access in Amazon Bedrock for Anthropic Sonnet 4.6 and Haiku 4.5. In many cases, this is now enabled by default.
        3. Select Anthropic Bedrock as the provider within Atolio. The region is automatically selected based on the deployed region. For more details, see Claude on Amazon Bedrock.
      • Google
        1. Navigate to https://aistudio.google.com
        2. Create a new project and API key and copy it.
        3. Specify the API key within Atolio.

    Changing Providers

    Upon configuring and validating the provider, Atolio will lock the configuration to ensure the system is stable. For switching providers, please contact your Atolio admin at this time.