Skip to main content

SAMO AI Assistant - Configuration Guide

Complete configuration reference for SAMO AI Assistant v1.1.3

Overview

SAMO AI Assistant uses environment variables with the ai_ prefix for all configuration. Settings are managed through Pydantic Settings with automatic validation.

Configuration File: .env in the project root directory

Important: Never commit .env files or share API keys publicly!


Environment Variables

All configuration is done via environment variables loaded from a .env file:

# Format
ai_<setting_name>=<value>

# Example
ai_llm_api_key=sk-your-api-key

Required Configuration

LLM Configuration

# OpenAI API Key (Required)
ai_llm_api_key=sk-your-openai-api-key

# LLM Provider (Required)
# Options: openai
ai_llm_provider=openai

# Default Model Name (Required)
# Examples: openai/gpt-4o, openai/gpt-4o-mini, openai/o1
ai_llm_model_name=openai/gpt-4o

Optional - Separate Embedding Configuration:

# Use different provider/key for embeddings (optional)
ai_llm_embedder_provider=openai
ai_llm_embedder_api_key=sk-your-embedding-api-key

If not specified, embedding uses the same provider and API key as the main LLM.

Vector Store Configuration

Choose ONE of the following:

ai_vector_store=./vector-store-dspy
ai_chromadb_host=localhost
ai_chromadb_port=8000

Note: Cannot configure both local and remote ChromaDB simultaneously.

Security Configuration

# JWT Secret Key (Required)
ai_jwt_secret_key=your-jwt-secret-key-here

# Basic Auth Key (Required for Basic Auth) - used strictly for testing/developing
# Format: base64(username:password)
ai_basickey=your-base64-encoded-credentials

# Application Secret Key (Required)
ai_secret_key=your-app-secret-key

# User Service URL (Required for JWT auth)
ai_user_service_url=https://user-service.example.com

# Authorization Method (Required)
# Options: jwt, basic
ai_authorization_method=APPLICATION:DEMO.SAMO-ADMIN

Azure Document Intelligence

# Azure DI Endpoint (Required)
ai_azdi_endpoint=https://your-resource.cognitiveservices.azure.com/

# Azure DI Key (Required)
ai_azdi_key=your-azure-di-key

Database Connection

# Database Vendor (Required)
# Options: oracle, sqlserver, postgresql, mysql
ai_db_vendor=oracle

# Database URL (Required)
# Format varies by vendor (see Database Configuration section)
ai_db_url=localhost:1521:ORCL

# Database Credentials (Required)
ai_db_username=dbuser
ai_db_password=dbpass

LIDS Service

# LIDS AS URL (Required)
ai_lids_url=https://lids-service.example.com

Optional Configuration

Server Configuration

# Server Port (Default: 8080)
ai_server_port=8080

# Debug Mode (Default: false)
# Set to true for development only
ai_server_debug_mode=false

Conversation History Database

# Database Connection String for history management (Default: sqlite:///requests.db)
ai_db_connect_string=sqlite:///requests.db

# For production, consider PostgreSQL:
# ai_db_connect_string=postgresql://user:pass@localhost:5432/samo_history

Agent Feature Flags

# Enable/Disable Database Queries (Default: true)
ai_agent_enable_data_queries=true

# Metadata Cache Size (Default: 16)
# Higher values use more memory but improve performance
ai_agent_metadata_cache_size=16

Agent Model Configuration

Configure separate models for different agent tasks to optimize performance and cost.

ReAct Agent Model

Used for database queries and document search (most token-intensive).

# Model Name (Default: openai/gpt-4o-mini)
ai_agent_react_model=openai/gpt-4o-mini

# Max Tokens (Default: 16000)
# Minimum: 16000 for reasoning models
ai_agent_react_max_tokens=16000

# Temperature (Default: 1.0)
# Must be 1.0 for reasoning models (o1, o3, gpt-5)
ai_agent_react_temperature=1.0

# API Key (Optional - uses ai_llm_api_key if not set)
ai_agent_react_api_key=sk-optional-specific-key

Intent Analysis Model

Used for query classification (low token usage).

# Model Name (Default: openai/gpt-4o-mini)
ai_agent_intent_model=openai/gpt-4o-mini

# Max Tokens (Default: 2000)
ai_agent_intent_max_tokens=2000

# Temperature (Default: 1.0)
ai_agent_intent_temperature=1.0

# API Key (Optional - uses ai_llm_api_key if not set)
ai_agent_intent_api_key=sk-optional-specific-key

Response Synthesis Model

Used for formatting final answers (medium token usage).

# Model Name (Default: openai/gpt-4o-mini)
ai_agent_synthesis_model=openai/gpt-4o-mini

# Max Tokens (Default: 5000)
ai_agent_synthesis_max_tokens=5000

# Temperature (Default: 0.7)
# Lower for more deterministic responses
ai_agent_synthesis_temperature=0.7

# API Key (Optional - uses ai_llm_api_key if not set)
ai_agent_synthesis_api_key=sk-optional-specific-key

Model Selection Guidelines

For Cost Optimization:

  • ReAct: openai/gpt-4o-mini (most API calls)
  • Intent: openai/gpt-4o-mini (fast classification)
  • Synthesis: openai/gpt-4o-mini (quality output)

For Maximum Quality:

  • ReAct: openai/gpt-5-mini or openai/gpt-5 (better reasoning)
  • Intent: openai/gpt-4o-mini (sufficient for classification)
  • Synthesis: openai/gpt-5-mini (best formatting)

Important: Reasoning models (o1, o3, gpt-5) require:

  • temperature=1.0 (cannot be changed)
  • max_tokens>=16000
  • Application validates these constraints at startup

Database Configuration

Supported Database Vendors

  1. Oracle
  2. PostgreSQL

Database URL Formats

The ai_db_url format varies by vendor. The application automatically constructs the SQLAlchemy connection string.

Oracle

SID Format:

ai_db_vendor=oracle
ai_db_url=hostname:1521:ORCL
ai_db_username=username
ai_db_password=password

Service Name Format:

ai_db_vendor=oracle
ai_db_url=hostname:1521/service_name
ai_db_username=username
ai_db_password=password

PostgreSQL

ai_db_vendor=postgresql
ai_db_url=hostname:5432/database
ai_db_username=username
ai_db_password=password

Vector Store Configuration

Local ChromaDB (Development)

Advantages:

  • No separate service needed
  • Simple setup
  • Data persistence in local directory

Configuration:

ai_vector_store=./vector-store-dspy

Data Location: Relative or absolute path to vector store directory

Remote ChromaDB (Production)

Advantages:

  • Scalable
  • Shared across multiple instances
  • Better performance for large datasets

Configuration:

ai_chromadb_host=chromadb.example.com
ai_chromadb_port=8000

Requirements: Running ChromaDB server

Setup ChromaDB Server:

# Using Docker
docker run -p 8000:8000 chromadb/chroma

# Or install and run directly
pip install chromadb
chroma run --host 0.0.0.0 --port 8000

Security Configuration

Authentication Methods

Basic Authentication (Development/Testing)

ai_authorization_method=APPLICATION:DEMO.SAMO-ADMIN
ai_basickey=dXNlcm5hbWU6cGFzc3dvcmQ= # base64(username:password)

Generate Basic Key:

# Linux/Mac
echo -n "username:password" | base64

# PowerShell
[Convert]::ToBase64String([Text.Encoding]::UTF8.GetBytes("username:password"))

JWT Authentication (Production)

ai_authorization_method=APPLICATION:DEMO.SAMO-ADMIN
ai_jwt_secret_key=your-jwt-secret-key
ai_user_service_url=https://user-service.example.com

Requirements:

  • External User Service for token validation
  • User Service must respond to token validation requests
  • Tokens stored in browser session storage

Last Updated: November 2025