Provider Setup¶
This workflow covers configuring LLM providers that power AI Core, including API key setup, connection testing, and fallback configuration.
Overview¶
AI Core requires at least one active LLM provider to function. The module supports Groq, OpenAI, and Anthropic. Multiple providers can be configured simultaneously with priority-based fallback – if the primary provider is unavailable, requests automatically route to the next provider.
Required permissions¶
AI Admin to configure providers and API keys
System Administrator to access the Settings page
Step 1: Open provider configuration¶
Navigate to .
Step 2: Add a provider¶
Click New.
Fill in the provider details:
Name – A descriptive name (e.g., “Groq Primary”)
Provider – Select Groq, OpenAI, or Anthropic
API Key – Paste the API key from the provider’s console
Groq keys start with
gsk_Get a Groq key at console.groq.com
Get an OpenAI key at platform.openai.com
Get an Anthropic key at console.anthropic.com
Model Name – The model to use (default:
llama-3.3-70b-versatilefor Groq)Priority – Lower number = higher priority (primary provider should be 1)
Optionally adjust generation settings:
Max Tokens – Maximum response length (default: 4096)
Temperature – Response creativity, 0.0 = deterministic, 1.0 = creative (default: 0.2)
Rate Limit (RPM) – Provider-specific requests per minute limit
Click Save.
Step 3: Test the connection¶
Click Test Connection on the provider form.
A success or failure notification appears.
On success, the Last Success timestamp updates.
On failure, the Last Error field shows the error message.
Step 4: Set as primary provider¶
Click Set as Primary to make this provider priority 1.
All other active providers shift their priorities accordingly.
Tip
Groq is recommended as the primary provider due to its fast inference speed and generous free tier. Configure OpenAI or Anthropic as a fallback for when Groq is rate-limited or unavailable.
Configure fallback providers¶
To add a fallback provider:
Create a second provider record following the same steps above.
Set its Priority to a higher number (e.g., 2 or 3).
Ensure it is marked as Active.
When the primary provider fails, AI Core automatically tries the next provider by priority order. The Last Error field helps diagnose which providers have issues.
Enable AI Core globally¶
Navigate to .
Ensure AI Assistant is enabled.
Select the Default Provider from the dropdown.
Adjust rate limits as needed.
Click Save.
Next steps¶
Model Access Control – Configure which models the AI can interact with
Conversations – Start using the AI assistant
Security – Review the full security model