Overview¶
AI Core (progrid_ai_core) is an ERP-aware AI assistant that runs inside Progrid. It combines
large language model capabilities with deep knowledge of your Progrid data to answer questions,
perform actions, and provide contextual guidance – all through a conversational interface.
Purpose¶
ERP systems contain vast amounts of data spread across many screens and models. Finding the right information often requires navigating multiple menus, building filters, and understanding the data model. AI Core eliminates this friction by allowing users to ask questions in plain English and receive immediate, accurate answers drawn directly from their Progrid database.
Beyond answering questions, AI Core can take action: creating records, updating fields, running aggregations, and performing bulk operations – all with built-in safety guardrails that require explicit confirmation before any write operation.
Target users¶
All Progrid users¶
Any user with the AI User role can interact with AI Core from any screen. Typical use cases include:
Asking “How many open opportunities do we have this month?” instead of building a filter
Requesting “Create a new contact for John Smith at Acme Corp” through conversation
Getting context-aware help based on the record currently being viewed
Looking up procedures and documentation through the built-in knowledge base
Sales and operations teams¶
Teams that work across multiple Progrid modules benefit from AI Core’s ability to search and aggregate data across models without switching screens. For example, a salesperson can ask about a partner’s open invoices, recent purchase orders, and support tickets in a single conversation.
Administrators¶
Administrators configure which models the AI can access, manage LLM provider connections, set rate limits, and review the audit log to monitor AI usage across the organization.
Architecture¶
AI Core uses an agentic loop architecture where the LLM can call tools iteratively until it has gathered enough information to respond:
User sends message
|
v
Build context (record, knowledge, history)
|
v
Send to LLM with available tools
|
+-----+------+
| |
Tool calls No tool calls
| |
Execute Return response
with to user
guardrails
|
Add results to context
|
Loop back to LLM
(max 5 iterations)
This loop allows the AI to chain multiple operations – for example, searching for a partner, reading their details, then checking related invoices – all in response to a single user question.
LLM providers¶
AI Core supports three LLM providers with automatic fallback:
Provider |
Description |
Default model |
|---|---|---|
Groq |
High-speed inference using open models. Recommended as primary provider for its speed and generous free tier. |
|
OpenAI |
Industry-standard LLM provider with broad model selection. |
|
Anthropic |
Claude models with strong instruction-following capabilities. |
|
Multiple providers can be configured simultaneously. The module uses a priority-based fallback system: if the primary provider fails or is rate-limited, requests automatically route to the next available provider.
Security model¶
AI Core enforces a whitelist-based access model with multiple layers of protection:
Model whitelist – Only explicitly whitelisted models can be accessed by the AI
Operation permissions – Each whitelisted model has independent read, write, create, and delete toggles
Field blacklist – Sensitive fields (passwords, API keys, tokens) are automatically excluded
User permissions – The AI respects the logged-in user’s standard Progrid access rights
Confirmation prompts – Write operations require explicit user approval before execution
Rate limiting – Per-minute and per-hour limits prevent abuse, with configurable admin multipliers
Audit logging – Every AI action is recorded with user, model, parameters, and result
Tip
The default configuration includes a global field blacklist that automatically excludes
password, api_key, totp_secret, oauth_access_token, and other sensitive fields
from all AI interactions.
Available tools¶
The AI has access to a set of tools that map to standard Progrid operations:
Tool |
Description |
Confirmation |
Access level |
|---|---|---|---|
|
Search any whitelisted model with domain filters |
No |
Read |
|
Read a specific record by ID |
No |
Read |
|
Count records matching a domain |
No |
Read |
|
Get field definitions for a model |
No |
Read |
|
Sum, average, count, min, max with grouping |
No |
Read |
|
Create a new record |
Yes |
Create |
|
Update fields on an existing record |
Yes |
Write |
|
Delete a record |
Yes |
Delete |
|
Update multiple records matching a domain |
Yes |
Write |
|
Delete multiple records matching a domain |
Yes |
Delete |
Knowledge system¶
AI Core includes a RAG (Retrieval-Augmented Generation) knowledge system that allows administrators to add company-specific documentation, procedures, FAQs, and model metadata. When a user asks a question, the AI searches the knowledge base for relevant context before responding.
Knowledge sources can be categorized as:
Model metadata – Auto-generated descriptions of whitelisted model schemas
Documentation – Product or system documentation
Procedures – Step-by-step operational procedures
FAQ – Frequently asked questions and answers
Custom – Any other reference material
Module information¶
Technical name |
|
Version |
18.0.1.1.0 |
Category |
Productivity |
Dependencies |
|