AI Features - Controls, Data Flow & Provider Selection
How Elker's AI capabilities are enabled, constrained, and routed to the provider of your choice - including AWS Bedrock in your own region.
Elker's AI features are off by default and configured per‑organisation. Every feature sits behind a global feature flag, a per‑organisation configuration, and a set of granular data‑access toggles that determine what information can leave the platform. No AI feature operates unless an administrator has both enabled it and chosen a provider.
Prefer no AI at all? AI can be turned off globally for your organisation with a single configuration change. When every AI feature flag is off, the platform makes zero outbound calls to any model, agent, or LLM provider - no context is gathered, no prompts are constructed, no network request is issued. Report data, chat history, case notes and reporter details remain entirely within Elker. This is the default posture for any organisation that has not explicitly opted in.
1. Feature inventory
Analytics Assistant
Conversational querying across the dashboards (period filters, status breakdowns, trend comparisons). Surfaced as an expandable panel on the analytics overview.
ai_analytics_enabled
Reports List Assistant
Thematic summaries and pattern detection across the currently visible set of reports. Scoped to reports the user is already authorised to see.
ai_reports_list_enabled
Field Autofill
Suggests values for any empty custom fields on a report based on the report's existing context. Suggestions are proposed - never applied automatically.
ai_field_autofill_enabled
Field Suggest
Per‑field suggestion button - a targeted variant of autofill for a single field input at a time.
ai_field_suggest_enabled
Chat Draft
Generates an initial draft response for a chat conversation with a reporter, using the chat history and report context.
ai_chat_draft_enabled
Chat Redraft
Takes a user‑written draft and returns tone variants (trauma‑informed, empathetic, professional, direct).
ai_chat_draft_enabled
Nothing runs without an enabled flag. The flag check happens before any data is gathered, before authorisation is evaluated, and before any outbound call to a provider. Disabling a flag takes effect immediately - no redeploy required.
1. Configured
Organisation has an ai_setting record, provider is chosen, and credentials (where required) are present.
2. Feature flag
The flag for the specific feature (e.g. ai_chat_draft_enabled) is on for the instance.
3. Policy
The user holds the role permission the feature requires (analytics needs overview access; field/chat features need contact access).
4. Rate limit
The organisation has not exceeded 100 AI requests in the current rolling hour window.
View & edit AI settings
Admin & owner roles only
Use an AI feature (when enabled)
Subject to the role permission the feature requires
Rotate/clear provider credentials
Admins can change provider or clear keys at any time
2. Data ingress & egress controls
When an AI feature runs, Elker builds a JSON context object for the provider. A small set of non‑identifying fields are always included (codename, status, report type, tags, flow, creation time, field values). Everything else is opt‑in per organisation via toggles on the AI settings page. Content is transient - prompts and responses are not persisted.
| Data class |
Sharing |
Default |
| Codename, status, type, tags, flow, created‑at |
ALWAYS |
Non‑identifying context |
| Custom field values |
ALWAYS |
Core report content |
| Chat messages (max 30, most recent) |
OPT‑IN |
Enabled by default |
| Case notes |
OPT‑IN |
Enabled by default |
| Submission responses (form answers) |
OPT‑IN |
Enabled by default |
| Reporter details (contact info) |
OPT‑IN |
Disabled by default |
Allow chat messages
Include recent correspondent chat history as context
Allow case notes
Include internal case notes & comments
Allow submission responses
Include structured form answers from the initial response
Allow reporter details
Include reporter contact information (off by default)
1. AuthoriseThe user's existing role permissions are evaluated against the target report or analytics view.
2. GatherA context builder reads only the fields the per‑org toggles permit. Chat history is capped at 30 messages.
3. SendThe context is sent over TLS to the configured provider using the organisation's credentials.
4. ReturnThe provider's response is parsed and returned to the browser. Prompts and responses are not stored.
5. ActSuggestions are shown to the user. Nothing is written to the report until the user accepts it.
No training, no retention. Elker does not send data to any AI provider for training. When Bedrock is used, model invocation data stays within your AWS account boundary and region.
Schema‑isolated
AI queries read through the same tenant schema as the rest of the platform. One organisation's data cannot be included in another organisation's AI context.
Report‑scoped
Report‑level features (autofill, chat draft) run a Pundit policy check against the specific report before any data is gathered.
Rate‑limited
100 AI requests per organisation per rolling hour. Redis‑backed; trips a AI_RATE_LIMIT_EXCEEDED error that is surfaced to the user.
3. Provider selection
Each organisation chooses its own provider. Elker ships with four adapters: AWS Bedrock (the default), direct Anthropic Claude, OpenAI, and any OpenAI‑compatible endpoint (including self‑hosted models behind a gateway URL). The adapter is selected at request time from the per‑org AI settings - there is no global lock‑in to a single vendor.
AWS Bedrock DEFAULT
No API key - authenticates via AWS SDK credentials. Region configurable (defaults to ap‑southeast‑2). Default model: Claude Sonnet 4.
Anthropic Claude (direct)
Uses the Anthropic Messages API. Requires an API key - stored encrypted at rest.
OpenAI
Uses the OpenAI Chat Completions API. Requires an API key - stored encrypted at rest.
OpenAI‑compatible
Any endpoint that implements the OpenAI wire format (Azure OpenAI, self‑hosted vLLM, Groq, LiteLLM, internal gateways). Requires an API key and a custom base URL.
Non‑Bedrock providers are additionally gated by a global ai_custom_llm_enabled flag. This allows Elker to ship an "AWS‑only" posture for regulated environments where only Bedrock is sanctioned for use.
| Provider |
API key |
Base URL |
Model override |
| Bedrock |
- |
- |
Optional |
| Claude |
Required |
- |
Optional |
| OpenAI |
Required |
- |
Optional |
| OpenAI‑compatible |
Required |
Required |
Recommended |
elk_config_api_key • • • • encrypted at rest
API keys are encrypted with Rails' built‑in attribute encryption. The settings page shows only whether a key is set - the value itself is never returned to the browser.
Data residency
Requests are made to the Bedrock regional endpoint. The default region is ap‑southeast‑2 (Sydney) and is configurable via environment. Your organisation's AI traffic stays within that AWS region.
No key‑management overhead
Elker authenticates with the AWS SDK using the instance role - there is no API key to store, rotate, or leak.
Single contractual surface
Your existing AWS agreement governs model invocation; no separate third‑party AI vendor needs to be onboarded.
4. Disabled states & graceful degradation
Feature flag is off
The AI button or panel is not rendered at all. There is no disabled control for the user to puzzle over.
Flag on, provider not configured
The API returns AI_NOT_CONFIGURED. The settings page surfaces the missing configuration to admins; feature entry points remain hidden.
User lacks role permission
The feature is hidden for that user. Other roles in the same organisation are unaffected.
Data toggle disables a source
Context for that source is simply omitted from the prompt. The feature still runs with whatever data remains permitted.
Rate limit tripped
The request returns AI_RATE_LIMIT_EXCEEDED. No partial result is shown.
Per‑feature flag
Toggle off the specific flag (e.g. ai_chat_draft_enabled) to disable that feature across all organisations immediately.
Per‑organisation credentials
Clear the API key or change the provider in the organisation's AI settings. Takes effect on the next request.
Global custom‑provider gate
Disable ai_custom_llm_enabled to force every organisation onto Bedrock (or off entirely, if Bedrock isn't configured).
All three levers are admin‑configurable. No code change, deploy, or vendor involvement is required to turn AI features on or off, to switch providers, or to tighten the data‑sharing posture.