AI assistant making up CRM data — investigate hallucination source #39
Labels
No labels
prio_critical
prio_low
type_bug
type_contact
type_issue
type_lead
type_question
type_story
type_task
No milestone
No project
No assignees
2 participants
Notifications
Due date
No due date set.
Dependencies
No dependencies set.
Reference
lhumina_code/hero_biz#39
Loading…
Add table
Add a link
Reference in a new issue
No description provided.
Delete branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Overview
The CRM AI surface returns invented contacts / projects / tasks instead of grounding on real OSIS data. Casper hit this; Timur is investigating.
Why
Meeting 2026-05-06: "asked timur to view the issue / AI making up stuff / explained to Casper / Timur helping Casper on blockers".
This is related to but distinct from home#215 (assistant non-functional from missing Groq key). The hallucination here happens even when the assistant responds — the response is fluent but factually wrong.
Likely candidates for the root cause:
Acceptance
Related
Owner: timur (investigating) + casper (reporting).
Source: meeting notes 2026-05-06.
casper-stevens referenced this issue2026-05-07 07:42:06 +00:00
The dominant cause is likely candidate 2 — the AI tool surface may not exist yet. From reading the code, there appear to be no tool definitions, no function-calling schema, and no OSIS calls from the AI layer.
build_entity_context()loads the currently-viewed entity and injects it as markdown into the prompt; that's likely the only OSIS data the LLM sees. Anything outside that one entity would then get fabricated.Candidate 1 (context routing) is likely not the cause here. hero_biz → hero_osis traffic goes over HTTP through hero_router, and that path appears to already pass
X-Hero-Contextcorrectly. The UDS header bug in hero_rpc#42 is real but likely affects a different code path.What would unblock this: wire CRM read tools into the assistant (
search_persons,search_companies,list_tasks,get_deal) with a proper function-calling schema and dispatch in theassistant_chat()handler. Adding a grounding guardrail to the system prompt would help as a safety net regardless.