Skip to content
← Alle Begriffe

Hallucination

Definition

AI hallucination is when a large language model generates information that sounds confident and plausible but is factually incorrect, fabricated, or unsupported by the input data. Hallucinations are a fundamental challenge in AI automation because automated workflows can propagate false information downstream without human review.

Reducing Hallucination

JieGou reduces hallucination risk through multiple mechanisms: RAG (grounding responses in your actual documents), structured output schemas (constraining what the model can return), eval quality gates (scoring outputs before they proceed), convergence loops (iterating until quality thresholds are met), and approval gates (human review at critical decision points).

Überzeugen Sie sich selbst

Beginnen Sie jetzt mit Rezepten und Workflows Ihre KI-Automatisierung aufzubauen.