KB article

Why AI Answers Change When Your Data Didn’t

Inconsistent context, not data changes, often causes fluctuating AI answers.

arf-kbcontext-stabilityfilter-contextcontext-volatilityrelationship-path

TL;DR

  • Same data can yield different answers if context shifts.
  • Stability requires predictable filter paths.

The problem

  • AI answers vary across runs even when data is unchanged.
  • The model has ambiguous context paths.

Why it matters

  • Users lose trust when answers are unstable.
  • Decisions can flip based on hidden context changes.

Symptoms

  • Two similar prompts return different totals.
  • Slightly different filters change the result dramatically.

Root causes

  • Ambiguous relationships or bidirectional filters.
  • Hidden default filters in measures.

What good looks like

  • Deterministic filter paths to facts.
  • Explicit context documented for key measures.

How to fix

  • Map filter paths for each KPI.
  • Reduce ambiguity by simplifying relationships.
  • Add context tests for repeatable queries.

Pitfalls

  • Using bidirectional filtering as a shortcut.
  • Ignoring role‑playing dimensions.

Checklist

  • Filter paths documented for top KPIs.
  • Context tests created and reviewed.
  • Ambiguous relationships resolved.

Framework placement

Primary ARF layer: Context Stability. Diagnostic bridge: data-movement-reliability, semantic-reliability, execution-reliability.