Search for a command to run...
This paper introduces Goal-Relative Adaptive Graph Retrieval (GRAG), a retrieval architecture designed for autonomous goal-directed agents. Standard retrieval-augmented generation (RAG) systems retrieve documents based solely on semantic similarity to the current query. GRAG reframes retrieval as navigation through knowledge space toward a goal, introducing a directional signal D = G − C (Goal minus Current State) that steers knowledge access toward task completion rather than semantic proximity. GRAG augments traditional retrieval with three mechanisms: (1) Goal-Relative Directional Retrieval, guided by the vector direction between the agent's current state and its objective; (2) Adaptive Retrieval Blending, which dynamically weights semantic similarity against goal-directed retrieval based on task progress; and (3) Experience-Weighted Graph Memory, a SQLite-backed graph structure that captures and reinforces successful retrieval trajectories using logistic edge reinforcement and multi-tier temporal forgetting. Evaluation on a 100-case benchmark using a corpus of five AI research papers with deliberate semantic overlap yields a conditional performance profile. GRAG achieves a +0.202 nDCG@5 lift on goal-disambiguation tasks and +0.054 on distractor-noise resistance. However, GRAG registers a −0.086 regression on multi-hop tasks, revealing a structural tension between directional steering and lateral document traversal — a phenomenon we term retrieval tunnel vision. This honest conditional profile characterizes both the utility and current limitations of goal-directed retrieval and motivates a routing-based extension as the primary direction for future work. GRAG is fully implemented within the Thoth autonomous agent platform, developed by Professor Hawkeinstein's Educational Foundation. Any questions can be emailed to stevemeierotto@gmail.com