Arivu’s intelligent query resolving utilizes a sophisticated LangGraph pipeline that abstracts schema ingestion, LLM execution, error verification, and memory logging. The system is designed for granular observability and deterministic human feedback.Documentation Index
Fetch the complete documentation index at: https://arivu.mintlify.app/llms.txt
Use this file to discover all available pages before exploring further.
The pipeline is the core engine that transforms natural language into secure, verified SQL operations.
Pipeline Lifecycle
Query Intake
Capture raw user intent with metadata context (database engines, session memory, user context).
Verification
Evaluate generated SQL for:
- Syntax validity - Is the SQL correct?
- Safety checks - Blocks destructive operations (DROP, DELETE, etc.)
- Schema validation - Prevents hallucinated columns/tables
Graph Inspection
Arivu compiles the pipeline into a LangGraph for transparent execution tracking. Inspect it locally:Key Concepts
State Management
State Management
The pipeline maintains a shared state dictionary that flows through each node:
- User query and session context
- Generated SQL candidates
- Verification results
- Final response and metadata
Error Handling
Error Handling
When verification fails, the pipeline:
- Captures the error
- Provides feedback to the LLM
- Automatically attempts correction
- Re-verifies the corrected SQL
Observability
Observability
Every pipeline execution is tracked in your memory backend:
- Timing metrics for each node
- Token usage for LLM calls
- Full state history for debugging
- User session context preservation

