Arivu’s Memory backends store a serialized graph state of the Text-to-SQL logic against aDocumentation Index
Fetch the complete documentation index at: https://arivu.mintlify.app/llms.txt
Use this file to discover all available pages before exploring further.
session_id.
When engaging with the pipeline, simply passing a session_id into the invoke function will automatically fetch, utilize, and update the associated conversational thread.
Managing Sessions
You can directly interact with the Memory Store API to fetch history, manage sessions, or build custom dashboard views.- SQLite Backend
- Redis Backend
Context Window Management
LLMs have finite context windows. Arivu automatically handles this by intelligently pruning older, less-relevant messages when the state grows too large.
How Context Truncation Works
How Context Truncation Works
- Automatic pruning - Oldest messages are removed first
- Intelligent selection - Preserves semantically important interactions
- Recent emphasis - Always keeps the latest 5-10 messages
- No data loss - Full history still available in database
Best Practices
Best Practices
- Use short session IDs for efficiency
- Periodically archive completed sessions
- Monitor memory usage for long-running sessions
- Clear old sessions regularly to maintain performance

