Documentation Index
Fetch the complete documentation index at: https://arivu.mintlify.app/llms.txt
Use this file to discover all available pages before exploring further.
Installation
The Arivu MCP server requires the mcp optional dependency map.
pip install "arivu-ai[mcp]"
The MCP server runs globally within your Python environment. For best results, install it within the same virtual environment where you intend to launch Claude Desktop or Cursor.
Global Configuration
The Arivu MCP Server does not take arguments programmatically; instead, it binds to Environment Variables securely passed in by the parent MCP client.
Core Variables
# Common configuration applied across all dialects
ARIVU_DB_DIALECT="postgresql" # postgresql, mysql, sqlite, databricks, snowflake
ARIVU_DB_MODE="user" # Safety guardrail (admin bypasses approval restrictions)
Dialect-Specific Configurations
PostgreSQL / MySQL
Snowflake
Databricks
Standard RDBMS parameters:ARIVU_DB_HOST="localhost"
ARIVU_DB_PORT="5432"
ARIVU_DB_USER="postgres"
ARIVU_DB_PASSWORD="supersecretpassword"
ARIVU_DB_NAME="analytics"
Requires warehouse routing context:ARIVU_SNOWFLAKE_ACCOUNT="xy12345.us-east-1"
ARIVU_SNOWFLAKE_WAREHOUSE="COMPUTE_WH"
ARIVU_DB_USER="admin"
ARIVU_DB_PASSWORD="mypassword"
ARIVU_DB_NAME="SNOW_DATA"
Requires Databricks catalog tracking:ARIVU_DB_HOST="adb-123.azuredatabricks.net"
ARIVU_DATABRICKS_HTTP_PATH="/sql/1.0/endpoints/abc123"
ARIVU_DATABRICKS_TOKEN="dapi..."
ARIVU_DATABRICKS_CATALOG="main"
Server Boot Modes
Arivu supports both standard execution layers as outlined in the MCP specification.
Stdio Transport (Local)
Standard mechanism for desktop integrations like Cursor. It reads inputs on stdin and writes on stdout. SSE HTTP Transport (Remote)
Used if you are hosting the MCP Server exclusively on a remote VPC droplet.python -m arivu.mcp --transport http --port 8080