Privacy-first document processing with complete data control. Lightweight, fast, and 100% offline capable. Your data never leaves your machine.
Advanced RAG capabilities with MCP tools integration and multi-provider LLM support.
Complete RAG pipeline with minimal resource requirements
PDF, Text, Markdown, JSON, HTML
Sentence, Paragraph, Token-based
SQLite with sqvect extension
Vector + Keyword with RRF
Comprehensive RAG capabilities with enterprise features
Complete document processing and Q&A system
High-performance SQLite-based vector storage
Flexible provider switching via configuration
Comprehensive TOML-based configuration
Model Context Protocol for tool integration
Complete agent workflow system with planning and execution
Cron-based scheduling for automated operations
Direct LLM chat with optional MCP tool integration
Comprehensive data import, export, and management
Complete RESTful API for all operations
Use RAGO as a library in your Go applications
Comprehensive command-line interface
# Ingest documents recursively
$ rago ingest ./docs --recursive
# Query with source attribution
$ rago query "What is the main concept?" --show-sources
# Interactive query mode
$ rago query -i
# Query with metadata filters
$ rago query "deployment options" --filter type=documentation
# Auto-run: Plan and execute automatically
$ rago agent auto-run "analyze project structure and create documentation"
# Step-by-step approach
$ rago agent plan "monitor system performance daily"
$ rago agent run-plan plan_id_12345
# Create reusable agents
$ rago agent create research-agent "Weekly market research automation"
$ rago agent execute research-agent
# Workflow templates
$ rago agent templates
$ rago agent generate-template monitoring --type system-monitoring
# Plan management
$ rago agent list-plans
$ rago agent plan-status plan_id_12345
# Direct LLM chat
$ rago chat "Explain the concept of RAG systems"
# MCP-enhanced chat with tool access
$ rago mcp chat --interactive
# MCP server management
$ rago mcp status
$ rago mcp start filesystem
$ rago mcp list
# Direct tool calls
$ rago mcp call filesystem read_file '{"path": "README.md"}'
$ rago mcp call sqlite query '{"database": "data.db", "query": "SELECT * FROM users"}'
# Data management
$ rago rag export --format json --output backup.json
$ rago rag import --file backup.json
$ rago rag list --detailed
$ rago rag reset --confirm
# Start API server with UI
$ rago serve --port 7127 --ui
# RAG Operations
curl -X POST http://localhost:7127/api/ingest \
-H "Content-Type: application/json" \
-d '{"file_path": "document.pdf"}'
curl -X POST http://localhost:7127/api/query \
-H "Content-Type: application/json" \
-d '{"query": "summarize", "top_k": 5, "show_sources": true}'
# Agent Operations
curl -X POST http://localhost:7127/api/agents \
-H "Content-Type: application/json" \
-d '{"name": "research-agent", "description": "Market research automation"}'
curl -X POST http://localhost:7127/api/agents/research-agent/execute \
-H "Content-Type: application/json" \
-d '{"goal": "analyze competitor pricing"}'
# MCP Tool Calls
curl -X POST http://localhost:7127/api/mcp/tools/call \
-H "Content-Type: application/json" \
-d '{"tool": "filesystem", "action": "read_file", "params": {"path": "data.txt"}}'
# Health and Status
curl http://localhost:7127/api/health
curl http://localhost:7127/api/status
01_basic_client
- Client initialization methods02_llm_operations
- LLM generation, streaming, chat03_rag_operations
- Document ingestion, queries, search04_mcp_tools
- MCP tool integration05_agent_automation
- Task scheduling & workflows06_complete_platform
- Full integration demopackage main
import (
"context"
"fmt"
"log"
"github.com/liliang-cn/rago/v2/client"
)
func main() {
ctx := context.Background()
// Simplified client API - only 2 entry points!
c, err := client.New("rago.toml") // Or empty string for defaults
if err != nil {
log.Fatal(err)
}
defer c.Close()
// LLM operations with wrapper
if c.LLM != nil {
response, _ := c.LLM.Generate("Write a haiku about coding")
fmt.Println(response)
// Streaming
c.LLM.Stream(ctx, "Tell a story", func(chunk string) {
fmt.Print(chunk)
})
}
// RAG operations with wrapper
if c.RAG != nil {
// Ingest documents
c.RAG.Ingest("Your document content here")
// Query with sources
resp, _ := c.RAG.QueryWithOptions(ctx, "What are the key findings?",
&client.QueryOptions{TopK: 5, ShowSources: true})
fmt.Printf("Answer: %s\n", resp.Answer)
for _, source := range resp.Sources {
fmt.Printf("Source: %s (score: %.2f)\n", source.Source, source.Score)
}
}
// MCP tools with wrapper
if c.Tools != nil {
tools, _ := c.Tools.List()
fmt.Printf("Available tools: %d\n", len(tools))
result, _ := c.Tools.Call(ctx, "filesystem_read",
map[string]interface{}{"path": "README.md"})
fmt.Printf("Tool result: %v\n", result)
}
// Agent automation with wrapper
if c.Agent != nil {
result, _ := c.Agent.Run("Summarize recent changes")
fmt.Printf("Agent result: %v\n", result)
}
// Direct BaseClient methods also available
resp, _ := c.Query(ctx, client.QueryRequest{
Query: "test query",
TopK: 3,
})
fmt.Printf("Direct query: %s\n", resp.Answer)
}
Multiple installation methods to suit your needs
# Install with go install (Go 1.21+)
go install github.com/liliang-cn/rago/v2/cmd/rago-cli@latest
# Initialize configuration
rago init
# Start using
rago ingest documents/
rago query "your question?"
# Clone repository
git clone https://github.com/liliang-cn/rago
cd rago
# Build with make
make build
# Or build directly
go build -o rago ./cmd/rago-cli
# Option 1: Ollama (recommended for local)
curl -fsSL https://ollama.com/install.sh | sh
ollama pull qwen3
# Option 2: LM Studio
# Download from https://lmstudio.ai
# Configure provider in rago.toml
[providers]
default_llm = "ollama"
default_embedder = "ollama"