Getting Started

GZOO Cortex is a local-first knowledge graph for developers. It monitors your project files, extracts meaningful entities using LLMs, and lets you query your codebase knowledge with natural language.

Quick Start

$ npm install -g @gzoo/cortex
$ cortex init
$ cortex projects add myapp ./src
$ cortex watch

The cortex init command will walk you through configuring your LLM provider and routing mode interactively.

Installation

Prerequisites

  • Node.js 18 or later
  • npm, yarn, or pnpm
  • An LLM API key (Anthropic, Google, or OpenAI) — or Ollama for local-only mode

Global Install (Recommended)

$ npm install -g @gzoo/cortex

Using npx

$ npx @gzoo/cortex init

From Source

$ git clone https://github.com/gzoonet/cortex.git
$ cd cortex
$ npm install
$ npm run build

Configuration

Cortex stores its configuration in ~/.cortex/config.json. Run cortex init to generate it interactively, or create it manually.

Routing Modes

cloud-firstBest quality. All processing via cloud LLMs. Variable costs.
hybridCloud for extraction, local for embeddings. Reduced costs.
local-firstLocal by default, cloud only when needed. Minimal costs.
local-onlyEverything runs on Ollama. Free, fully private.

Environment Variables

ANTHROPIC_API_KEY=sk-ant-...
GOOGLE_AI_API_KEY=...
OPENAI_API_KEY=sk-...

CLI Reference

cortex init

Interactive configuration wizard. Sets up LLM provider, routing mode, and storage location.

cortex projects add <name> <path>

Add a project to monitor. Use --restricted flag for sensitive projects.

cortex projects list

List all registered projects with file counts and status.

cortex watch

Start watching all projects for file changes. Processes updates in real-time.

cortex ingest --project <name>

Manually trigger ingestion for a specific project.

cortex query "<question>"

Query the knowledge graph with natural language. Returns answers with source citations.

cortex contradictions

Show detected contradictions across projects. Use --severity to filter.

cortex serve

Start the web dashboard on port 3710 (configurable with --port).

cortex mcp start

Start the MCP server for Claude Code integration.

cortex status

Show overall status: projects, entities, relationships, and storage usage.

MCP Integration

Cortex includes a built-in MCP (Model Context Protocol) server that integrates directly with Claude Code and other MCP-compatible tools. This gives your AI assistant deep knowledge about your codebase.

Setup with Claude Code

Add Cortex to your Claude Code MCP configuration:

// ~/.claude/mcp.json
{
"mcpServers": {
"cortex": {
"command": "cortex",
"args": ["mcp", "start"]
}
}
}

Available MCP Tools

cortex_queryQuery the knowledge graph with natural language
cortex_entitiesList entities matching a filter
cortex_relationshipsGet relationships for a specific entity
cortex_contradictionsShow detected contradictions
cortex_statusGet current Cortex status and statistics

Web Dashboard

Cortex includes a web-based dashboard for visualizing your knowledge graph, exploring queries, and monitoring activity in real-time.

$ cortex serve
Dashboard running at http://localhost:3710

Dashboard Features

  • Interactive D3-powered knowledge graph visualization
  • Real-time activity feed via WebSocket
  • Query explorer with streaming responses
  • Project management (add, remove, configure)
  • Entity and relationship browser
  • Contradiction reports with severity levels