Drop in papers, documents, notes, and supported media. OpenCairn builds a workspace wiki, surfaces source-backed links, and keeps AI changes reviewable as workflow surfaces mature.
Reviewable AI actions included·Developer? Self-hosting guide ↗
Instead of one giant prompt, narrow-responsibility steps parse, embed, draft, link, and serve the knowledge base. Answers and graph surfaces carry source chips and evidence panels.
PDFs, common documents, Markdown/CSV ZIP exports, Google Drive file-ID imports, and supported media. OCR and media handling depend on provider capability and deployment settings.
Text, table structure, and OCR output are normalized to Markdown. Media transcription runs through configured provider paths.
Mapped to vector space. Similar things end up close.
The compile workflow drafts wiki pages per topic.
Backlinks · opposing views · causal edges into a knowledge graph.
Q&A · review cards · Socratic tutor · document-generation surfaces.
AI answers carry source chips, while graph, card, and mindmap surfaces expose evidence panels. Sentence-level runtime verification is not yet enforced across every writer path.
The current product exposes workflow-backed jobs, action ledgers, note actions, generated files, and staged plan/code surfaces. The public roadmap and feature registry remain the source of truth for what is active.
Uploads and imports become source notes, normalized text, and retrievable project context.
Chat and retrieval surfaces answer from scoped workspace context and carry source evidence.
Create, rename, move, delete, restore, and update flows run through preview-and-apply contracts.
Project-file and document-generation surfaces keep outputs traceable to workflow runs.
Workflow activity, recovery state, and approvals remain visible instead of disappearing into background jobs.
Planning, code workspace, and execution-loop features are staged behind product contracts and feature flags.
One dataset, five ways to see it. Switch to whichever view fits the moment. Every switch is logged and reversible.
Explore the full network at a glance
Radiate outward from a central concept
Search, filter, and spaced-repetition review
Changes across time
Collect related cards in one working board
Wiki pages aren't just text — they're the meeting point of sources, teammates, and reviewable AI work. Every page carries its citations, editors, and workflow history with it.
Attention weights the tokens of an input sequence by relevance. Vaswani et al. (2017) — "Attention Is All You Need" — introduced it as the foundation of the transformer architecture; self-attention and multi-head attention followed.
Related pages: Transformer, Scaled Dot-Product, Linear Attention.
Data sovereignty · living knowledge · one throughline to learning · open source. Tools that keep one or two of these are common. Keeping all four at once — that's OpenCairn.
Text, vectors, and original files all live in your infrastructure. With an Ollama-based local LLM configuration, model calls do not leave your deployment. It can also be run without external connectors.
AI actions can draft pages, propose updates, and link them to the existing wiki. You review and edit before important changes land. The time you spent copying by hand disappears.
Q&A · graph surfaces · review cards · Socratic dialogue · document-generation surfaces — all fed from the same wiki. No tool-hopping.
Default is AGPLv3 — fork, modify, deploy yourself whenever you want. A separate commercial license is available for organizations that can't take an AGPL component. No risk of the service disappearing.
Linearly scales from an individual to a lab to a regulated enterprise — without swapping tools.
Papers and documents become one woven wiki. Explore relationships in the concept graph, ask grounded questions, and self-check before exams with AI quizzes. Citations lead back to your original notes.
Run it with role-based permissions and live collaboration. Keep meeting notes and research sources in one project wiki, so knowledge stays behind when people move on.
Designed with finance, healthcare, government, and universities in mind — places where external SaaS can be restricted. Fully-local LLM paths and commercial licensing can support stricter deployment policies.
Docker Compose brings up the database, file storage, and workflow engine. After installing dependencies and running migrations, start the app processes; a profile flag enables fully-local Ollama mode.
Core features stay open, while monthly credits control free and paid usage. Included credits and detailed limits may change as operating costs change.
Start free · 500 credits/month
Personal work · 8,000 credits/month
Monthly allowances may change with operating costs
Heavy usage · 18,000 credits/month
Fair-use limits may apply
Your provider key · no managed credits
Prices are monthly subscription prices. Billing, VAT, refunds, and auto top-up follow launch-time terms.
If managed cloud isn't for you, here are two more paths.
Full open-source code (AGPLv3, with separate commercial license available) · self-operated workspaces and members · fully-local LLM option. Hardware and ops are on you.
Dedicated install and support path for finance, healthcare, government, and universities — environments where external SaaS can be restricted. Fully-local LLM configuration · organization auth integration design · commercial license inquiry.
Yes when the supported media path is configured for your deployment. Audio and video are transcribed through provider and worker settings, then handled like text sources. Real-time transcription is not supported; files are processed after upload.
All of it in infrastructure you operate. Text, vectors, and wiki pages live in the database; original files live in file storage. In Gemini mode, AI calls go to the external provider. With an Ollama-based local LLM configuration, model calls do not leave your deployment. External connectors such as Google Drive or Notion only contact those services when enabled.
OpenCairn is dual-licensed. The default AGPLv3 can be evaluated for internal use, but network service models and source-disclosure obligations should go through your legal review. A separate commercial license is available for organizations whose policy disallows AGPL components or source disclosure.
The default paths are Gemini and Ollama, and OpenAI-compatible gateways can also be configured. Feature coverage varies by provider; Gemini-specific capabilities only work when that provider path is enabled.
Upload a Markdown/CSV export ZIP and OpenCairn rebuilds the folder and note structure into a wiki. Google Drive currently supports file-ID import. App-connected imports, including Notion, belong in the connector roadmap.
AI-generated changes go through review-and-apply flows, and you can edit them afterward. Note changes are tracked through versions and activity records; duplicate and contradiction review continues to expand through workflow-backed surfaces.
OpenCairn uses Hocuspocus and Yjs for synchronization. Collaboration surfaces such as cursors, comments, @mentions, notifications, and public links are implemented; real-world smoothness still depends on network and Hocuspocus deployment settings.
Start free on Solo today. How much your knowledge base has grown a year from now depends on what you start accumulating today.
GITHUB ↗·Self-hosting guide·Tonight, start with your first source.
A sample graph that shows how transformer papers can be woven into a wiki. Hover or tap a node to see how connections spread.