Plenty of tools hold
what you write.
OpenCairn
knows what you read.

Drop in papers, documents, notes, and supported media. OpenCairn builds a workspace wiki, surfaces source-backed links, and keeps AI changes reviewable as workflow surfaces mature.

Reviewable AI actions included·Developer? Self-hosting guide ↗

opencairn — live compile● compiling
INPUTPaper PDFDOCX · PPTXMarkdown ZIPGoogle Drive fileSupported mediaWORKFLOW SURFACESCompilerResearchLibrarianSynthesisSocraticNarratorCuratorConnectorTemporalDeep R.CodeVisual.orchestration● running · stagedOUTPUTWiki pagesConcept graph · backlinks17 pages · 42 linksLearning + Q&AFlashcards · quizzes23 review cards suggestedGenerated artifactsDocs · slides · sheetsevery source traceable
Upload detected · attention_is_all_you_need.pdf
AI
Reviewable workflow actions
5 views
One graph · five lenses
0
Model calls · local LLM
AGPLv3
Open-source + commercial

Turning one source
into a wiki page, in six steps.

Instead of one giant prompt, narrow-responsibility steps parse, embed, draft, link, and serve the knowledge base. Answers and graph surfaces carry source chips and evidence panels.

01 · INGEST
Intake

PDFs, common documents, Markdown/CSV ZIP exports, Google Drive file-ID imports, and supported media. OCR and media handling depend on provider capability and deployment settings.

02 · PARSE
Break down

Text, table structure, and OCR output are normalized to Markdown. Media transcription runs through configured provider paths.

03 · EMBED
Meaning

Mapped to vector space. Similar things end up close.

04 · COMPILE
Weave

The compile workflow drafts wiki pages per topic.

05 · CONNECT
Link

Backlinks · opposing views · causal edges into a knowledge graph.

06 · SERVE
Use

Q&A · review cards · Socratic tutor · document-generation surfaces.

$ pnpm dev:dockerhttp://localhost:3000it just runs.

Answers and graph views trace back to source evidence.

AI answers carry source chips, while graph, card, and mindmap surfaces expose evidence panels. Sentence-level runtime verification is not yet enforced across every writer path.

citation.sources: ["note: attention-is-all-you-need","note: rotary-embedding"]

Workflow surfaces,
reviewed before they change data.

The current product exposes workflow-backed jobs, action ledgers, note actions, generated files, and staged plan/code surfaces. The public roadmap and feature registry remain the source of truth for what is active.

01ingest

Source processing

Uploads and imports become source notes, normalized text, and retrievable project context.

02answer

Grounded Q&A

Chat and retrieval surfaces answer from scoped workspace context and carry source evidence.

03action

Note actions

Create, rename, move, delete, restore, and update flows run through preview-and-apply contracts.

04artifact

Generated files

Project-file and document-generation surfaces keep outputs traceable to workflow runs.

05operate

Action ledger

Workflow activity, recovery state, and approvals remain visible instead of disappearing into background jobs.

06staged

Plan and code surfaces

Planning, code workspace, and execution-loop features are staged behind product contracts and feature flags.

One graph, five lenses.

One dataset, five ways to see it. Switch to whichever view fits the moment. Every switch is logged and reversible.

01overview
Graph

Explore the full network at a glance

02radiate
Mindmap

Radiate outward from a central concept

03review
Cards

Search, filter, and spaced-repetition review

04chronology
Timeline

Changes across time

05arrange
Board

Collect related cards in one working board

Same shape for a solo user
as for a whole lab.

Wiki pages aren't just text — they're the meeting point of sources, teammates, and reviewable AI work. Every page carries its citations, editors, and workflow history with it.

opencairn.local / studio-kim / research-wiki
⌘ K
/ research-wiki / attention-mechanism

Attention Mechanism

reviewed AI action· 12 sources· updated 2h ago

Attention weights the tokens of an input sequence by relevance. Vaswani et al. (2017) — "Attention Is All You Need" — introduced it as the foundation of the transformer architecture; self-attention and multi-head attention followed.

CONTRADICTION
Three sources disagree on quadratic complexity. The Linear Attention family challenges this assumption.

Related pages: Transformer, Scaled Dot-Product, Linear Attention.

Hocuspocus · real-time co-editingBlock comments · @mentionsRole permissions · link sharingExport paths · data sovereignty

Four promises
rarely kept together.

Data sovereignty · living knowledge · one throughline to learning · open source. Tools that keep one or two of these are common. Keeping all four at once — that's OpenCairn.

[ a ]sovereignty

Your data stays on your server.

Text, vectors, and original files all live in your infrastructure. With an Ollama-based local LLM configuration, model calls do not leave your deployment. It can also be run without external connectors.

[ b ]compile

AI does the sorting, you do the judging.

AI actions can draft pages, propose updates, and link them to the existing wiki. You review and edit before important changes land. The time you spent copying by hand disappears.

[ c ]one·throughline

Capture → explore → learn, one continuous line.

Q&A · graph surfaces · review cards · Socratic dialogue · document-generation surfaces — all fed from the same wiki. No tool-hopping.

[ d ]open·forever

Dual-licensed; exit always available.

Default is AGPLv3 — fork, modify, deploy yourself whenever you want. A separate commercial license is available for organizations that can't take an AGPL component. No risk of the service disappearing.

A tool you started solo,
grown into your organization.

Linearly scales from an individual to a lab to a regulated enterprise — without swapping tools.

v0.1primary

Grad student · researcher

Papers and documents become one woven wiki. Explore relationships in the concept graph, ask grounded questions, and self-check before exams with AI quizzes. Citations lead back to your original notes.

  • · Personal self-hosting
  • · Your own API key (BYOK)
  • · Import from existing note apps
v0.2expansion

Lab · small team

Run it with role-based permissions and live collaboration. Keep meeting notes and research sources in one project wiki, so knowledge stays behind when people move on.

  • · Workspaces + permissions
  • · Live collaboration · comments
  • · Install on your lab server
v0.3enterprise

Regulated industries

Designed with finance, healthcare, government, and universities in mind — places where external SaaS can be restricted. Fully-local LLM paths and commercial licensing can support stricter deployment policies.

  • · On-prem deployment path
  • · Audit and security requirements by design
  • · Commercial license inquiry path

Built to run
on your server.

Docker Compose brings up the database, file storage, and workflow engine. After installing dependencies and running migrations, start the app processes; a profile flag enables fully-local Ollama mode.

  • Minimum 4 vCPU · 8GB RAM, recommended 8 vCPU · 16GB RAM
  • x86_64 + ARM64 multi-arch (Raspberry Pi · Apple Silicon · Oracle Ampere)
  • Backup and restore strategy documented; automation scripts are follow-up work
  • Choose Cloudflare R2 / MinIO / S3
TERMINAL
$ git clone <your-fork-url>
$ cd opencairn-monorepo
$ cp .env.example .env
# set required secrets and your preferred model provider in .env
$ pnpm install
$ docker compose up -d postgres redis minio temporal
$ pnpm db:migrate
$ docker compose --profile app --profile worker --profile hocuspocus up -d --build
→ http://localhost:3000
app stack up
health checks passed
waiting for first workspace
$ cp .env.example .env → pnpm db:migrate → docker compose up

Hosted pricing
starts with credits.

Core features stay open, while monthly credits control free and paid usage. Included credits and detailed limits may change as operating costs change.

Freemanaged · starter
₩0

Start free · 500 credits/month

  • · Hosted workspace start
  • · 500 credits included monthly
  • · Docs, notes, and basic AI features
  • · Upgrade when usage exceeds the free pool
  • · Open-source self-hosting remains available
  • · Built for onboarding
Sign in
Promanaged · personal
₩9,900 /mo

Personal work · 8,000 credits/month

Credits
8,000 credits included monthly
Large AI jobs show estimated credit usage before running
  • · Managed AI calls included
  • · Document generation, summarization, and RAG
  • · Core connectors and workspace features
  • · Usage and remaining-credit UI
  • · Top up or upgrade to Max for excess usage
Start Pro

Monthly allowances may change with operating costs

Maxmanaged · heavy
₩19,900 /mo

Heavy usage · 18,000 credits/month

  • · 18,000 credits included monthly
  • · Long documents, large RAG jobs, and research
  • · Batch embeddings and cache-aware usage
  • · Higher monthly usage than Pro
  • · Additional top-ups planned
Start Max

Fair-use limits may apply

BYOKyour key · solo
₩4,900 /mo

Your provider key · no managed credits

  • · Your provider key where supported (encrypted at rest)
  • · LLM API usage billed to your account
  • · OpenCairn hosting and product features
  • · No monthly managed credits
  • · Use Pro or Max for managed calls
  • · For users comfortable with API costs
Start with BYOK

Prices are monthly subscription prices. Billing, VAT, refunds, and auto top-up follow launch-time terms.

Before you ask.

Can OpenCairn analyze audio or video?+

Yes when the supported media path is configured for your deployment. Audio and video are transcribed through provider and worker settings, then handled like text sources. Real-time transcription is not supported; files are processed after upload.

Where is my data stored?+

All of it in infrastructure you operate. Text, vectors, and wiki pages live in the database; original files live in file storage. In Gemini mode, AI calls go to the external provider. With an Ollama-based local LLM configuration, model calls do not leave your deployment. External connectors such as Google Drive or Notion only contact those services when enabled.

Doesn't AGPLv3 mean my company can't use it?+

OpenCairn is dual-licensed. The default AGPLv3 can be evaluated for internal use, but network service models and source-disclosure obligations should go through your legal review. A separate commercial license is available for organizations whose policy disallows AGPL components or source disclosure.

Can I use OpenAI-compatible models?+

The default paths are Gemini and Ollama, and OpenAI-compatible gateways can also be configured. Feature coverage varies by provider; Gemini-specific capabilities only work when that provider path is enabled.

Can I migrate from my existing note app?+

Upload a Markdown/CSV export ZIP and OpenCairn rebuilds the folder and note structure into a wiki. Google Drive currently supports file-ID import. App-connected imports, including Notion, belong in the connector roadmap.

What if the AI gets the wiki wrong?+

AI-generated changes go through review-and-apply flows, and you can edit them afterward. Note changes are tracked through versions and activity records; duplicate and contradiction review continues to expand through workflow-backed surfaces.

How smooth is the real-time collaboration?+

OpenCairn uses Hocuspocus and Yjs for synchronization. Collaboration surfaces such as cursors, comments, @mentions, notifications, and public links are implemented; real-world smoothness still depends on network and Hocuspocus deployment settings.

A second brain
for your knowledge.

Start free on Solo today. How much your knowledge base has grown a year from now depends on what you start accumulating today.

GITHUB ↗·Self-hosting guide·Tonight, start with your first source.