Most assume a managed dashboard, hosted inference, and phone-home telemetry by default. We wanted the opposite: an audit-friendly, air-gap-compatible foundation that an enterprise compliance team can sign off in an afternoon.
The engine.
You ship the assistant.
Graphorin is a TypeScript framework for building long-living personal AI assistants — a personal trainer, tutor, financial advisor, or business co-pilot that remembers, endures, and stays yours.
- Zero telemetry
- Vendor-neutral
- Type-safe
- Open source
Today's agent frameworks force a choice
between two halves of the same problem.
Most frameworks specialise in one frontier — agent memory, the agent loop, or durable execution — but few own all three together. Build a real personal assistant on any of them and you end up stitching four libraries together.
Graphorin is one composable surface across all of it: the agent loop, six-tier memory, durable workflow, secrets, observability, sandbox, sessions — with an optional standalone server when you need to expose your assistant over a network.
Seven differences you'll feel
on the first day, the first week, and the first year.
A literal local-first promise
Zero version pings. Zero analytics. Zero crash uploads. Zero npm postinstall
network calls. A continuous-integration check fails the build the moment anyone slips.
This isn't “mostly local”. It's a hard rule.
Memory worthy of the name
Six distinct layers — working, session, episodic, semantic, procedural, and shared — each with its own lifecycle. Old facts are superseded, never silently overwritten. A background consolidator quietly distils long conversations into long-term knowledge.
Durable from the first turn
Workflows can pause for an overnight approval, resume on a different machine a week later, and continue exactly where they left off. Human-in-the-loop is a primitive, not an afterthought.
Vendor-neutral by principle
Bound to no LLM vendor. Switch models with a one-line change. Use a frontier API today and a model running on your own laptop tomorrow — without rewriting your assistant. Local LLMs are first-class.
Secrets as a first-class concern
A secret-value type that cannot be accidentally logged, serialised, or displayed. Cryptographically-signed skills. A tamper-evident audit log for every privileged operation. A linter that catches the most common mistakes before they ship.
Observability you'll actually use
OpenTelemetry-native traces with the GenAI semantic conventions for every LLM call, tool, memory write, and workflow step. A mandatory redaction layer keeps secrets and PII out of your traces — even if you forget.
TypeScript ergonomics, end-to-end
Zero any in public APIs. Schemas flow through tools, memory blocks, and
structured outputs. Streaming-first by design — every operation is a typed
AsyncIterable of events your UI can render as they happen.
A real memory model
— not a vector database with a retrieval helper.
Most frameworks treat memory as one undifferentiated bag. Graphorin treats it as six layers, each with its own lifecycle, conflict-resolution strategy, and privacy posture. Together they give your assistant a memory it can actually live with — for years.
Working
Short, structured blocks holding what the assistant is doing right now — persona, current task, immediate context.
milliseconds → minutesSession
The rolling message log of the current conversation. The thread you're inside, with all of its turn-by-turn context.
minutes → hoursEpisodic
Things that happened — decisions, events, milestones — captured with proper bi-temporal validity. The assistant's autobiography.
days → yearsSemantic
Facts about you, the world, the task. Conflicts resolved through a multi-stage pipeline. Old facts superseded, never destroyed.
weeks → permanentProcedural
How to do things — workflows, recipes, learned patterns. The assistant gets better at the tasks it does most often.
grows over timeShared
Common knowledge across multiple agents in the same household, team, or organisation — with private layers for each individual.
cross-agent, lastingMulti-stage conflict resolution
Exact dedup → embedding three-zone → heuristic → subject/predicate. No coin flips. Every decision is auditable.
Hybrid search by default
Dense vectors plus full-text, fused with Reciprocal Rank Fusion. Pluggable rerankers for when you need a sharper answer.
Background consolidator with a budget
Light, standard, and deep phases distil sessions into long-term memory. A built-in cost cap means it can never run away with your bill.
Privacy levels on every record
Public, internal, and secret tags flow through traces and exports. Sensitive content is redacted by default — and you can't accidentally turn that off.
Pick a domain.
The shape is the same.
Personal trainer
Remembers your injury history, your last six months of workouts, your nutrition preferences, and your favourite phrasing.
Personal tutor
Knows what your child mastered last week, what they struggled with, and adjusts every lesson accordingly.
Financial advisor
Has watched a year of your spending, understands your goals, and waits for your approval before doing anything irreversible.
Business co-pilot
Lives alongside your team, knows your customers and contracts, and quietly drafts follow-ups overnight.
Research companion
Builds up a multi-month understanding of a topic, never forgets a citation, and surfaces the right paper at the right time.
Household assistant
Several family members talk to it — each with their own private memory, plus a shared layer for the things that concern everyone.
Embed it. Or run it as a daemon.
Same code. Different lifetime.
As a library
Embed Graphorin directly in any Node.js process. Your application owns the lifecycle — perfect for desktop apps, CLIs, scripts, and short-lived services.
- Lives in your process
- You own the event loop
- Zero infrastructure overhead
- Ideal for embedded use
As a standalone server
Promote your assistant to a long-lived daemon with a network API the moment it has to outlive a terminal. Same code, different process model.
- REST + WebSocket + SSE fallback
- Durable human-in-the-loop across restarts
- Built-in triggers and consolidator daemons
- Lifecycle hooks, replay, and audit
The promise
Local-first. Vendor-neutral. Durable. Observable. Type-safe. Honest.
Eighteen design principles, encoded in the repository. If a feature contradicts a principle, the feature loses. No drift. No scope creep. No surprises.
We believe the next decade of personal AI
will not be won by the cleverest chat window.
It will be won by whoever earns the user's trust over time — by remembering what matters, forgetting what doesn't, never leaking what's private, and being there next year when the user comes back. That is a framework problem before it is a product problem. The frameworks that exist today were not designed for it. So we designed one that is.
Build the assistant you've been
wanting to build.
Open source. MIT licensed. Local-first. The engine is ready when you are.