FAQ
What is Graphorin in one sentence?
Graphorin is a TypeScript framework for building long-living personal AI assistants — a personal trainer, tutor, financial advisor, or business co-pilot that remembers, endures, and stays yours.
Is Graphorin a product?
No. Graphorin is a framework. Assistant products (a fitness app, a tutor app, a finance copilot) are built on top of Graphorin in your own application code.
Is it free / open source?
Yes. Graphorin is distributed under the MIT License. © 2026 Oleksiy Stepurenko.
Who maintains Graphorin?
Graphorin is created and maintained by Oleksiy Stepurenko (step.oleksiy@gmail.com). See the Contributing guide for how to contribute.
What does the name mean?
It's the project's own coined name; no acronym, no expansion. The official logo and the colour palette (graphite + terracotta) are shared with graphorin.com.
Does Graphorin phone home?
No. Graphorin makes zero implicit network calls. The only outbound traffic is what your code initiates explicitly. A continuous-integration check fails the build the moment a forbidden network primitive is introduced. See Privacy.
Which LLMs are supported?
Through the Provider interface, any LLM that speaks one of the supported wire formats:
- The Vercel AI SDK (
aipackage) — covers OpenAI, Anthropic, Google, Mistral, Groq, Cohere, etc. - Ollama (HTTP).
- OpenAI-compatible HTTP servers (LM Studio, LocalAI, vLLM, Together.ai, …).
- The
llama.cppHTTP server. - In-process GGUF via the
@graphorin/provider-llamacpp-nodecompanion package.
See Providers.
Can I run Graphorin entirely offline?
Yes. SQLite + the bundled multilingual embedder + a local LLM (Ollama, llama.cpp, or in-process GGUF) cover every default. Set GRAPHORIN_OFFLINE=1 to verify the offline contract; the runtime refuses to phone home.
Does Graphorin store user data in the cloud?
Only if you configure it to. The default storage adapter writes to a local SQLite file; the audit log lives in a local encrypted SQLite file; secrets land in the OS keychain. Cloud storage is a deliberate, opt-in configuration.
What's the minimum Node.js version?
22.x LTS or newer. Older versions are not supported. See Installation.
ESM only? Why?
ESM is Node.js' native module system, and async-flow primitives behave correctly under it. Maintaining a parallel CommonJS surface would double the test matrix and force compromises that contradict the framework's "no any in the public surface" principle.
Can I use Graphorin in a browser?
The @graphorin/client package is the browser-friendly client for the standalone server. It depends only on @graphorin/protocol. The runtime packages (memory, agent, workflow, server) are Node.js-only — they assume better-sqlite3 and OS-level facilities.
How does the memory system handle conflicts?
Through a five-stage pipeline — exact dedup, embedding three-zone classification, locale-aware regex heuristics, subject / predicate split, and a deferred LLM judge in the consolidator's deep phase. See Memory system.
How big can a session get?
Sessions are append-only and stream. The context engine auto-compacts the buffer when it crosses the configured budget; the consolidator distils older content into long-term memory. A multi-month session is normal.
What's a "trigger"?
A scheduled invocation of an agent — cron, fixed interval, idle, or event. See Standalone server § Triggers.
How does HITL work?
Two complementary mechanisms:
- Tool approvals. Tools whose
needsApprovalpredicate returnstrueraise atool.approval.requestedevent. The run state can be persisted, the process can shut down, and another machine can resume it later viaagent.run(savedRunState, { directive: { approvals: [...] } }). - Workflow
pause/resume. A workflow node callspause(value); the engine yields aworkflow.suspendedevent and persists the checkpoint.workflow.resume(threadId, directive)re-enters the paused node.
See Agent runtime and Workflow engine.
How is observability handled?
OpenTelemetry-native. Every span follows the published GenAI Semantic Conventions. A mandatory withValidation(...) exporter wrapper enforces sensitivity-aware redaction with 14 built-in PII patterns. See Observability.
Where do skills live?
Skills can come from a local folder, an npm package, or a Git repository. Untrusted sources require a verifiable Ed25519 signature, run with --ignore-scripts, and are sandboxed per the skill's frontmatter. See Skills.
What is MCP, exactly?
The Model Context Protocol — a public protocol for tool / prompt / resource servers. Graphorin's client wraps @modelcontextprotocol/sdk over stdio and Streamable HTTP. See MCP client.
How do I deploy Graphorin in production?
See Deployment for systemd, Docker, Kubernetes, and CI templates.
What's the relationship between agents and workflows?
Agents and workflows compose orthogonally. An agent loop runs the LLM-driven model -> tool calls -> model cycle. A workflow runs a durable step-graph that may invoke agents from individual nodes. Use Dispatch(...) for durable cross-step parallelism; use agent.fanOut(...) for inline reasoning-loop parallelism. See Workflow engine § Composition.
Can I extend Graphorin?
Yes. The contracts in @graphorin/core/contracts are deliberately small. Most extension points are pluggable:
- Custom storage adapters (
MemoryStore,SessionStore,CheckpointStore, …). - Custom embedders (
EmbedderProvider). - Custom rerankers (
ReRanker). - Custom provider adapters.
- Custom secrets resolvers.
- Custom redaction patterns.
- Custom token counters.
- Custom skill sources.
See the API reference for the contract signatures.
Where do I report a security issue?
See the Security policy. Please do not open a public GitHub issue.
Where can I get help?
- GitHub Discussions on the repository (enabled post-launch).
- Issues on the repository for bug reports and feature requests.
- Email the maintainer at step.oleksiy@gmail.com for non-public matters.
What's on the roadmap?
The framework is currently on the v0.1.0 pre-release line. Watch the repository releases for milestones; the Changelog has the rolled-up history.
Graphorin · v0.1.0 · MIT License · © 2026 Oleksiy Stepurenko