Skip to content

Changelog

All notable changes to the Graphorin framework are documented in this file. The format follows Keep a Changelog; the project follows Semantic Versioning (pre-1.0: minor bumps may carry breaking changes; patch bumps do not).

Per-package changelogs live in each package's CHANGELOG.md and are generated by Changesets on every release.


0.1.0 — 2026-05-09

The first public release of the Graphorin framework. All @graphorin/* packages ship together at 0.1.0 (lockstep release while the framework is on the 0.x line). Every package is published on the npm registry with Sigstore build provenance.

Added

Runtime, memory, workflow

  • @graphorin/agent — agent runtime with streaming events, steering / follow-up queues, prepareStep hook, HITL durable resume, multi-agent handoffs (Agent.toTool({ secretsInheritance })), composable stop conditions, fan-out + evaluator-optimizer loops, structured-handoff artifacts.
  • @graphorin/memory — six-tier memory: working / session / episodic / semantic (bi-temporal default-on) / procedural / shared. Multi-stage conflict resolution (exact dedup → embedding three-zone → heuristic regex → subject/predicate). Hybrid search with Reciprocal Rank Fusion default. Memory-aware system prompt with built-in English locale pack and a pluggable per-locale extension point. Background consolidator with light + standard + minimum-viable deep phases, mandatory noise filter, lock-then-defer policy, idempotency cursor, dead-letter queue, and a default tier: 'free' cost budget.
  • @graphorin/workflow — durable step-graph runtime with a synchronous-step model, in-memory channels (LatestValue, Reducer, Stream, Barrier, Ephemeral, AnyValue, ListAggregate), pause (HITL) / resume, four stream modes (values / updates / tasks / debug), Directive and Dispatch primitives.
  • @graphorin/sessions — hybrid session facade with the agent registry, handoff records, JSONL export schema 1.0, and replay reconstruction. Session messages are owned by @graphorin/memory (single source of truth, DEC-147).

External surface

  • @graphorin/tools — typed tool registry, parallel execution, needsApproval flow, sandboxed execution, four-strategy result truncation pipeline, streaming-tool execution surface, built-in tool_search lookup tool.
  • @graphorin/skills — Anthropic Agent Skills format compatible loader with graphorin-* namespaced extensions; ed25519 signature verification on install (DEC-140 / ADR-034); slash commands (/skill:name); progressive-disclosure activation; sandbox-tier-aware execution.
  • @graphorin/mcp — Model Context Protocol client (stdio + Streamable HTTP + legacy SSE); typed MCPClient; toTools() adapter with inbound prompt-injection sanitization, deferred-loading auto-default, structured-content + outputSchema round-trip, per-server priority and collision strategy; pluggable EventStore for resumable sessions; OAuth bridge backed by @graphorin/security/oauth.

Persistence + provider

  • @graphorin/store-sqlite — default storage adapter on top of better-sqlite3@^12.9.0 + sqlite-vec@~0.1.9 + FTS5 with unicode61 remove_diacritics 2 tokenchars '-_.@/'. WAL hardening pragmas, WorkerPool wrapper for the standalone server.
  • @graphorin/embedder-transformersjs — default in-process embedder (Xenova/multilingual-e5-base, multilingual). WebGPU when available.
  • @graphorin/embedder-ollama — first-class opt-in alternative against an Ollama daemon (nomic-embed-text default, multi-model support).
  • @graphorin/triggers — cron / interval / idle / event triggers; same code path in library and standalone-server modes (DEC-150).
  • @graphorin/provider — vendor-neutral Provider interface; default vercelAdapter wrapping the Vercel AI SDK (v7 beta); ollamaAdapter, llamaCppServerAdapter, and openAICompatibleAdapter for local LLMs; shared LocalProviderTrust classifier; provider middleware composer with enforced ordering (DEC-145 / ADR-039) — withRedaction is mandatory innermost.
  • @graphorin/provider-llamacpp-node — companion package for in-process GGUF execution via node-llama-cpp@^3.5.

Cross-cutting infrastructure

  • @graphorin/securitySecretValue wrapper end-to-end with leakage barriers; SecretRef URI scheme (env: / keyring: / file: / encrypted-file: / op:// / vault:// / ref:); KeyringSecretsStore default via @napi-rs/keyring; sandbox tiers (worker-threads default + docker + isolated-vm + none); memory-modification guard (xxhash-fingerprint hash chain); HMAC-SHA256 + pepper server-token auth (DEC-122 / ADR-027); encrypted audit.db with SHA-256 hash chain; OAuth flows via openid-client@^6.x; ed25519 skill-signature verifier; process hardening (umask, refuse-root, file-mode policy).
  • @graphorin/observability — OpenTelemetry tracer with GenAI Semantic Conventions; typed AISpan<SpanType>; ConsoleExporter / JSONLExporter / OTLPHttpExporter with mandatory RedactionValidator (default-deny non-public, DEC-141 / ADR-035). Eval interfaces only; the full eval framework ships in @graphorin/evals.
  • @graphorin/pricing — separate package; bundled @pydantic/genai-prices snapshot; graphorin pricing refresh opt-in (never invoked automatically).

Standalone runtime

  • @graphorin/server — optional REST + WebSocket + SSE runtime built on Hono. REST Idempotency-Key per IETF draft-07 (DEC-142 / ADR-036), durable HITL across process restarts, lifecycle hooks, triggers daemon, consolidator daemon, replay endpoints, /v1/health
    • /v1/metrics (Prometheus, opt-in auth), audit verify endpoint.
  • @graphorin/cli — operator CLI binary (graphorin start | init | migrate | doctor | token | secrets | storage | audit | memory | consolidator | triggers | auth | pricing | skills | traces | migrate-export | guard | telemetry).
  • @graphorin/protocol — browser-friendly schemas for the WebSocket protocol contract graphorin.protocol.v1 (DEC-127 / ADR-031).
  • @graphorin/client — browser-friendly TypeScript client for the standalone server.

Optional sub-packs

  • @graphorin/store-sqlite-encrypted — SQLCipher v4 encryption-at-rest via better-sqlite3-multiple-ciphers@^12.9.0 (DEC-129 / ADR-030). Required for the always-encrypted audit.db on fresh installations.
  • @graphorin/secret-1password — reference SecretResolver for op:// URIs through the 1Password CLI.
  • @graphorin/reranker-transformersjs — pluggable cross-encoder reranker on top of @huggingface/transformers.
  • @graphorin/reranker-llm — pluggable LLM-judge reranker.
  • @graphorin/eslint-plugin — ESLint rules for projects that build on Graphorin (no-secret-unwrap, no-secret-in-deps, provider-middleware-order, no-implicit-network-call, no-third-party-workflow-aliases, no-bare-tool-exec, tool-discovery surface).
  • @graphorin/evals — full evaluation framework (scorers, datasets, runner, reporters; decoupled from @graphorin/observability per RB-17 / DEC-152).

Privacy and security baselines

  • Zero default telemetry (DEC-154 / ADR-041). The framework generates no outbound network call you did not initiate. The CI workflow check-no-network.yml enforces this against the source tree on every push and pull request.
  • Sigstore build provenance on every published package (publishConfig.provenance: true + npm provenance on the GitHub Actions release workflow).
  • Pre-launch security audit completed against the project's STRIDE threat model and the OWASP LLM Top 10 (2025): 0 Critical, 0 High findings; Medium / Low findings documented with v0.2 owners.

Examples

The repository ships eight example apps:

  • personal-assistant-cli — single-agent local CLI (library mode, hello-world target).
  • slack-bot-integration — server mode + WebSocket + durable HITL approvals across server restart.
  • background-consolidator — server mode + cron triggers + light / standard consolidator phases.
  • multi-agent-crew — supervisor + 2 worker agents (RB-33 acceptance scenario).
  • approval-workflow@graphorin/workflow HITL durable resume via pause() / Directive(resume) across server restart.
  • document-pipeline@graphorin/workflow Dispatch + parallel nodes + every channel type.
  • three-agent-harness — Planner / Generator / Evaluator harness with structured-handoff artifacts; Agent.fanOut(...) + evaluatorOptimizer(...) (RB-50 reference).
  • local-stack-cli — fully local stack (Ollama LLM + Ollama embeddings + SQLite + sqlite-vec, no cloud calls).

Distribution templates: docker/, k8s/, systemd/, github-actions/.

Benchmarks

  • benchmarks/locomo — LoCoMo benchmark runner (10 conversations, 200 questions); per-question accuracy + per-conversation aggregates
    • cost summary.
  • benchmarks/locomo-multilingual — community-contribution hooks for per-locale subsets.
  • benchmarks/dialsim — DialSim runner.
  • benchmarks/memory-sim — synthetic-dialogue memory simulator.
  • benchmarks/latency — p50 / p95 latency probes for streaming response, fact extraction, and memory search.
  • benchmarks/cost — per-conversation token-cost regression suite (CI budget assertion, must not increase > 10 % between runs).

Documentation

  • Per-package README.md covers the public surface, configuration, and dependency footprint.
  • SECURITY.md documents the disclosure process, supported versions, cryptographic baselines, and the privacy promise.
  • CONTRIBUTING.md covers the development workflow, conventions, and commit format.
  • CODE_OF_CONDUCT.md reproduces the unmodified Contributor Covenant v2.1 text.
  • THIRD_PARTY_NOTICES.md lists every runtime, optional, and build-time dependency with its license and the role it plays in Graphorin.

Hello-world target

A 20-line script that creates a memory-backed agent, streams tokens, persists facts to local SQLite via local transformers.js embeddings, survives process restart for HITL approvals when run via graphorin start, and emits OpenTelemetry spans (file or console exporter). The example lives in examples/personal-assistant-cli/.


Project Graphorin · v0.1.0 · MIT License · © 2026 Oleksiy Stepurenko · https://graphorin.com · https://github.com/o-stepper/graphorin