Documentation · v0.1

docs

Everything you need to install STORM, define your first agent, and ship it to production. Boring, composable, honest — start anywhere below.

Getting started

STORM is a single primitive — install, configure, deploy. No orchestrator, no SDK soup, no vendor lock-in.

Install

Add storm to your project with one command.

STORM ships as a single package with zero peer dependencies. It runs anywhere JavaScript runs — Node, Bun, Deno, Cloudflare Workers, or your own host.

bun add @storm/core

Quickstart

Spin up your first agent in under five minutes.

Define a signal source, a scoring policy, and an action handler. STORM wires them into a cycle and starts logging. That's the whole tutorial.

import { storm } from "@storm/core";

storm({
  signals: ["http"],
  policy: (s) => s.confidence,
  action: async (s) => console.log("fire", s),
}).start();

Configuration

Signals, policies, sizing, memory — wired in one block.

Every knob lives in a single config object. Sensible defaults out of the box; override only what you need. Configuration is plain data — diff it, version it, replay it.

Primitives

Four primitives, one cycle. Everything else is a composition.

Signals

HTTP, webhooks, streams, and model outputs as a typed event bus.

Signals are the only way data enters a STORM cycle. They are typed, timestamped, and addressable. Build your own source by implementing a single async iterator.

Policies

Score events deterministically and gate actions by confidence.

A policy is a pure function from signal to score. Determinism is enforced — no Date.now(), no Math.random() unless seeded. This is what makes replay possible.

Handlers

Execute on-chain, off-chain, or shell — with adaptive sizing.

Handlers are the only place side effects happen. STORM gates every handler call with the score from the policy and the size from the sizer. Failed handlers are retried with exponential backoff and logged either way.

Logs

Immutable, replayable cycle records.

Every cycle writes one log entry containing the signal, score, action, and result. Logs are append-only and content-addressed. Replay any window of history bit-for-bit.

Runtime

The runtime is small on purpose. Bring your own host.

Memory

Persistent, replayable cycles. Every decision is auditable.

Memory is a key-value store scoped per cycle. Use any backend that exposes get/set — Redis, SQLite, Cloudflare KV, or the in-memory default for tests.

Observability

Structured logs, traces, and metrics out of the box.

STORM emits OpenTelemetry-compatible spans for every cycle. Pipe them into your existing stack — Datadog, Honeycomb, Grafana, or just stdout.

Deploy

Runtime-agnostic — Node, Bun, Workers, or your own host.

There is no STORM runtime to install. Your handler runs in your process. We ship adapters for the common hosts and a 40-line example for everything else.

Patterns

Composable shapes that show up over and over in real systems.

Fan-out

One signal, many handlers, one log entry per action.

Multiple handlers on a single cycle are first-class. Each one is gated, sized, and logged independently. Use this for mirrored execution or shadow rollouts.

Replay

Re-run any window of history against a new policy.

Point STORM at a log range and a new policy. It replays every signal, scores it under the new rules, and produces a delta. No production traffic required.

Backpressure

Signals queue, scores throttle, handlers stay honest.

When a downstream is slow, STORM applies backpressure at the signal layer rather than dropping work or blocking the loop. Configurable per source.

SDK reference

The full surface area of @storm/core. Small on purpose.

storm(config)

Construct an agent. Returns a controllable cycle runner.

Accepts a config object describing signals, policy, action, sizing, and memory. Returns an instance with .start(), .stop(), .replay(range), and .snapshot().

const agent = storm({
  signals: [http({ port: 8080 }), webhook("/x")],
  policy: (s) => model.score(s),
  action: async (s) => handler.fire(s),
  sizing: confidence({ floor: 0.7 }),
  memory: postgres(process.env.DB_URL),
});

await agent.start();

agent.replay(range)

Re-run any window of cycle history against the current policy.

Returns a stream of synthetic cycles with the would-be score and action for each historical signal. Use it to validate policy changes against real traffic before shipping.

const delta = await agent.replay({
  from: "2025-04-01",
  to: "2025-04-08",
});

agent.snapshot()

Capture full agent state for cold-start or migration.

Returns a serialisable snapshot — config hash, memory contents, in-flight signals. Restore on a new host with storm.restore(snapshot).

Examples

Three agents, three domains, the same primitive.

Swarm → decision

Fan out four specialist agents and synthesize a structured verdict.

The reference swarm shipped with this site: streams Researcher, Skeptic, Quant, and Strategist deliberations in parallel, then a Coordinator agent returns verdict + confidence + key risks via a tool call. See it running on /app.

Webhooks → ops

Score incoming webhooks and route to the right on-call handler.

A 60-line example showing how STORM replaces a typical PagerDuty + workflow tool stack. Confidence gating, adaptive cooldown, full replay.

LLM → email

Score model outputs as signals; only fire emails above 0.85.

Wire any chat completion as a signal source. The policy thresholds the score; the handler is your transactional email provider. Hallucinated drafts never leave the cycle.

Security model

Trust boundaries are explicit. The runtime never owns secrets.

Secrets

Stored in your environment. STORM never serialises them into logs.

Logs redact known secret-shaped values by default. Bring-your-own scrubber for custom formats.

Replay safety

Replay runs handlers in a sandbox by default — no real side effects.

Side-effecting handlers see a stub network and stub clock during replay. Opt in per-handler if you want a real-money replay (you usually don't).

Audit trail

Every cycle is content-addressed. Tampering is detectable.

Logs form a hash chain. A modified entry invalidates every later hash. Verify offline with storm verify <log-file>.

FAQ

Is this an LLM framework?

No. LLMs are just one kind of signal source.

STORM is a runtime for autonomous cycles. Plug an LLM in as a signal or a policy if you want — but the standard does not assume one.

Do I need an orchestrator?

No. The cycle is the orchestrator.

If you want to chain cycles, have one cycle's action emit a new signal that another cycle subscribes to.

Production-ready?

v0.1 is stable for the core cycle. Adapters are evolving.

We run STORM ourselves at six figures of cycles per day. Public APIs follow semver from v1.0 onward.

How does it compare to LangChain / CrewAI?

Different abstraction. STORM is a runtime, not a chain DSL.

Chain frameworks compose calls. STORM composes cycles. You can build a chain on top of STORM in a few lines, but the inverse is much harder.