Event Sourcing
Ironflow provides native event sourcing primitives that allow you to derive application state from an append-only log of events.
Why Event Sourcing?
Traditional databases only store the “current state.” Ironflow stores every change that led to that state. This gives you:
- Auditability: A perfect history of every user action.
- Time Travel: Reconstruct state at any point in the past.
- Analytics: Build new data models (projections) months after events occurred.
Core Concepts
| Concept | Description |
|---|---|
| Entity Stream | An ordered log of events for one ID (e.g., order_123). |
| Optimistic Concurrency | Ensures writes only happen if the version matches. |
| Upcasters | Migrates old event data to new schemas at read-time. |
| Global Event Log | Every event, in nats_seq order, across all entity streams — the source projections fold over. |
| Projections | Read models built by folding the global event log across many streams. |
Quick Example (Node.js)
import { createClient } from "@ironflow/node";
const ironflow = createClient({ apiKey: process.env.IRONFLOW_API_KEY });
// 1. Append an event (auto-creates the stream)const { entityVersion } = await ironflow.streams.append("order-123", { entityType: "order", name: "order.created", data: { total: 99.99 },}, { expectedVersion: 0 }); // 0 = ensure this is the first event
// 2. Read the historyconst { events, totalCount } = await ironflow.streams.read("order-123");console.log(`Stream version is now: ${entityVersion} (${totalCount} events)`);Guides
- Entity Streams — Appending, reading, and concurrency.
- Projections — Building read-optimized views.
- Event Versioning — Schema evolution via Upcasters.