Skip to Content

winkComposer · open-sourcing soon

Composable Streaming
Intelligence.

Turn streaming data into actionable insights — from edge to cloud. A high-performance JavaScript framework for IIoT. Signal conditioning, anomaly detection, and health assessment, underpinned by neural-network intelligence.

Star on GitHubRead the docs
Live · NASA IMS bearing dataset · replaying in your browser
Pipeline state
Nominal
tracking baseline
Health score
100%
waiting for data
Insight surfaced
before total failure
Loading bearing dataset…
DAY 0DAY 1DAY 2DAY 3DAY 4DAY 5DAY 6DAY 7
4-bearing test rig · vibration RMS · 20 kHz samplingSee the full use case

The same pipeline runs on a Raspberry Pi, a gateway, or a server.

What it surfaces

Every stream has a story.
Composer reveals it.

Two recurring patterns — drift and classification. The hero already shows a third: bearing degradation on real NASA data. More use cases span telematics, HVAC, and IT infrastructure.

01
process drift
caught while still developing

A control loop hides what variance reveals.

A catalyst slowly degrades in the Tennessee Eastman benchmark. Temperature stays flat — the control loop compensates. Pressure oscillations grow from ±1 to ±50 kPa. Three detectors catch the instability before the reactor trips.

Tennessee Eastman benchmark
see the use case
02
driving modes
classified from real GPS

Average speed says nothing. Its behaviour says the road.

Real GPS from a truck on an Indian highway averages 40 km/h whether it’s cruising or crawling through jams. A flow reads the same speed stream three ways and fuses them into Highway, City, or Jam.

Real truck on Indian highways
see the use case

WiFi AP health, wash-cycle states, server latency, and more — the library grows with the community.

Explore all use cases →

How it fits

One moving part.

Composer is the processing layer in a stack of proven, boring, open-source parts. No re-platforming — it slots into the brokers, historians, and dashboards you already run.

01 · your infrastructure
Connect
MQTT · OPC-UA · HTTP · CSV replay
02 · one declarative flow
Compose.
signal + stats + anomaly + neural
03 · every value, typed
Store
QuestDB with full semantics
04 · tools your team knows
Act
Grafana · MQTT · LLMs via MCP

The output layer

Your operations,
at a glance.

Composer computes. QuestDB stores. Mosquitto alerts in real time. Grafana shows. Health scores, evidence, cycle signatures — the same intelligence your pipeline produces, in dashboards your team already knows.

Grafana · winkComposer dashboard
Grafana dashboard fed by winkComposer: equipment health verdict, appraise evidence breakdown, and wash-cycle pressure signature — all materialised from a live pipeline into QuestDB.

winkComposer · QuestDB · Mosquitto · Grafana — open source, end to end

Express what, not how

Building blocks.
Endless combinations.

Each block, or "node", does one thing well: connect, filter, smooth, detect, classify, act, and more. Compose them through one declarative flow language into any pipeline you need.

Read the flow language
Arithmetic
accumulatediffinvertFlagratiotransform
Signal Conditioning
sanitizebutterworthFilteresMeanesStats
Feature Extraction
trenddwellTimeTrackeresCorrelationstateChangeDetector
Detection
pageHinkleypersistenceCheckthresholdprocessIndex
Intelligence
appraisekalman1d
Flow Control
passIf
Observability
emitIfpersistIf
Orchestration
controller

AI-native · via MCP

Your fleet,
in plain English.

Every insight is pre-computed, typed, and stored. Any LLM retrieves the facts via the Model Context Protocol — then reasons over them. No prompt engineering. No sampling. No hallucinated metrics.

How MCP grounding works
Technician

“Check sensor health and wash-cycle state for Pump #42. Give me a visual summary I can share with the supervisor.”

Claude Opus · grounded in live pipeline via MCP
AI diagnostic response for Pump #42 — structured health assessment with sensor degradation separated from healthy pump operation, produced by Claude Opus querying a winkComposer pipeline via MCP.

Actual output from Claude Opus querying a live winkComposer pipeline via MCP.

Proven at scale

Same engine.
Any scale.

An 8-node pipeline, benchmarked end-to-end. Your flow runs the same whether you ship it to a Pi in a gatehouse or a server tracking a full fleet.

~100K
msg/sec
Edge gateway (Pi 5-class)
1M+
msg/sec
Modern server
~300K
msg/sec
tracking 200K assets

Same engine powers every use case on this site — in your browser.

Verify performance

Transitioning to open source

Follow the journey.

winkComposer is in active development. Interfaces will break, details will go stale, whole sections will reshape. If something looks wrong — please tell us.

Star on GitHubWatch for Updates

Or subscribe for email updates.