// the stack

Flux Command Center

A multi-tab Electron + React application I built to orchestrate Flux — my local AI agent — across every dimension of my stack.

Tabs

  • Chat Direct dialogue with Flux — local model, zero latency, no cloud.
  • Canvas Visual agent workspace: build, connect, and run AI workflows on a node canvas.
  • Dashboard System observability — connectivity status, model presence, token optimization.
  • Skills Capability registry: browse, activate, and deactivate agent skills on the fly.
  • Terminal Full shell access through the command center interface.
  • Site Live preview of this website — informativ.io, rendered locally.

The orchestrator is Flux — a local 8B model running via llama.cpp on an M4 Pro with 48GB RAM. Fully offline. Zero API cost. No rate limits.

flux@m4pro ~ canvas %
▶ Executing flow: informativ-deploy
[01] trigger — manual start received
[02] brain — generating site copy update...
model: llama-3.1-8b-instruct | tokens: 312
[03] skill — github.push → informativ-io/main
status: 200 | sha: a7f3c19
[04] output — netlify build triggered
✓ Flow complete — 4 nodes — 1.2s
flux@m4pro ~ canvas %
M4 Pro
48GB RAM
Apple Silicon
100% Local

// demo

Would you buy
Flux Command Center?

A local AI orchestrator for Mac. Vote to help us decide whether to release it publicly.

Would you buy this?