Data modeling and full-stack docs
Brick, 223P, SPARQL, CRUD APIs, Docker Compose, and lab automation for Open-FDD as a deployed platform now live in a separate repository:
That site documents how the stack uses this open-fdd PyPI package under the hood (RuleRunner, YAML rules, pandas).
In this repository (rules engine only)
- Column map resolvers — map Brick, Haystack, DBO, 223P, or vendor labels to DataFrame columns (dict, manifest, composite resolvers).
- Expression rule cookbook — fault logic on pandas, including schedule and weather gates via
params.schedule/params.weather_band. examples/column_map_resolver_workshop/— runnable ontology-agnostic demo (simple_ontology_demo.py).
The base PyPI wheel is the rules engine only. pip install "open-fdd[desktop]" adds the local FastAPI gateway, Feather-backed ingest, model.json on disk, and BRICK TTL generation under open_fdd.desktop (paths in open_fdd.desktop.storage.paths). Larger deployed platform concerns (Compose, production topology, extra RDF/SQL services) stay documented in open-fdd-afdd-stack.
AI-assisted modeling workflows
For the full OpenClaw + Codex OAuth + gateway HTTP integration picture, see Open FDD Claw architecture and scripts/OPENCLAW_RUNBOOK.md Phase 0.
For AI-assisted data modeling (OpenClaw, ChatGPT, or human-in-the-loop review), use a simple loop:
- Export model JSON from your backend (
/model/exportor stack export endpoint). - Review and revise with an LLM (OpenClaw agent or ChatGPT web UI).
- Validate the edited JSON before import.
- Import JSON back to backend (
/model/import) and re-run SPARQL/rules checks.
The same flow works for:
- OpenClaw agents running local automation loops.
- ChatGPT online interface where a human copies export JSON in and validated JSON out.
- Hybrid workflows where AI drafts and human confirms before import.
For robust prompts, import schema guidance, and operator-safe pre-flight checks, see:
Local HTTP gateway note (open-fdd repo)
The FastAPI gateway (open_fdd.gateway, CLI open-fdd-gateway / open-fdd-desktop-bridge) supports agent-friendly backend operations such as:
- model export/import/validate,
- SPARQL query endpoints (
/data-model/sparql,/data-model/sparql/upload), - timeseries bounds/query over Feather data,
- weather/BACnet ingest and ML training routes.
This enables OpenClaw-style local assistants to do data modeling, retrieve and join data in pandas/Feather workflows, run faults, and iterate with a human operator. For how to run the gateway locally, storage paths, and start-local, see Desktop app.