Data model flow
Open-FDD uses a unified graph: one semantic model that combines Brick (sites, equipment, points), BACnet discovery RDF (from bacpypes3 in diy-bacnet-server), platform config, and—as the project evolves—other ontologies such as ASHRAE 223P. CRUD and discovery both update this graph; all backend queries are SPARQL-driven (rdflib Graph parse + SPARQL; no grep or text search on the TTL). Rules resolve inputs via ofdd:mapsToRuleInput in the TTL.
Flow
Sites + Equipment + Points (DB) ← single source of truth
│
▼
Data-model export / CRUD
│
▼
Brick TTL (config/data_model.ttl) ← Brick section reserialized on every create/update/delete; same file can include a BACnet discovery section (one file for SPARQL)
│
▼
FDD column_map (external_id → rule_input)
│
▼
RuleRunner
CRUD and Brick TTL sync: The database is the single source of truth. Every create, update, or delete on sites, equipment, or points (via API or data-model import) triggers a reserialize: the Brick TTL file (config/data_model.ttl, or OFDD_BRICK_TTL_PATH) is regenerated from the current DB and written to disk. So the Brick model is always in sync with CRUD. Deleting a site, device (equipment), or point also cascades to dependent data (timeseries, fault results, etc.) as in a typical CRUD app; see Danger zone — CRUD deletes.
TTL structure
:oat_sensor a brick:Outside_Air_Temperature_Sensor ;
rdfs:label "OAT (°F)" ;
ref:hasExternalReference [
a ref:TimeseriesReference ;
ref:hasTimeseriesId "oat_sensor" ;
ref:storedAt "postgresql://localhost:5432/openfdd/timeseries_readings"
] ;
brick:isPointOf :ahu_7 ;
ofdd:mapsToRuleInput "oat" .
- Brick classes define sensor types
ofdd:mapsToRuleInputmaps to FDD DataFrame columnsref:hasExternalReferencelinks points to BACnet/timeseries external systemsrdfs:labelfor display
Data modeling process (discover → tag → import → validate)
- Discover — BACnet discovery and/or manual entry populate sites, equipment, and points in the DB.
- Export — Use GET /data-model/export (or the Export card on the Data Model Setup page) to get JSON for tagging.
- Tag — Either manual: copy JSON → use an external LLM or human → paste back (or automate with an external agent like Open‑Claw). If your agent needs platform documentation as context, fetch it from
GET /model-context/docs. - Import — PUT /data-model/import (or the Import card / auto-import from the agent) writes tagged points and optional equipment relationships into the DB and reserializes the Brick TTL.
- Validate — Use the Data Model Testing page (SPARQL, “Summarize your HVAC”) to confirm the model; treat the result as pass or fail.
Data-model API
| Endpoint | Description |
|---|---|
GET /data-model/export | Single export route: BACnet discovery + DB points (optional ?bacnet_only=true, ?site_id=...). Use for AI-assisted tagging. |
PUT /data-model/import | Import JSON: points (required) and optional equipment (feeds/fed_by). Creates/updates points; does not accept sites or equipments. |
GET /data-model/ttl | Generate Brick TTL from DB (and in-memory BACnet graph). Optional ?save=true to write to file. |
POST /data-model/sparql | Run SPARQL query against the current data model. |
Validation
Use SPARQL to validate:
- All rule inputs have
ofdd:mapsToRuleInput - Equipment types and points are consistent
- Brick schema compliance (optional)