BRICK Model Process Flow
This section describes the lifecycle of resources and semantic models in DCH2.0.
Last updated
This section describes the lifecycle of resources and semantic models in DCH2.0.
Last updated
The DCH Model API has five 'top-level' resource types:
Site: represents a parcel of real estate which belongs to an organisation.
Building: represents an enclosed physical structure, constructed upon a Site.
Data Pool: represents a collection of any number of 'Points' (timeseries data streams), linked to any number of Sites or Buildings.
Application: represents deployed applications, linked to any number of Sites and/or Buildings.
Each of the resources has basic metadata fields. Additionally, Sites, Buildings and Data Pools each have a semantic model describing their internal components.
Semantic models are managed in a draft-publish lifecycle. This means that each Site, Building and Data Pool resource has both a 'draft' version and a 'published' version of its semantic model graph. Uploading a draft semantic model does not immediately update the published model. When a site is later published, its semantic model draft (and drafts of any buildings on the site) undergo a process of inference and validation. If validation is successful, the merged outputs are then copied to the 'published' version for each resource. When a BRIQL query is invoked for a combination of sites and buildings, it is the published version that is interrogated. This section explains these operations in more detail.
For the purposes of describing data lifecycle figure 1 shows an example Site, with one or more Building(s). Our example Site has one site-level electrical meter, and each Building has a separate submeter. Meters are connected to an on-site gateway device, which is configured with DCH Data Source credentials to report energy metering points to DCH via MQTT.
Availability: Model API (not available via Dashboard).
Prerequisites: Organisation resource exists.
Data Pools in DCH are typically managed automatically by other services, rather than by human operators or off-system tools; this step is mentioned here for context and completeness.
The lifecycle of point-related data is typically as follows:
An on-site gateway is configured with Data Source credentials for the MQTT API.
On-site equipment and sensors send time-series data to the gateway.
Gateway relays live time-series data to DCH.
The Alethia service creates a Data Pool and semantic model data for each Data Source.
The 'live' semantic model for a Data Pool is continually updated, rather than using a draft-publish lifecycle.
Availability: Dashboard, Model API.
Prerequisites: Organisation resource exists.
This will create the Site resource and basic metadata, and a pre-populated minimal draft semantic model containing only a single statement (i.e. that the site's ID represents a brick:Site).
Availability: Dashboard, Model API.
Prerequisites: Site resource exists
In this example the uploaded draft makes five statements, which deserve brief explanation:
site: a brick:Site
The node 'site:' represents an instance of 'Site' (a Location subclass in the Brick Schema)...
brick:isMeteredBy site:siteMeter
...and the Site is metered by the node 'site:siteMeter'.
site:siteMeter a brick:Electric_Meter
The node 'site:siteMeter' is an instance of 'Electrical_Meter' (an Equipment subclass in the Brick Schema)...
brick:hasPoint dp1:site_energy
...and the Site has a Point with URI 'dp1:site_energy'.
Availability: Dashboard, Model API.
Prerequisites: Site resource exists
Buildings can be created in DCH through the Dashboard, or through direct interaction with the API.
This will create the Buildings resource and basic metadata, and a pre-populated minimal draft semantic model containing only a single statement (i.e. that the building's ID represents a brick:Building).
Availability: Dashboard, Model API.
Prerequisites: Building resource exists
A new semantic model draft can be uploaded for a Building either through the Dashboard, or through direct interaction with the API. As with buildings, this also expects a TTL-formatted upload, and is subject to the same basic checks.
Availability: Dashboard (invoked by Dashboard automatically after updating site or building draft), Model API.
Prerequisites:
Site (and, optionally, any buildings) resource(s) exist
Draft semantic models are ready and valid.
A Site can be published either through the Dashboard (this happens automatically after uploading a draft model), or through direct interaction with the API.
Publication is the process of linking, inference and validation necessary to create reliable semantic models that can are suitable for querying. After publication of a site or building, the published version will remain unchanged until the next publication, regardless of changes to draft versions.
Publication always happens simultaneously across a whole Site and all of its constituent Buildings, meaning that Buildings are not separately published.
The steps performed by a publication operation are:
Dependency discovery: The Data Pools that contain any Points referenced in the semantic model draft(s) are discovered. If necessary, new dependency relationships are recorded in the database between the Site/Building resource(s) and the Data Pool resource(s).
Intersection: Triples about any Points that are 1) referenced in Site/Building draft and 2) declared in linked Data Pools are extracted for merging into output graphs.
Root node composition: Triples that link the 'root' nodes of Site and Building models are automatically generated, to assist querying.
Inferencing: SHACL inference 'Rules', as defined in Brick Schema, are applied to create the triples implied by explicit triples in models. Most of these are about 'inverse' relationships (eg a statement 'X brick:hasPart Y' also implies 'Y brick:isPartOf X').
Validation: SHACL validation 'Shapes', as defined in the Brick Schema, are used to validate the models. 'Error' level results will prevent publication from continuing; results at 'Warning' or lower level will allow publication to continue.
Merging: Triples from draft and inferencing are merged into output graphs
Representation update: Finally, the merged output graphs are linked to the Site and/or Building resources that they represent; these are the graphs that will be interrogated by any future queries mentioning these resources.
In the case of our example Site, those steps during publication will cause the following actions:
Site 'Hobart' depends on Data Pool 'dp1', due to finding a reference to Point 'dp1:site_energy', so this dependency is recorded.
Building 'B1' depends on Data Pool 'dp1', due to finding a reference to Point 'dp1:b1_energy', so this dependency is recorded.
The triples 'dp1:site_energy a brick:Electric_Energy_Sensor
' and 'dp1:site_energy brick:hasUnit unit:KiloW-HR
' are extracted for merging into the Site's published graph.
The triples 'dp1:b1_energy a brick:Electric_Energy_Sensor
' and 'dp1:b1_energy brick:hasUnit unit:KiloW-HR
' are extracted for merging into the Building's published graph.
The triple 'site: brick:hasPart b1:
' is derived for merging into the Site's published model.
The triple 'b1: brick:isPartOf site:
' is derived for merging into the Building's published model.
Triples with implied inverse predicates are created by SHACL inference rules, along with other inferencing to expedite queries.
Site draft is merged and linked to Site resource.
Building draft is merged and linked to Building resource.
DCH2.0 provides basic discovery of Sites within an organisation by geolocation.
Currently, the search geometry is a bounding box, given by centroid latitude and longitude (both in decimal degrees), and an edge length in km.
This operation expects RDF payload data in the format (also known as TTL or 'turtle', MIME type 'text/turtle'). The uploaded data will be subjected to very basic checks for formatting and content (see ), but not full Brick Schema validation.
DCH allows querying for entities and metadata in a Site/Building model(s). The BRIQL API is fully documented .
We also have a set of example queries .