Constructing building model
Last updated
Last updated
This section of the walkthrough covers the development of a BRICK model for your buildings/sites. We will be using the DCH Model Generation Tool (mgtool) which requires a set of CSVs, so reading before proceeding is encouraged.
It is assumed:
That you have already read our document and have a scope of onboarding. This walkthrough is for a full level 3 model.
You have collated all technical information about your onboarding scope. Documents like BMS screenshots, Single Line Diagrams (SLDs), plant blueprints etc.
This is NOT a real model, it is an imaginary building. The points for the equipment may not be complete compared to the real world (e.g. FCUs in this model do not have air flow sensor). This is strictly to outline the general process of modelling a building.
Although we recommend you follow these steps in order, you may also take a different approach to developing your model CSVs. Our approach is a top-down method which starts with the high-level entities and their relationships (i.e. skeleton and object CSVs first) before dealing with the points.
To ensure compatibility with DCH applications, building models MUST abide by our norms as detailed in:
This CSV declares:
What Equipment, Locations, Zones and Collections (entities) are in the building
How these entities related to each other
We also require a prefix to be used for URI mapping. For this model we will use bldA as the prefix.
For this walkthrough we are modelling a building with the following technical documentation available. We will fill-in our skeleton CSV as we go on.
Things we take away:
The building has 2 wings
Each wing has 2 floors
Each floor has 2 rooms
HVAC
All VAVs except VAV7 feed a specific room (VAV7 feeds a zone containing 2 rooms) [7 VAVs]
FCUs are serving 2 rooms each [4 FCUs]
There are Temperature and Humidity (TH) IoT sensors installed that are not integrated with the HVAC systems
Based on these notes, we can start populating our skeleton CSV:
['hasPart',1]
['isPartOf',0]
['isPartOf,1]
['isPartOf',2]
['hasPart',3]
['feeds',3]
['feeds',4]
['feeds',2]
bldA|Building_A
bldA|Eastern_Wing
bldA|Eastern_Floor_1
bldA|Eastern_Floor_1_Room_1
bldA|VAV5
bldA|FCU3
bldA|Building_A
bldA|Eastern_Wing
bldA|Eastern_Floor_1
bldA|Eastern_Floor_1_Room_2
bldA|VAV6
bldA|Building_A
bldA|Eastern_Wing
bldA|Eastern_Floor_2
bldA|Eastern_Floor_2_Room_1
bldA|HVAC_Zone_A
bldA|VAV7
bldA|FCU4
bldA|Building_A
bldA|Eastern_Wing
bldA|Eastern_Floor_2
bldA|Eastern_Floor_2_Room_2
bldA|HVAC_Zone_A
bldA|Building_A
bldA|Western_Wing
bldA|Western_Floor_1
bldA|Western_Floor_1_Room_1
bldA|VAV1
bldA|FCU1
bldA|Building_A
bldA|Western_Wing
bldA|Western_Floor_1
bldA|Western_Floor_1_Room_2
bldA|VAV2
bldA|Building_A
bldA|Western_Wing
bldA|Western_Floor_2
bldA|Western_Floor_2_Room_1
bldA|VAV3
bldA|FCU2
bldA|Building_A
bldA|Western_Wing
bldA|Western_Floor_2
bldA|Western_Floor_2_Room_2
bldA|VAV4
Things we take away:
FCUs in this building have the following sub-equipment:
Filter
CHW Valve
CHW Coil
HW Valve
HW Coil
Supply Fan
Now we can declare these sub-equipment in the skeleton CSV picking up from column 7 where we declared our FCUs and where they are feeding. Note that we have moved many rows down to make our job easier and we are no longer operating in the same rows as last time.
['feeds',2]
['isPartOf',7]
bldA|FCU3
bldA|FCU4
bldA|FCU1
bldA|FCU2
bldA|FCU1
bldA|FCU1_Filter
bldA|FCU1
bldA|FCU1_CHWValve
bldA|FCU1
bldA|FCU1_CHWCoil
bldA|FCU1
bldA|FCU1_HWValve
bldA|FCU1
bldA|FCU1_HWCoil
bldA|FCU1
bldA|FCU1_SupplyFan
We repeat this for all FCUs. The skeleton CSV at the end of this step looks like this:
Things we take away:
VAVs in this building have the following sub-equipment:
HW Valve
HW Coil
We repeat the same process as we did with FCUs to declare these sub-equipment.
Things we take away:
1 AHU only in this building which feeds the VAVs
The AHU has the following sub-equipment:
OA Damper
RA Damper
Filter
CHW Valve
CHW Coil
HW Valve
HW Coil
Return Fan
Return Fan VSD
Supply Fan
Supply Fan VSD
With the AHU there are a couple of things we need to add:
Add the sub-equipment listed above.
Add the relationship between the AHU and the VAVs.
This brings our CSV to this state:
Things we take away:
The Chilled Water Plant contains:
A Chiller
A Chilled Water Pump
To close the chilled water loop, we model from our chilled water coils to the chiller (CHWP) and then to the chilled water pump:
Things we take away:
The Chilled Water Plant contains:
A Boiler
A Hot Water Pump
We repeat closing the loop, this time for our hot water plant. The result is a full plant skeleton CSV:
Things we take away:
There are 4 total electrical meters in the building.
Main building supply (point of connection with grid) is metered by MSSB.
Building has a submeter for the Chiller as measured by M1.
Building has a submeter for the Boiler as measured by M2.
Building has solar generation as measured by both the inverter and M3.
Starting from the bottom of the electrical tree, we establish the lower level meters and work our way up. In this scenario, M1, M2 and M3 are submeters of MSSB while they all meter their respective equipment (Chiller, Boiler and Inverter respectively). Additionally, we link the solar array with the inverter as described in our norms. This results in our final skeleton CSV:
bldA|AHU
AHU
bldA|Boiler
Boiler
bldA|Building_A
Building
bldA|MSSB_Primary
Building_Electrical_Meter
bldA|AHU_CHWCoil
Chilled_Water_Coil
...
...
This step is where we will link the entities created in the last steps to their points. For this step we require our points list:
On the sidebar, go to Data and select Data Sources
Find your data source from the list and select it
Click on Download Points List
For our mock model, we have the following points list:
The points_list CSV file requires the full URI path of the points, so we copy the uri field into a new CSV and save it as points_list.csv (No header required). The Data Pool URI can also be found in the Data Source page.
dch:org/sandbox-eddf8c4b-a3a7-49a1-82d4-1e4821baca62/datapool/managed_datapool_walkthrough_datasource_a710d406-9eb4-4070-b2fe-45c4a0bc8782#walkthrough_site.walkthrough_building.AHU.Enable
The expression is .AHU. to link the point to the bld|AHU entity we created in our skeleton and object CSVs.
However, for points such as:
dch:org/sandbox-eddf8c4b-a3a7-49a1-82d4-1e4821baca62/datapool/managed_datapool_walkthrough_datasource_a710d406-9eb4-4070-b2fe-45c4a0bc8782#walkthrough_site.walkthrough_building.TH3.Air_Temp
dch:org/sandbox-eddf8c4b-a3a7-49a1-82d4-1e4821baca62/datapool/managed_datapool_walkthrough_datasource_a710d406-9eb4-4070-b2fe-45c4a0bc8782#walkthrough_site.walkthrough_building.TH3.Air_Hum
Since IoT equipment is not directly modelled, we have to link these points to their respective location/zone. In the case of TH3, since the sensor platform is installed in room bldA|Western_Floor_2_Room_1 then the expression for this room would be TH3. This links both points to the room.
The final point_linkage CSV looks like this:
bld|AHU
.AHU.
bldA|Western_Floor_2_Room_1
.TH3.
...
...
Be aware that if you use the expression AHU (without the dots) for the entity AHU, all points with AHU in them are then linked and you can no longer link the points for the AHU's sub-equipment (e.g. AHU_CHWCoil is then pointless). You can overcome this by either reorganising the order of your pattern match or to change the pattern altogether. In this case we included the dots before and after the AHU to avoid this pitfall.
Now that we are done with our CSVs, we need to write a manifest file. You need the following information before you get started:
Organisation ID (in this case I am using my sandbox organisation sandbox-eddf8c4b-a3a7-49a1-82d4-1e4821baca62)
Site ID: exactly as created on DCH (for this example it is walkthrough_site)
Building ID: exactly as created on DCH (for this example it is walkthrough_building)
Data pool ID: you can get this by refering to your point_list.csv. You can see the point URI names follow the pattern of <dch:org/YOUR_ORG/datapool/YOUR_DATAPOOL_ID#> (in our case, the data pool ID is managed_datapool_walkthrough_datasource_390c330e-8335-4084-abf0-00d6daf87c5a#.
First, we set the manifest version. We are currently at version 1.0.0:
Next, we add our URI mappings based on the info above:
Note how we assigned the URI prefix bldA to the actual building in DCH. The URI path to your data pool can be called whatever you want, that is added to reduce the size of the final model file by replacing the path with WALKTHROUGH_DATA.
Finally, we tell the tool what operations we are using and what the CSVs are called:
Save this file and call it manifest.json.
Couple of warnings:
The tool reads these fields in order, so if you put RELATE before CREATE the tool will fail thinking there are no objects created.
In POINT_CREATE_LINK_PATTERN_MATCH operation, point_list_files MUST be set as a list with square brackets.
Now that we have all of our CSVs and manifest, we can use the mgtool to generate our model.
Make a folder and call it whatever you wish (we will call it walkthrough_model).
Place your manifest in that folder.
Place all of your CSVs in a folder called input_csvs in the root folder.
Place the manifest and the input_csvs folder in a ZIP file.
In the DCH sidebar, go to Tools and select Generate Model(s).
Drag and drop the zip file in the box and press Generate Model.
After the model generation is done, a ZIP file will be downloaded by your browser with the full report and the generated TTL file.
Check the mgtool.log for any warnings or errors.
Now we can upload our model TTL to DCH. Navigate to your building page by clicking on it.
Press Upload Model File and select your TTL file.
Press Upload & Overwrite. If everything is alright with your model, you should get this message:
Your building page should now have Site Published.
From the sidebar, go to Tools and select Point Classifier.
Select Create New and enter a name for this classifier set. Then select your data pool from the list.
Press on the + sign on the top right to create a new rule to this set. We will be focusing on the ID field which represent the point IDs as shown in the point_list CSV. It is a good idea to have your point_list CSV open on the side.
As the first rule, we will target all points ending with ".Enable". Type *Enable on the left field and select Enable Command on the right Class field. This classifies all points ending with .Ending as Enable Command.
When you are done, press Generate Rules. A new section opens on the same page with the changes mapped out for your approval. If everything looks good and you wish to proceed, press Apply Changes at the bottom of the page. Your points are now classified and your model is DONE!
Note that based on our , we MUST model the VSD as feeding their respective sub-equipment and it does not take on the isPartOf relationship with the AHU.
This step is relatively easy, all we have to do is list all entity instances we used in the skeleton and assign a to them. You can also use our . For this model, the objects CSV would look like:
Now we have to use this list for out point_linkage CSV. This operation requires a string pattern match as detailed . This pattern match looks for the declared expression anywhere in the point name. For example, for the point:
Last step is to use the tool to assign BRICK classes to our points.
We repeat this for all other point types. For all of our KWH points, we have to set Entity Properties to abide by our . To do this, enable the Properties field and start adding the values to their properties. You may also add Units of Measure in the right UoM fields.