Constructing building model

This section of the walkthrough covers the development of a BRICK model for your buildings/sites. We will be using the DCH Model Generation Tool (mgtool) which requires a set of CSVs, so reading that section of the documentation before proceeding is encouraged.

It is assumed:

  1. That you have already read our Model Scope and Requirements document and have a scope of onboarding. This walkthrough is for a full level 3 model.

  2. You have collated all technical information about your onboarding scope. Documents like BMS screenshots, Single Line Diagrams (SLDs), plant blueprints etc.

Although we recommend you follow these steps in order, you may also take a different approach to developing your model CSVs. Our approach is a top-down method which starts with the high-level entities and their relationships (i.e. skeleton and object CSVs first) before dealing with the points.

Step 1: skeleton CSV

This CSV declares:

  1. What Equipment, Locations, Zones and Collections (entities) are in the building

  2. How these entities related to each other

We also require a prefix to be used for URI mapping. For this model we will use bldA as the prefix.

For this walkthrough we are modelling a building with the following technical documentation available. We will fill-in our skeleton CSV as we go on.

Floorplans, HVAC zones/locations and IoT sensors

Things we take away:

  • The building has 2 wings

  • Each wing has 2 floors

  • Each floor has 2 rooms

  • HVAC

    • All VAVs except VAV7 feed a specific room (VAV7 feeds a zone containing 2 rooms) [7 VAVs]

    • FCUs are serving 2 rooms each [4 FCUs]

  • There are Temperature and Humidity (TH) IoT sensors installed that are not integrated with the HVAC systems

Based on these notes, we can start populating our skeleton CSV:

0
1
2
3
4
5
6
7

['hasPart',1]

['isPartOf',0]

['isPartOf,1]

['isPartOf',2]

['hasPart',3]

['feeds',3]

['feeds',4]

['feeds',2]

bldA|Building_A

bldA|Eastern_Wing

bldA|Eastern_Floor_1

bldA|Eastern_Floor_1_Room_1

bldA|VAV5

bldA|FCU3

bldA|Building_A

bldA|Eastern_Wing

bldA|Eastern_Floor_1

bldA|Eastern_Floor_1_Room_2

bldA|VAV6

bldA|Building_A

bldA|Eastern_Wing

bldA|Eastern_Floor_2

bldA|Eastern_Floor_2_Room_1

bldA|HVAC_Zone_A

bldA|VAV7

bldA|FCU4

bldA|Building_A

bldA|Eastern_Wing

bldA|Eastern_Floor_2

bldA|Eastern_Floor_2_Room_2

bldA|HVAC_Zone_A

bldA|Building_A

bldA|Western_Wing

bldA|Western_Floor_1

bldA|Western_Floor_1_Room_1

bldA|VAV1

bldA|FCU1

bldA|Building_A

bldA|Western_Wing

bldA|Western_Floor_1

bldA|Western_Floor_1_Room_2

bldA|VAV2

bldA|Building_A

bldA|Western_Wing

bldA|Western_Floor_2

bldA|Western_Floor_2_Room_1

bldA|VAV3

bldA|FCU2

bldA|Building_A

bldA|Western_Wing

bldA|Western_Floor_2

bldA|Western_Floor_2_Room_2

bldA|VAV4

FCU BMS screens

Things we take away:

  • FCUs in this building have the following sub-equipment:

    • Filter

    • CHW Valve

    • CHW Coil

    • HW Valve

    • HW Coil

    • Supply Fan

Now we can declare these sub-equipment in the skeleton CSV picking up from column 7 where we declared our FCUs and where they are feeding. Note that we have moved many rows down to make our job easier and we are no longer operating in the same rows as last time.

7
8

['feeds',2]

['isPartOf',7]

bldA|FCU3

bldA|FCU4

bldA|FCU1

bldA|FCU2

bldA|FCU1

bldA|FCU1_Filter

bldA|FCU1

bldA|FCU1_CHWValve

bldA|FCU1

bldA|FCU1_CHWCoil

bldA|FCU1

bldA|FCU1_HWValve

bldA|FCU1

bldA|FCU1_HWCoil

bldA|FCU1

bldA|FCU1_SupplyFan

We repeat this for all FCUs. The skeleton CSV at the end of this step looks like this:

VAV BMS screens

Things we take away:

  • VAVs in this building have the following sub-equipment:

    • HW Valve

    • HW Coil

We repeat the same process as we did with FCUs to declare these sub-equipment.

AHU BMS screen

Things we take away:

  • 1 AHU only in this building which feeds the VAVs

  • The AHU has the following sub-equipment:

    • OA Damper

    • RA Damper

    • Filter

    • CHW Valve

    • CHW Coil

    • HW Valve

    • HW Coil

    • Return Fan

    • Return Fan VSD

    • Supply Fan

    • Supply Fan VSD

With the AHU there are a couple of things we need to add:

  • Add the sub-equipment listed above.

  • Note that based on our VSD norms, we MUST model the VSD as feeding their respective sub-equipment and it does not take on the isPartOf relationship with the AHU.

  • Add the relationship between the AHU and the VAVs.

This brings our CSV to this state:

Chilled Water Plant BMS screen

Things we take away:

  • The Chilled Water Plant contains:

    • A Chiller

    • A Chilled Water Pump

To close the chilled water loop, we model from our chilled water coils to the chiller (CHWP) and then to the chilled water pump:

Hot Water Plant BMS Screen

Things we take away:

  • The Chilled Water Plant contains:

    • A Boiler

    • A Hot Water Pump

We repeat closing the loop, this time for our hot water plant. The result is a full plant skeleton CSV:

Electrical

Things we take away:

  • There are 4 total electrical meters in the building.

    • Main building supply (point of connection with grid) is metered by MSSB.

    • Building has a submeter for the Chiller as measured by M1.

    • Building has a submeter for the Boiler as measured by M2.

    • Building has solar generation as measured by both the inverter and M3.

Starting from the bottom of the electrical tree, we establish the lower level meters and work our way up. In this scenario, M1, M2 and M3 are submeters of MSSB while they all meter their respective equipment (Chiller, Boiler and Inverter respectively). Additionally, we link the solar array with the inverter as described in our norms. This results in our final skeleton CSV:

Step 2: objects CSV

This step is relatively easy, all we have to do is list all entity instances we used in the skeleton and assign a BRICK class to them. You can also use our BRICK patch classes. For this model, the objects CSV would look like:

Model_ID|Entity_Name
Class

bldA|AHU

AHU

bldA|Boiler

Boiler

bldA|Building_A

Building

bldA|MSSB_Primary

Building_Electrical_Meter

bldA|AHU_CHWCoil

Chilled_Water_Coil

...

...

Step 3: point_list and point_linkage CSVs

This step is where we will link the entities created in the last steps to their points. For this step we require our points list:

  1. On the sidebar, go to Data and select Data Sources

  2. Find your data source from the list and select it

  3. Click on Download Points List

For our mock model, we have the following points list:

The points_list CSV file requires the full URI path of the points, so we copy the uri field into a new CSV and save it as points_list.csv (No header required). The Data Pool URI can also be found in the Data Source page.

points_list.csv

Now we have to use this list for out point_linkage CSV. This operation requires a string pattern match as detailed here. This pattern match looks for the declared expression anywhere in the point name. For example, for the point:

dch:org/sandbox-eddf8c4b-a3a7-49a1-82d4-1e4821baca62/datapool/managed_datapool_walkthrough_datasource_a710d406-9eb4-4070-b2fe-45c4a0bc8782#walkthrough_site.walkthrough_building.AHU.Enable

The expression is .AHU. to link the point to the bld|AHU entity we created in our skeleton and object CSVs.

However, for points such as:

dch:org/sandbox-eddf8c4b-a3a7-49a1-82d4-1e4821baca62/datapool/managed_datapool_walkthrough_datasource_a710d406-9eb4-4070-b2fe-45c4a0bc8782#walkthrough_site.walkthrough_building.TH3.Air_Temp

dch:org/sandbox-eddf8c4b-a3a7-49a1-82d4-1e4821baca62/datapool/managed_datapool_walkthrough_datasource_a710d406-9eb4-4070-b2fe-45c4a0bc8782#walkthrough_site.walkthrough_building.TH3.Air_Hum

Since IoT equipment is not directly modelled, we have to link these points to their respective location/zone. In the case of TH3, since the sensor platform is installed in room bldA|Western_Floor_2_Room_1 then the expression for this room would be TH3. This links both points to the room.

The final point_linkage CSV looks like this:

Model_ID|Entity_Name
Point_Expressions

bld|AHU

.AHU.

bldA|Western_Floor_2_Room_1

.TH3.

...

...

A good way of populating the point_linkage CSV is to copy the Model_ID|Entity_Name column from objects CSV to ensure you have all of your entities covered. Next start linking points until you are out of points to link.

Step 4: manifest.json

Now that we are done with our CSVs, we need to write a manifest file. You need the following information before you get started:

  • Organisation ID (in this case I am using my sandbox organisation sandbox-eddf8c4b-a3a7-49a1-82d4-1e4821baca62)

  • Site ID: exactly as created on DCH (for this example it is walkthrough_site)

  • Building ID: exactly as created on DCH (for this example it is walkthrough_building)

  • Data pool ID: you can get this by refering to your point_list.csv. You can see the point URI names follow the pattern of <dch:org/YOUR_ORG/datapool/YOUR_DATAPOOL_ID#> (in our case, the data pool ID is managed_datapool_walkthrough_datasource_390c330e-8335-4084-abf0-00d6daf87c5a#.

First, we set the manifest version. We are currently at version 1.0.0:

{
    "manifest_version":"1.0.0",
}

Next, we add our URI mappings based on the info above:

{
    "manifest_version":"1.0.0",
  "id_mapping": {
    "bldA": "dch:org/sandbox-eddf8c4b-a3a7-49a1-82d4-1e4821baca62/site/walkthrough_site/building/walkthrough_building#",
    "WALKTHROUGH_DATA": "dch:org/sandbox-eddf8c4b-a3a7-49a1-82d4-1e4821baca62/datapool/managed_datapool_walkthrough_datasource_390c330e-8335-4084-abf0-00d6daf87c5a#"
  }
}

Note how we assigned the URI prefix bldA to the actual building in DCH. The URI path to your data pool can be called whatever you want, that is added to reduce the size of the final model file by replacing the path with WALKTHROUGH_DATA.

Finally, we tell the tool what operations we are using and what the CSVs are called:

{
  "manifest_version": "1.0.0",
  "id_mapping": {
    "bldA": "dch:org/sandbox-eddf8c4b-a3a7-49a1-82d4-1e4821baca62/site/walkthrough_site/building/walkthrough_building#",
    "WALKTHROUGH_DATA": "dch:org/sandbox-eddf8c4b-a3a7-49a1-82d4-1e4821baca62/datapool/managed_datapool_walkthrough_datasource_390c330e-8335-4084-abf0-00d6daf87c5a#"
  },
  "operations": [{
      "operation_type": "CREATE",
      "config": {
        "object_file": "objects.csv"
      }
    }, {
      "operation_type": "RELATE",
      "config": {
        "skeleton_file": "skeleton_FINAL.csv"
      }
    }, {
      "operation_type": "POINT_CREATE_LINK_PATTERN_MATCH",
      "config": {
        "point_linkage_file": "point_linkage.csv",
        "point_list_files": ["points_list.csv"]
      }
    }]
}

Save this file and call it manifest.json.

Step 5: generating a TTL

Now that we have all of our CSVs and manifest, we can use the mgtool to generate our model.

  1. Make a folder and call it whatever you wish (we will call it walkthrough_model).

  2. Place your manifest in that folder.

  3. Place all of your CSVs in a folder called input_csvs in the root folder.

  4. Place the manifest and the input_csvs folder in a ZIP file.

  5. In the DCH sidebar, go to Tools and select Generate Model(s).

  6. Drag and drop the zip file in the box and press Generate Model.

  7. After the model generation is done, a ZIP file will be downloaded by your browser with the full report and the generated TTL file.

  8. Check the mgtool.log for any warnings or errors.

Step 6: upload your model

Now we can upload our model TTL to DCH. Navigate to your building page by clicking on it.

Press Upload Model File and select your TTL file.

Press Upload & Overwrite. If everything is alright with your model, you should get this message:

Your building page should now have Site Published.

Step 7: point classification

Last step is to use the Point Classifier tool to assign BRICK classes to our points.

From the sidebar, go to Tools and select Point Classifier.

Select Create New and enter a name for this classifier set. Then select your data pool from the list.

Press on the + sign on the top right to create a new rule to this set. We will be focusing on the ID field which represent the point IDs as shown in the point_list CSV. It is a good idea to have your point_list CSV open on the side.

As the first rule, we will target all points ending with ".Enable". Type *Enable on the left field and select Enable Command on the right Class field. This classifies all points ending with .Ending as Enable Command.

We repeat this for all other point types. For all of our KWH points, we have to set Entity Properties to abide by our Electrical Modelling Norms. To do this, enable the Properties field and start adding the values to their properties. You may also add Units of Measure in the right UoM fields.

When you are done, press Generate Rules. A new section opens on the same page with the changes mapped out for your approval. If everything looks good and you wish to proceed, press Apply Changes at the bottom of the page. Your points are now classified and your model is DONE!

Last updated