Key words: SIMBOT, real time simulation, data center, DPP, digital product passport, BDTA

Pablo Vicente-Legazpi
Paula de Miguel
Robert Birke
Miguel Lopez
BDTA, Antwerpen, Belgium
p.legazpi@buildingdigitaltwin.org
BDTA, Antwerpen, Belgium
UNITO, Torino, Italia
ECOSIMPRO, Madrid, Spain

 

Between 2018 and 2022, the newly established Building Digital Twin Association (BDTA), proposed an ambitious and innovative approach during the SPHERE project[1] (GA No. 820805): transferring technologies originally developed for life-support systems in the aerospace field to everyday applications such as thermal comfort control and ventilation. That was one of the first European H2020 projects on digital twins in the construction sector. The challenge was to make these technologies affordable, fast and reliable, so that they could truly revolutionise conventional building systems design and operation.

Subsequent projects, Hycool-IT (2023–2026, GA No. 101138623) and DYMAN (2024–2027, GA No. 101161930), have seen this vision evolve towards the definition of real-time simulation standards. These standards aim to ensure that all developers can use the same functional data from equipment published in future Digital Product Passports (DPPs), and that the methodology for model creation is standardised and controlled while allowing each developer to offer their own software platforms and support services.

Currently, this standardisation process is in its initial steps and must be agreed upon by all key stakeholders, primarily equipment manufacturers, simulation software developers and engineering companies. In future, these standardisation efforts will be channelled through CEN/TC 442 WG9 'Digital Twins in the Built Environment'.

Why real-time mathematical simulation?

In complex life-support systems, such as those used at the International Space Station (ISS), human-environment interaction and air purification are analysed in extreme detail. This results in highly interconnected systems involving multi-physical phenomena, all of which are described mathematically through large sets of equations. Solving such a system provides insight into the dynamic behaviour of the overall process.

Figure 1. Environmental Control and Life Support Systems (ECLSS) at ISS. (Source: NASA free images: https://images.nasa.gov/ / https://www.youtube.com/watch?v=iht75kq0RrU)

If the dynamics are relatively slow, as in the case of a building's thermal response, the solution of the system of equations can be obtained in real time, meaning that each calculation step is completed faster than the actual elapsed time of that step. This facilitates real-time mathematical simulation running in parallel with monitoring, providing highly valuable information for supervision and control purposes. It facilitates intelligent supervision in unattended environments or assists operators who are not HVAC specialists.

Mathematical simulation also allows engineers to evaluate system performance under abnormal conditions, such as during start-up, or to perform energy and performance assessments not only at building level but also for individual pieces of equipment, like compressors or control elements. These capabilities are of value especially during design and commissioning phases, where a clear and reliable process reference is essential. This approach ensures a more robust and verifiable commissioning process, thereby ensuring compliance with design requirements.

Model validation is immediate, since both sensor data and simulated variables can be observed simultaneously. Simulation outputs are typically more accurate and less influenced by calibration drift or measurement noise. However, it is crucial to consider sampling delays and the dynamic offsets they introduce, as the inputs received by the model are displayed as outputs in the subsequent simulation step, leading to discrepancies that require proper understanding and modelling.

Figure 2. Monitored and simulated values of the air supply in a mechanical ventilation machine. Delay in the simulated signal is due to sampling, but agreement with monitoring is excellent. (Source: ECOSIMPRO team, validation in SPHERE Project)

Mathematical simulations can be applied not only during the design and commissioning phases, but throughout the entire building lifecycle, including the operation phase. The models used for equipment selection and sizing during the design phase can be reused for commissioning and operational monitoring, ensuring conceptual continuity and traceability throughout the project.

However, the main challenges of mathematical simulation, particularly in real time, remain the cost and complexity of software systems, the time required for implementation and the lack of standardised methodologies that guarantee objective, reliable analysis.

The BDTA aims to address these challenges by reducing the cost and complexity of simulation models through the implementation of common standards.

The no-operator problem or the non-specialist supervisor

In the construction industry, it is common for control systems to be unmonitored. In many cases, these systems are poorly maintained, too. In other cases, the system may have been installed in a home and the owner is not a technician. In any of these scenarios, having intelligent edge technology that can solve and detect problems can add significant value. In the HYCOOLIT and DYMAN projects, which aim to improve data centres, on-site operators are IT engineers, not HVAC specialists. This means that they may not fully understand problems related to chillers and server cabinet cooling. However, with the right AI tools, this technical staff would be able to solve problems effectively.

Intrinsically complex control problems must also be considered, such as cooling/heating processes involving long time-outs, or the presence of many similar machines in a data centre where it is not possible to pay attention to the cooling of each CPU individually. The interaction between meteorology, chillers, piping systems, humidity, ventilation networks and emergency and redundancy systems makes control itself a sophisticated matter. An AI system at the edge can be instrumental in improving control quality and the installation's intrinsic safety.

Imagen que contiene interior, tabla, escritorio, oficina

El contenido generado por IA puede ser incorrecto.

Figure 3. Data Center supervision using Ecostruxure software, by Schneider. (Source, POLIMI)

Virtual supervisor using AI

Supervision combining monitoring, mathematical simulation and reinforcement learning within an interoperable framework enables the development of AI agents. These include the detection of measurement device failures, suboptimal performance or even malfunction due to non-sensorised causes. These agents can work at the edge, circumventing potential privacy or cybersecurity issues. They would also help supervisors understand the type of failure and how to react, even if they are not specialists. Rather than working within a control system to increase its intelligence and capacity, the objective would be to provide external supervision, potentially with a contractual commitment. The supervisor would obtain an independent, intelligent overview of their installation's performance, completely independent of the manufacturer and based on standardisation that would enable the use of competing software systems. The observation time, or the time window used by the agent, does not necessarily have to extend to many hours, thus avoiding the problem of storing sensor values. Therefore, the agent navigates along time in the same way as an autonomous vehicle. It recognises patterns and deviations and can report or make decisions if necessary.

How does the simulation work in real time? From deck to SCADA[2]

A simulation model is built with pre-designed components that have connection ports. Once the model is completed and compiled, we have a potential mathematical calculation engine. However, we usually have more equations than unknowns, which forces us to introduce boundary conditions and make the system of equations "solvable". Or in more specialized words, it is necessary to define the mathematical partition or the way in which we want to solve the system. And we will also have to define some independent parameters that would act as boundary conditions.

Contrary to popular belief, this process of "mathematical refinement" – and not the initial graphical modelling – is key to arriving at solvent and rapidly integrated models. With the model compiled, the partition defined, and with additional values set such as boundary conditions and some configuration parameters, we can carry out the simulation via integration. This integration is developed in time step by step, in the same way that a system of equations is integrated in implicitly, in an iterative way. Once the residue in each step is less than a threshold, the integration moves on to the next step. So, this can continue to happen indefinitely, or I can exchange inputs and outputs at each step that will interact with the integration, generating a "synthetic" real-time process.

Simulation programs usually provide tools that allow you to generate a simplified interface to interact with the simulation (called decks). A deck [3] would be an interface where the user enters some values, sees results, and can interact with the simulation process. It is a box encapsulating the original model. For real time, a specific deck can be generated, which communicates with a standard OPCUA [4] protocol. Through this protocol, SCADA systems are able to see and write values in the simulation as if it were a control element, a PLC for example.

Figure 4. Real-time deck running and OPCUA supervision and interaction usingUaExpert. (Source, BDTA's deck of a small data center prototype)

In Figure 4, a deck running continuously. On the right, viewing the OPCUA inputs and outputs with the UAExpert software. It is possible to interact with the simulator by changing variable values, and seeing the response as a result of the changes.

An important condition to achieve real-time simulation is that the calculation of each step must be executed in less time than the monitoring sampling. This time may be from 1 to 15 min, so the calculation should be done in about 30 sec (or within that order of magnitude). If the simulation components are designed correctly, these calculation times are possible even in large models.

The SIMBOT concept

To turn simulation into an economical process, it is necessary to greatly reduce the deployment times of the models and to control how components are defined. Mathematical simulation programs currently use extensive libraries of components to help in this process. These libraries usually have parameterized components. However, the problem is knowing which parameters to use and how. A large number of parameters adds an important complexity factor, increasing the possibilities of errors or long execution times (no real time) due to bad modelling. Nowadays, this would be the state of the art today.

Figure 5. Parametric component of a pump, state of the art. (Source, ECOSIMPRO HVAC library)

The improving alternative would consist of greatly simplifying the libraries, with components closely linked to market models and with little to no parameters. An example can be seen in the following Figure 6, showing a ceiling fan from the company S&P (model CTB4). The only two parameters are the air mixture and the power factor of the motor driving the fan.

Figure 6. Component (SIMBOT) of a market model with practically no parameters. (Source, ECOSIMPRO HVAC CATALOGUE library)

Specific library components can be easily programmed if abstract components are available. That is, components that act as a template on which the user can introduce a series of specific properties (such as performance curves), but that already include many parts of the code necessary for the final component.

This could also be achieved by connecting the functional content of the digital product passport (DPP) with part of the simulation component. We are currently working on this line of work seeking consensus with developers, who must complete the functional information (performance curves) with information specific to their programming environment. An example of what is proposed can be seen in Figure 7.

Figure 7. Example of component (SIMBOT) of a TROX_DRV_ECS_500H fan, with the lines defining the performance curves. (Source, ECOSIMPRO HVAC CATALOGUE library).

The Figure 7 shows the different parts of a specific simulation component, or as we have baptized it in the BDTA, a SIMBOT. The part of the code that is interchangeable with the information from a DPP is indicated in a box. All similar components (all similar fan series) would share the same code, but with varying the performance curves. This opens the door to the creation of extensive and open simulation libraries, for each developer and software environment, without the equipment manufacturer having to do anything more than share its DPP. In a simple and easy-to-implement way, all software platforms would be sharing functional market models, improving libraries and model implementation times.

Standards in simulation and CEN442 WG12 (DPP) and WG9 (digital twins)

CEN442 WG12 is currently developing the standards that will govern digital product passports in the future. The product information will be semantic and structured, and intelligible by machines, which requires that the functional representation of these products (the SIMBOTs) must also be represented semantically. In parallel, CEN442 WG9 (digital twins) is beginning a phase of expansion of concepts, once the basic definitions of the digital twin environment in construction have been approved.

The alignment of objectives between both standardization groups can be very fruitful, so that in a period of a few years we could see SIMBOTs being easily and versatilely integrated into project, commissioning and operation processes. At least it is worth trying, as it would mean a revolution in engineering methods, bringing the tools used today in aerospace life support environments closer to everyday designs.



[2] SCADA stands for Supervisory Control and Data Acquisition, which is a system of software and hardware used to monitor and control industrial processes in real time. It collects data from remote equipment, displays it on a central dashboard, and allows operators to remotely control machinery to optimize efficiency and safety.

[3] “deck” refers to MatDeck's SCADA toolbox provides ready-made instruments and drag-and-drop functionality for creating SCADA panels without needing to code

[4] OPCUA OPC is an industrial communication standard that enables different software and hardware in industrial automation to exchange data. The protocol uses a client-server model and is crucial for Industry 4.0 and IIoT applications by providing secure and standardized data exchange between devices from various manufacturers.

Pablo Vicente-Legazpi, Paula de Miguel, Robert Birke, Miguel LopezPages 19 - 23

Stay Informed

Follow us on social media accounts to stay up to date with REHVA actualities

0

0 product in cart.products in cart.