ML Single Model Evaluator
This unit is a Source unit type.
The ML Single Model Evaluator provides data on the name and parameters of Machine Learning models that have been updated. When you have more than one model in your domain, you will need to create a Single Model Evaluator for each one, as it matches with the model name.
This unit is linked to the ML Update Notifier unit to garner data on updates made on Machine Learning models in the domain.
An event enters via the model port to signal to the unit that an update has been detected.Â
Then, the unit receives information on the fields that have been updated via the data port.
Updates detected will be emitted as events via the out port.
If the fields checked for updates do not match those configured in the unit linked to the ML Single Model Evaluator, they will exit via the discarded port.
If an error occurs, the input event is enriched with new fields describing the problem, and the event is sent through the error port.
Configuration
After dragging this unit into the Flow canvas, double-click it to access its configuration options. The following table describes the configuration options of this unit:
Tab | Field | Description |
---|---|---|
General | Name | Enter a name for the unit. It must start with a letter, and cannot contain spaces. Only letters, numbers, and underscores are allowed. |
Description | Enter a description for the unit. | |
Model Name | Enter the name of the Machine Learning model you wish to receive the updates of. This name should match exactly with the name of the machine. Use a separate Single Model Evaluator unit for each individual model. | |
Out result field | Enter the name of the output event field containing the results | |
Overwrite | Toggle ON to overwrite an input field if one should already exist with the same name. |
Input ports
Port | Description |
---|---|
data | Events providing the parameters that have been updated. |
model | Events signalling that a model has been updated. |
Output ports
Port | Description |
---|---|
out | Outputs events successfully detecting updates. |
error | Outputs events that contain errors, enriched with standard error fields. |
discarded | Outputs events containing information on fields that did not match. |
Example
Imagine you have an Iris Petal machine learning model in your domain and you wish to know when updates have been carried out, and the exact fields that have been changed.
You can use the ML Single Model to receive real-time information on all updates for the specific model.
For this, add a Tick unit to your Flow to configure the fields you wish to view changes on. Then, add a ML Single Model Evaluator to the canvas and link the data port to the out port of the Tick unit. Finally, add an ML Update Notifier unit to the canvas and link its data port to the model port of the ML Single Model Evaluator unit.
In the Tick unit properties, go to the Fields tab and click the + icon to configure the fields of the model to check. The names and types must match exactly with those of the ML model. In this Iris Petal example, configure the following 4 fields:
In the ML Single Model Evaluator unit, enter the name of the Model to check for updates, as well as the name for the output event.
You can try this flow by downloading this JSON file and uploading it to your domain using the Import option.
Â