How to build a Flow monitoring application with Decision Insight

This document is a best practices guide for building applications with Decision Insight.


Setting up the basics

New application

Create an application, by giving it a name applicable to the monitoring context. For information about how to create an application, see The application.

Select the creation time of the application. The date should correspond to the earliest date at which you expect to inject data into the system.
The creation time of the application will be used to add an Initial application version, which is the starting point for data capture, analysis and dashboards.


Spaces are used to organize what you create, and are the support for modularity. A good structure makes your application easier to manage and evolve.

As a starting point, the following set of spaces is automatically created whenever you create a new application:

Space Description
Classifiers Definition of classifier attributes. For more information about classifiers, see Classifier.
Instances For instances of configuration entities that can be imported and exported along with the application. Thresholds configuration is also usually stored in this space.
Initializer For integration routes, mappings, data, etc. to initialize the application. You may not need to provision this space when deploying an application into production, except for maintenance of configuration, and reference data.
Integration For integration routes, mapping for actual data sources.
Model Definition of the entities, attributes, relations, keys.
Rhythms Definition of the rhythms. For information about rhythms, see Rhythm.

For information, this configuration will lead to the following spaces dependencies:


Identify the required & applicable rhythms from a functional standpoint.

  • In Configuration > Rhythms, create rhythms in the Rhythms space.
  • For a typical intraday monitoring application, applicable rhythms are:
      • 10 minutes
      • 1 hour
      • 1 day


Define the model of your application based on the functional analysis of the application to be built.

A model is the data model of your monitored process in Decision Insight. The model is the central component that describes what information is available to the application and includes the description of entities, their attributes and how they relate to each other. These relationships are the glue that binds an application together.

The model is the cornerstone of Data Integration, Indicators and Dashboards configuration.

Some configuration best practices include:

  • Create all entities, attributes, relations and keys in the Model space.
Relations cardinalities
  • When you create relations attributes, i.e. attributes that configure the relation between one entity to the other, always configure one-to-many (1-n) relations from the entity with the 'n' cardinality.
    • For example, if purchase orders have a relation to a single customer (a customer potentially having multiple concurrent orders), the relation should be defined on the Order entity targeting the Customer entity.
  • Many-to-many (n-m) relations should be configured from the entity with the highest probable cardinality.
Temporal attributes
  • Do not define temporal attributes to track the creation / deletion / Status Change of entities.

    • Creation and Deletion are native systematic attributes of the DI model. 

    • To monitor the status change of an entity, we recommend monitoring changes to the value of relation attributes.

  •  A valid use case for using temporal attributes is when you require additional information that represent functional deadlines, value dates, expected processing, etc.

Entity type
  • Carefully choose the Entity storage type . Basic rules for selecting the correct type are:
    • Configuration for entities whose instances are part of the configuration of the application, and as such need to be part of the export, or for a globally stable entity, typically configuration related (for example: step, channel, stage, category, process).
    • Transaction  for an entity representing appearing and disappearing items (for example: transaction, cutoff, period, batch).
Global entity
  • Plan for a Global entity in your model, a singleton entity used for aggregations and analysis on the overall application activity.
    • All other entities of the model that need to be aggregated should be linked to the Global entity, directly or indirectly.

Data Integration

Perform data integration pieces for both Initializer (resources and routes for configuration - a.k.a. static - data) and Integration (transactions) data.


Prepare CSV files to upload configuration data.

Create specific routes and mappings for feeding the configuration entities

  • Ensure that these instances are created in the Instances space,so that they can be properly included in the application export file.
    • Do not forget the setSpace operation in your mappings !
  • Ensure that instances of entities linked to the Global entity have a proper link to the singleton instance.
    • You can use a hardcoded key in your mappings, using a constant named Global

Using the proper creation time for configuration instances

For first initialization, create configuration entities at application start.

For subsequent changes & updates to the configurations entities, be mindful of the times at which changes should apply when:

  • Adding a new instance
  • Removing an instance
  • Updating an instance

Managing thresholds for configuration instances

It is also a best practice to include configuration instances into your Initialization data, routes, mappings by including these as additional columns.

This will give your users an easy way to edit thresholds manually, for example in excel, and to inject the updated values into the application as needed.

Transaction data

Build and configure the routes, resources, connectors, and mappings responsible for integrating the Transaction data, depending on your specific context (flat files, message queues, web services, databases, etc.).

  • Be mindful of the times for resolution and operations in mappings
  • Ensure you are handling the termination of transactions, in order to avoid 'poluting' the system with live data
    • Terminated transactions cannot be reopened, so be careful in identifying the final status that should lead to an instance termination.

Add a history log for transactions, using a specific Log entity. This history log will enable you to conveniently display the evolution of a transaction over time, including the status changes, users in charge, etc.

First set of dashboards

Build a  set of dashboards to allow displaying the list of live transactions, as well as to search for and access the details of a specific transaction.

  • Create a SEARCH FOR LIVE TRANSACTIONS dashboard.
  • Display transaction entities live at instant, ordered by age descending.
    • Add interesting columns in the context.
    • Add filters.
  • Create a TRANSACTION DETAIL view.
    • Display all attributes related to a transaction, including interesting time information (creation, completion, etc.)
    • Configure a drill-down from the SEARCH view(s) to the TRANSACTION DETAIL view
    • Take care of proper time configuration for all displayed attributes, in order for the dashboard to support displaying Live and Completed transactions
      • Usually, the proper time configuration is to correlate with Transaction and retrieve attribute when the instance of Transaction is last available [...].
    • Include a view on the history log of the transaction.

Depending on context, you might need to also create a  SEARCH FOR ACQUIRED TRANSACTIONS  and/or  SEARCH FOR COMPLETED TRANSACTIONS  view, displaying entities that were acquired / completed during the lookup interval.

  • Data Grid on Current Day interval (or whichever interval makes sense: last 2 days, last 7 days, etc.): 
    • Filter transactions entities using time configuration on Time Range interval, using Begin During (acquired) or End During (completed) 
    • Add interesting columns in the context - be careful of setting the correct time configuration for each attribute
    • Add filters - be careful of setting the correct time configuration for each attribute

The two search views should point to the same TRANSACTION DETAILS dashboard.

Application Foundation

Build a set of dashboards and indicators to provide a first overview of available data and processing flow, as well as a foundation for building the application (set of basic indicators).


The application foundation focuses on three aspects of the processing flow: Acquisition, Processing and Completion.

It relies on a set of indicators and is built on the Rhythms:

Acquisition: count of transactions that entered the process:

  • Every 10 minutes, during the last 10 minutes.
  • Every 10 minutes, since the beginning of the day.
  • Every day, at the end of the day.
  • Every hour, during the last hour.
  • Every hour, since the beginning of the day.

In process: count of transactions currently in process

  • Every 10 minutes, at instant.

Completion: count of transactions that completed the process:

  • Same set of indicators than acquisition, only with a different time conf.

For Acquisition, In process and Completion, also configure a second set of the same indicators but substituting the "count of transactions" with the "total amount of transactions" as the basis for analysis.

It is usually useful to create this set of indicator on [Global] entity, as well as on the [Step] entity (or equivalent in you application design).


This is only a proposal of basic set of views to provide, and remains subject to analysis and adaptation depending on the context.

Following are only example of layout for views of the application foundation. They need to be adapted to the specific context of the observed activity.

  • Display profile of acquisition, in process and completion throughout the day, at a 10 minutes interval. For acquisition and completion, include both the cumulative and non cumulative set of indicators. Include both set of indicators based on transactions count and transactions total amount.
  • Enhance the graphs by adding for comparison the same intraday values, timeshifted to -1 and -2 weeks (depending on the available data range).
  • Same set of indicators as the INTRADAY OVERVIEW: GLOBAL, but using Step as dimension. Because there are multiple steps, you can rely on a table with sparklines to overview the profiles, and then drill-down to the "INTRADAY OVERVIEW: STEP DETAILS" dashboard.
  • Same set of indicators and same layout as the INTRADAY OVERVIEW: GLOBAL, but using Step as a dashboard parameter and as dimension.
  • Display profile of acquisition, in process and completion by hour during the last week. Include both set of indicators based on transactions count and transactions total amount.
  • Display profile of acquisition, in process and completion by day during the last 60 days. Include both set of indicators based on transactions count and transactions total amount.

Analytics Views

Based on the functional analysis, configure indicators and views.

  • Thresholds & Constants should be put in the Instances space
  • Indicators can either be put in the space of the view they belong to, or in a dedicated Analysis space in the application

It is also a good practice to add to the application Dimension Details views (e.g. "Customer Details", "Channel Details", "Message Type Details", ...) in order to drill-down / zoom on specific perspective.

Additional tips & best practices

Importing and exporting (w/ multiple fragments)

In order to move the application between instances of the product, you should now be able to export it in two fragments:

  • Part 1 : "Core" : contains rhythms, model, configuration instances & data integration
  • Part 2 : "Analysis" : Contains the views and indicators

When importing on the target node, you can now follow these steps:

  1. Import the CORE fragment (containing the rhythms, the model, the configuration instances & data integration)
  2. Inject historical data (if any)
    1. Wait until the initial injection phase is over
    2. Potentially, flush the pending Availability Events by running the lla command in the node shell
  3. Import the ANALYSIS fragment
    1. This will trigger computation of the indicators from the start date of the application

Create a new application version

Once your application is ready to be delivered on your production environment, create a new version of your application from the Applications screen.


Related Links