For the list of all supported Decision Insight downloads and releases, see the Downloads page.

How to build a Flow monitoring application with Decision Insight

This document is a best practices guide for building applications with Decision Insight


Setting up the basics

New application

Create an application , by giving it a name applicable to the monitoring context.

Select the creation time of the application : it should be the earliest date at which you expect to inject data into the system.
The creation time of the application will be used as the Default applicability date  for the application, which is the starting point for data capture, analysis and dashboards.


Spaces are used to organize what you create, and are the support for modularity. A good structure allows application to be easier to manage and evolve.

If you plan for multiple applications, a good practice would be to prefix all names with a codename for the application, for example ABC.

As a starting point, the following set of spaces is recommended :

ABC Classifiers Definition of Classifiers.
ABC Instances For instances of configuration entities that can be imported / exported along with the application. Thresholds configuration is also usually stored in this space.
ABC Initializer For integration routes, mappings, data, etc. for initializing the application. This space might be optional when deploying an application into production, except for maintenance of configuration & reference data.
ABC Integration For integration routes, mapping for actual data sources.
ABC Model Definition of the Entities, Attributes, Relations, Keys.
ABC Rhythms Definition of the Rhythms.

For information, this configuration will lead to the following spaces dependencies:


Identify the required & applicable rhythms from a functional standpoint.

  • Create rhythms in the Rhythms space.
  • For a typical intraday monitoring application, applicable Rhythms are :
      • 10 minutes
      • 1 hour
      • 1 day


Define the Entity Model based on the functional analysis of the application to be built.

This is one of the most critical and structural part of building the application, as the Entity Model is the cornerstone of Data Integration, Indicators and Dashboards configuration.

Some configuration best practices include:


  • Create all entities, attributes, relations and keys in the  Model  space.

Relations cardinalities

  • 1-n relations are always configured from the entity with the 'n' cardinality.
    • e.g. If orders have a relation to a single customer (a customer potentially having multiple concurrent orders), the relation should be defined on the "Order" entity targeting the "Customer" entity.
  • n-m relations should be configured from the entity with the highest probable cardinality.

Temporal attributes

  • Do not define temporal attributes for Creation / Deletion / Status Change of entities.
    • Creation and Deletion are native systematic attributes of Decision Insight's model. Status change should be modeled through relation value changes.
  • Valid use case for temporal attributes are additional information that represent functional deadlines, value dates, expected processing, etc.

Entities type

  • Carefully choose the Entity storage type . Basic rules for selecting the correct type are:
    • Configuration for entities whose instances are part of the configuration of the application, and as such need to be part of the export, or for a globally stable entity, typically configuration related (for example: step, channel, stage, category, process).
    • Transaction  for an entity representing appearing and disappearing items (for example: transaction, cutoff, period, batch).

"Global" entity

  • Plan for a "Global" entity, a singleton entity used for aggregations and analysis on the overall activity.
    • All other entities of the model that need to be aggregated should be linked to the Global entity, directly or indirectly

Data Integration

Perform data integration pieces for both Initializer (resources and routes for configuration - a.k.a. static - data) and Integration (transactions) data.


Prepare CSV files for Configuration data

Create specific routes and mappings for feeding the configuration entities

  • Ensure that these instances are associated with the "Instances" space, in order to be properly included in the application export file
    • Do not forget the "setSpace" operation in your mappings !
  • Ensure that instances of entities linked the the Global entity have proper link to the singleton instance
    • You can use a hardcoded key in your mappings, using a constant named Global

Using the proper creation time for configuration instances

For first initialization, create configuration entities at application start.

For subsequent changes & updates to the configurations entities, an additional attention need to be put on the time at which the changes need to be applied:

  • Adding a new instance
  • Removing an instance
  • Updating an instance

Managing thresholds for configuration instances

It is also a best practice to include them the Initialization data, routes, mappings by including these as additional columns.

This gives the customer an easy way to edit thresholds manually, for example in excel, and to inject the updated values into the application as needed.

Transaction data

Build and configure the routes, resources, connectors and mapping for integrating the Transaction data, depending on your specific context (flat files, message queues, web services, databases, etc.).

  • Take care of times for resolution and operations in mappings
  • Make sure to handle termination of transactions, in order to avoid 'poluting' the system with live data
    • Terminated transactions cannot be reopened, so be careful in identifying 'final' statuses that should lead to termination

Add an history log for transactions, using a specific "Log" entity. This log allows to conveniently display the evolution of a transaction over time, including the status changes, users in charge, etc.

First set of views

Build a set of views to allow displaying / searching live transactions list and accessing the details for a specific transaction.

    • Display transaction entities live @ instant, ordered by age descending
    • Add interesting columns in the context
    • Add filters
  • Create a TRANSACTION DETAIL view
    • Display all attributes related to a transaction, including interesting time information (creation, completion, etc.)
    • Configure a drill-down from the SEARCH view(s) to the TRANSACTION DETAIL view
    • Take care of proper time configuration for all displayed attributes, in order for the dashboard to support displaying Live and Completed transactions
      • Usually, the proper time configuration is to correlate with Transaction and retrieve attribute when the instance of Transaction is last available [...].
    • Include a view on the history log of the transaction.

Depending on context, you might need to also create a  SEARCH FOR ACQUIRED TRANSACTIONS  and/or  SEARCH FOR COMPLETED TRANSACTIONS  view, displaying entities that were acquired / completed during the lookup interval.

  • Data Grid on Current Day interval (or whichever interval makes sense: last 2 days, last 7 days, etc.): 
    • Filter transactions entities using time configuration on Time Range interval, using Begin During (acquired) or End During (completed) 
    • Add interesting columns in the context - be careful of setting the correct time configuration for each attribute
    • Add filters - be careful of setting the correct time configuration for each attribute

The two search views should point to the same TRANSACTION DETAILS dashboard.

Application Foundation

Build a set of dashboards and indicators to provide a first overview of available data and processing flow, as well as a foundation for building the application (set of basic indicators).


The application foundation focuses on three aspects of the processing flow: Acquisition, Processing and Completion.

It relies on a set of indicators and is built on the Rhythms:

Acquisition: count of transactions that entered the process:

  • Every 10 minutes, during the last 10 minutes.
  • Every 10 minutes, since the beginning of the day.
  • Every day, at the end of the day.
  • Every hour, during the last hour.
  • Every hour, since the beginning of the day.

In process: count of transactions currently in process

  • Every 10 minutes, at instant.

Completion: count of transactions that completed the process:

  • Same set of indicators than acquisition, only with a different time conf.

For Acquisition, In process and Completion, also configure a second set of the same indicators but substituting the "count of transactions" with the "total amount of transactions" as the basis for analysis.

It is usually useful to create this set of indicator on [Global] entity, as well as on the [Step] entity (or equivalent in you application design).


This is only a proposal of basic set of views to provide, and remains subject to analysis and adaptation depending on the context.

Following are only example of layout for views of the application foundation. They need to be adapted to the specific context of the observed activity.


  • Display profile of acquisition, in process and completion throughout the day, at a 10 minutes interval. For acquisition and completion, include both the cumulative and non cumulative set of indicators. Include both set of indicators based on transactions count and transactions total amount.
  • Enhance the graphs by adding for comparison the same intraday values, timeshifted to -1 and -2 weeks (depending on the available data range).


  • Same set of indicators as the INTRADAY OVERVIEW: GLOBAL, but using Step as dimension. Because there are multiple steps, you can rely on a table with sparklines to overview the profiles, and then drill-down to the "INTRADAY OVERVIEW: STEP DETAILS" dashboard.


  • Same set of indicators and same layout as the INTRADAY OVERVIEW: GLOBAL, but using Step as a dashboard parameter and as dimension.


  • Display profile of acquisition, in process and completion by hour during the last week. Include both set of indicators based on transactions count and transactions total amount.


  • Display profile of acquisition, in process and completion by day during the last 60 days. Include both set of indicators based on transactions count and transactions total amount.

Analytics Views

Based on the functional analysis, configure indicators and views.

  • Thresholds & Constants should be put in the Instances space
  • Indicators can either be put in the space of the view they belong to, or in a dedicated Analysis space in the application

It is also a good practice to add to the application Dimension Details views (e.g. "Customer Details", "Channel Details", "Message Type Details", ...) in order to drill-down / zoom on specific perspective.

Additional tips & best practices

Importing & Exporting (w/ multiple fragments)

In order to move the application between instances of the product, you should now be able to export it in two fragments:

  • Part 1 : "Core" : contains rhythms, model, configuration instances & data integration
  • Part 2 : "Analysis" : Contains the views and indicators

When importing on the target node, you can now follow these steps:

  1. Import the CORE fragment (containing the rhythms, the model, the configuration instances & data integration)
  2. Inject historical data (if any)
    1. Wait until the initial injection phase is over
    2. Potentially, flush the pending Availability Events by running the lla command in the node shell
  3. Import the ANALYSIS fragment
    1. This will trigger computation of the indicators from the start date of the application

Related Links