Configure interactions

the AI Suite products interact with different products or components to govern and operate flows of data for operations such as: transformation, data correction, data entry, audit trail, reconciliation, report generation, administration, etc.

The following is a typical scenario that illustrates some of these interactions:

  • An input event file arrives in a folder
  • AccountingIntegrator reads the file and processes the accounting transformation of this file to provide postings.
  • If some input events are rejected, they are sent to InterPlay for manual correction. When the input events are fixed, they are generated for transformation.
  • The postings are sent to the General ledger
  • Traces have been generated during the transformation. The traces are sent to the Datastore that imports them to constitute an audit trail.

There are two methods to deal with these interactions:

  • Publish REST APIs to react to incoming events.
  • Use the Event Processor to react to incoming events by automatically triggering associated actions.

Configure automatic flows

Interactions can be managed automatically using the Event Processor module. This is an additional module that is embedded into the AI Suite products and the other components that interact with it. Its role is to react to incoming events by triggering associated actions, thus automatically managing dataflows/interactions without using scripts.

Another method to deal with these interactions is to publish REST APIs to react to incoming events. However, Event Processor brings more value in the context of a cloud application.

AI Suite products interact to operate flows of data for operations such as: transformation, data correction, data entry, audit trail, reconciliation, report generation, administration, etc. An interaction can also imply just one component. As, for example, an import in Interplay or Datastore, or exports triggered by time events with results being copied on S3 or on the local shared folder.

Event processor reacts to incoming events, generated by a file, time or REST listener which is the entry point for a dataflow:

  • It evaluates the event properties (user, system) based on rules defined in Designer.
  • It identifies the dataflow.
  • It reacts by triggering the actions/operations defined for the identified dataflow.

The incoming events that can be managed by the Event Processor are:

  • Files that arrive into a folder
  • Scheduled time event
  • REST event produced by REST calls

The actions represent the actual application service execution (like import/export a file, transformation, import audit, etc). The actions that can be triggered by the Event Processor depend on the components involved. Each Event processor is configured to define the actions to trigger depending on incoming events and action results. This configuration is defined by the user in the Designer then deployed to the components. Two configuration steps are necessary:

According to the configuration, the possible actions are:

  • Import/export a file
  • Trigger a transformation
  • Move a file to a folder
  • Send an event to another component using a REST call

When an action is executed locally by the component, the component must collect results and provide them to the Event Processor that can decide the new actions to execute.

The following image is a general view of the Event Processor:

Related Links