Event-driven control

The core products of AI Suite can be triggered by three types of events:

  • File event – A file event is triggered when files with a name matching a defined list of patterns are stored in a folder.
  • Time event – An event is triggered at a certain time.
  • REST event – A REST event is triggered by a REST call executed on a component.

The Event Processor module, available in each core product, processes events by applying rules that execute actions when activated by the events. Examples of actions are: applying an accounting transformation on a file, importing a file to InterPlay or Datastore, exporting files from InterPlay or Datastore, moving a file to a folder, or sending a REST event.

The rule execution generates result objects that are also processed by the Event Processor. For instance, applying an accounting transformation on a file will generate one result object per generated files -reject, output, or audit. A rule on a reject result object can send an event to InterPlay to import the file allowing a user to correct it manually.

The following diagram illustrates how the AI Suite products work together when controlled by events:

The following describes of a typical scenario:

Accounting Transformation

  1. An input file and a meta property file arrive in a folder that is monitored by the Rule Engine Server component of AccountingIntegrator.
  2. A File event is generated and processed by the Event Processor of the Rule Engine Server. The Event Processor triggers a rule to process the file.
  3. The Rule Engine Server selects an available Rule Engine or creates a new one. The meta properties are used to identify the flow and the accounting rules that must be used. The available Rule Engine is configured accordingly. The configuration is found in the repository.
  4. The Rule Engine processes the input file and generates various type of files:
    • Output files containing the accounting entries that will be posted to the General ledger and other financial systems
    • A reject file containing the input data that have errors
    • Audit files containing the traces of the transformations that have been performed by the Rule Engine
    • Reports that log information on the transformation process
  5. Result objects are generated and processed by the Event Processor of the Rule Engine Server. Rules are executed to handle the reject file, the output files and the audit files. For instance, a REST event is sent to InterPlay and Datastore to import the reject file and audit files respectively.

Correction of errors

  1. InterPlay receives a File event that is processed by the Event Processor of InterPlay. A rule is executed to import the file.
  2. A user manually corrects the errors in the file using the InterPlay user interface, then set the status of the file to ready.
  3. A Time event is generated every 10 minutes. This event is processed by the Event Processor that triggers a rule to export all files with status ready and generates a meta property file. The file is stored in a folder that is monitored by the Rule Engine Server.
  4. The Rule Engine Server processes the input file.

Storage of Audit traces

  1. Datastore receives a File event that is processed by the Event Processor of Datastore. A rule is executed to import the file.
  2. Business users use the Datastore user interface to navigate into the audit trail.

Visibility

  1. Each product sends events to Sentinel representing its activities.
  2. Using the Sentinel user interface, the operator monitors the execution of the flow through dedicated queries and dashboards.

Security

  1. The user interfaces of Datastore and InterPlay access PassPort to authenticate the users and verify their privileges.
  2. Rule Engine Server, Datastore, and InterPlay authenticate and verify the privileges of the user that calls the API to process the operations.

To implement custom scenarios, use Designer to define flow management rules. Then store the rules in the repository. The Event Processor installed in each product accesses the repository to update its configuration.

The Event Processor installed in each product stores the flow events that have been triggered in the repository database. Using the administration tool, it is possible to follow the executed flows from end to end across the products.

The event-driven control architecture provides a way to handle simple flows involving mainly the AI Suite core products. At runtime, each product is configured to execute its portion of the flow.

Related Links