Get started with AI Suite

AI Suite defines the interaction between Axway products including AccountingIntegrator, Datastore, and InterPlay.

The following procedures illustrate an end-to-end scenario that uses the samples provided with the AI Suite products:

Prerequisites

Before you start, make sure that:

  • Components have been installed and started. For AccountingIntegrator, select also the Rule Engine Server component.
  • The AI Enabler has been setup with a user and a folder.

Samples provided

To help you get started or to help you define a configuration based on an existing one, the AI Suite products provide samples that can be used by different services.

The samples are provided as predefined configurations, as follows:

  • Get started with AI Suite contains configuration samples for the Rule Engine, Datastore, and InterPlay.
  • Flow management for AI Suite contains the Event Processor configuration describing the interaction between the Rule Engine Server, Datastore, and InterPlay

For details about each configuration, see Samples description.

Setup the Interaction flows

Generate and import the AI Enabler configuration

You need to import the Composer configuration into the Repository so it can be used by the Rule Engine Server to configure the Rule Engine used. This configuration is named aiConfiguration.

To create the initial folder and the configuration for the sample in Composer:

  1. Open the Topography workbench.
  2. Create a comm network and a host.
  3. Create a new Axway Server of type AccountingIntegrator - Repository:
    • Complete the Agent host and port (the default port is 4801).
    • In the Parameters tab, specify the Application name and the Configuration name. These parameters identify the AI configuration that will be imported into a repository in a unique way. Setup also the generation folder with a path were the configuration generated will be saved. In this scenario, use:
      • Application name: default
      • Configuration name: finance
      • Generation folder: <a path of your choice>
  4. Open the Integration Services workbench.
  5. Import the sample composer configuration provided in the composer.xml file from the [AccountingIntegrator_install_path]/AccountingIntegrator/getStarted/AccountingIntegration/configuration directory. On import, use the update the folder hierarchy option. A new folder GetStarted_<AIversion> will be created.
  6. Make a recursive check on all Input-Events.
    You can use the Entity Browser (F7) to validate the objects. All objects in the folder must be in "Checked" status.
  7. Send to the server all the Input-Events using the AccountingIntegrator - Repository server created. The command will generate the aiConfiguration files in the folder [Generation_folder_path]/enables/AccountingIntegrator/AISRepository/default/aiConfiguration.
  8. Go to the Repository installation path and start the Repository console. Import the AI configuration deployed from Composer by using the importAIConfiguration command which takes as argument the path to the aiConfiguration folder.

Transform the Composer configuration for Designer and deploy it

  1. In Composer open the Integration-Services workbench.
  2. From the Tools menu, select Finance > Finance XML Export.
    • In the export window, choose the GetStarted_AI folder. You can check Input-Event, Financial-Event or Financial-Procedure.
    • From the Export Preferences menu item, check Include Input-Event structure.
    • Click Launch XML data export of selected data from the toolbar.
    • Select the path for the export and click Finish.
  3. Go to Designer Tools <Suite_installation_path>/AIS/Designer/extra/Tools
  4. Open ant-task.properties and search for the Transform Accounting Integrator configuration action.
    • Setup TransformConfigurationSourceDirectoryName to the folder where the finance export was done
    • Setup TransformConfigurationExportDirectoryName to the folder where the transformed configuration will be saved
    • Caution Make sure the folder exists, otherwise the transformation will fail.
    • You can leave TransformConfigurationAnnotationFileName empty
    • For details on the ant transformConfiguration command, see the README.txt file.
    • In the ant-task.properties file, make sure you setup correctly the ApplicationUser and ApplicationPassword.
  5. Run the command with ant transformConfiguration.
  6. In Designer, go to Load Configuration, select Base for Accounting Integration and load it.
  7. Use Import project to import the transformed configuration you just obtained over this base configuration.
  8. Select all items of the configuration and click Deploy.
    • Select the application to which you want to deploy the configuration (default application).
    • Choose Datastore and InterPlay.
    • Launch the deployment.

You can also load the sample business configuration in Designer. For details, see Load the business configuration into Designer and deploy it.

Load the business configuration into Designer and deploy it

  1. In Designer, go to Load configuration and select "Get started with AI Suite".
  2. Deploy the configuration to Datastore and InterPlay:
    1. Select the configuration to deploy.
    2. Click Deploy.
    3. Select the application to which you want to deploy the configuration (default application).
    4. Choose Datastore and InterPlay.
    5. Launch the deployment.

Load and deploy the Event Processor configuration

  1. Click on Configuration management.
  2. Add a new configuration named AI Suite interaction.
  3. Switch to the new created configuration.
  4. Click on Load configuration and select the "Flow management for Accounting Integration" predefined configuration.
  5. Select all items and click Deploy.
    • Select the interaction application.
    • Launch the deployment.

Configure listeners in Administration

Connect to Administration and go to the Components > Listeners tab.

  1. From the Listener list, click Add File Listener to add a new file listener.
  2. Complete the listener fields:
    • Component: AIRuleEngineServer
    • Interaction: Accounting Integration
    • Flow: StartProcessingSession1
    • Folder: a path to a folder where you will copy the files
    • User Id: the user identifier
    • User Domain: the user domain
  3. Add a file pattern:
    • Key: dataFile
    • File: IEvent.seq
  4. Save your changes.

Process the Input Events files and see the results

  1. Check that theRule Engine Server is started.
  2. Copy the InputEvent.seq file from [Install_path]/AccountingIntegrator/getStarted/withAccountingIntegration/data/ to the folder specified in FileListener.
  3. Connect to Administration and select the Flows tab. In the events list you can visualize the Event Processor's events. For an interaction you should have the following events:
    • On StartProcessingSession1 flow: A FileEvent created when the file arrives in the folder
    • On DataProcessingSession1 flow:
      • ProcessResult for the result of the first Rule Engine session
      • A FileResult for each file generated by the Rule Engine:
      • 1 FileResult for the rejects
      • 2 FileResult events for the Input and Output traces
      • 6 FileResult events for the Output events for each Output Collection: IAS, IAS_GL, GL, LOCAL, SESSION2, NULL_FEES
      • 2 FileResult events for logging information: RuleEngine.log, rdjfic.msg
      • 7 FileResult events for reporting on LOCAL collection
    • On RecyclingSession1 flow:
      • 1 FileEvent
      • 1 SendEventResult
      • 1 ImportResult for the result of the rejects imported to InterPlay
    • On AuditStorageSession1 flow:
      • 2 FileEvent
      • 2 SendEventResult
      • 2 ImportResult for the result of the import of the Input and Output traces of the first session in Datastore
    • On DataProcessingSession2 flow:
      • ProcessResult for the result of the second Rule Engine session
      • A FileResult for each file generated by the Rule Engine:
      • 2 FileResult events for the Input and Output traces
      • 1 FileResult for the redirected Input event
      • 1 FileResult for the output events for each Output Collection: DESTAMORT
      • 2 FileResult events for logging information: RuleEngine.log, rdjfic.msg
      • 4 FileResult events for reporting on LOAN_AMORT collection
    • On AuditStorageSession2 flow:
      • 2 FileEvent
      • 2 SendEventResult
      • 2 ImportResult for the result of the import of the Input and Output traces of the second session in Datastore

Prepare Datastore

You can import predefined queries or create your own queries based on sample queries.

Import the sample queries

  1. Start the Configuration Repository Console.
  2. Launch the following command:
    importQueries -f [Install_path]/Datastore/getStarted/withAccountingIntegration/queries/queries.xml -c DatastoreClient
  3. where DatastoreClient indicates the name you gave to the Datastore Client component when you installed it.

The following queries are imported:

  • Audit/Get all ACC_ME_Audit
  • Audit/Get all TAM_ME_Audit
  • Audit/Get all ACC_ME_Audit per CUSTOMER
  • Audit/Get all TAMORT_Audit
  • Audit/Get all NOM_ME_Audit
  • Audit/Get CUSTOM1 from TAM_ME_Audit
  • Audit/Get IBAN1 from ACC_ME_Audit
  • Audit/Get CUSTOM1 from NOM_ME_Audit
  • Audit/Get NOFILE002 from Release_Audit
  • Audit/Get CUSTOM2 from TAMORT_Audit
  • Audit/Get all FEES_Audit
  • Audit/Get all RELEASE_Audit
  • Audit/Get PERIOD content
  • Audit/Get Children
  • Audit/Get Ancestors
  • Audit/Get Parents
  • Audit/Get Descendants
  • Audit/Get all Collections
  • Reconciliation/RECO_LOCAL_CURRENCY_ViewResults
  • Reconciliation/get All ACC
  • Reconciliation/get all GL0
  • Reconciliation/get all ERRORS
  • Reconciliation/RECO_INIT_CURRENCY_ViewResults
  • Reconciliation/Get all COMMENTS
  • Reconciliation/listReconciliationCollections
  • Reconciliation/getSource1
  • Reconciliation/getSource2
  • Reconciliation/getSources
  • Reconciliation/getReconciliationResults
  • Reconciliation/RECO_LOCAL_CURRENCY
  • Reconciliation/RECO_INIT_CURRENCY

Inject the period data files

  1. Go to the Datastore Runtime console dstools: [Install_path]/Datastore/DatastoreRuntime
  2. Inject the data for partition management using this command: dstools inject -c PERIOD_COLLECTION ../getStarted/withAccountingIntegration/data/period.dat.

Check results

Execute queries on imported traces

  1. Connect to the Datastore Client.
  2. Execute the imported predefined queries.
    Results are displayed in the P1 panel:
Query name Number of lines
Get all ACC_ME_Audit 372
Get all FEES_Audit 249
Get all NOM_ME_Audit 34
Get all RELEASE_Audit 77
Get all TAM_ME_Audit 2040
Get all TAMORT_Audit 34
Get CUSTOM1 from NOM_ME_Audit 3
Get CUSTOM1 from TAM_ME_Audit 180
Get CUSTOM2 from TAMORT_Audit 3
Get IBAN1 from ACC_ME_Audit 14
Get NOFILE002 from RELEASE_Audit 4
Get PERIOD Content 108
Get all Collections 5

Export a query result to an Excel file

In Datastore Client, when viewing a query result, you can export it to an Excel file:

  1. From the main toolbar, select MoreExport into Excel.
  2. Check that the data in the original table has been exported to the Excel file.

Import the report templates

The delivered sample contains two report templates written to take into account the formats defined in the Finance1 configuration.

The reports provided are:

  • Report_ACC_ME_Audit.rptdesign, Report_ACC_ME_Audit.xml
  • Report_ACC_ME_AuditMasterDetails.rptdesign, Report_ACC_ME_AuditMasterDetails.xml

In the Report_ACC_ME_AuditMasterDetails.xml, you will find the subqueries that are part of the master/details report template.

<subqueries>

<subquery>./Audit/Get all ACC_ME_Audit</subquery>

<subquery>./Audit/Get all ACC_ME_Audit per CUSTOMER</subquery>

</subqueries>

In the Repository console, execute the command to update the configuration with the report templates:
importReportTemplates

[Install_path]/Datastore/getStarted/withAccountingIntegration/
The report templates are now available in the Datastore Client.

Generate reports

In Datastore Client, when viewing a query result, you can generate a report with that data.

To use the report templates:

  1. Connect to the Datastore Client.
  2. Execute the Get all ACC_ME_Audit query.
  3. From the main toolbar, select More > Generate report
    • Select report template
    • Give a name to your report
    • Select the report type
    • Press Generate
  4. A notification appears with the link for your report.
  5. Check that the data in the report represents the data from the original table.

Recycle data

To correct the rejected events in InterPlay:

  1. Connect to the InterPlay user interface.
  2. The LOAN_CRA_1 Collection is displayed and contains 2 objects to correct:
    • One is rejected because there is no NOMINAL segment.
    • The other one is rejected because a FEES currency is not correct.
  3. Create this NOMINAL segment and modify the corresponding FEES segment to select a proper CURRENCY value.
  4. Change the collection status to ready.

Samples description

Get started with AI Suite

RuleEngine Session1

The RuleEngine Session1 handles RELEASE Input-Events to provide:

  • NOM_ME Output-Events that are transformed into TAMORT Input-Events to be handled by the RuleEngine Session2. Each RELEASE Input-Event that is processed generates one NOM_ME Output-Event.
  • ACC_ME that are General Ledger entries supposed to be sent to a General Ledger application. At least four ACC_ME Output-Events are generated for each RELEASE Input-Event that is processed.
  • Audit files that contain traces to be stored. Five audit lines are generated for each RELEASE Input-Event that is processed.
  • Rejected events that are stored and manually updated in Interplay before recycling to RuleEngine Session1. Two Input-Events are rejected in Session1.
RuleEngine Session2

The RuleEngine Session2 handles TAMORT Input-Events to provide:

  • TAM_ME Output-Events that contain the lines of the calculated amortization schedule for the loan. One line per month of amortization is provided for each TAMORT Input-Event processed.
  • Audit files that contain traces to be stored. One audit line is generated for each TAMORT line and TAM_ME line that is processed.
Input-Events in AI Enabler

The configured Input-Events in AI Enabler are:

  • RELEASE: Loan release, multi-segment Input-Event based on Business-Documents NOMINAL and FEES.
  • TAMORT: Amortization of loan.

The fictive Input-Events in AI Enabler are:

  • ACC_ME is a General Ledger entry event generated after RELEASE Input-Event and TAMORT Input-Event have been processed.
  • NOM_ME is generated when RELEASE Input-Event has been processed to be handled in a second Rule Engine session. It provides the loan amortization table transferred to the TAMORT Input-Event.
  • TAM_ME is generated when TAMORT is processed. It contains the lines of the calculated amortization schedule for the loan.
Session1 rules
Session1 Rules Name Description
Transformation-Rules RT001 Takes the Nominal amount of the Loan release (transformation of NOMINAL into ACC) into account to provide General Ledger entries.
  RT002 Takes the Fees amount types (transformation of FEES into ACC) into account to provide General Ledger entries.
  RT003

Multi sessions: Transforms NOMINAL into NOM and provides the Input-Events for the next session.

Enrichment-Rules RM001 NOMINAL Segment enrichment: Calculates the variables for RT001 and updates the REL_DATE value in the Input-Events.
  RM002 FEES Segment enrichment: Calculates the variables for RT002 and updates the REL_DATE and FEES_DATE values in the Input-Events.
Aggregation-Rules AGR_FEES Aggregates the FEES Input-Events.
  AGR_ACC Aggregates the ACC generated Output-Events.
Audit-Rules NOMINAL Displays audit trace information for NOMINAL segment in the RELEASE Input-Event.
  FEES Displays audit trace information for FEES segment in the RELEASE Input-Event.
  NOM_ME Displays audit trace information for NOM_ME generated Output-Event.
  ACC_ME Displays audit trace information for ACC_ME generated Output-Event.
Session2 rules
Session2 Rules Name Description
Transformation-Rules RT004

Calculates the amortization schedule (takes TAMORT Input -Event into account) to provide TAM_ME Output-Events.

Audit-Rules TAMORT Displays audit trace information for TAMORT Input-Event.
  TAM_ME

Displays audit trace information for TAM_ME generated Output-Event.

Reconciliation configuration
Reconciliation definition
  • Reconciliation name: REC_ALPHA_GL0_ACC
  • Source 1 details (subledger inventory)
    • Object type name: ACC_ME
    • Version: 1
    • Collection type name: LOCAL
    • Version: 1
    • Keys: ACCOUNT, CURRENCY
    • Amount: AMOUNT, AMOUNT_CUR
    • Direction: DIRECTION
    • Direction credit possible values: C ; +
    • Direction debit possible values: D ; -
    • Additional properties: FILE_ID
  • Source 2 details (general ledger position)
    • Object type name: GL0_ME
    • Version: 1
    • Collection type name: IAS_GL
    • Version: 1
    • Keys: ACCOUNT_GL, PRDT_CAT, AMOUNT_NAT, CURRENCY
    • Amount: AMOUNT, AMOUNT_CUR
    • Direction: DIRECTION
    • Direction credit possible values: C
    • Direction debit possible values: D
    • Additional properties:
  • Reconciliation common
    • Reconciliation property: RECONCILIATION_ID
    • Result name: RECONCILIATION_RESULT
Generated reconciliation formats
  • Source 1 ACCOUNT (Account number)
  • Source 1 CURRENCY
  • Source 1 DIRECTION
  • Source 1 AMOUNT (Amount in currency)
  • Source 1 AMOUNT_CUR (Amount in local currency)
  • Source 2 ACCOUNT_GL (Account number)
  • Source 2 PRDT_CAT (Product category)
  • Source 2 AMOUNT_NAT (Amount nature)
  • Source 2 CURRENCY
  • Source 2 DIRECTION
  • Source 2 AMOUNT (Amount in currency)
  • Source 2 AMOUNT_CUR (Amount in local currency)
  • Reconciliation key
  • Reconciliation amount (Source 1 AMOUNT_CUR - Source 2 AMOUNT_CUR)

Flow management for AI Suite

The AI Suite interaction scenario delivered with the Flow management for AI Suite sample is the following:

  1. A file containing the input events is copied to a folder on which a File Listener has been defined. The Event Processor embedded in the Rule Engine Server creates a FileEvent on the StartProcessingSession1 flow.
  2. The FileEvent triggers the following rule defined on this flow:
  3. processFileSession1 [FileEvent] for StartProcessingSession1 flow

    do processFileEvent(

    event : obj,

    outputFlow : "DataProcessingSession1",

    configurationId : "finance",

    applicationName : "default",

    generateRejects : "yes",

    singleLine : "no",

    collectionTypeName : "LOAN",

    collectionTypeVersion : 1,

    session : "session1")

  4. The rule executes the processFileEvent operation and the results of the operation are pushed by the Rule Engine Server to the Event Processor. The following rules are executed:
  5. processFileSession2 [FileResult] for DataProcessingSession1 flow

    if obj.outputType="OUTPUT_EVENT" and obj.collectionTypeName="SESSION2"

    then

    processFileEvent( event : obj,

    outputFlow : "DataProcessingSession2",

    applicationName : obj.applicationName,

    configurationId : obj.configurationId,

    collectionTypeName : "LOAN_AMORT",

    collectionTypeVersion : 1,

    session : "session2")

    sendRejectsSession1 [FileResult] for DataProcessingSession1 flow

    if obj.outputType="REJECT"

    then

    sendFileEvent(

    service : "DataEntry",

    dataFile : obj.dataFile,

    flow : "RecyclingSession1",

    interaction : obj.interaction,

    collectionTypeName : "LOAN_CRA",

    collectionTypeVersion : 1 )

    sendTracesSession1 [FileResult] for DataProcessingSession1 flow

    if obj.outputType in {"INPUT_AUDIT", "OUTPUT_AUDIT"}

    then

    sendFileEvent(

    service : "Storage",

    dataFile : obj.dataFile,

    flow : "AuditStorageSession1",

    interaction : obj.interaction,

    collectionTypeName : "AUDIT_COLLECTION",

    collectionTypeVersion : 1 )

  6. InterPlay receives a FileEvent and the following rule is executed:
  7. importRejectsSession1 [FileEvent] for RecyclingSession1 flow

    do importDocument(

    event: obj)

  8. Datastore receives a FileEvent for each trace and the following rule is executed:
  9. importTracesSession1 [FileEvent] for AuditStorageSession1 flow

    do importDocument(

    event : obj )

  10. The Rule Engine Server processes the file from the first session in a second session and pushes the results to the Event Processor. The following rules are executed:
  11. sendTracesSession2 [FileResult] for DataProcessingSession2 flow

    if obj.outputType in {"INPUT_AUDIT", "OUTPUT_AUDIT"}

    then

    sendFileEvent(

    service : "Storage",

    dataFile : obj.dataFile,

    flow : "AuditStorageSession2",

    interaction : obj.interaction,

    collectionTypeName : "AUDIT_COLLECTION",

    collectionTypeVersion : 1 )

  12. Datastore receives a FileEvent for each trace and the following rule is executed:
  13. importTracesSession1 [FileEvent] for AuditStorageSession2 flow

    do importDocument(

    event : obj )

 

Related Links