Install and configure Decision Insight

Introduction

This section presents how to set up a Decision Insight node and configure Prebuilt Dashboards on it.

Install Decision Insight

Configure Decision Insight

  • Before the first start of the platform, choose an application start date, e.g 2017-01-01T00:00:00.000+01:00
  • Update or set the following properties in the platform  conf/platform.properties  file: 
conf/platform.properties
#update the following properties :
com.systar.titanium.initialPeriodValidTimeEnd=[application-start-date-chosen-above]
com.systar.calcium.totalPartitions=16
com.systar.titanium.memtable.globalMaxSize=4G
com.systar.titanium.memtable.individualMaxSize=256M
 
#add the following properties :
com.systar.krypton.scheduler.collector.defaultComputingRhythm.scalar=5
com.systar.krypton.scheduler.collector.defaultComputingRhythm.unit=minutes
com.systar.krypton.scheduler.collector.defaultComputingLag.scalar=30
com.systar.krypton.scheduler.collector.defaultComputingLag.unit=seconds

JVM memory

If the JVM memory XMX is equal or less than 4Gb, you must modify the memory space use by titanium(globalMaxSize).  For example for xmx=4G, in conf/platform.properties the property would be com.systar.titanium.memtable.globalMaxSize=2G


  • Update the JVM memory size in conf/jvm.conf. Please see the sizing guide to use the correct XMX size. Example for 16Gb entering these values in conf/jvm.conf :
conf/jvm.conf
-Xms16G
-Xmx16G
  • Start the platform
  • Download the Prebuilt Dashboards application package from the marketplace.

  • Unzip the dowloaded file
  • In Decision Insight Settings > Platform > Applications > Import the application appx and properties.xml files, using the password 'admin' for import. At this step, set the application import date to the application start date chosen above:


  • In ADI Settings > Data Integration > 
    • In space 04-API-Integration >  Properties, update APPLICATION_START_DATE with the value chosen above

    • Configure the same APPLICATION_START_DATE in space 05-API-Initializer



    • In Platform management > Current Activity, wait until the platform CPU Usage is down to a few percent. This allows time for all pre computings to bootstrap and computings to become LIVE.
    • In space 04-API-Integration > Routes, make sure the route "collectAndProcessEvents" is started. 

    • Create the keystore and trustore for TLS communication with the Filebeat agents. For more information see How to generate keys and certificates files for TLS mutual authentication?. Decision Insight keystore and truststore must be in PKCS12 or JKS format.
    • In HOME > Data Integration > APPLICATION DATA INTEGRATION > ENDPOINTS > Connectors : sslContextParameters, check that the connector information below are present

SSL connector parameters
Name sslContextParameters
Class name com.systar.aluminium.engine.impl.util.SSLContextParameters

keyManagers class: org.apache.camel.util.jsee.keyManagersParameters


keyPassword {{keyStore_password}}

keyStore Class: org.apache.camel.util.jsee.KeyStoreParameters


password {{keyStore_password}}


resource {{keyStore_filePath}}
cipherSuitesList TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384

trustManagers class: org.apache.camel.jsee.TrustManagersParameters


keyStore Class: org.apache.camel.util.jsee.KeyStoreParameters


password {{trustStore_password}}


resource {{trustStore_filePath}}
secureSocketProtocolsList TLSv1.2

serverParameters class: org.apache.camel.jsee.SSLContextServerParameters


clientAuthentication REQUIRE


    • In HOME > Data Integration > APPLICATION DATA INTEGRATION > TRANSFORMATIONS > Routes:

Route must use secure lumberjacks uri

the uri in route : collectAndProcessEvents (normally on line 5) must be :  <from uri="lumberjacks:0.0.0.0:5044?sslContextParameters=#sslContextParameters"/>

To use #sslContextParameters : the name of the SSL connector

If you estimate that your network is secure enough and you estimate that you can use an non-secure connection, you can uncomment the uri (line 4) <from uri="lumberjack:0.0.0.0:5044"/>  and comment the secure uri (line 5) <!-- <from uri="lumberjacks:0.0.0.0:5044?sslContextParameters=#sslContextParameters"/> -->


    • In  HOME > Data Integration > APPLICATION DATA INTEGRATION > TRANSFORMATIONS > Properties, add the correct file paths and passwords of the trustStore and the keyStore for Filebeat SSL connection (Lumberjack).

Properties for the SSL connector
keyStore_filePath /path-to-DI-install/DecisionInsight/security/keyStore.jks
keyStore_password ****************
trustStore_filePath /path-to-DI-install/DecisionInsight/security/trustStore.jks
trustStore_password ***************

Set up automated data purge

It is recommended to activate automated data purge to remove old data when they are no longer used.

In the context of the Prebuilt Dashboards for example, setting a retention period for transactions and transaction legs to 1 month allows you to release disk space on a regular basis.

For more information, see Purge data in the Axway Decision Insight product documentation.

Policy on customizing Embedded Analytics applications

The entity models underpinning EA apps form the data flow scope that they cover. As such the base ADI data models for EA apps cannot be altered without upgrading to a full ADI license. If considering any customization then customers should in the first instanced contact their assigned Axway PSO contact to discuss additional requirements and the potential impact of their implementation. FOr further context, the following sections set out prohibited vs. permitted customizations as well as best practice guidance on the preservation of permitted changes when migrating to later Embedded Analytics releases.

Prohibited Customizations when using Embedded Analytics

The following customizations are prohibited without first upgrading from EA entitlement to a full ADI license:

  • No alterations to the data model are allowed through the Administrator’s Model > Entities menu selection.
  • No additional data sources can be integrated with Embedded Analytics applications

Permitted Customizations when using Embedded Analytics

Certain customizations to EA4ST applications are permitted and are covered in the following sections

Capture of additional data using User Definable fields

User-definable fields exist within the base data model. New entities have been added containing custom fields for the Gateway Instance, Client Application and API entities in the data model. For each of these entities, there has been added 1 Boolean and 5 String type fields (note that when populating these new custom field entities, the id must match an existing Gateway Instance, Client Application or API id as relevant) . Integration Specialists with Administrator privileges can populate these fields as needed to hold additional info not captured/presented out of the box. These uses and associated data will be preserved during upgrades of EA applications.

These user-defined fields are provided as aids to augment dashboards and/or reports. They can be used for list display, as search criteria, or as filters. From a technical aspect, the fields are defined as type string. They cannot be used for numerical aggregation or derivation. There are some string functional available for aggregate and derive that are not expected to be of practical use in the EA applications.

Other Permitted Customizations

With adherence to best practice steps outlined in the following section, EA customers are entitled to implement the following customization to their EA4ST instances

  1. Clone an Existing Dashboard
  2. Create a New Dashboard
  3. Configure a New Event
  4. Add a New Route
  5. Define a New Query
  6. Create a New Trigger
  7. Define a New Resource

Preservation of Customizations during upgrade of Embedded Analytics applications - Best Practice approach

The following section sets out a methodology to help preserve permitted customizations when migrating to later EA releases.

Custom Spaces

The underlying principle to managing modifications is to define and use a custom space.  For example, create a space called “09 Customization” to hold all your local modifications.

Use this space when creating new endpoints, automation, and transformations in Data Integration.

When creating a new dashboard, the best practice is to clone an existing dashboard and make changes to the cloned version. When creating new aggregated or derived attributes for a dashboard, it is strongly recommended to save those KPIs either in the dashboard space or the customization space you created.

Keeping all your changes in a dedicated customization space or the cloned dashboards allows you to easily migrate those items when upgrading to a new version of the EA application.

Other considerations with the customizing of Embedded Analytics applications

Sizing guide invalidation

The sizing guide is only valid for the unaltered out of the box Embedded Analytics applications. Applying customization introduces changes to the applications that are not accounted for in Embedded Analytics application sizing utilities. 

Performance effects

Performance is at risk if customization increases the computational/process demands of the application. Different types of modifications have different impacts on performance, sizing or the application footprint. In general, adding custom information through the use of the user defined fields has very little impact on performance. Creating additional info for display on a dashboard through aggregation or derivation has the potential to create a large overhead. 

Increased footprint requirements

Customization can affect the computational demand of an ADI application that may require increasing hardware/VM resources to sufficiently resource the application in run time.

Implementation of Customisations

Completed at customers cost where there are in house and/or billable PSO efforts needed for their delivery.

Upgrade the Prebuilt dashboards

You have to consider the following elements before performing an upgrade:

  • All custom application items (such as data integration routes and mappings, model entities and attributes, dashboards, indicators) are not guaranteed to be kept after an application upgrade. It is not recommended to perform an upgrade on an application that has already been customized to fit with specific purposes.It is safe to assume that any customization done to the application will not be kept after an upgrade.
  • For safety reasons, create a manual checkpoint using a meaningful comment as described here.

You can perform an upgrade of the Prebuilt Dashboards application by downloading and importing its latest version, as described in the previous section. You might need to upgrade Decision Insight as described in the Axway Decision Insight product documentation. For more information, see Upgrade.

Additional information regarding large configuration and performance demands.

Currently whilst the EA4APIM can support higher throughput rates (e.g. 200 TPS and above), the main limiting factor is in the configuration of the solution to meet the number of API Methods and Client Applications. This is where the EA4APIM application was designed on the assumption each Client Application (e.g. 200) requires every API Method (e.g. all 1300), so the EA4APIM application computes all the values for those 260,000 instances every 5 minutes, regardless of the actual use of an API Method by a Client Application.

For customers where a larger configuration of these key entities/objects of Client Application and API Methods is required, this causes the EA4APIM application to hold in memory a large number of instances and calculate the relevant (approximately 10) KPIs every 5 minutes for each instance.

In addition there are further computations which are based on these KPIs, leading to more computations which may be redundant.

Dependent on the timeline for your opportunity, it may be worth waiting a little longer into Q2 for more visibility of an improved offering from EA4APIM, which may be able to handle these dimensions more natively.

Whilst the EA4APIM application is under review for how to improve this situation, we have looked at potential options with the current EA4APIM application to meet these larger confirmations beyond the limits of the vertical scaling of the ADI server.

 

Option 1 – Horizontal Scaling – Primary-Replica

Depending on the customer need, it may be possible to use the Primary-Replica configuration for distributing the computational load across multiple instances of EA4APIM with one or more replicas.

This will provide more horizontal scalability, where vertical scalability cannot achieve the required computational capacity.

This will however not remove the redundant computing, only enable this to be computed.

Option 2 – Segregating the configuration

The most typical case is where the number of APIs and API Methods grows for each application that is added as they may add new APIs. However the current application also expects the Client Application to use all the existing API Methods too.

Some analysis of the Client Applications and API Method pairing could lead to identifying a more efficient way to group the pairing of API Methods to Applications, this would reduce the amount of computing required, and reduce the estimated large number of servers to a lesser number, possibly by 50% - 75%.

As an example, rather than compute 260,000 instances (200 Client Applications and 1300 API Methods) which would require 10 servers, breaking these into groups could reduce it to 4 groups of 50 Client Applications with less API Methods each – e.g. 500 API Methods for those 50 Client Applications = 25,000 per instance – would lead to 4 servers instead of 10.

However in addition to using the existing application to provide the monitoring, it is expected that this would miss the overarching visibility, and therefore need a Cockpit view across these 4 instances to achieve this, which would mean a full ADI licence, and some PSO effort to design and develop this Cockpit solution.

 

Option 3 – Reduced Analytics

Another example use case is where there is a set number of API Methods, but the customer allows Client Applications to be installed to use these API Methods. Again they may not need to use all the methods, but the growth is driven more by the Client Applications.

This too can utilise the segregation approach defined above, though the use case indicates a need to monitor the Client Applications, and the API Methods separately, so less need of the Client Application X API Method analytics. In this instance it may be possible to have reduce analytics in the EA4APIM solution solely focusing on these two perspectives (Client Application and API Methods) separately.


Related Links