Data Stores


This article covers how to use data stores in Appian. Data stores connect Appian to external relational databases in order to store and retrieve application data. Learn how to create data stores, run DDL scripts and retrieve external data in your application.

Data stores allow you to insert, update, query, and delete data in the format needed by your applications without writing structured queries (SQL). You must have a business data source configured to use this feature.

See also: Configuring Relational Databases

Custom data types used by your processes are stored in an external relational database. These hierarchies are used to create tables and keys in another database used to hold your custom data.

See also: Custom Data Types

Once you define a custom data type and create a data store for it, you can publish data out to the database related to the data store using the Write to Data Store Entity Smart Service. You can also use a!queryEntity() to access information in the data store without knowing the underlying table schemas of the database or SQL.

See also: Write to Data Store Entity Smart Service and a!queryEntity().

Creating a Data Store

To create a data store, complete the following:

  1. Open an application in Appian Designer and use the New menu to select Data Store.
  2. Select a Data Source from the available list. See also: Configuring Relational Databases
  3. Click Create & Edit to see the newly created data store in new browser window.
  4. Enter a name for the new entity.
  5. Select a data type for the entity by clicking Browse and selecting a custom data type from the displayed list.
  6. Click Verify to validate that the necessary data structures have been created in the database, and that the database is available.
    • When the tables needed for your entity do not already exist (expected) the following message is displayed: No matching tables found!
    • If the letter casing for the schema in Oracle is different than the schema attribute of the @Table annotation of the XSD (which is lowercase by default), the entity will not be able to map to the table. If you encounter this issue, you can create a synonym in the Appian user schema that points to the tables in the other schema. Please note that the synonym will need to have the same name as the table or view that it points to.
    • Select Create tables automatically to have the system execute the SQL commands needed to configure your data store entity - OR - select Create tables manually if you want or need to have the necessary SQL commands executed using a process that you perform outside the system.
      See below: Running DDL Scripts
    • If you selected Create tables automatically and want Appian to always execute DDL statements, check the Automatically Update Database Schema checkbox on the data store's page. If automatic schema updates are enabled both globally in the Administration Console and locally on the data store, Appian automatically attempts to fix any discrepancies between your database schema and data store definition by adding, but never removing or altering, tables and columns to the database to match entity definitions.
      • When automatic schema updates are enabled:
        • Columns and tables missing from the database, but present in the data type, are created.
        • Columns and tables missing from the data type, but present in the database, remain untouched.
        • Columns that are incorrectly configured to map to a data type field are not altered, so verification of the data store requires manual changes to the database. An example that requires manual changes is altering a database column of type text so that it maps to a data type field of type integer. Additionally, fields renamed from the Data Type Designer are considered entirely new, not renamed, and a column is added if not present on the database.
      • If automatic schema updates are disabled, Appian does not address any discrepancies between your database schema and data store definition. This means:
        • Tables missing from the database are not automatically created.
        • Columns missing from tables are not added.
      • When automatic schema updates are enabled, new columns and tables are automatically added to the data store's schema in the following scenarios:
        • The data store is imported into an environment for the first time.
        • Changes to the data store are imported.
        • A designer updates a data type used by the data store (precedent data type).
        • WARNING: Existing column definitions are not updated. For example, Appian will not make changes to a column's type or length.
      • For automatic schema updates to be considered "enabled," it must be enabled both globally in the Administration Console and locally on the data store
    • If the database is not available or you want to avoid updating the database immediately, click Save Draft.
  7. Click Save and Publish.

Running DDL Scripts

When creating or editing a data store entity, the system checks your RDBMS to see if the necessary data structures already exist.

When the data structures in your database do not match, you are given the option to create tables manually with a link to download the SQL needed to generate the necessary structure.

To create tables manually, complete the following:

  1. After selecting Create tables manually, click Download DDL Script.
    • The browser's File Download dialog box displays, listing the DDL script file for download.
    • The script file is named using the following convention: <DATA_STORE_NAME> DDL.sql
  2. Optionally, you can edit the file name to use your own version numbering scheme for your DDL scripts, such as: <DATA_STORE_NAME> DDL_<MONTH_DAY_YEAR>_<VERSION_NUMBER>.sql.
  3. Save the file to your local machine.
  4. Review the script or have it reviewed by your DBA to ensure that it meets your organization's requirements.
  5. If the script does not meet your requirements, you can modify your custom data type to better suit your needs by completing the following:
    • Edit the custom data type that was used by the entity that you were creating or updating. See also: []
    • Edit the Data Store entity to use the new custom data type. You can skip this step if you updated all outdated dependents when editing the data type.
    • Verify the Data Store again to generate a new DDL script.
  6. If the script meets your organization's requirements, run the DDL script to create the necessary tables for your data store entity.
  7. Click Verify Again.
    • When the database tables are properly defined, the message Entity mappings verified displays.
    • If the database tables are not properly defined, use the commented commands in the DDL script to drop and recreate the tables.

Retrieving External Data

Use the following guidelines when loading data into a data store from an external RDBMS.

Perform Bulk Operations as Much as Possible

Minimize the number of individual queries to the database.

You can write arrays of data to your data store using the Write to Data Store Entity Smart Service, which allows you to avoid running the Query Database Smart Service with multiple node instances configured.

See also: Write to Data Store Entity Smart Service

Retrieve Data in Smaller Batches

The amount of data that can be returned by a query is limited to 1MB by default. If you expect to return more than 1MB of data, you can design your process models to retrieve data in smaller batches and loop the process flow to retrieve the desired data.

See also: Query Limits

When using a query to start processes, you may quickly reach the default limit of 1000 activated instances by repeatedly activating your Send Message Event and gateways (which do not provide the option to delete previously completed instances in the same manner as process flow activities).

See also: Maximum Activity Instances

You may want to configure your Send Message Event in a sub-process and configure the Process Model Properties to delete the completed instances when they are no longer needed.

See also: Sub-Process Activity

You can also configure the other process activities to automatically delete previously completed instances to avoid reaching the 1000 node limit. This setting also reduces memory (RAM and HD) usage of your processes.

See also: Releasing the System Memory Used by Completed Nodes

See Also

Open in Github Built: Fri, Jun 03, 2022 (01:08:29 PM)

On This Page