Connecting Bizagi processes to feed the Dataset

<< Click to Display Table of Contents >>

Navigation:  Cloud applications > Bizagi Datasets >

Connecting Bizagi processes to feed the Dataset

Overview

When having defined a Dataset for a specific purpose, you may easily configure a Bizagi process to send data to the Dataset within the completion of each of the cases.

This allows you to have your processes seamlessly and continuously feed the Dataset day by day.

For introductory information on Datasets, refer to Bizagi datasets.

 

Cloud_datasets_overviewprocess

 

This section describes how to set your Bizagi processes as a data source of a Dataset.

 

Before you continue

Note that these steps part from the fact that you would already have:

 

1.Defined a given process in Bizagi which captures and works with specific information.

No special considerations are needed for such process, and you would implement it just as you would do as with any process in Bizagi.

For  information on Bizagi basics, about how to implement processes with Bizagi Studio, refer to:

http://help.bizagi.com/bpm-suite/en/index.html?process_wizard.htm.

 

2.Clearly identified at which point in the process, the information is final.

Recall that in order to make the most out of your stored information, you will need to ensure that such data is final so that it is reliable in terms of being a good source for analysis (as explained at concepts of Bizagi datasets).

 

3.Created a Dataset whose structure considers the specific information which is both captured and worked on during the Bizagi process.

For more information about this step, refer to Creating a dataset.

 

What you need to do

In order to have your Bizagi process automatically send data to the Dataset, these steps need to be carried out:

 

1.Download the Dataset connector.

2.Obtain the service endpoint and its access keys to use the connector.

3.Configure the connector in Bizagi Studio.

 

Example

Assuming we have a Sales process in Bizagi which is outlined parting from receiving an order from customers (an activity called Analyze customer order), up until items of that order are shipped (an activity called Ship item), we will rely on the following workflow definition to have this process send out its business information to the Dataset:

 

Cloud_Datasets_sales1

 

Business information that is captured during the process and to be sent to the Dataset include: Transaction date, Price, Payment (type), City/State/Country, Product, Customer (Name), and Longitude plus Latitude.

For this example, we also define that information is actually reliable and final when the process finishes (no further modifications taking place after the Ship items activity).

 

Therefore and in terms of Datasets, a first prerequisite is to make sure we have created a Dataset whose structure considers the fields described above (Transaction date, Price, Payment (type), City/State/Country, Product, Customer (Name), and Longitude plus Latitude).

In order to do so, a Dataset was created with the columns definition as given by a csv sample downloadable at http://samplecsvs.s3.amazonaws.com/SalesJan2009.csv (from this Insurance service company's site: https://support.spatialkey.com/spatialkey-sample-csv-data/).

 

 

Cloud_BizagiDataset7

 

You may for instance check out that this sample .csv file has the following information:

 

Cloud_BizagiDataset10

 

For detailed information on how to create a Dataset based on the definition withheld in a .csv file, refer to Creating a dataset with the structure of a csv.

 

Procedure

Once you have the process, Dataset and definitions ready, follow these steps:

 

1. Download the Dataset connector.

The first step for the configuration is to download a Bizagi connector already providing connectivity information to the Dataset we want to send data into.

In order to do so, go into the Dataset given environment (for instance, Development):

 

Cloud_Datasets10alt

 

For that environment, go to the Security & Connectivity tab and click on Download connector.

 

Cloud_Datasets3

 

Notice that a .bizc connector file is downloaded locally into your machine, and you may choose to rename it afterward:

 

Cloud_BizagiDataset5

 

note_pin

If the download does not start automatically, you may need to ensure that pop-ups and downloads are authorized in your browser settings for https://datasets.bizagi.com.

 

Cloud_Datasets4

 

2. Obtain the service endpoint and its access keys to use the connector.

In order to prepare yourself for next steps regarding that connector's configuration, ensure you copy the service endpoint and access keys as provided in that same tab.

You may rely on the Copy Key button to have at hand:

REST URL (the service endpoint that populates data into the Dataset):

 

Cloud_Datasets_RestURL

 

Access key 1 (username) and Access key 2 (password):

 

Cloud_Datasets_keys

 

note_pin

DO NOT use the Generate New Keys option unless you are completely certain of wanting to generate new access keys and eliminating previous ones.

Note that once you eliminate previous ones, you will not be able to look them up,nor use them again, which entails that any connector's configuration or application already using the previous pair of keys would no longer be able to connect to the service endpoint.

 

3. Configure the connector in Bizagi Studio.

The final step is about configuring the use of a Bizagi connector in the relevant point of your process.

Before doing so, note that you need to manually install the connector by uploading the .bizc file (as with any other connector and as described at http://help.bizagi.com/bpm-suite/en/index.html?connectors_setup.htm).

 

Therefore, install the .bizc connector file as downloaded in the first step:

 

Cloud_BizagiDataset3

 

To first install the connector, set:

Into the Connector parameters, the URL to point to the REST URL copied in the previous step.

Use of Basic authentication and input the copied Access keys 1 and 2, as username and password respectively.

 

Cloud_BizagiDataset4

 

 

note_pin

Notice that the configuration for each connector allows you to specify different URLs for the different project environments (Development, testing and production).

 

You should make sure you rely on these settings so that you adequately use the matching URL per environment (that is, use the Dataset's development environment URL for the project's development environment, use the Dataset's testing  environment URL for the project's testing environment and use the Dataset's production environment URL for the project's production environment.

 

Then and to configure the actual use of the connector, create an Activity action of the type Connector where information will be final for your process.

As mentioned in the Example section above, we will be integrating the process with the Dataset at the Ship item activity:

 

Cloud_BizagiDataset6

 

Set the configuration as shown below:

Using the addDatasetSample action:

 

Cloud_BizagiDataset0

 

Mapping the relevant information as inputs to send to the Dataset (Transaction date, Price, Payment (type), City/State/Country, Product, Customer (Name), and Longitude plus Latitude):

 

Cloud_BizagiDataset1

 

 

Mapping if considered needed, the result field as output that goes into your Bizagi data model (i.e, only for tracking purposes of a successful connection).

Note that such result field will return true (successful invocation) or false (invocation failure).

 

Cloud_BizagiDataset2

 

 

Click Finish and you are set.

At this point. you may run your process and within each new case (once it completes the activity where the connector was set), business data will be sent automatically into the dataset.

 

note_pin

The Dataset collects data which is send to this REST URL, which means that a single process does not have to be the only publisher sending data to a Dataset.

You may similarly configure other processes to feed the Dataset, as well as other applications different from Bizagi.

When having other non-Bizagi applications send out information to the Dataset, you would use the same access keys, and program your application to target that same Dataset's REST URL (the service endpoint).

 

Notes

When mapping inputs into the Dataset, note that mapping from collections is not supported.

You may currently map information so that you send one and only one record at a time to the service offered by the Dataset.

 

Next steps

In case you want to verify that business data is correctly reaching the Dataset and being stored there, you may go into your Dataset environment's details and into the Explore data tab to take a quick look at stored data, as described at Working with dataset environments.

 

Regarding next steps and further possible use cases for your Dataset, refer to the links below:

To interpret the data stored in the Dataset from reporting tools or similar applications, refer to Consuming the dataset from external applications.

To use Bizagi's Artificial intelligence application for predictive analysis, refer to Bizagi Artificial intelligence.