Tutorial 06: Data Mapping

Introduction

This Tutorial is relevant for

Automation Experts who wants to derive significant benefits from using data mapping tools within PIPEFORCE as low-code / no-code tool, which streamline the process of integrating, transforming, and managing data across applications and systems.

What you will learn

In this tutorial you will learn how to do data mapping from one structure into another using PIPEFORCE automation pipeline.

Lets assume this example: You got a customer dataset from the sales system, and you need to make sure, that this data set perfectly fits into the structure of the ERP system.

For this, you need some way of converting the source data from the CRM system to target format of the ERP system. To do so, you can use the data mapping toolings of an automation pipeline.

Lets assume the customer dataset from the CRM system looks like this:

{ "firstName":"Sam", "lastName":"Smith", "age":34 }

And lets assume you want to convert this input dataset from the CRM system into an output format for the ERP system, which expects the customer dataset to have a structure like this:

{ "customer": { "name":"Sam Smith", "age":34, "isLegalAge":true }, "mappingDate":"01.01.2022", "mappedBy":"someUsername" }

As you can see, we have to do some steps to transform from the source to target format:

  • We have to nest every customer data inside the customer field.

  • We have to combine the first and last name into the single name field.

  • The target system expects the additional field isLegalAge, which doesn’t exist in the source system. The value of this field must be set to true in case age of the customer is > 18, otherwise it must be set to false.

  • Finally, the target system expects a new field mappingDate, which contains the date of mapping, and mappedBy to contain the username of the user who did the mapping, just for compliance reasons.

In the next steps of this tutorial you will learn how to map from one JSON to another.

Prerequisites

Step 1: Create the data mapping pipeline

  1. Login to the portal with your developer account.

  2. Navigate to AUTOMATION → Properties.

  3. Click the plus icon and create a new app with name: io.pipeforce.tutorial-06-data-mapping

  4. Select the node of your app and click the plus icon again.

  1. Now create a new automation pipeline with name: data-mapping

  1. The new automation pipeline has been created and the content editor was opened for you.

  1. Copy and paste this YAML script into the editor, and overwrite any existing data there. Then click SAVE:

pipeline: - data.mapping: input: { "firstName": "Sam", "lastName": "Smith", "age": 34 } rules: | firstName + ' ' + lastName -> customer.name, age -> customer.age, age >= 18 -> customer.isLegalAge, @date.now() -> mappingDate, @user.username() -> mappedBy

 

  1. In this pipeline snippet, we created a very simple data mapping configuration:

  • We used thedata.mapping command, which allows to map from one structure to another.

  • The input parameter defines the source data as a static JSON in this example. Besides a static JSON, this value could also be a Pipeline Expression (PEL) pointing to some dynamic data in the varssection or external services. In this example, we want to focus on the data mapping, and keep the rest simple. In case the parameter input is not specified, the current value from the body would be expected as input.

  1. The rules parameter (ormappingRulesin versions < 8.0) defines the mapping rules, which will read from the input data and write to the output data. You can define as many mapping rules as you want. Each mapping rule ends with a comma and a line break at the very end. They will be applied from top to down. The input expression is defined at the left hand side and selects + prepares the input data for the mapping. At the right hand side, the output expression is defined. It specifies the location where to write the data in the output structure. Both expressions are separated by an arrow ->. Each side can use a Pipeline Expression (PEL), and therefore, the full power of this language. It's not necessary to wrap a pipeline expression inside ${and }. So the format on each line should look like this:

  • As a first rule, we concatenate (= combine) the first and last name separated by a space from input and write the result into the output to the location customer.name:

  • The second mapping rule copies the age field from the input to the nestedcustomer.agefield on the output:

  • The third rule is an expression, which detects whether the age field on the input contains a value >= 18. Then, it writes the result to the output at the locationcustomer.isLegalAge:

  • The fourth rule executes the pipeline util @date, in order to return the current date. Then, it writes this value to the new field mappingDateat top level of the output:

  • The last rule is similar to the previous one and calls the pipeline util @user, in order to return the username of the currently logged-in user. Then, it writes the result to the new fieldmappedByat the top level of the output:

  • Not mentioned here because it is optional: Theoutputparameter for the commanddata.mapping. Its value must be a Pipeline Expression (PEL), which points to the location (or a sub-path) to write the mapping result to (for example a variable inside thevarsscope). If not specified, it will be written to the body by default. That is the case in our example.

10. Click SAVE to save the pipeline.

Step 2: Execute the pipeline

11. Then click RUN to execute the pipeline

12. You should then see a result similar to this:

13. Note that the nesting inside customer was automatically created.

14. This data now can be used and send to the ERP system for example using an additional command.

Congratulations! You have executed your first data mapping from JSON to JSON.

For more details about data mapping and transformation, have look here: Data Mapping and Transformation.

Â