Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.
Info

Data Profiles are configured in Setup / Extensibility / Data Management / Data Profile

Overview

Data profiles define how source data is mapped to target fields in the system. Data profiles also allow the source data to be altered (transformed) and validated. The act of importing data is performed on the data import screen, the profiles you setup here are selected when executing a data import.

Prerequisites

Data profiles can be setup as needed, there are no prerequisite configuration steps.

Data Profiles Panel

The Data Profiles panel towards the left of this screen allows you to select existing profiles to view and edit on the Edit Data Profile panel to the right. Actions are also available in this panel which are shown as icons above the Filter text box. The available actions are described below.

Actions

Icon

Description

Imports a json file that contains a data profile configuration

Exports the data profile configuration of the selected data profile to a json file

Deletes the selected data profile if not in use

Adds a new data profile

Edit Data Profile Panel

The information in this panel reflects the currently selected data profile. From here you can change profile details (fields will display a red left border if a value is required and purple text if the field is read-only and cannot be modified).

Data Profile Fields and Options:

Info

Note: Some of the options listed below only appear when a specific file type is selected.

  • Name: a descriptive name for the profile. Typically this would describe what data is being imported

  • Sample File: a sample of the import file. This file should contain the import file field structure but does not require the complete set of data to be imported. The sample file is added via the upload sample file (blue star) action button

  • Target: this indicates what entity the imported data will be written to. Typically fields in the import file will contain the required data for the target. E.g. if the Target is set to 'Account - Package' the import data would include the Account Id, Package Name, Effective date, Status Type Id, and Package Frequency Id (the required fields when adding a package to an account). However target data can be set using a transform (explained further below) without the need to include every required value in the source file (for instance you can set the package status value to an active status for all imported account packages without the need to have a 'Status' field in the import file)

  • Include Child Target: indicates child objects should be imported as well. For instance contacts are child elements of accounts

  • File Type: indicates the file type of the import file

  • First Row has Labels: indicates that the field names in the import file will be present on the first row of the source file. This option is displayed when selecting a file type of 'CSV'

  • Delimiter: the character that is used to separate values in CSV source file types. A lower case \t can be entered for the Tab delimiter

  • Envelopes Per Request: the number of json envelopes per request

  • Patches Per Envelope: the number of patch requests to include in each envelope

  • Stop On Error: if enabled the data import will halt when an import error occurs. Otherwise if unselected the import action will continue even if errors are encountered

  • Lock Settings: prevent the settings for this data profile from being modified when the profile is selected in the data import configuration

Layout tab

This tab will detail the source file layout when selecting the 'Multi-Layout' file type. If you do not select a Multi-Layout file type this tab will not be displayed.

Schema tab

This tab is where the source (imported) fields are added. The source data type as well as any transforms and validations executed on data import are also configured here. Transforms are used to manipulate the incoming data before it is written to the target (case conversion, date format conversion, etc.). Validations are used to verify that the source data matches a regular expression and/or list of acceptable values. Data that does not match the validations configured will not be imported.

If transforms or validations are configured on fields on the schema tab the box above the Transforms/Validations text will display a check mark and the specific transform or validation type that is setup will be listed underneath the text.

Example: a transform configured to trim whitespace before and after the source field value (full trim).

How to add source data fields and configure their data types, transforms and validations will be covered in the adding data profiles section below.

Visual Map tab

This tab is used to map the imported/source fields to the desired target fields in the system. When you add items to this tab the source field will be displayed on the left and the target (system) field will be shown on the right. If the source data is being modified or validated the transform/validation will be displayed here in between the source field and the target field.

When transforms are setup the source fields will display a '+' at the bottom of the field box.

Clicking this + will allow you to add additional source fields that can be used in the transform configuration (assuming any source fields remain to be configured on the visual map). The + at the bottom of the field box will also be available on the target data fields on the right so that you can import values to multiple target fields if desired.

Source and target fields can be removed by clicking the 'X' within the field.

Clicking on a transform field in this map will bring up an edit modal allowing you to view and edit the transform settings.

How to configure types, transforms and validations will be covered in the adding data profile guide below.

Call Outs tab

This tab allows you to execute plugins (custom code) at different stages of the import process.


Guides

Adding Data Profiles

  1. At the top of the Data Profiles panel click the (blue star) action button

  2. Populate the fields as desired

  3. Next define the source data that is being imported and any transforms and validations you require by loading the Schema tab and following the steps below:

    1. To add a new source field click the (blue star) action button on the Schema tab and the new field will appear within the tab

    2. To edit the new source field click the (blue star) button to the right of the field. A pop-up modal will appear allowing you to change the source field Name, Field Type and Delimited Item (source field order within the source file) values. After modifying any of these values you can click Update to save the changes

    3. To delete a source field click the  button and then click the  button that appears. Source fields that have not yet been saved to the data profile will be removed immediately, source fields that are configured on the profile will be shown in orange text and removed after you click the Save button

    Adding Import Transforms

    1. If you wish to modify (transform) the source data when it is imported load the Schema tab and for the desired field item click the (blue star) above the 'Transforms' text in relation to the imported data you wish to transform. The  will be displayed if a transform is currently setup for the source data fieldNote: additional transform options will be available later when configuring transforms on the Visual Map tab

    2. A pop-up modal will appear. Click the (blue star) action button on the modal to add a new transform

    3. On the model under the Add Transforms heading you will need to specify the Transform Type. The options are:

      • Case Conversion: converts the imported text into the desired case. The options are Lower, Title (capitalize the first character of each separate word) and Upper

      • Condition: allows you to transform the value if it meets a specific condition, e.g. if the source data 'Is Null' import the substitute value of '0'

      • Date Conversion: converts the imported data value into an alternate date format that you specify. E.g. if the imported date value is in a MM/DD/YYYY format you can convert the date to be imported with the day first instead of the month (DD/MM/YYYY)

      • Regex: permits you to transform source data using JavaScript compatible regular expressions

      • Set Value: allows you to change the imported value to a static value you specify

      • Substitute: allows you to import a specific value based on the value of the target field in the system. E.g. if the target value to be imported is 'U.S.' substitute it with 'USA' instead. If the target value does not match then no substitution will occur

      • Time - AM/PM to 24 Hour: converts an imported 12 hour time value into a 24 hour time format

      • Trim: removes whitespace until a character is reached. Whitespace can be trimmed from the left, right or both the left and the right of the imported value

    Adding Validations

    1. Validations ensure imported data conforms to a standard you define. On the Fields tab click the  (blue star)icon above the 'Validate' text in relation to the inbound data you wish to validate

    2. A pop-up modal will appear. To add a validation click in the Validation Type list in the bottom left of the pop-up modal. The validation options are:

      • In List: validate the imported value against a list of values you supply to create the list:

        1. In the Validation Type field select In List

        2. Click the + New button

        3. In the upper right of the Add Validations section click the (blue star) action button.

        4. Enter the first list value in the Item field that appears

        5. To add additional list items click the (blue star) again

        6. Once the list is built click the Add button in the lower right

      • Regex: validate the imported data against a regular expression. To add the regex:

        1. In the Validation Type field select Regex

        2. Click the + New button

        3. In Expression text box enter the regex

        4. Click the Add button

  4. The next step is to map the imported data to the target fields in the system. The mapping must follow the order of the fields in the import file. To map the fields do the following:

    1. Click the Visual Map tab

    2. To the right of the tab click the (blue star) action button

    3. On the pop-up modal that loads specify the Source Field from the import fileNote: you do not need to have a source field if you are automatically setting a target value via a transform. E.g. you can have a literal transform that writes the value 0 to multiple target fields with no source field specified

    4. If you wish to further transform the data you can do so here. After you select the transform option you desire and click Add to close the modal you will be able to configure the transform you have added by clicking on it on the Visual Map tab. The transform options are:

      • Combine: combine allows you to merge multiple fields from the imported file into one or more target fields in the system. See Configuring a Combine below for more information

      • Formula: Formulas are mathematical expressions that can modify the source data. Source field values can be referenced in the formula numerically using curly braces, for example {1} + {2} would add the first source value to the second source value and write that value to the target field. See Configuring a Formula below for more information

      • Literal: writes the value you specify to the target field. Literals can be used to provide data that is not present in the import file. E.g. with a literal transform you can write the value 'true' to a 'isActive' target field without the need to include a source field on the Visual Map

      • Match: matches the imported value to the target field data type. For instance, if you were importing a value called 'activated' to a StatusID target field the value 'activated' would be compared against the list of StatusID's in the system. If activated is found with an ID value of '2' then '2' would be written to the target StatusID field. This option is useful in scenarios where you don't know what the targetID value is.
        Note: match transforms are case sensitive

    5. Specify the Target field to import the data to after any transforms or validations have been performed

    6. Click Add

  5. After you have mapped the imported data fields to the target fields using the (blue star) action the Visual Map will be displayed. You can further configure the transforms you have setup by clicking on buttons and objects. These options are explained below:

    • Adding source or target fields: by clicking the + below the field you can add additional source fields or target fields.

      This approach is used when you need to combine data or in instances where the source data needs to be written to multiple target fields

    • Removing Fields: you can remove a field from the map by clicking the X to the right of the field name.

      Configuring a Combine: after you have added the combine transform you can add additional source fields using the + below the existing source field on the left of the Visual Map. Once the source fields you want to combine are added click on the combine transform on the Visual Map and the Edit Transform modal will load

      The Mask inputs (source fields) will be listed which can be used in the Combine Mask field. A simple combine example would be merging together location data from multiple source fields, for instance: Country, State and City. To combine these values into a forward slash separated format you would enter a Combine Mask of {1}/{2}/{3}. In this example if you were importing a source file row that contained the country,state,city values of: USA,California,Los Angeles then the data written to the target field would be USA/California/Los Angeles

    • Configuring a Formula: after you have added the formula transform you can click on it in on the Visual Map tab to load the Edit Formula modal. The Available inputs (source fields) that are available will be listed along with the reference number you can use in the formula (i.e. {1}, {2}, etc.). A simple formula that would divide the source field value by 100 would be entered as: {1} / 100

    • Configuring a literal: after you have added the literal transform you can click on it in on the Visual Map tab to load the Edit modal. In the Literal Value field supply the value that will be written to the target field. If you wish to populate the same literal value to multiple target fields simply click the + at the bottom of the target field on the Visual Map to add additional target fields

  6. Finally verify your profile is in order by uploading a sample file using the (blue star) action button. After the file is uploaded you can verify that the source data is being red properly by clicking on the (blue star) action button. You can also verify the target data by clicking on the (blue star) action button. The target sample data displayed will include the effect of transforms, so viewing this sample output is a good way to ensure you are manipulating the inbound data as desired.


Editing Data Profiles

  1. Select the data profile on the left under the Data Profiles heading

  2. In the Edit Data Profile panel on the right modify the profile as needed

  3. Click Save at the bottom of the edit data profile panel


Deleting Data Profiles

Info

Note: Data profiles that are configured on data imports are in use and cannot be deleted. In order to delete a data profile in use, first determine which data import is using it and then edit those imports to use an alternate data profile.

  1. Select the profile from the list in the Data Profiles panel

  2. Click the (blue star) action button

  3. Click Yes on the confirmation pop-up that appears