Skip to main content

Exporting adaptive model data for external analysis

3 Tasks

25 mins

Visible to: All users
Beginner Pega Customer Decision Hub '23 English
Verify the version tags to ensure you are consuming the intended content or, complete the latest version.

Scenario

U+ Bank implements cross-selling of their credit cards on the web by using Pega Customer Decision Hub™. Self-learning, adaptive models drive the predictions that support the Next Best Actions for each customer. In the production phase of the project, you can export the data from the Adaptive Decision Management datamart to further analyze the performance of your online models over time, across channels, issues, and groups, in your external data analysis tool of choice.

To limit the scope and size of the data set, you can modify the data flows that export the data files. This allows you to select the data for the models that interest you.

Use the following credentials to log in to the exercise system:

Role User name Password
Data scientist DataScientist rules

Your assignment consists of the following tasks:

Task 1: Generate the monitoring export artifacts

In the Prediction Studio settings, configure the Monitoring database export to generate the monitoring export artifacts.

Task 2: Modify the generated ModelSnapshot data flow

Add a filter component to the ModelSnapshot data flow to select model snapshots for the Web Click Through Rate model. Create a data set that has .pyModelID as the only key, and then use this data set as a second destination in the ModelSnapshot.

Note: You create an extra data set to select the appropriate predictor snapshots at a later stage.

Task 3: Modify the generated PredSnapshot data flow

Add a Merge component to the PredSnapshot data flow to select predictor binning snapshots for the Web Click Through Rate model.

Task 4: Confirm your work

Export the monitoring database and examine the content of the files.

 

You must initiate your own Pega instance to complete this Challenge.

Initialization may take up to 5 minutes so please be patient.

Challenge Walkthrough

Detailed Tasks

1 Generate the monitoring export artifacts

  1. On the exercise system landing page, click Pega Infinity™ to log in to Customer Decision Hub.
  2. Log in as a data scientist:
    1. In the User name field, enter DataScientist.
    2. In the Password field, enter rules.
  3. In the navigation pane of Customer Decision Hub, click Intelligence > Prediction Studio to open Prediction Studio.
  4. In the navigation pane of Prediction Studio, click Settings > Prediction Studio settings to navigate to the settings page.
  5. Scroll down to the Monitoring database export section, and then click Configure to open the Export monitoring database dialog box.
  6. In the Export monitoring database dialog box, click Submit to generate the monitoring export artifacts.
  7. Close the Export monitoring database dialog box.
    Note: These monitoring export artifacts are typically generated in a development environment and deployed to higher environments through the enterprise pipeline.
  1. In the lower-left corner, click Back to Customer Decision Hub.

2 Modify the generated ModelSnapshot data flow

  1. In the search field, enter ModelSnapshot, and then press Enter to search for the ModelSnapshot data flow.
  2. In the search results, click the ModelSnapshot data flow to open the data flow canvas.
    The search results for ModelSnapshot
    Note: The name of the data flow contains a random identifier that is created during this exercise.
  1. In the upper-right corner, click Check out.
  2. Click the Add icon, and then click Filter to add a filter component to the data flow.
    The plus sign to add a component to the data flow
  3. Right-click the filter component, and then click Properties to configure the component.
    1. In the Filter configurations dialog box, in the Name field, enter Selected models only.
    2. In the Filter conditions section, click Add condition.
    3. Enter a condition that reads: When .pyConfigurationName = "Web_Click_Through_Rate_Customer".
      The Filter configurations dialogue box
    4. Click Submit to close the Filter configurations dialog box.
      The filter component is added
  4. In the navigation pane of Customer Decision Hub, click Intelligence > Prediction Studio to open Prediction Studio.
    Note: You create an extra data set in Prediction Studio to select the appropriate predictor snapshots at a later stage. Then you return to the data flow and reference the new data set.
  1. In the navigation pane of Prediction Studio, click Data > Data sets to navigate to the data sets landing page.
  2. In the upper-right corner, click New to configure a new data set.
    1. In the Name field, enter OnlyModelIDs.
    2. In the Type field, select Decision Data Store.
    3. In the Apply to field, select Data-Decision-ADM-ModelSnapshot.
  3. Click Create to configure the data set.
  4. In the Keys section, delete the .pySnapshotTime and .pxApplication fields.
  5. In the upper-right corner, click Save.
  6. In the lower-left corner, click Back to Customer Decision Hub.
  7. On the ADM snapshot file repository data set destination component of the data flow, click Add branch.
  8. Right-click the new destination tile, and then click Properties to configure the component.
    1. In the Output data to section, select Data set as the destination.
    2. In the Data set field, enter or select OnlyModelIDs.
  9. Click Submit to close the dialog box.
    The completed model snapshot data flow
  10. In the upper-right corner, click Check in.
  11. In the Check in dialog box, enter appropriate comments, and then click Check in.

3 Modify the generated PredSnapshot data flow

  1. In the search field, enter PredSnapshot, and then press Enter to search for the PredSnapshot data flow.
  2. In the search results, click the PredSnapshot data flow to open the data flow canvas.
    The search results for PredSnapshot
    Note: The name of the data flow contains a random identifier that is created during this exercise.
  1. In the upper-right corner, click Check out.
  2. Click the Add icon, and then click Merge to add a second source component.
    The plus sign to add a second source to the data flow
  3. Right-click the second source component, and then click Properties to configure the component.
    1. In the Source configurations dialog box, in the Source field, select Data set.
    2. In the Input class field, enter or select Data-Decision-ADM-ModelSnapshot.
    3. In the Data set field, enter or select OnlyModelIDs.
    4. Click Submit to close the dialog box.
      The merge component is added
  4. Click the Add icon on the OnlyModelIDs component, and then click Convert.
  5. Right-click the Convert component, then click Properties to configure the component.
  6. In the Convert configurations section, in the Into page(s) of class field, enter or select Data-Decision-ADM-PredictorBinningSnapshot.
  7. In the Field mapping section, click Add mapping.
  8. In the Target field, enter or select .pyModelID.
  9. In the Source field, enter or select .pyModelID.
  10. Click Submit to close the dialog box.
    The Convert component is added
  11. Right-click the Merge component, then click Properties to configure the component.
    1. In the Name field, enter Selected models only.
    2. In the Merge when all conditions below are met section, in the Data-Decision-ADM-PredictorBinningSnapshot field, select pyModelID.
    3. In the get predictor snapshots field, enter or select .pyModelID.
  12. Click Submit to close the dialog box.
    The completed predictor snapshot data flow
  13. In the upper-right corner, click Check in.
  14. In the Check in dialog box, enter appropriate comments, and then click Check in.

Confirm your work

  1. In the navigation pane of Customer Decision Hub, click Intelligence > Prediction Studio to open Prediction Studio.
  2. In the upper-right corner, click Actions > Export > Export monitoring database.
  3. Click Export to start the process.
    Note: This action is typically done in a Business Operations Environment. For the purpose of this exercise, and in general, for testing purposes, you can also run this export in a development environment like the exercise system used for this challenge.
  1. In the upper-right corner, click Notifications to confirm that the export has been initialized, and then completed.
  2. On the exercise system landing page, click File Browser to open the repository.
  3. Log into the repository:
    1. In the Username field, enter pega-filerepo.
    2. In the Password field, enter pega-filerepo.
  4. Open the datamart folder, and then open one of the ADM-ModelSnapshot ZIP file.
  5. Click Download to download the ZIP file to your local system, and then extract the ADM-ModelSnapshot file.
    Tip: The format of the ZIP file is compatible with the 7-Zip tool, but not some other compression tools.
  1. Confirm that the ADM-ModelSnapshot file only contains the model snapshots for the adaptive models based on the Web Click Through Rate model. configuration.
    The model snapshot file shows only relevant data
  2. Return to File Browser and extract one of the ADM-PredictorBinningSnapshot files to inspect the predictor binning snapshots.
    Note: The model snapshot and predictor binning snapshot files can be used for analysis in external tools.

This Challenge is to practice what you learned in the following Modules:


Available in the following mission:

If you are having problems with your training, please review the Pega Academy Support FAQs.

Did you find this content helpful?

Want to help us improve this content?

We'd prefer it if you saw us at our best.

Pega Academy has detected you are using a browser which may prevent you from experiencing the site as intended. To improve your experience, please update your browser.

Close Deprecation Notice