Skip to main content
Verify the version tags to ensure you are consuming the intended content or, complete the latest version.

Other background processing options

Pega Platform™ supports several options for background processing. You can use listeners, service-level agreements (SLAs), and Wait shapes to design background processing in your application.

Listeners

Use listeners for email, file, or inbound network requests or messages. In a high-availability environment, listeners are distributed across hosts. You can configure email listeners to run on specific node types. 

File listener

Use a file listener with a file service to import and process files from another system or that have been created by application users. For example, you can import data that is used to create a work object.

The file listener monitors the file directory. When files match the pattern that the file listener is listening for arrive, the listener moves the files into the work_<name of listener> directory and calls the file service. The file service uses a parsing rule to open and read the file, evaluate each input record, divide the record into fields, and then write the fields to the clipboard, and a service activity processes the data.

Email listener

Email listeners provide Pega Platform with the information it needs to route incoming email messages to an email service rule (Rule-Service-Email). Email listener is configured with the email account name, the name of the mail folder to monitor, the message format of the incoming messages, and the email service rule to which to route the messages.

When an email listener routes a message with attached files, the listener creates a pyAttachmentPage in the Data-ServiceMessage class and puts the files on that page by using the pyAttachNames and pyAttachValues properties. When you use the Email wizard to configure inbound email, the system configures the generated service activity to extract files from the pyAttachmentPage and attach them to the work item as work item attachments.

JMS listeners provide Pega Platform with the information it needs to route Java Message Service (JMS) messages from a specific topic or queue to a Pega Platform JMS service (Rule-Service-JMS). A JMS listener or JMS MDB listener data instance specifies which queue or topic contains the messages to consume and which JMS service rules process the messages.

Caution: JMS MDB listener rules are no longer actively developed and are considered for deprecation in later releases. Using JMS MDB Listener rules does not follow Pega development best practices. Consider other implementation options instead.

SLAs

A service-level agreement establishes a work completion deadline. Organizations often establish SLAs to enforce on-time performance. These obligations range from informal response-time promises to negotiated contracts.

SLAs define the intervals of time that standardize how you resolve work in your application. You can apply a service-level agreement to cases, stages, steps, flows, and assignments.

Using SLAs is a viable alternative to using an agent in some situations. You can invoke agent functionality without creating a new agent with an SLA escalation activity. For example, if you need to provide a solution to conditionally add a subcase at a specific time in the future, then adding a parallel step in the main case incorporating an assignment with an SLA and escalation activity can perform this action.

The standard connector error Work-.ConnectionProblem handler flow leverages an SLA to retry failed connections to external systems.

Pega Platform can initiate an SLA in the context of a case only. Any delay in triggering an SLA impacts the timeliness of executing the escalation activity. Do not use SLAs for polling or periodic update situations.

The Assignment Ready setting of an SLA allows you to control when the assignment is ready to be processed by the SLA agent. For example, you can create an assignment today but configure it to process tomorrow. An operator can still access the assignment if there is direct access through a worklist or workbasket.

 

Wait shape

Wait shapes provide a viable solution in place of creating a new agent or using an SLA. The Wait shape is applicable against only a case that is in a flow step and waits for a single event (timed or case status) before it allows the case to advance. Single-event triggers in a case represent the most suitable use case for the Wait shape. The desired case functionality occurs when the Wait shape runs at the designated time or status.

For example, the FSG Booking application for events uses a Timer wait shape in a loop-back polling scenario. A user might want to run an operation immediately within the loop-back. In the following figure, the user wants to poll for the current weather forecast instead of waiting for the next occurrence of automated retrieval. Implementation of this loop-back can occur in parallel to other tasks, such as flagging weather preparation setup and tear-down task completion.

Image showing the wait shape usage in weather forecast case type.

Batch scenarios questions

Question: As part of an underwriting process, an application must generate a risk factor for a loan and insert the risk factor into a Loan case. The generation of the risk factor is an intensive calculation that requires several minutes to run, and the calculation slows down the environment. You want all risk factor calculations to run automatically between 10:00 PM and 6:00 AM to avoid the slowdown during daytime working hours. How do you design a solution to support this requirement?

Answer: Use a delayed dedicated queue processor. Set the DateTime for processing to 10:00 PM. The case waits for the queue processor to resume the flow for the next processing.

If enabled on other nodes, it can take advantage of other claims processing queue processors, reducing the time it takes to stop processing all loan risk assessments.

Question: You need to automate a claim adjudication process in which the system parses, verifies, and adjudicates files that contain claims. The system automatically creates claims that pass those initial steps for further processing. The application receives a single file with up to 1,000 claims daily before 5:00 PM. Claim verification is simple and takes a few milliseconds. Still, claim adjudication might take up to five minutes. How do you design a solution to support this requirement?

Answer: In an activity, invoke the Queue-For-Processing method for each claim.

Use the File service activity only to verify claims, and then offload the task to the queue processor because the method does not significantly impact the intake process. It can also take advantage of multinode and threading processing if available. Furthermore, the modular design of the tasks allows for reuse and extensibility if required in the future. However, using the same file service activity for claim adjudication impacts the time required to process the file. Processing is only available on a single node, and there is little control over the time frame while the file service runs. Extensibility and error handling might also be more challenging.

Take into consideration the time that it takes a queue processor requires to complete the task. For example, the time required to process the claims by a queue processor is 5,000 minutes (83.33 hours); this is unsuitable for a single queue processor that runs on a single node to complete the task. A system with a queue processor enabled on multiple nodes with multiple threads can perform the off-hours task. An alternative solution is to split the file into smaller parts, which are then scheduled for different queue processors (assuming that there is enough CPU available for each queue processor to perform its task).

Question: ABC Company is a distributor of discount wines and uses Pega Platform for order tracking. There are up to 100 orders each day. Up to 40 different line items in each order specify the product and quantity. Up to 5,000 varieties of wines continuously change over time as new wines are added to and dropped from the list. ABC Company wants to extend the functionality of the order tracking application to determine recent hot-selling items by recording the top 10 items ordered by volume each day. This information populates in a table, and users reference the table to ease historical reporting. How do you design a solution to support this requirement?

Answer: Use job schedulers that run after the close of business each day that performs the following tasks:

  • Opens all order cases for that day and tabulates the order volume for each item type.
  • Determines the top 10 items ordered and records these in the historical reporting table.

The activity uses a report to easily retrieve and sort the number of items ordered in a day. When recording values in the historical table, include a commit and error handling step in the activity.

Check your knowledge with the following interaction: 


This Topic is available in the following Module:

If you are having problems with your training, please review the Pega Academy Support FAQs.

Did you find this content helpful?

Want to help us improve this content?

We'd prefer it if you saw us at our best.

Pega Academy has detected you are using a browser which may prevent you from experiencing the site as intended. To improve your experience, please update your browser.

Close Deprecation Notice