Automation of Loan Origination using Process and Decision Services

The financial services industry has long sought a truly digital mortgage. Far too much of the industry still relies on manual, paper-based processing for tasks like loan origination. These deficiencies were highlighted during the pandemic when that reliance on manual processes created problems on top of those already experienced: lack of consistency, auditability, accuracy and efficiency. It is not an agile approach!

Learn how using the Trisotech Digital Enterprise Suite (DES) allows you to visually define processes and decisions that are directly automated to streamline loan origination processes resulting in productivity increases internally and satisfaction increases externally!

As presented by:

Brian Stucky, Quicken Loans, Team Lead – Rocket Technology Ethical AI
MISMO – Residential Governance Board, Co-Chair Decision Modeling Community of Practice
and Denis Gagné, CEO & CTO at Trisotech.

View the Slides

Videos

Recorded webinars

View all videos
Attend a live webinar

Slider
Bruce Silver's Blog - Business Automation Services in Fintech
Bruce Silver
Blog

Business Automation Services in Fintech

By Bruce Silver

Read Time: 7 Minutes

In client engagements, I am seeing growing interest in what Trisotech calls Business Automation as a Service. I am seeing it particularly in financial services, but I expect it applies in health care and other verticals as well. Financial services, for so long reliant on legacy applications, is now racing to create new cloud-based apps built on modern architecture, where business automation services built with BPMN and DMN are a great fit. This post will explain why, and how it’s done.

The pattern I am seeing starts with software product idea from a subject matter expert. How the software is supposed to work is expressed as a collection of Excel tables that illustrate table updates in response to various business events. The goal is to create a web application that replaces Excel with SQL database tables and automates the table update logic based on attributes of each event in combination with the existing tables.

Examples I have worked on range from loan underwriting to accounting. The client is typically a startup or a service provider getting started as a software provider. In the early stages their programming resources are thin and focused on the more conventional aspects of web application development: database views and reports, analytics, administrative functions, and the occasional manual update of the tables.

The core IP is focused on the automated table updates. BPMN and DMN play a key role there because numerous details of the logic typically remain to be worked out. Excel is great for creating examples, but when you need to generalize the logic for automation in response to any possible event, you often find you need to tweak the table structure or even add new tables. That’s why creating a spec for developers at this stage is a losing battle.

With BPMN and DMN — at least as implemented by Trisotech — the subject matter experts can create the executable logic themselves. If you can create complex models in Excel, you can create automated versions of those models and make the necessary adjustments yourself without getting in line for developer resources. Moreover, those automated models can be deployed as cloud services called by the “conventional” parts of the web app.

For example, I am currently working with a client developing an accounting app for companies that buy, sell, and hold a variety of financial assets: loans, securities, and such. Business events received from the trading system, in combination with time-based events such as accumulated interest and “mark-to-market” revaluation, result in table updates for each such asset, and these tables ultimately are rolled up into the company’s financial statements. If it sounds complicated, trust me, it is.

The solution method that I have found to work well in these situations is based on three basic elements:

Each type of business event, whether received from an external source or user interaction, is associated with one decision service and one process service. Each instance of the event triggers the process service, which in turn calls the decision service.

OData is an OASIS standard for cloud-enabling databases. I wrote about it in a recent post. OData automatically creates an XML metadata file equivalent to an OpenAPI (Swagger) file normally used to define REST APIs. Upon import of that file, Trisotech instantly exposes to BPMN and DMN Create, Retrieve, Update, and Delete (CRUD) operations for all the tables and converts the datatypes used in these operations to their FEEL equivalents. OData’s value here is that it allows subject matter experts to modify the table structures and instantly regenerate the REST APIs for all table operations. This is critical when the tables are not finalized and locked down. With Swagger, my experience has been that if you need to wait for the developers to modify the REST APIs, it’s hopeless. By then, things have changed again.

Below you see the basic application architecture:

Business Automation Services receives business events from both external sources and the web app. Each business event triggers a BPMN service that uses OData to retrieve table data, calls a DMN service to generate new table rows, and then uses OData again to insert the new table rows. The web app provides views, reports, and analytics on the table data.

The BPMN process looks like this:

In this simplified example, the process references Table A and manages updates to Tables B and C. Upon receipt of the business event, the process first validates it by matching it to some existing record in Table A. Assuming it is found, the event is recorded in an event table. OData generates a unique ID for every record and the ID of the business event is saved as a process variable used to establish identifiers for records in Tables B and C. This example just has 3 tables, but typically there may be 6-10 or more, and logging the business event is typically followed by OData Find (query) of other tables to obtain the data needed for the decision task.

Each process includes a single decision task that generates new rows for all tables. In the process diagram you see the decision task inputs and outputs visualized as data associations from and to data objects (process variables). The decision task invokes a single DMN service in which the data input associations in the process map to the input data elements, and the output decisions in the decision model map to the data output associations in the process. Following the decision task, the decision outputs become the inputs of multiple OData Create (insert) operations. For auditability, all the table updates are inserting additional rows, not updating existing rows. OData Create inserts a single record, so in the case where multiple records are generated, we use a multi-instance service task. As we have discussed in the past, Trisotech uses FEEL boxed expressions to map between service parameters and BPMN/DMN variables, so subject matter experts can do this without programming.

Here is how it works in practice.
  1. We start by defining the database tables. With MySQL, for example, we use phpMyAdmin.
  2. From the OData gateway, we download the XML metadata file for the database. Even though DMN is not calling any services, we import the metadata file into the DMN Operation Library as a way to capture all the table datatypes at once.
  3. The real work is developing the DMN model, which is where the logic generalizing the Excel examples is performed. The decision model must create an output decision for any table where new rows are inserted. After testing the logic, we define a decision service specifying its inputs and outputs, and publish it to the Trisotech Cloud. The service inputs define the data that must be supplied by the process to the decision task.
  4. In BPMN, again we import the metadata file to the Operation Library. The OData API supports 5 operations for every table: Find (query), Get (by ID), Create (insert), Update, and Delete. Our method requires just Find and Create. In BPMN, each service task is configured to a particular table and operation, and mapped to input and output data. We discussed how to use these operations in a previous post.
  5. The decision task is configured by linking to a deployed decision service in the Trisotech environment, and again providing data mapping to the process variables.
  6. You can now publish the process to the Trisotech Cloud, at which point it is an executable REST service.

This is the basic recipe, and it is accessible directly to subject matter experts without getting stuck in the developer backlog. OData is quite valuable while the table structures and logic are still evolving. Once they are finalized and locked down, you can either stick with it or provide a final spec to the developers to create your own database APIs.

This method allows the vision for your cloud-based event-driven app to be realized quickly and demonstrated to prospective clients and investors. The only new skill you need to do it is DMN modeling to generalize and automate the Excel logic. And we have training for that! The great thing about DMN is it is designed to be used by subject matter experts… so if you can create your fintech app by example in Excel, you can learn to bring it to life using Trisotech Business Automation services.

Build your business automation services

Follow Bruce Silver on Method & Style.

Blog Articles

Bruce Silver

View all

Business Automation Services in Fintech

Blog

By Bruce Silver

(Read Time: 7 Minutes)

Repeating Activities in BPMN

Blog

By Bruce Silver

(Read Time: 6 Minutes)

Using Messages in Executable BPMN

Blog

By Bruce Silver

(Read Time: 6 Minutes)

XML and JSON in DMN Models

Blog

By Bruce Silver

(Read Time: 7 Minutes)

BPMN: Database Operations with OData

Blog

By Bruce Silver

(Read Time: 12 Minutes)

BPMN 101: What Is a Process?

Blog

By Bruce Silver

(Read Time: 6 Minutes)

DMN: Dealing with Nothing

Blog

By Bruce Silver

(Read Time: 10 Minutes)

Calling Rest Services from DMN

Blog

By Bruce Silver

(Read Time: 7 Minutes)

BPMN Decoded: Data Flow

Blog

By Bruce Silver

(Read Time: 6 Minutes)

BPMN Basics: Providing Information to a Process

Blog

By Bruce Silver

(Read Time: 10 Minutes)

CMMN Method and Style – Part 2

Blog

By Bruce Silver

(Read Time: 8 Minutes)

CMMN Method and Style – Part 1

Blog

By Bruce Silver

(Read Time: 10 Minutes)

Modeling Virus Transmission on an Airplane

Blog

By Bruce Silver

(Read Time: 8 Minutes)

Using Decision Services

Blog

By Bruce Silver

(Read Time: 10 Minutes)

Use Contexts to Simplify Your Decision Models

Blog

By Bruce Silver

(Read Time: 9 Minutes)

Why BKMS?

Blog

By Bruce Silver

(Read Time: 11 Minutes)

Helping the Mortgage Industry Go Digital

Blog

By Bruce Silver

(Read Time: 12 Minutes)

BPMN Style Rules

Blog

By Bruce Silver

(Read Time: 6 Minutes)

DMN, Meet Machine Learning

Blog

By Bruce Silver

(Read Time: 12 Minutes)

Matrix Operations in DMN

Blog

By Bruce Silver

(Read Time: 15 Minutes)

previous arrowprevious arrow
next arrownext arrow
Slider
All Blog Articles

Read our experts’ blog

View all

Slider
Sandy Kemsley's Blog - Not the New Normal for Mortgage Lending
Sandy Kemsley
Blog

Not the New Normal for Mortgage Lending

By Sandy Kemsley

Read Time: 5 Minutes

I gave a presentation recently (virtually, of course), talking about how intelligent automation has become critical for business survival as we learn to live with disrupted supply chains, remote work, fluctuating demand and different markets. I finished that presentation with a thought that I want to start with here: the question is not, “is this the new normal”; rather, the question is “why weren’t we doing things this way before”?

As I look back at the past year, there really haven’t been any quantum leaps in automation technology during that time. What has changed, however, is the adoption of the technology. Companies that claimed that they couldn’t work remotely because of compliance, or couldn’t automate processes because of cost, or couldn’t perform online customer transactions because of regulations, have discovered that none of those reasons are really true. The pandemic has disrupted lives and businesses, but it has also transformed how business gets done. I don’t want to downplay the human tragedy that we have seen unfold over the past year, but in the spirit of making lemonade from lemons, let’s take the lessons that come out of this and use them to survive – even thrive – in the face of economic disruption.

If you look at the new way of working, you will find leading-edge organizations were already doing things that way: the ones who embraced and leveraged intelligent automation technologies. Here’s some of the automation-fueled changes that will help you to get through the current disruption and come out stronger on the other side:

The funny thing is that after a year of being forced to do things virtually as much as possible, we can see that some things actually work better that way. The key is to figure out which processes are best offered online, in-person or a combination of the two.

There are many examples of technology-enabled “better ways” that have emerged, both in the creation of physical goods and in knowledge-based work.

Consider real estate:
Sandy Kemsley's Blog - Not the New Normal for Mortgage Lending

I bought a new home in late 2020, and the transaction was completely contactless, using video calls with my lawyer, digital signatures on documents, online banking transactions, and exchange of keys using a lockbox. Having done transactions like this in the past that required multiple visits to the lawyer’s office and the bank, and signing documents in triplicate, my immediate thought was that I never want to go back to the old way of doing this.

Real estate-related financial services, such as mortgage origination, have undergone significant changes to be able to serve this transaction model. A borrower’s financial situation can change in the weeks that it takes to close a deal, requiring underwriting to gather information in real time and assess the impact of changes as deadlines approach. There are new guidelines and regulations to be applied, and no financial institution wants to risk being out of compliance or underwriting a potentially bad loan.

Given the more complex environment, the only practical way to handle mortgage origination is to add some level of intelligent process and decision automation. This does not mean that origination will be completely automated: underwriters will still need to review information and make decisions that can’t be automated, but they will have access to all of the up-to-date information and be guided by best practices to ensure compliance and reduce risk. Adding process and decision automation would mean that loans approved by an underwriter that don’t meet compliance or risk rules would be routed for more senior review before final approval.

Lenders that don’t adopt intelligent automation in their origination processes will greatly increase their risk, and incur higher costs as highly-skilled underwriters spend their time collecting information rather than considering decisions. And if they’re not offering the full digital lending experience, they will lose out on an increasing amount of business as more consumers decide, like I did, that they just don’t want to do it the old way.

Follow Sandy on her personal blog Column 2.

Blog Articles

Sandy Kemsley

View all

Not the New Normal for Mortgage Lending

Blog

By Sandy Kemsley

(Read Time: 6 Minutes)

From Project to Program: Expanding Across the Organization

Blog

By Sandy Kemsley

(Read Time: 6 Minutes)

From Project to Program: Picking Your Next Project

Blog

By Sandy Kemsley

(Read Time: 7 Minutes)

From Project to Program: Post-Implementation Review

Blog

By Sandy Kemsley

(Read Time: 5 Minutes)

Building Incentives Into Processes

Blog

By Sandy Kemsley

(Read Time: 6 Minutes)

Discovering Processes In Unusual Places

Blog

By Sandy Kemsley

(Read Time: 5 Minutes)

Foiling BEC Fraud With Better Distributed Processes

Blog

By Sandy Kemsley

(Read Time: 8 Minutes)

Better Process Analysis Using Analytical Techniques

Blog

By Sandy Kemsley

(Read Time: 10 Minutes)