jimpatterson

Data Preparation to Enhance Strategy Design, Implementation & Testing

Blog Post created by jimpatterson Advocate on Apr 23, 2018

This blog series features the opinions and experiences of five experts working in various roles in global strategy design. They were invited to discuss the best practices outlined in this white paper, and also to add their own. With combined experience of over 70 years, they partner with various businesses, helping them to use data, tools and analytic techniques to ensure effective decision strategies to drive success. The experts share their personal triumphs and difficulties; you’ll be surprised to learn the stark differences, and occasional similarities, in these assorted expert approaches to accomplishing successful data driven strategies across industries.

 

Jim patterson headshot cropped.pngJim Patterson is a Director of Business Consulting within the Credit Lifecycle Practice of Fair Isaac Advisors where he’s worked for the past 20 years. Jim focuses in the application of adaptive decision management and predictive analytic technologies within the financial services industry.

 

Data Preparation

Thoughtful data preparation is the key first step before any analytic strategy design effort. It is critical to identify which variables to include in the dataset by thinking through the specific business challenge. This should not be approached with tunnel vision, as many portfolio trends have a multi-decisional context. For example, rising losses may require new and innovative approaches not only in how a lender manages collections prioritization and treatment, but also a combination of limit and authorizations management in the case of revolving products.

 

It is important that you also consider how you will validate the design of the new strategy. You must ensure that relevant profiling variables are included in addition to the outcome variables that drive the design. This facilitates a critical part of the design and deployment process – selling your new approach to internal stakeholders. Lastly, select a performance window that is relevant to the decision context. For instance, credit limit increase strategy design typically leverages a longer performance window, say twelve months, since revenue gains are observed much earlier than risk-related impacts (e.g. delinquency and loss).

 

Pick Your Sample

It often isn’t necessary to sample all available account records. In large portfolios with millions of customer accounts, doing so would simply extend the strategy development process with little to no benefit as compared with using a representative subset of account records. Applying a suitable sampling methodology can pay big dividends in terms of the efficiency of the strategy development process while yielding essentially the same output.

 

It is important to note, however, that the development dataset should not be constructed with a purely random sample. Imagine a pool off 100 marbles with 95 green marbles and 5 red marbles; a pure random sample would most likely miss the red marbles. To avoid a situation like this when sampling your data, you must ensure an accurate representation of key account profiles. A common approach is to use a stratified random sampling methodology that results in unbalanced account profiles being accurately represented in the sampled datasets. In stratified random sampling, the full population is broken into smaller groups, called strata, that share important characteristics. These smaller data groups are then sampled at different rates.

stratified random sampling.png

Sampling must be done correctly in order for the development data to properly represent the true portfolio composition and produce usable, beneficial results. A practical example is sampling 100 percent of non-performing accounts (i.e. “bads”).

 

Don’t Neglect Your Business Intuition
The best designed strategies blend business expertise and judgement with the mathematical power of the analytical software. The analyst’s industry knowledge should be a guiding force behind any strategy development effort. The absence of business expertise in the process, where the strategy builder is completely reliant on mathematical outcomes, introduces significant vulnerabilities. The lens of industry experience, as stressed by my colleague Stacey in her best practices, helps to identify unexpected patterns in the data that would otherwise go unnoticed by those lacking applicable business understanding; this could compromise the integrity and performance of the resulting strategy.

 

This is not to say that valid data patterns are never surprising to industry veterans! However, if the unanticipated patterns or suggested strategy splits cannot be rationalized, take a step back and check the validity of your data. 

 

The combination of good statistical-based software and human business expertise for contextual wisdom is key to a successful data-driven strategy design project. On occasion, the analytic output will justify a particular course of action that is not implementable for practical business reasons.

 

Take, for example, a limit reduction strategy designed to manage exposure at risk. Credit bureau and/or behavior risk scores are common inclusions to the targeting criteria. If the proposed limit reduction action requires adverse action notification to the customer, consideration must be given to the customer communication. Specifically, the score reasons provided to the customer as justification for the adverse action must be reasonable and sensible. This level of explainability can help avoid indefensible customer complaints and operational disruptions in the form of unmanageable inbound call volumes.

 

It is good practice to balance the analytically-derived strategy design with review of live account examples that would fall within the marginal spectrum of the targeted score range of your treatment group. A manual review and understanding of score reasons that will be communicated to customers may necessitate strategy modifications – so be prepared.

 

Operational Agreement and Communication

While some business strategies are executed without dependence or reliance on operational areas (e.g. credit limit increases), the success and effective testing of other decision strategies are as dependent on human execution as they are on the design of the strategy itself. A clear example is collection strategy design where execution consensus is essential. When an operational center is responsible for carrying out the customer treatment prescribed by your new business rules, agreement among relevant parties is imperative to ensure the test is applied in practice. It also ensures that back end performance monitoring reflects the business strategy design which is critical to the design-test-learn cycle.

 

Equally important is communication to call centers that will be impacted by shifts in customer treatment approach. Call center agents should be coached on the nature of anticipated customer reaction and how to address related customer concerns. Call center agents and statistics can provide a valuable qualitative feedback loop once strategy tests are put into production. If there is an unexpectedly high volume of customer calls tied to your test, you should revisit your strategy design and treatment. Significant negative impact to the customer experience will manifest in complaints to the call center, so capture the “voice of the customer” in your strategy management procedures.

 

Pre-Implementation Testing

Once the strategy design process is complete, it is important to conduct pre-implementation simulations to assess the strategy-prescribed customer treatments against current day portfolio volumes and distributions. Strategy development datasets often leverage observation dates up to a year ago or more, subsequent shifts in the portfolio makeup can lead to unpleasant production surprises that can be avoided by running estimator reports prior to deployment. This acts as a final check of the strategy before rolling it out into a production test. Treatment distribution and volume shifts can be addressed, if needed, through a staggered rollout of the new strategy or minor strategy logic changes. 

 

Have questions, comments or tips about data-driven strategy design practices? We can discuss in the comments below or in the TRIAD Community. If you're already a TRIAD user, please join us at our upcoming Customer Forum, May 23-24 in Atlanta, Georgia.

Outcomes