Insight, analysis & opinion from Joe Paduda

< Back to Home


Why your “predictive analytics” program isn’t working

I’m hearing more complaints and concerns about the lack of results from projects involving “big data”, analytics, predictive modeling and the like. These have me scratching my head, as effective use of data is critical to any enterprise these days.

I think I’ve figured out why some of these projects haven’t turned out the way sponsors want.

An excellent article on the effective use of analytics identifies 6 keys to ensuring success hit my inbox a bit ago and I’ve read it seveal times, passed it on to respected colleagues, and gotten their feedback.

Targets and accountability. The Central Analytics Business Unit (ABU) was set up as a centralized profit center with ambitious targets and with direct reporting to the chief operations officer;

Support from the top. Obvious, critical, and bearing repeating.

Incentive scheme alignment. The returns generated by ABU’s analytics projects accrue to the departments, who do not contribute to the cost of the ABU. And the ABU team is paid using variable compensation, based on projects that have been fully implemented and based on their ROI.

Rigorous assessment of results. The contribution of analytics is always measured and in some cases is reviewed by the accounting department.

Communicating with strategic goals in mind.  The ABU emphasized communication

The right people. Recruited employees had:
(1) significant quantitative strength;
(2) negotiating skills and diplomacy;
(3) the ability to communicate with the business lines; and
(4) entrepreneurial instincts. Recruiting this high-demand skill set was not easy.

Most of the initial ABU recruits were external hires, and several of them had little knowledge of the banking industry.

BUT…information without action is nothing but a waste of time and money.

This from a physician executive colleague:

One of the things they don’t discuss that I see as an issue throughout the insurance industry (commercial as well as WC) is that analytics often produce counter-intuitive results, and/or suggest conclusions that are at odds with what passes for traditional wisdom.  

An example – I had 3 years of analytics (pretty good ones, too) that demonstrated a 5 or 6:1 ROI from the medical directors’ department (and that included all costs, fully loaded salaries, etc).  No one would believe it, and they dismantled the whole operation.  So, what I’d add to the HBR piece is that the CEO championing (which is one of their 6) has to include championing of business plans based on the analytics, no matter how uncomfortable that makes some people.  

Think analysis of the true costs of network discount strategies is going to be well received anywhere?




6 thoughts on “Why your “predictive analytics” program isn’t working”

  1. Rising Medical Solutions’ Claims Benchmarking Study (annual since 2013) reports that predictive modeling is used by less than 30% of survey respondents. I wonder if the fundamental problem is a cultural resistance to very proactive claims management, as predictive analytics is valuable only if the claims dept. really works on the claims that score highly for trouble, meaning special action.

  2. Complete agreement with your physician colleague, embracing the results of the data analysis must be with an open and curious mind. Results can be upsettingly disruptive, especially when many of the results are contrary to the business standards or “best practices” that so many have followed.

  3. In the case of medical care when analytics (or evidence based care is utilized) are actually applied at the end user level, many times they are counterintuitive to previous medical practices. It is essential to COMMUNICATE with the provider or facility to whom you are applying the analytics. Allow them to express their rationale, but also share the data and your reason for sharing, i.e., appropriate, evidence-based care.

  4. Most businesses require evidence of a ROI sooner as opposed to later. Many predictive models require a great deal of time before the results are evident so our societal need for “immediate gratification” may cause the premature termination of well-intentioned models. Cannot underestimate that many models simply require time that balance sheets may not permit.

  5. From a Claims standpoint, calculating an ROI on a predictive model is extremely difficult. How do you prove that you handled a claim better based on the predictions? What would the cost have been (make up a number) vs. what it actually closed for. How do you prove that you actually improved the outcome based on the modeling vs. what would have happened anyway? It is trying to prove a negative. Some of this is simply a “leap of faith” and those that require an exact ROI will never be satisfied or believe their investment was worth it.

    1. Impossible to calculate any ROI on a single claim. Have to examine the entire program and make comparisons to baseline measures. This assumes there is a large enough volume of claims to reasonably measure and draw conclusions against. Also, assuming there is any evidence of improved cost, the next challenge is ensuring the cost reductions are related to the predictive modeling and not other factors like inflation, legislative changes, exposure base changes, etc. Needless to say, its not an easy calculation to make and that may be contributing to the lack of solid evidence that predictive models work.

Comments are closed.

Joe Paduda is the principal of Health Strategy Associates




A national consulting firm specializing in managed care for workers’ compensation, group health and auto, and health care cost containment. We serve insurers, employers and health care providers.



© Joe Paduda 2019. We encourage links to any material on this page. Fair use excerpts of material written by Joe Paduda may be used with attribution to Joe Paduda, Managed Care Matters.

Note: Some material on this page may be excerpted from other sources. In such cases, copyright is retained by the respective authors of those sources.