Uses for big data and predictive analytics in workers’ comp are growing. But will adjusters buy-in?
Big data is increasingly at the heart of the analytics solutions future-focused claims organizations in workers’ compensation are developing.
A 2019 Willis Towers Watson (WTW) survey found that in commercial lines, “claim analytics usage is highest in workers’ comp,” at 27%.
A 2020 follow-up study revealed that “more than half (54%) of workers’ compensation writers say they are using advanced analytics in claims, compared with 10% in commercial property.” WTW predicts that this will rise, although “all lines of business are behind the schedule they anticipated for claim analytics.”
Many in workers’ comp hope that as technology develops, predictive analytics will be able to flag things such as comorbidities and opioid use, which can drag out claims unnecessarily.
As this inevitable rise makes long-standing industry assumptions and manual processes obsolete, leaders in the space point to the dream — a happy medium between the convergence of data points and experienced adjusters that leads to massive workers’ comp savings.
What Solutions Are Out There Now?
At the most commonly used predictive models and data analytics tools in workers’ comp are those that flag claims that could benefit from early intervention.
Tools designed to skim provider notes for terms like “opioid use” or common comorbidities can help identify claims that may need extra attention from adjusters early on in the process.
One Call, for example, developed an “at-risk” dictionary and uses natural language processing (NLP) and optical character recognition (OCR) to help flag claims. OCR converts handwritten notes into typed text, and NLP is a tool that allows computers to process language data to organize the data that comes in for the notes.
“We developed an at-risk dictionary and used OCR identification to pull from the provider notes,” explained One Call SVP of client analytics, Tina Brletich. “We now use NLP, natural language processing, as well, because it’s a lot of unstructured data that’s coming in from items such as physical therapy evaluation notes.”
Since launching the first-generation model in March 2021, the company has seen a 15% reduction in outlier cases. One Call has built a second generation version with the goal of further improving the predictive models.
“We can track the performance of the dictionary and the model, identifying the high-risk cases to reduce the outliers,” Brletich said.
These tools have challenged the long-prevalent assumptions in workers’ comp that an older claim is more problematic than a newer one or that high-dollar claims need attention first, according to Matt Harmon, senior vice president of claims at the MEMIC Group.
“These new data tools allow us to identify which claims have potential to be severe or have poor outcomes earlier in the life cycle, not based on the date of injury, not based on what’s been reserved or paid in the past, so we can apply the right people earlier on in the claim to do what they do best — analyze and make decisions,” he said.
Tom Wiese, MEMIC’s vice president of claims, worked closely with the firm CLARA Analytics to develop those insights over four years, pushing for customizations of the tool to work for their frontline claims staff.
It took a year for full implementation to be achieved and that they never feel they’ve crossed the finish line on adoption. “We’ve recently gone through an analysis on return on investment related to changes in our results through utilization of the analytics tool,” Wiese said. “We’ve been able to show well over 5 million in claims savings over the last 18 months and a significant reduction in claim duration.”
Where Predictive Analytics Helps
Though predictive analytic tools that forecast which claims could go awry are only just being developed, workers’ comp professionals are already looking ahead to new ways big data can help the industry cut costs.
“As we look into the future, we’re hoping that the models become sophisticated to the point where I can make strategic decisions around staffing, around geography, around job fit, title and scope,” explained Virna Alexander Rhodes, senior vice president and manager, workers’ compensation claims at Liberty Mutual.
In the future, even more models deployed with even more guidance will be the norm. Rhodes, said she expects integration to improve (currently the models run on a different system from the organization’s claims platform) and the breadth of information captured to grow. Two of the more unique models, subrogation and dual strategy, are already garnering wins for some of the company’s bigger clients.
Other companies are looking to use data and predictive analytics to prevent fraud — an issue WTW expects to explode to 47% prevalence among commercial lines in 2021 data.
“We’re able to pull together pieces of information to spot potential red flags that suggest potential fraudulent billing practices and even kick-back schemes so that our dedicated fraud investigations department can investigate,” said Christina Ozuna, vice president of quality assurance, claims at EMPLOYERS.
“We’ve been able to catch double billing, billing for services never rendered, even fraud rings. Once identified, we work with prosecutors to bring unethical providers to justice, which ultimately benefits injured workers, policyholders, as well as the insurance industry more broadly.”
Adjuster Buy-in Still Needed
Though predictive analytics and big data have numerous uses in the industry, getting buy-in from adjusters can be difficult. Adherence to the technology, differences in nomenclature between tech companies eager to impress and claims leaders eager to establish rapid buy-in among staff, and the influx of data are constant battles.
However, when frontline staff sees the value, it can be transformative and bolster confidence in both people and machines.
“In the majority of cases, resolution managers [adjusters] follow the guidance from the machine learning model. In the relatively small fraction of cases where they don’t, they have a good reason. And that’s exactly the balance you want to achieve,” explained Joe Powell, senior vice president of data and analytics at Gallagher Bassett.
“If you’re placing your bets on ‘hey this is really interesting looking,’ but it doesn’t necessarily move the needle for our client — maybe you’re just looking at the needle rather than moving it. At the end of that investment period, you’re going to be exactly where you were; you’re just going to have a better view of where you are.”
Nicole Boodram, chief analytics officer at AmTrust Financial, said buy-in her organization took time, but is gaining traction.
“The transformation to using analytics started with the senior executives in claims,” she said.
“Using the dashboards and scorecards in meetings demonstrated their support and adherence to using data and analytics. The growing pains on the frontlines were “trusting the data,” she added.
“The teams relied on manual reports as part of their daily process. Demonstrating the “new” scorecards and reports over a period of time yielded information they could trust helped the transition. Additionally, being able to have more frequent, automated results so that they could spend more time understanding trends and taking action helped disperse reluctance to the new process.”
Other leaders in the industry admitted the least successful parts of their programs came in the form of too much, too fast.
Ron Skrocki, senior vice president of product management and development at Genex, now Enlyte, whose career has spanned IT, analytics, operations and product over 30 years, put it simply: “One of the reasons why I love doing this is, given that I’m a bit of a jack of all trades having done all of those jobs, is what you’re developing, refining and bringing to the table has the right input and expertise all due to those constituents.”
But there’s a caveat, he said: “Just because you have the data to do something, doesn’t mean you should do it.” &