Predictive Prudence

There’s more data for advanced analytical modeling, but innovation is moving carefully.

From new data sources and fresh modeling approaches to the emerging analytic insurance company structure, predictive modeling is cautiously generating new competitive opportunities.

Despite the game-changing success of predictive modeling, impediments to innovation remain. Many obstacles stem from the cautious nature of insurance companies.

Insurers are just as hesitant to move forward on new predictive modeling approaches as they were with generalized linear modeling (GLM) in the past, observed Roosevelt C. Mosley, principal and consulting actuary for Pinnacle Actuarial Resources. “The same companies that resisted GLMs 15 years ago,” Mosley said, “are now saying the same things to me about advanced modeling.”

Carriers are ramping up on research and development, said Claudine Modlin, who leads Willis Towers Watson’s P&C pricing and product management for the Americas. However, since insurance companies tend to have a mindset to improve financial results in the short term, she said, it is difficult to get them thinking about innovation “that will not pay off in six months.” Insurance companies often prefer waiting to see if an approach is tested and proven, and if it will impact the bottom line, Mosley said.

Insurers also realize, explained Stephen J. Mildenhall, a professor at St. John’s University’s risk management and insurance department, that the competitive edge to be gained from predictive modeling innovation can be short-lived in this “quick-to-copy” industry.

Larger carriers often find it hard to support innovation within their business structures, Mildenhall said, “even if they know it is the right thing to do.” Strong program management is necessary for innovation, Modlin said, but only “a few companies in the industry excel around that.”

Obtaining the right data remains a concern even though there is more collection and availability of that data than ever before. “There continue to be data quality concerns just as there were 10 to 20 years ago,” said Louise Francis, founder of Francis Analytics and Actuarial Data Mining, Inc. This rings especially true for certain variables, such as injury type.

Many considerations go into collecting data, Christopher Monsour, vice president of analytics at CNA, pointed out. Insurers that sell commercial coverage through independent brokers need to consider the relationships with those brokers when deciding whether to collect additional customer information. “You also have a decision to make about agency relations if you are asking for information that your competitors might not be asking for,” he added, because the additional time commitment might encourage them to sell coverage from a competitor.

“This is one reason data vendors are so popular — they provide additional information without providing the agent with additional burdens, or at least, not with as many,” he said. Meanwhile, Francis is not convinced that “boat loads of data from external vendors,” will rescue companies from their data challenges. Choosing a vendor with trustworthy data requires a careful approach.

Regulatory Considerations

Regulatory restrictions — whether real or perceived — can also hamper innovation. One challenge of implementing advanced machine-learning models is that they can appear as black boxes to regulators, making them difficult to explain and to understand, Mildenhall said.

Regulators are really concerned with how insurers use data, said Bob Miccolis, a former managing director for Deloitte Consulting. Insurance companies do not want their actions to be misconstrued since that can lead to inquiries or even to an expensive market conduct exam, he observed.

Meanwhile, regulators are looking at how to address the multitude of issues and questions about data and predictive modeling use through the National Association of Insurance Commissioners’ (NAIC’s) Big Data (D) Task Force. Begun in 2016 as a working group, the task force’s recommendations will likely lead to a model for state law.

The task force’s chair is Laura N. Cali Robison, Oregon’s insurance commissioner and an actuary. Currently, the task force focuses on understanding the landscape, she said. “We are trying to think differently in this new age of big data (and) to feel assured that we have the right information and tools to understand how the models are being used.”

Some of the data issues the task force is exploring include who owns the data and who should be held accountable for its accuracy and its use. “The reality is there are a lot of different sources of data on the internet and a lot of it [data] is public,” she said.

Regulators are also concerned about the effect of big data and models on consumers — and so are actuaries (see Data Ethics sidebar). The task force also desires to locate areas “where the current regulatory framework stifles innovation that could be beneficial to the public and the market,” she said. Most states require rates to not be excessive, inadequate or unfairly discriminatory. Cali Robison said that the task force needs to explore whether those laws are sufficient to address potential concerns and opportunities for the use of big data in ratemaking.

The task force is also looking into the potential for how big data can affect other aspects of insurance, such as claim practices. “The environment has changed. I think there are ways data can be used to improve people’s experience with interacting on a claim, but the use of big data in claims handling may also carry risks,” Cali Robison observed.

Data and Analytics Business Model

Despite obstacles and regulatory unknowns regarding data and advanced analytics, applying predictive modeling to alter the traditional insurance business model moves forward. The approach uses data-driven business rules in predictive models to provide decision-making options, Miccolis said. “It is redefining the business,” he added.

Unlike the current approach, which is based on a combination of business rules from past experience, he explained, the data-driven model is based on measurable information that can be put in a mathematical model. “The equation, or series of equations, gives you certain types of results, such as high or low probability of success as one kind of outcome,” Miccolis said.

Industry movement toward this new insurance management and decision-making approach is much like the Parable of the Sower: Some companies go into the process, persevere and flourish; other companies find the ground not ready due to internal pushback.

 

The advantage of applying analytics for decision-making is that the techniques provide an objective anchor, said John Lucker, whose title includes advisory principal and global advanced analytics market leader, analytics strategist and evangelist at Deloitte Consulting. Without it, he said, “the best an organization can do is to have an average performance that is a function of the independent aggregated thinking of every person.” While some view the new approach as replacing people, Lucker believes it gives insurance companies consistency that can be lost from employee turnover.

“Since most mainstream property-casualty insurance products are largely commoditized with companies struggling to differentiate themselves with distributors and customers,” he explained, “the analytics model allows insurers to address core functional problems and create a consistent and objective approach that should foster new ways to compete.”

Reaching that point requires operational changes. “Pursuing the data-driven analytics model requires multidisciplinary collaboration because insurance companies are siloed,” Modlin said.

Steve Lowe, a senior consultant with Willis Towers Watson, said that the transition from the traditional model to one that is data-driven often begins with combining actuaries and data scientists on innovation teams. “To some extent, the supply shortage forces you to concentrate the talent,” he explained.

As more quantitative professionals have a solid grasp of both disciplines and the supply shortage eases, Lowe explained, they will gradually be embedded in different departments such as claims, pricing, underwriting and marketing rather than working within a concentrated innovation team. Actuaries can learn data science techniques and data scientists can gain deeper industry knowledge through the iCAS program.

The transition to the data-driven insurance business model is experiencing resistance, as did the evolution of predictive modeling innovation in general. The reasons are also similar, especially the pushback from employees who are more comfortable with judgment and experience-based decisions.

Industry movement toward this new insurance management and decision-making approach is much like the Parable of the Sower: Some companies go into the process, persevere, and flourish; other companies find the ground not ready due to internal pushback. Numerous companies have, for example, invested in the technology to become more data-driven but then could not make the move, Miccolis said, while others adjusted incrementally and succeeded in the effort over time.

Still other insurers wait, adopt the “show me” approach, and compare the results of models to human judgment, Lucker said, which is a very costly way to gain internal buy-in due to lost profit potential, potentially higher expenses and missed opportunity costs. Miccolis said that so far there are very few insurers that are comprehensively and holistically applying the data-driven modeling approach to improve their businesses.

The new model has its detractors. People tend to trust experience and educational qualifications more than data, Miccolis explained. On the other hand, he said, others see flaws in the traditional human judgment-based approach because people introduce cognitive biases due to how the brain works.

“Data is becoming more important than business relationships or clinical knowledge,” Lowe observed.

Conclusion

As actuaries experiment with meaningful data sources and discover appropriate applications with different predictive models, there are plenty of opportunities for fine-tuning model applications and even insurance functions. Realizing the advances of predictive modeling means addressing multi-fold impediments from data to regulatory concerns.

As always, there is risk in an industry famous for caution, but if the past is a predictor of the future, predictive modeling will continue to challenge the status quo.


Annmarie Geddes Baribeau has been covering actuarial topics for more than 25 years. Her blog can be found at http://insurancecommunicators.com.

Data Sources and Their Usage Present Ethical Concerns

As personal consumer data becomes more plentiful and models less straightforward, concerns about data ethics are being more closely examined. Questions include: Should the insurance industry use this data and, if so, how should it do so appropriately?

Coupling insurance companies’ internal data with consumer preference information, for example, became controversial a couple of years ago when consumer groups successfully crusaded against price optimization for determining customer premiums.

So far, approximately 20 states have limited or banned the use of price optimization models. “I applaud putting the brakes on optimization rating methods,” said Louise Francis, founder of Francis Analytics and Actuarial Data Mining, Inc. She considers this modeling application to be “predatory capitalism.”

Another important consideration is the appropriate use of information gathered from social media. Only three years ago, using social media information to learn about consumers was something to which insurance companies would not admit  publicly. Now there is greater acknowledgement of its use.

The kind of social media data that should be allowable for marketing and other purposes has not been clearly defined, said Laura N. Cali Robison, Oregon’s insurance commissioner and chair of the National Association of Insurance Commissioners’ Big Data (D) Task Force. “People have the responsibility to think about what they put in public view,” she explained. But people do not expect that a post on Facebook will affect their insurance or a bank loan, she observed.

Even actuaries have different views on the use of information posted on social media differently. Using consumer internet breadcrumbs about life events to locate potential auto insurance buyers is one approach some insurers currently use.

Stephen J. Mildenhall, a professor at St. John’s University’s risk management and insurance department, offered that data gathered from social media would be unreliable for insurers because people can post whatever they desire to make them look good to insurance companies.

To Mildenhall, rating variables should be directly related to risk and ideally should be controllable, so that insureds understand how their behavior affects premiums. For example, he explained, instead of rating by age, insurers could rate (as required in Massachusetts) by the number of years a person has had a driver’s license because it better reflects driving experience. Workers’ compensation experience rating is another good example of basing premium on the actual employer’s experience instead of using a proxy for the same.

Uncontrollable factors that the insured cannot change — age, sex, ethnicity, pre-existing conditions and genes for health/life insurance etc. —­ should not be included in rating, he added.

Regulatory constraints, whether real or perceived, are not the only forces affecting what might be considered the appropriate use of personal data. Public perception will also affect how insurers use greater sources of data, Cali Robison said. “‘How will I explain this to my policyholders?’ That might be a new big thing (to think) about,” she said.

“Insurers need to think of ways to use the data that are acceptable and a win-win for companies and customers,” said Jim Guszcza, U.S. chief data scientist at Deloitte Consulting. “[Changing] behavior through data may be a new 21st century way of being an actuary by helping insureds to understand and manage risk better.”

The internet of things (IoT) has such potential for reducing loss, Mildenhall said. Home sensors to shut off water leaks and to measure air quality have the potential to lower costs and make homes safer, he said. Telematics also has great potential for loss mitigation.

Francis, who is a consumer privacy advocate, questions using data stemming from IoT’s greater connectivity. “It’s always discussed in a positive light without thinking of the implications of using personal data.” For example, smart meters may present risks from malfunctions, or the data they generate may include private personal data, she said. Some consumers would rather opt out, but some public utilities now require their customers to use them.

The public debate concerning the collection, distribution and use of such data will continue. Companies in insurance and other industries will need to examine how they protect the public and ensure ethical practices regarding data and its uses.