It’s Not Just Twitter’s Problem: What Insurers Need to Know about DDoS and the Snake in the IoT Garden of Eden

On Friday October 21 a massive Distributed Denial of Service (DDoS) made over 1,000 websites unreachable, including, Twitter, Netflix and PayPal. Two cloud providers, Amazon Web Services and Heroku reportedly also experienced periods of unavailability.

The attack was directed against a key part of the internet’s infrastructure, a domain name system provider, Dynamic Network Services aka Dyn. When a person enters a web address into a browser, such as, the browser in turn needs an IP address (a string of numbers and periods) to actually connect with that web address. Domain name system providers are a critical source of IP addresses.

On Friday Dyn was the target of perhaps the largest ever DDoS, when its site was overcome by tens of million of requests for IP addresses. Because Dyn could not provide the correct IP addresses for Twitter and the other affected sites, those sites became unreachable for much of the day.

It also appears that the DDoS was mounted using a widely available malware program called Mirai. Mirai searches the web for IoT connected devices (such as digital video recorders and IP cameras) whose admin systems which can be captured using simple default user names and passwords, such as ADMIN and 12345. Mirai can then mobilize those devices into a botnet which executes a directed DDoS attack.

There are a number of potentially serious implications for insurers:

  • An insurer with a Connected Home or Connected Business IoT initiative that provides discounts for web-connected security systems, moisture detectors, smart locks, etc. may be subsidizing the purchase of devices which could be enlisted in a botnet attack on a variety of targets. This could expose both the policyholder and the insurer providing the discounts to a variety of potential losses.
  • If the same type of safety and security devices are disabled by malware, homeowners and property insurers may have increased and unanticipated losses.
  • As insurers continue to migrate their front-end and back-office systems to the cloud, the availability of those systems to customers, producers, and internal staff may drop below acceptable levels for certain periods of time.

The Internet of Things will change insurance and society in many positive ways. But the means used to mount the October 21 attack highlights vulnerabilities that insurers must recognize as they build their IoT plans and initiatives.

The Evolving Role of Architects

In the last couple of weeks I’ve had the great opportunity to spend time with IT architects of various sorts both inside and outside of the insurance industry. The discussions have been illuminating and offer different visions and futures both for technology that supports insurers and for the future of the architecture function in insurers.

One of the main events that allowed for this conversation was a round table held in London with architects from insurers. The main topics were the relevance of microservices style architectures to insurance, the role of the architects in AI and InsurTech and the future role of architects at insurers. Another event that offered an interesting contrast was the inaugural London Software Architecture Conference which I'll call SACon below (Twitter feed).


I won't fully define microservices here but briefly it’s an approach to delivering software where each service is built as it’s own application which can be scaled independently from other services.

Microservices as a way of delivering software was the default approach at the SACon. There were sessions where architects sharing stories about why sometimes you had to work with a monolith or even making the case for not having the services in discrete applications. Meanwhile at the round table the monolith was the default still with the case being made for microservices in some parts of the architecture.

There are use cases where microservices make a great deal of sense, particularly in already distributed systems where a great deal of data is being streamed between applications. Here the infrastructure of microservices and the libraries supporting the reactive manifesto such as Hysterix and Rx* (e.g. RxJava) and indeed one insurer related their use of microservices to support IoT. Others discussed using this style of approach and the tooling surrounding these architectures to launch new products and increase change throughput but in all cases these were far from replacing the core architecture.

For now microservices is not the default for insurer software but is certainly a tool in the box. An observation or two from SACon from those looking to adopt: First it doesn’t solve the question of how big a service or a component is, something architects need to discuss and refine and; Second, microservices needs a great deal of automation to make work, a topic covered in our DevOps report to be published shortly.

Architects and AI

I have a background with training and experience both in computer science, AI and machine learning. One thing that I noticed going to the analytics conferences where AI is discussed is the absence of IT representation – plenty of actuaries, MI/BI folks, marketing folks – was this a place for architects?

Most insurers present at the round table had activity within the organisation for AI. For the most part only data architects are involved in this discussion – AI being distinct from business and applications architecture for now. It’s my opinion that AI components will form part of the wider applications architecture in the future, with AI components being as common place as programmed ones.

Architects and InsurTech

Here is an area where architects can more immediately contribute in a meaningful way both in reviewing opportunities and unique capabilities from InsurTech firms and in discussing integration where acquisition rather than investment is the goal.

The challenge here of course is the age old challenge for architects – to have a seat in the discussion the architect function needs to demonstrate the value it can bring and it’s internal expertise.

Finally, one amusing discussion I had was with a few architects from startups. As I discussed legacy systems they also related seeing legacy systems in their organisations – albeit the legacy systems were 2 or 4 years old rather than 20 or 40 years old. The intriguing thing here was the reasons for them becoming legacy were the same as insurers – availability of skills, supportability and responsiveness to changing demands. It may hearten architects at insurers that start ups aren’t immune to legacy issues!



In search of a new ‘dominant design’ for the industry. What does insurtech have to offer?

There is little in the world of insurtech happening today that insurers couldn’t arguably choose to do for themselves if they were motivated to do it. They have the capital to invest. They have resources and could hire to fill gaps in any new capabilities required. They importantly understand the market and know how to move with the trends. And yet, despite having all of these things, they readily engage with the start-up community to do the things that arguably they could do for themselves.  So, why is that?   

In Making the Most of the Innovation Ecosystem, Mike Fitzgerald’s observes the main cultural differences between insurers and the start-ups they court. These cultural differences give us a strong clue as to why insurers engage with start-ups, even though on paper they do not and should not need them.

Alongside these deep cultural differences, I believe that there is another angle worth exploring to help answer the question, and that’s the market’s maturity stage and, with it, the strategies required to succeed.

One model that helps explain this relates to the work of Abernathy and Utterback on dynamic innovation and the concept of the ‘dominant design’. To be relevant to this discussion, you first need to believe that we’re on the cusp of a shift from an old world view of the industry based upon a well-understood and stable design towards one where substantial parts of the insurance proposition and value network are up for grabs. You also need to believe that, for a period at least, these two (or more) worlds will co-exist.

So, here’s a quick overview of the model (in case you’re not familiar with it)…

Settling on a “Dominant Design”

First introduced way back in the mid-1970s and based upon empirical research (famously using conformance towards the QWERTY keyboard as an example), Abernathy and Utterback observed that when a market (or specifically a technology within a market) is new, there first exists a period of fluidity where creativity and product innovation flourishes. During this period, huge variation in approaches and product designs can co-exist as different players in the market experiment with what works and what does not.

In this early fluid stage, a market is typically small, and dominated by enthusiasts and early adopters. Over time, a dominant design begins to emerge as concepts become better understood and demand for a certain style of product proves to be more successful than others. Here, within an insurance context, you'd expect to see high levels of change and a preference for self-build IT systems in order to control and lower the cost of experimentation.

Once the dominant design has been established, competition increases and market activity switches from product innovation to process innovation – as each firm scrambles to find higher quality and more efficient ways to scale in order to capture a greater market share. This is the transitionary stage. 

Finally, at the specific stage, competitive rivalry intensifies spurred on by new entrants emulating the dominant design, incremental innovation takes hold and a successful growth (or survival) strategy switches to one that either follows a niche or low-cost commodity path. Within an insurance context, outsourcing and standardisation on enterprise systems are likely to dominate discussions.

Applying the ‘dominant design’ concept to the world of insurance and insurtech

Building upon the co-existence assumption made earlier, within the world of insurtech today, there are broadly (and crudely) two types of firm: (1) those focused on a complete proposition rethink (such as Trov, Slice and Lemonade); and (2) those focused on B2B enablement (such as Everledger, Quantemplate and RightIndem). The former reside in ‘Fluid’ stage (where the new ‘dominant design’ for the industry has not yet been set and still may fail) and the latter in the ‘Transitionary’ stage (where the dominant design is known, but there are just better ways to do it).

Figure: Innovation, Insurance and the 'Dominant Design'


(Source: Celent – Adapted from Abernathy and Utterback (1975)

Outside of insurtech, within the 'Specific' stage, there is the traditional world of insurance (where nearly all of the world’s insurance premiums still sit by the way) that is dominated by incumbent insurers, incumbent distribution firms, incumbent technology vendors, and incumbent service providers.

So what? 

What I like about this model is that it starts to make better sense of what I believe we’re seeing in the world around us. It also helps us to better classify different initiatives and partnership opportunities, and encourages us to identify specific tactics for each stage – the key lesson being "not to apply a ‘one-size fits’ all strategy to the firm".

Finally, and more importantly, it moves the debate on from being one about engaging insurtech start-ups purely to catalyze cultural change (i.e. to effect the things that the incumbent firms cannot easily do for themselves) towards one begging more strategic and structural questions to be asked, such as will a new ‘dominant design’ for the industry really emerge?, what will be its time-frame to scale?, and what specific actions are required to respond (i.e. to lead or to observe and then fast-follow).

Going back to my original question “What does insurtech have to offer?”. Insurers can do nearly all of what is taking place within insurtech as it exists today by themselves…but, as stated at the start of this blog, if, and only if, they are motivated to do so.

And there’s the rub. Many incumbents have been operating very successfully for so long in the ‘specific’ stage optimizing their solutions that making the shift required to emulate a ‘fluid’ stage is a major undertaking – why take the risk?. However, this is not the only issue that is holding them back. For me, the bigger question remains one of whether there is enough evidence to show the existence of an emerging new ‘dominant design’ for the industry in the ‘fluid’ stage that will scale to a size that threatens the status quo. Consequently, in the meantime, partnering and placing strategic investments with insurtech firms capable of working in a more ‘fluid’ way may offer a smarter more efficient bet in the meantime.

In a way, what we’re seeing today happening between insurers and insurtech firms  is the equivalent of checking out the race horses in the paddock prior to a race.  Let the race begin!







Where is the innovation in Individual life and annuity?

I had the pleasure of attending an amazing event last week in Las Vegas. The InsureTech Connect event drew over 1,500 people, from insurers to vendor to investors. Given the unprecedented size of an inaugural event, I was very impressed with how well the event worked. The sessions were good, but for me, the opportunity to have individual meetings with key industry players was even better. Our own Oliver Wyman was the primary sponsor of the event.

As I cover individual and group products, plus health and have an experience in P&C, I personally got a lot out of the event. I did have one major observation which I think speaks of the individual life and annuity industry. While I did not do a scientific study, I would estimate that over 50% of the content was focused on P&C insurance. This is not particularly surprising as they have all the cool technology like drones. My estimate was that the group insurers and health insurers were about 45% of the content, with an emphasis on topics like wellness programs and direct to consumer exchanges.

If you did the math, this only leaves 5% of the content for individual life and annuity products and that may very well have been a stretch. There was one session on eliminating the health data gathering for underwriting, which was well done and well attended, but past that, not so much.

Some insurers are diversifying, into Group or Wealth management, but I would not characterize that as innovation.

So what is holding us back as an industry? There are many things, from risk aversion, to length of the application to the sheer amount of data required for underwriting. I could write pages and pages on the topic, which explains why the next blog post you read from me is likely going to discuss the report I am finishing on this exact topic.

The potential for disruption in the space is huge and the coveted Millennial buyer is looking for just such innovation. Let’s make it happen.

The Rise and Rise of Analytics in Insurance

As noted in our prior research insurance has always been an industry that relies on advanced analytics and has always sought to predict the future (as it pertains to risk) based on the past. (For research on advanced analytics in insurers see here, here and here).

As observed in the last post here analytics, AI and automation has been a key focus of InsurTech firms but do not assume that the investment is limited to newbies and start-ups. I have for a few years now been attending and following the Strata+Hadoop conferences and others focused on advanced analytics and the broad range of tools and opportunities coming out of the big data organisations. This last week I attended a conference focused on the insurance industry and was surprised to see the two worlds have finally, genuinely overlapped – just take a look at the sponsors.

As Nicolas Michellod and I have noted in the past, insurers have already been investing in these technologies but only those that have made the effort to speak “insurance”. What the conversations at Insurance Analytics Europe (twitter feed) demonstrated was a new focus on core data science tools and capabilities. This continued the theme from DIA Barcelona (twitter) earlier in the year.

The event followed InsTech London’s meeting (Twitter) looking at data innovation and it’s opportunities for Lloyd’s, the London market and the TOM initiative. Here the focus was on InsurTech firms that would partner on analytics, would sell data or would enable non-data scientists to benefit from advances in machine learning, predictive analytics and other advanced analytics disciplines.

While this trend of democratising advanced analytics was discussed by analytics heads and CDO’s at the analytics conference the focus was much more on communicating value, surfacing existing capability and tools within the organisation and to put it bluntly, getting better at managing data.

In short – AI, Analytics, Machine Learning, Automation – these were all hot topics at InsurTech Connect and similar events but for the insurers out there – don’t assume these are purely the domain of InsurTech. Insurers are increasingly investing in these capabilities which in turn is attracting firms with a great deal to offer our industry. For those big data firms that ruled out insurance as a target market a couple of years ago – look again, the appetite is here.

As a techy and AI guy of old I am deeply enthused by this focus and excited to see what new offerings come out of the incumbent insurers and not just InsurTech.

Do have a look at the aware machine report and the blog too. We’re increasing our coverage in this area so if you have a solution focused on this space please reach out to Nicolas, Mike or myself so we can include you and for the insurers look out for a report shortly.


Life Insurance Automated Underwriting – A 25 Year Journey

Automated underwriting has come a long way in the last 25 years. It may be surprising that there was automated underwriting 25 years ago. At that time, it was called ‘expert’ underwriting. The idea was right, but the timing was wrong. The underwriting engines were black box algorithms; there was no user interface; data was fed from a file to the system; programming was required to write rules; and specialized hardware was necessary to run the systems. Not surprisingly, this attempt at automating underwriting was dead on arrival.

The next major iteration occurred about ten years later. Automated underwriting systems included a user interface; rules were exposed (some programming was still required to change the rules); data interfaces were introduced to collect evidence from labs and the medical inquiry board; underwriting decisions could be overridden by the human underwriter; and workflow was provided. Some insurers chose to take a chance on this new technology, but it was not widely adopted. There were two strikes against it: cost and trust. The systems were expensive to purchase, and the time and costs involved in integrating and tailoring the systems to a specific company’s underwriting practice could not be outweighed by the benefits. The lack of benefits was partially because the underwriters did not trust the results. Many times this caused double work for the underwriters. The underwriters reviewed the automated underwriting results and then evaluated the case using manual procedures to ensure the automated risk class matched the manual results.

Moving ahead fifteen years to today, changes in the underwriting environment place greater demands on staff and management. Staff members are working from home, and contractors are floating in and out of the landscape, all while reinsurers are knocking on the insurer’s door. There are now state-of-the-art new business and underwriting (NBUW) systems that address the challenges associated with the new demands. The solutions do not just assess the risk but provide workflow, audit, and analytics capabilities that aid in the management process. Rules can be added and modified by the business users; evidence is provided as data so that the rules engine can evaluate the results and provide the exceptions for human review. Subjective manual random audits of hundreds of cases evolve into objective, data-driven perspectives from thousands of cases. Analytics provide insights on specific conditions and impairments over the spectrum of underwritten cases to provide a portfolio view of risk management. Underwriting inconsistencies become easy to find and specific training can be provided to improve quality.

.In our report, Underwriting Investments that Pay Off, Karen Monks and I found that the differences between insurers who are minimally automated and those that are moderately to highly automated are substantial.  For minimally automated insurers, the not in good order (NIGO) rates are four times higher, the cycle times are 30% longer, and the case manager to underwriter ratio is almost double compared to the metrics for the moderately to highly automated insurers. This outcome may not reflect your specific circumstances, but it is worth preparing a business case to understand the benefits. With the advances in the systems and the advantages provided for new business acquistion, there are few justifications for any company not to seek greater automation in their underwriting.  

To learn more about the adoption of current NBUW systems and the functionality offered in them, please read our new report, What’s Hot and What’s Not, Deal and Functionality Trends and Projections in the Life NBUW Market or join our webinar on this topic on Thursday, September 29.  You can sign up here.



The Muslin is off the Lemon — Lemonade Launches

Today’s announcement by Lemonade provides an example of what actual disruption in insurance looks like. Disruption — the term is overused in the hype around innovation. In Celent’s research on innovation in insurance, we see that what is often tagged as disruptive is actually an improvement, not a displacement, of the existing business model.

The information released describes how Lemonade seeks to replace traditional insurance. Yes, they have built a digital insurance platform. Beyond that significant feat, they seek to replace the profit-seeking motive of their company with one based on charitable giving, acting as a Certified B-Corp (more info on B-Corps). They are also using the charitable motive as the guide to establish their risk sharing pools, thus creating the peer-to-peer dimension. Unlike other P2P efforts, Lemonade goes beyond broking the transaction and assumes the risk (reinsured by XL Catlin, Berkshire Hathaway and Lloyd’s of London, among others).

However, like other P2P models, such as Friendsurance, Lemonade faces a real challenge regarding customer education. The Celent report Friendsurance: Challenging the Business Model of a Social Insurance Startup — A Case Study details the journey of the German broker along a significant learning curve regarding just how much effort was required to teach consumers a new way to buy an old product.

The next few weeks will surface answers to they second-level questions about this new initiative such as:

  • How/if their technical insurance products differ from standard home,renters, condo and co-op contracts;
  • What happens to members of a risk sharing pool when the losses exceed funding;
  • Will the bedrock assumption, that a commitment to charity will overcome self interest and result in expected levels of fraud reduction?

It is refreshing to see some disruption delivered in the midst of all the smoke around innovation. Celent toasts Lemonade and welcomes this challenge to business as usual!


Changing the Landscape of Customer Experience with Advanced Analytics

That timeless principle – “Know Your Customer” – has never been more relevant than today. Customer expectations are escalating rapidly. They want transparency in products and pricing; personalization of options and choices; and control throughout their interactions.

For an insurance company, the path to success is to offer those products, choices, and interactions that are relevant to an individual at the time that they are needed. These offerings extend well beyond product needs and pricing options. Customers expect that easy, relevant experiences and interactions will be offered across multiple channels. After all, they get tailored recommendations from Amazon and Netflix – why not from their insurance company?

Carriers have significant amounts of data necessary to know the customer deeply. It’s there in the public data showing the purchase of a new house or a marriage. It’s there on Facebook and LinkedIn as customers clearly talk about their life changes and new jobs.

One of the newest trends is dynamic segmentation. Carriers are pulling in massive amounts of data from multiple sources creating finely grained segments and then using focused models to dynamically segment customers based on changing behaviors.

This goes well beyond conventional predictive analytics. The new dimension to this is the dynamic nature of segmentation. A traditional segmentation model uses demographics to segment a customer into a broad tier and leaves them there. But with cognitive computing and machine learning an institution can create finely grained segments and can rapidly change that segmentation as customer behaviors change.

To pull off this level of intervention at scale, a carrier needs technology that works simply and easily, pulling in data from a wide variety of sources – both structured and unstructured.

The technology needs to be able to handle the scale of real-time analysis of that data and run the data through predictive and dynamic models. Models need to continuously learn and more accurately predict behaviors using cognitive computing.

Doing this well allows an carrier to humanize a digital interaction and in a live channel, to augment the human so they can scale, allowing the human to focus on what they do best – build relationships with customers and exercise judgment around the relationship.

Sophisticated carriers are using advanced analytics and machine learning as a powerful tool to find unexpected opportunities to improve sales, marketing and redefine the customer experience. These powerful tools are allowing carriers to go well beyond simple number crunching and reporting and improve their ability to listen and anticipate the needs of customers.

The 2017 Model Insurer Nominations Start Now

It’s been five months since we awarded Zurich with our top distinguished award, Model Insurer of the Year, during our Innovation & Insight Day (I&I Day) on April 13. I&I Day has been growing and gaining recognition since its inception over 10 years ago. Over last two years, more than 250 financial services professionals joined us in New York City at Carnegie Hall in 2015 and at The Museum of American Finance in 2016 to celebrate the Model Insurer winners.

From September 15, we will be accepting Model Insurer nominations. The window for new entries will close on November 30. We are looking forward to receiving your best IT initiatives. You may be announced as a Model Insurer at our I&I Day in 2017. The Model Insurer award program recognizes projects that essentially answer the question: What would it look like for an insurance company to do everything right with today’s technology? It awards insurance companies which have successfully implemented a technology project in five categories:

  • Data mastery and analytics.
  • Digital and omnichannel technology.
  • Innovation and emerging technologies.
  • Legacy transformation.
  • Operational excellence.

Some examples of initiatives that we awarded early this year are:

Model Insurer of the Year   

Zurich Insurance: Zurich developed Zurich Risk Panorama, an app that allows market-facing employees to navigate through Zurich’s large volumes of data, tools and capabilities in only a few clicks to offer customers a succinct overview of how to make their business more resilient. Zurich Risk Panorama provides dashboards that collate the knowledge, expertise and insights of Zurich experts via the data presented.

Data Mastery & Analytics

Asteron Life: Asteron Life created a new approach to underwriting audits called End-to-End Insights. It provides a portfolio level overview of risk management, creates the ability to identify trends, opportunities and pain points in real-time and identifies inefficiencies and inconsistencies in the underwriting process. 

Celina Insurance Group: Celina wanted to appoint agents in underdeveloped areas. To find areas with the highest potential for success, they created an analytics based agency prospecting tool. Using machine learning, multiple models were developed that scored over 4,000 zip codes to identify the best locations.

Farm Bureau Financial Services: FBFS decoupled its infrastructure by replacing point to point integration patterns with hub and spoke architecture. They utilized the ACORD Reference Architecture Data Model and developed near real time event-based messages.

Digital and Omnichannel

Sagicor Life Inc.: Sagicor designed and developed Accelewriting® , an eApp integrated with a rules engine; which uses analytic tools and databases to provide a final underwriting decision within one to two minutes on average for simplified issue products.

Gore Mutual Insurance Company: Gore created uBiz, the first complete ecommerce commercial insurance platform in Canada by leveraging a host of technology advancements to simplify the buying experience of small business customers.

Innovation and Emerging Technologies

Desjardins General Insurance Group: Ajusto, a smart phone mobile app for telematics auto insurance, was launched by Desjardins in March 2015. Driving is scored based on four criteria. The cumulative score can be converted into savings on the auto insurance premium at renewal.

John Hancock Financial Services: John Hancock developed the John Hancock Vitality solution. As part of the program, John Hancock Vitality members receive personalized health goals. The healthier their lifestyle, the more points they can accumulate to earn valuable rewards and discounts from leading retailers. Additionally, they can save as much as much as 15 percent off their annual premium.

Promutuel Assurance: Promutuel Insurance created a new change management strategy and built a global e-learning application, Campus, which uses a web-based approach that leverages self-service capabilities and gamificaton to make training easier, quicker, less costly and more convenient.

Legacy Transformation

GuideOne Insurance: GuideOne undertook a transformation project to reverse declines in its personal lines business. They launched new premier auto, standard auto, and non-standard auto products, as well as home, renter and umbrella products on a new policy administration system and a new agent portal.

Westchester, a Chubb Company: Chubb Solutions Fast Track™, a robust and flexible solution covering core business functionality, was built to support Chubb’s microbusiness unit’s core mission of establishing a “Producer First,” low-touch mindset through speed, accessibility, value, ease-of-use and relationships.

Teachers Life: Teachers Life has achieved a seamless, end-to-end online process for application, underwriting, policy issue and delivery for a variety of life products. Policyholders with a healthy lifestyle and basic financial needs can get coverage fast, in the privacy of their own homes, and pay premiums online in as little as 15 minutes.

Operational Excellence

Markerstudy Group: Markerstudy implemented the M-Powered IT Transformation Program which created an eco-system of best in class monitoring and infrastructure visualization tools to accelerate cross-functional collaboration and remove key-man dependencies.

Guarantee Insurance Company: In order to focus on their core competency of underwriting and managing a large book of workers compensation business, Guarantee Insurance outsourced its entire IT infrastructure.

Pacific Specialty Insurance Company: Complying with their vision is to become a virtual carrier, meaning all critical business applications will be housed in a cloud-based infrastructure, PSIC implemented their core systems in a cloud while upgrading infrastructure to accommodate growth in bandwidth demands.

If you have completed a project during the last two years that you feel is a role model for the industry, don’t hesitate to send us your initiative here. You may be the next Model Insurer of the year.

For more information about the Model Insurer program click here, leave a comment, or email me directly at I’d be more than happy to talk with you. The Celent team and I are looking forward to hearing from you and meeting you in person at the 2017 Innovation & Insight Day.

See you there!

Using private consumer data in insurance: Mind the gap!

Insurance is no different to other industries when it comes to capturing valuable data to improve business decisions. At Celent we have already discussed how and where in their operations insurance companies can leverage private consumer data they can find on social networks, blogs and so on. For more information you can read a report I have published this year explaining Social Media Intelligence in insurance.

Actually there are various factors influencing insurers' decision to actively use private consumer data out there including among others regulation, resources adequacy, data access and storage. I think that an ethical dimension will play a more important role going forward. More precisely I wonder whether consumers and insurers' perceptions about the use of private consumer data are divergent or similar:

  • What do consumers really think about insurance companies using their private data on social networks and other internet platforms?
  • What about insurers; does it pose an issue for them?

In order to assess this ethical dimension, we have asked both insurers worldwide and also consumers (in the US, UK, France, Germany and Italy) what where their view on this topic. To insurers, we simply asked them what best described their opinion about using consumer data available on social networks (Facebook, Twitter, LinkedIn, etc.) and other data sources on the internet (blogs, forums, etc.). To consumers, we asked what were their opinions about insurers using these open data sources for tracking people potentially engaged in fraud or criminal activity.

The following chart shows the result and indicates that there is a big gap between the two sides:


Overall what is good for consumers is not necessarily good for insurers. In the same way, what insurers want is not always in line with what consumers expect from their insurers. Going forward the question for insurance companies will be the find the right balance between the perceived value of private consumer data and customers' satisfaction. In addition, it will be tough for them to figure out the impact (pros and cons) of all factors at play in the decision to invest in technologies allowing for the efficient use of private consumer data accessible on the Internet.

At Celent, we are trying to define a framework that can help them structure their reasoning and make an optimal decision. So more to come in the coming weeks on this topic…