Lost in Innovation?

So, how do you avoid getting lost in innovation? The simple (and maybe glib) answer might be to buy a map, a compass and start to plan your route. However, what do you do when there is no map, no obvious path to take and no-one to follow?

The last 24 months have seen an incredible amount of activity across the sector in experimenting with novel proposition concepts fuelled by emerging technologies in the internet of things, distributed ledgers and bot-driven artificial intelligence. Although each new concept shows promise, we are yet to experience a clear and obvious pattern for winning new clients or delivering a superior shareholder return using them. Many of the most exciting novel ideas (and many are genuinely exciting) are yet to see any real business volume behind them (see my earlier blog for additional context of what insurtech has to offer in defining the ‘dominant design’ for new tech-enabled propositions).

So, as an insurer faced with having to balance how much it should invest in these new concepts versus furthering the existing business in what is probably a highly successful and scalable model, two of the big questions we often hear from clients are: “Which of these nascent concepts are most likely to deliver real business value the fastest?” and “How much effort should I be devoting to exploring them today?” These are the questions that we looked to address at our latest event in London that we called ‘Lost in Innovation’, attended by just over 70 inquiring insurance decision makers.

Faced with uncertainty, we followed an agenda that focused on the things that an insurer can control, such as the innovation-led partnerships they enter, the skills they develop internally, the criteria used for measuring value, and the potential challenges ahead that they need to plan for.

Celent analyst Craig Beattie presenting on emerged software development approaches

Alongside presenting some of our latest research on the topic, we were joined on-stage by:

  • Matt Poll from NEOS (the UK’s first connected home proposition in partnership with Hiscox) shared his experience on the criteria for a successful partnership.
     
  • Jennyfer Yeung-Williams from Munich Re and Polly James from Berwin, Leighton, Paisner Law shared their experience and views on some of the challenges in the way of further adoption, including the attitude of the regulator and potential legal challenges presented by using personal data in propositions.
     
  • Dan Feihn, Group CTO from Markerstudy, presented his view of the future and how they are creating just enough space internally to experiment with some radical concepts – demonstrating that you don’t always need big budget project to try out some novel applications of new technologies.

So, what was the conclusion from the day? How do you avoid getting lost in innovation? Simply speaking, when concepts are so new that the direction of travel is unclear, a more explorative approach is required – testing each new path, collecting data and then regrouping to create the tools needed to unveil new paths further ahead until the goal is reached. Scaling concepts too early in their development (and before they are ready) may be akin to buying a 4×4 to plough through the scrub ‘on a hunch’ only to find quicksand on the other side.

Some tips shared to help feel out the way:

  • Partnerships will remain a strong feature of most insurer’s innovation activity over the next 12-24 months. Most struggle to create the space to try out new concepts. Also, realistically, many neither have the skills or the time to experiment (given that their existing capabilities are optimised for the existing business). Consequently, partnerships create a way to experiment without “upsetting the applecart”.
     
  • Hiring staff from outside of the industry can be a great way to change the culture internally and bring-in fresh new ideas…however, unless there is an environment in place to keep them enthused, there remains a risk of them turning ‘blue’ and adopting the existing culture instead of helping to change it.
     
  • There are several ways to measure value created by an initiative. The traditional approach is a classic ‘Return on Investment’ (RoI). However, RoI can be hard to calculate when uncertainty is high. To encourage experimentation, other approaches may be better suited, such as rapid low-cost releases to test concepts and gather data to feel the way. Framing these in terms of an ‘affordable loss’ may be another way to approach it – i.e. “What’s the maximum amount that I’m willing to spend to test this out?” – accepting that there may not be an RoI for the initial step. Although no responsible insurer should be ‘betting the house’ on wacky new concepts, reframing the question and containing exposure can sometimes be all that’s required to create the licence to explore.
     
  • There’s still an imbalance between the promise of technology and the reality of just how far end-customers and insurers are willing to go in pursuit of value. The geeks (or ‘path finders’) have rushed in first – but will the majority follows? Regardless, to avoid getting lost in the ‘shiny new stuff’, a focus on customer value, fairness and transparency around how data is being used need to be at the heart of each proposition – plus, recognising that the regulator will not be far behind.
     

In summary, the journey ahead needs to be less about the ‘what’ (with all of its bells, whistles and shiny parts) and more about the ‘how’ (deep in the culture of the firm and its willingness to experiment – even in small ways) – at least while the map to future value is being still being drawn.

Celent continues to research all of these topics, including assessing the different technologies and techniques that insurers can use. Feel free to get in touch to discuss how Celent could assist your organisation further.

Celent clients will be able to access the presentations from the event via their Celent Account Manager.

How Insurity’s Acquisition of Valen Could Create a Virtuous Analytics Circle

It’s open season on insurance technology acquisitions in general, and for Insurity in particular. Today’s announcement of Insurity’s acquisition of Valen Analytics is now Insurity’s fourth acquisition in a multi-year string: Oceanwide, Tropics, and in rapid succession Systema and Valen.   The potential for crossing selling among the five customer bases is obvious.   Less obvious, but of potentially even greater value, is Insurity’s ability to invite all of its insurer and other customers to use its Enterprise Data Solutions IEV solution as the gateway to Valen’s contributory database and Valen’s InsureRight analytic platform.   Insurity now has the scale and the means to create a virtuous analytics circle: individual customers contributing a lot of data through IEV to Valens and receiving back analytic insights to feed into their pricing, underwriting, and claims operations.   Good move.

Conversation systems and insurance — one experience

To start with full disclosure, I am a huge fan of the Amazon Echo. We have them throughout the house, and have automated our home so Alexa can control most light switches, ceiling fans and more. We play music through them, ask for the weather, schedule appointments, and more.

All my kids are believers from our 5 year-olds on up. It’s fun to hear one of my five year-olds ask Alexa to play the song YMCA and then burst into full song, including the dance. My one personal recommendation. If you have an Echo and children, turn off voice purchases. I found out the hard way.

So I thought I would check out how Alexa does with insurance. My plan is to try all the skills and leverage them into a report. I may even have to purchase one of Google’s new Google Home devices just to compare them in this use case.

So I spent considerable time this morning trying to get an auto quote. Let’s just say the outcome was that I gave up. I won’t name the insurer, as I am sure that their Alexa skill works well in other areas such as information sharing and likely works for others to get a quote, but it sure did not for me. I do want to give credit to the insurer, as they are out on the bleeding edge doing these quotes.

First it asked me my birth year. It heard 1916. That’s not when I was born, but that’s what it heard. I tried to correct it, using the instructions it had provided, but no dice. I gave up and started over, only to be born in 1916 again. This time it was so stuck I had to unplug the Echo. I was surprised, as Alexa’s voice recognition amazes me.

I’m old, but I’m not 101 years old.

I finally made it through on the third try with very careful enunciation. Made it through my wife’s birth year and the fact we’re both married (apparently being married to each other wasn’t important).

Got to the question on what body style. I tried convertible, since, well, it is a convertible. That wasn’t an option. Since the app had prompted 2 door car as an example, I tried it. Um, no. That’s not supported. That seemed odd, but I tried car. Apparently car is OK.

Made it through miles driven a year.

Go to age of the car. My car is a little older, but no antique. However, apparently 12 years old is fatal, as the app crashed with “Sorry I am having trouble accessing your skill right now”.

OK, odd, but wireless sometimes blips, so no problem. Started over for the fourth time.

Worked my way through all the questions, enunciating very, very carefully and got to age of my car.

Yep. Crashed again.

At that point, I gave up and decided to write a blog instead.

Or I could have played a game of Jeopardy with Alexa.

CES 2017: JUST HOW SMART IS AI GOING TO MAKE CONNECTED CARS AND CONNECTED HOMES?

Walking the exhibit halls and attending sessions at the mammoth Consumer Electronics Show, it was easy to identify the dominant theme: AI-enabled Intelligent Personal Assistants (IPAs).
  • Manufacturers and suppliers of connected cars and homes are betting big on IPAs: overwhelmingly favoring Amazon Alexa.
  • Impressionistically, Google Assistant, Siri, Cortana and others trailed some distance behind.
Natural language commands, queries and responses provide a vastly more intuitive UX. And these capabilities in turn make owning and using a connected home or car much more attractive. But there is a deeper potential benefit for the connected car and connected home sellers: developing context-rich data and information about the connected home occupants and the connected car drivers and passengers. This data and information include:
  • Who is in the house, what rooms they occupy—or who is in the car, going to which destinations
  • And what they want to do or see or learn or buy or communicate at what times and locations
Mining this data will enable vendors to anticipate (and sometimes create) more demand for their goods and services. (In a sense, this is the third or fourth generation version of Google’s ad placement algorithms based on a person’s search queries.) Here’s what this means for home and auto insurers:
  • As the value propositions of connected cars and homes increase, so does the imperative for insurers to enter those ecosystems through alliances and standalone offers
  • The IPA-generated data may provide predictive value for pricing and underwriting
  • IPAs are a potential distribution channel (responding to queries and even anticipating the needs of very safety- and budget- conscious consumers)
A note on terminology: the concept of “Intelligent Personal Assistants” is fairly new and evolving quickly. Other related terms are conversational commerce, chatbots, voice control, among others.

Guidewire makes blockbuster acquisition of ISCS

Long sought after by Private Equity firms, other insurers, and the occasional investment banker looking for a transaction, privately held ISCS has chosen to join Guidewire (NYSE:GWRE).   ISCS adds its SurePower Innovation end-to-end suite to Guidewire’s existing InsuranceSuite end-to-end suite. This is a decided change of acquisition strategy for Guidewire. Up to now, all its acquisitions have fit into—or added a single new element—to InsuranceSuite.   Why?   Well, if you are a publicly held company growth is good. ISCS immediately brings more revenue and more importantly brings good market momentum with a solid sales pipeline.   ISCS’ focus on small and midsize insurers brings a few other intriguing possibilities. One is that Guidewire and its SI alliance partners will now aim at the large and very large insurer market, leaving the small and midsize market to ISCS. A second is that ISCS will become a vehicle for small insurer growth outside of the US. The third is that ISCS’ more extensive cloud experience, especially with AWS, will step up Guidewire’s movement to the cloud.   For now Guidewire shareholders have a heckuva gift under their Christmas trees.  

Have Electronic Applications Come of Age?

My first experience with an electronic application was in 2002.  I was working with a major credit card company who included a flyer along with the billing statement that provided information about how to apply on-line for their term life insurance product. We didn't know how many applications to expect; but based on the wide distribution, we planned on a high number.  Many months of effort went into developing the eApplication on the website and creating an interface for the collected data into the new business and underwriting system. This was cutting edge technology at the time. The electronic application collected the Part 1 – demographic information – of the application. The Part 2 – medical information – was collected by a third party. A whopping 523 applications were received from the first mailing. The campaign continued on an intermittent basis for a year with a few over 2,000 applications received. At the end of the year, we threw in the towel and quietly closed down the campaign.  

Why did the campaign fail? There was nothing wrong with the process and the technology, while primitive compared to today, worked well.  The problem was that the idea was ahead of its time.  People were not ready to buy insurance on the internet. In fact, most of the applications received were declined or heavily rated.  The people who applied were driven to do so by a less than stellar health history and had few other options available to them.   

Flash forward to today; digitization of life insurance new business is a hot topic. Consumers are buying everything from mutual funds to groceries on the internet.  However, based on Celent’s recent new business and underwriting benchmarking report, Resetting the Bar: Key Metrics in Life Insurance New Business and Underwriting, nearly 52% of all insurance applications received are still in paper form.

There are a number of problems associated with paper applications, from missing forms to illegible writing, which creates a tremendous impact on an insurer’s ability to process an application quickly and/or accurately. Industry benchmarks have placed NIGO (not in good order) rates at greater than 50%. Electronic applications essentially eliminate NIGO.

Our research shows a significant reduction in new business cycle time for insurers between 2007 and 2016. For high face amount writers, the average cycle time decreased from 52 days to 44 days and from 42 days to 33 days for moderate face amount writers. When asked how the better results were obtained, the majority of insurers had seen a reduction in cycle time due to the use of technology. Some responses included “increase in eApp adoption and increased use of an automated UW engine,” “eApp, more skilled staff, cross-training with 60% automated underwriting, so huge reduction,” and “increase in auto-issue rate.” Obviously, the new business process is ripe for automation.

In Karen Monks’ and my new report, The Doorway to Straight-Through Processing: Life Insurance Electronic Applications 2016, we profile nine software vendors and their 10 electronic applications marketed to life insurance. The report focuses only on stand-alone solutions in North America. For each vendor the solution is described using the customer base, data sources supported, functionality, and technology, as well as implementation and costs.

In 2002, the buying public wasn’t ready to shop for insurance on-line.  That attitude is changing.  An electronic application, along with an underwriting rules engines and electronic contract delivery, to enable straight-through processing will soon be the norm. The time for eApplications has arrived.  An electronic application opens the door to transform the insurance buying experience, increase agent and customer satisfaction, and potentially sell more insurance.

  

 

It’s Not Just Twitter’s Problem: What Insurers Need to Know about DDoS and the Snake in the IoT Garden of Eden

On Friday October 21 a massive Distributed Denial of Service (DDoS) made over 1,000 websites unreachable, including, Twitter, Netflix and PayPal. Two cloud providers, Amazon Web Services and Heroku reportedly also experienced periods of unavailability.

The attack was directed against a key part of the internet’s infrastructure, a domain name system provider, Dynamic Network Services aka Dyn. When a person enters a web address into a browser, such as google.com, the browser in turn needs an IP address (a string of numbers and periods) to actually connect with that web address. Domain name system providers are a critical source of IP addresses.

On Friday Dyn was the target of perhaps the largest ever DDoS, when its site was overcome by tens of million of requests for IP addresses. Because Dyn could not provide the correct IP addresses for Twitter and the other affected sites, those sites became unreachable for much of the day.

It also appears that the DDoS was mounted using a widely available malware program called Mirai. Mirai searches the web for IoT connected devices (such as digital video recorders and IP cameras) whose admin systems which can be captured using simple default user names and passwords, such as ADMIN and 12345. Mirai can then mobilize those devices into a botnet which executes a directed DDoS attack.

There are a number of potentially serious implications for insurers:

  • An insurer with a Connected Home or Connected Business IoT initiative that provides discounts for web-connected security systems, moisture detectors, smart locks, etc. may be subsidizing the purchase of devices which could be enlisted in a botnet attack on a variety of targets. This could expose both the policyholder and the insurer providing the discounts to a variety of potential losses.
  • If the same type of safety and security devices are disabled by malware, homeowners and property insurers may have increased and unanticipated losses.
  • As insurers continue to migrate their front-end and back-office systems to the cloud, the availability of those systems to customers, producers, and internal staff may drop below acceptable levels for certain periods of time.

The Internet of Things will change insurance and society in many positive ways. But the means used to mount the October 21 attack highlights vulnerabilities that insurers must recognize as they build their IoT plans and initiatives.

The Evolving Role of Architects

In the last couple of weeks I’ve had the great opportunity to spend time with IT architects of various sorts both inside and outside of the insurance industry. The discussions have been illuminating and offer different visions and futures both for technology that supports insurers and for the future of the architecture function in insurers.

One of the main events that allowed for this conversation was a round table held in London with architects from insurers. The main topics were the relevance of microservices style architectures to insurance, the role of the architects in AI and InsurTech and the future role of architects at insurers. Another event that offered an interesting contrast was the inaugural London Software Architecture Conference which I'll call SACon below (Twitter feed).

Microservices

I won't fully define microservices here but briefly it’s an approach to delivering software where each service is built as it’s own application which can be scaled independently from other services.

Microservices as a way of delivering software was the default approach at the SACon. There were sessions where architects sharing stories about why sometimes you had to work with a monolith or even making the case for not having the services in discrete applications. Meanwhile at the round table the monolith was the default still with the case being made for microservices in some parts of the architecture.

There are use cases where microservices make a great deal of sense, particularly in already distributed systems where a great deal of data is being streamed between applications. Here the infrastructure of microservices and the libraries supporting the reactive manifesto such as Hysterix and Rx* (e.g. RxJava) and indeed one insurer related their use of microservices to support IoT. Others discussed using this style of approach and the tooling surrounding these architectures to launch new products and increase change throughput but in all cases these were far from replacing the core architecture.

For now microservices is not the default for insurer software but is certainly a tool in the box. An observation or two from SACon from those looking to adopt: First it doesn’t solve the question of how big a service or a component is, something architects need to discuss and refine and; Second, microservices needs a great deal of automation to make work, a topic covered in our DevOps report to be published shortly.

Architects and AI

I have a background with training and experience both in computer science, AI and machine learning. One thing that I noticed going to the analytics conferences where AI is discussed is the absence of IT representation – plenty of actuaries, MI/BI folks, marketing folks – was this a place for architects?

Most insurers present at the round table had activity within the organisation for AI. For the most part only data architects are involved in this discussion – AI being distinct from business and applications architecture for now. It’s my opinion that AI components will form part of the wider applications architecture in the future, with AI components being as common place as programmed ones.

Architects and InsurTech

Here is an area where architects can more immediately contribute in a meaningful way both in reviewing opportunities and unique capabilities from InsurTech firms and in discussing integration where acquisition rather than investment is the goal.

The challenge here of course is the age old challenge for architects – to have a seat in the discussion the architect function needs to demonstrate the value it can bring and it’s internal expertise.

Finally, one amusing discussion I had was with a few architects from startups. As I discussed legacy systems they also related seeing legacy systems in their organisations – albeit the legacy systems were 2 or 4 years old rather than 20 or 40 years old. The intriguing thing here was the reasons for them becoming legacy were the same as insurers – availability of skills, supportability and responsiveness to changing demands. It may hearten architects at insurers that start ups aren’t immune to legacy issues!

 

 

In search of a new ‘dominant design’ for the industry. What does insurtech have to offer?

There is little in the world of insurtech happening today that insurers couldn’t arguably choose to do for themselves if they were motivated to do it. They have the capital to invest. They have resources and could hire to fill gaps in any new capabilities required. They importantly understand the market and know how to move with the trends. And yet, despite having all of these things, they readily engage with the start-up community to do the things that arguably they could do for themselves.  So, why is that?   

In Making the Most of the Innovation Ecosystem, Mike Fitzgerald’s observes the main cultural differences between insurers and the start-ups they court. These cultural differences give us a strong clue as to why insurers engage with start-ups, even though on paper they do not and should not need them.

Alongside these deep cultural differences, I believe that there is another angle worth exploring to help answer the question, and that’s the market’s maturity stage and, with it, the strategies required to succeed.

One model that helps explain this relates to the work of Abernathy and Utterback on dynamic innovation and the concept of the ‘dominant design’. To be relevant to this discussion, you first need to believe that we’re on the cusp of a shift from an old world view of the industry based upon a well-understood and stable design towards one where substantial parts of the insurance proposition and value network are up for grabs. You also need to believe that, for a period at least, these two (or more) worlds will co-exist.

So, here’s a quick overview of the model (in case you’re not familiar with it)…

Settling on a “Dominant Design”

First introduced way back in the mid-1970s and based upon empirical research (famously using conformance towards the QWERTY keyboard as an example), Abernathy and Utterback observed that when a market (or specifically a technology within a market) is new, there first exists a period of fluidity where creativity and product innovation flourishes. During this period, huge variation in approaches and product designs can co-exist as different players in the market experiment with what works and what does not.

In this early fluid stage, a market is typically small, and dominated by enthusiasts and early adopters. Over time, a dominant design begins to emerge as concepts become better understood and demand for a certain style of product proves to be more successful than others. Here, within an insurance context, you'd expect to see high levels of change and a preference for self-build IT systems in order to control and lower the cost of experimentation.

Once the dominant design has been established, competition increases and market activity switches from product innovation to process innovation – as each firm scrambles to find higher quality and more efficient ways to scale in order to capture a greater market share. This is the transitionary stage. 

Finally, at the specific stage, competitive rivalry intensifies spurred on by new entrants emulating the dominant design, incremental innovation takes hold and a successful growth (or survival) strategy switches to one that either follows a niche or low-cost commodity path. Within an insurance context, outsourcing and standardisation on enterprise systems are likely to dominate discussions.

Applying the ‘dominant design’ concept to the world of insurance and insurtech

Building upon the co-existence assumption made earlier, within the world of insurtech today, there are broadly (and crudely) two types of firm: (1) those focused on a complete proposition rethink (such as Trov, Slice and Lemonade); and (2) those focused on B2B enablement (such as Everledger, Quantemplate and RightIndem). The former reside in ‘Fluid’ stage (where the new ‘dominant design’ for the industry has not yet been set and still may fail) and the latter in the ‘Transitionary’ stage (where the dominant design is known, but there are just better ways to do it).

Figure: Innovation, Insurance and the 'Dominant Design'

picture4

(Source: Celent – Adapted from Abernathy and Utterback (1975)

Outside of insurtech, within the 'Specific' stage, there is the traditional world of insurance (where nearly all of the world’s insurance premiums still sit by the way) that is dominated by incumbent insurers, incumbent distribution firms, incumbent technology vendors, and incumbent service providers.

So what? 

What I like about this model is that it starts to make better sense of what I believe we’re seeing in the world around us. It also helps us to better classify different initiatives and partnership opportunities, and encourages us to identify specific tactics for each stage – the key lesson being "not to apply a ‘one-size fits’ all strategy to the firm".

Finally, and more importantly, it moves the debate on from being one about engaging insurtech start-ups purely to catalyze cultural change (i.e. to effect the things that the incumbent firms cannot easily do for themselves) towards one begging more strategic and structural questions to be asked, such as will a new ‘dominant design’ for the industry really emerge?, what will be its time-frame to scale?, and what specific actions are required to respond (i.e. to lead or to observe and then fast-follow).

Going back to my original question “What does insurtech have to offer?”. Insurers can do nearly all of what is taking place within insurtech as it exists today by themselves…but, as stated at the start of this blog, if, and only if, they are motivated to do so.

And there’s the rub. Many incumbents have been operating very successfully for so long in the ‘specific’ stage optimizing their solutions that making the shift required to emulate a ‘fluid’ stage is a major undertaking – why take the risk?. However, this is not the only issue that is holding them back. For me, the bigger question remains one of whether there is enough evidence to show the existence of an emerging new ‘dominant design’ for the industry in the ‘fluid’ stage that will scale to a size that threatens the status quo. Consequently, in the meantime, partnering and placing strategic investments with insurtech firms capable of working in a more ‘fluid’ way may offer a smarter more efficient bet in the meantime.

In a way, what we’re seeing today happening between insurers and insurtech firms  is the equivalent of checking out the race horses in the paddock prior to a race.  Let the race begin!

 

 

 

 

 

 

Life Insurance Automated Underwriting – A 25 Year Journey

Automated underwriting has come a long way in the last 25 years. It may be surprising that there was automated underwriting 25 years ago. At that time, it was called ‘expert’ underwriting. The idea was right, but the timing was wrong. The underwriting engines were black box algorithms; there was no user interface; data was fed from a file to the system; programming was required to write rules; and specialized hardware was necessary to run the systems. Not surprisingly, this attempt at automating underwriting was dead on arrival.

The next major iteration occurred about ten years later. Automated underwriting systems included a user interface; rules were exposed (some programming was still required to change the rules); data interfaces were introduced to collect evidence from labs and the medical inquiry board; underwriting decisions could be overridden by the human underwriter; and workflow was provided. Some insurers chose to take a chance on this new technology, but it was not widely adopted. There were two strikes against it: cost and trust. The systems were expensive to purchase, and the time and costs involved in integrating and tailoring the systems to a specific company’s underwriting practice could not be outweighed by the benefits. The lack of benefits was partially because the underwriters did not trust the results. Many times this caused double work for the underwriters. The underwriters reviewed the automated underwriting results and then evaluated the case using manual procedures to ensure the automated risk class matched the manual results.

Moving ahead fifteen years to today, changes in the underwriting environment place greater demands on staff and management. Staff members are working from home, and contractors are floating in and out of the landscape, all while reinsurers are knocking on the insurer’s door. There are now state-of-the-art new business and underwriting (NBUW) systems that address the challenges associated with the new demands. The solutions do not just assess the risk but provide workflow, audit, and analytics capabilities that aid in the management process. Rules can be added and modified by the business users; evidence is provided as data so that the rules engine can evaluate the results and provide the exceptions for human review. Subjective manual random audits of hundreds of cases evolve into objective, data-driven perspectives from thousands of cases. Analytics provide insights on specific conditions and impairments over the spectrum of underwritten cases to provide a portfolio view of risk management. Underwriting inconsistencies become easy to find and specific training can be provided to improve quality.

.In our report, Underwriting Investments that Pay Off, Karen Monks and I found that the differences between insurers who are minimally automated and those that are moderately to highly automated are substantial.  For minimally automated insurers, the not in good order (NIGO) rates are four times higher, the cycle times are 30% longer, and the case manager to underwriter ratio is almost double compared to the metrics for the moderately to highly automated insurers. This outcome may not reflect your specific circumstances, but it is worth preparing a business case to understand the benefits. With the advances in the systems and the advantages provided for new business acquistion, there are few justifications for any company not to seek greater automation in their underwriting.  

To learn more about the adoption of current NBUW systems and the functionality offered in them, please read our new report, What’s Hot and What’s Not, Deal and Functionality Trends and Projections in the Life NBUW Market or join our webinar on this topic on Thursday, September 29.  You can sign up here.