Archives for October 2009

The least bad decision

As we have just entered the last quarter of the year, economists and business men are trying to anticipate what 2010 will be. So do policy makers and last week the Federal Reserve has published its Beige Book . For those of you, who are not familiar with the Beige Book, it is a report published 8 times a year gathering useful economic data from 12 regional districts in the United States, whose objective is to summarize the state of the US economy. The Beige Book is also an important tool upon which the Federal Reserve counts to make decisions around interest rates. The Beige Book October 2009 edition provides two important news: one good and one bad. Let’s start with the good one: the US economy seems to have reached its bottom at least it does not show signs of more degradation. However the bad news is: the US consumption is weak and when we know that consumption represents 70% of the US GDP, it means that the recovery will be slow. Overall there are two things to learn from the last edition of the Beige Book. First of all, American people seem to change their habits and behaviour. They tend to save more and I strongly believe that this change is good for the world economic system in the long run even though it is no in favour of a strong recovery in the short term. Indeed, the US people could not continue spending so much using debt. The second interesting aspect relates to how the Federal Reserve will then adapt its monetary policy going forward knowing that US consumers are spending less money. On the one hand low interest rates facilitate speculation and contribute to making the US$ weaker. To a certain extend this situation could generate a new bubble (I wonder whether it’s not already the case). On the other hand, hiking interest rates would slow down or even stop a recovery, which is already slow. In summary there is no two best alternatives to choose from. The US and world economy depend on choices between bad scenarios and the challenge will be to choose the least bad. But when we think where the world economy was last year, this is maybe not a bad situation to be in now.

A Toe-hold for Social Media in Insurance – Affinity Marketing

There is a lot of buzz about the use of social media, but most companies are only beginning to look at how to make it work in insurance. The experimentation continues, but I had not heard of an approach which blends this new opportunity with the realities of our industry — until today.

Almost all of the discussions concerning using these tools involve going direct to the consumer. For example, it is common to hear the suggestion that a company set up a Facebook page where insureds can sign up as “fans”. However, the majority of insurance sold in the U.S. is done so through an intermediary, usually an independent agent. How can an insurance company best leverage social media given the distribution realities of the marketplace?

One way is through providing services to agents which have affinity marketing schemes already in place. These services would include education to the agent (what is social media and why should I care?), strategy planning with them (where should my agency establish presences – Facebook? Linked In?), and implementation assistance (how do I get started and how does my agency use it on an ongoing basis to drive business?).

As an example, consider an agency which has a program which sells personal insurance products to teachers. An insurance carrier can add value to their agency force and create goodwill with this agency by providing some advice and direction to the distributor about where best to find teachers on the web, what tools will best reach them, and how to follow up with postings that will be made.

Most insurers already have the infrastructure in place to manage agents with affinity relationships, so this service can be an addition to that, not a new department that requires significant investment. Some cross functional knowledge transfer between IT and Marketing can establish a skill base which agents will find most valuable.

In these days where revenue is shrinking and funding is constrained, an innovative approach that is low cost and leverages existing relationship is a winner.

Is there such a thing as a mainframe monopoly?

As discussed in detail in The New York Times, The Justice Department is starting a preliminary antitrust inquiry of IBM. I can’t speak to whether certain actions taken by IBM or other companies were lawful or not, but I do think the investigation speaks more to current problems in the industry than it does to any particular wrongdoing. Much of the action seems to be driven by the fact that IBM has a near-mainframe monopoly and that businesses rely on mission-critical code that can only run on these mainframes. The issue here is not just IBM’s mainframe monopoly but, rather, the fact that companies are relying on code that was written twenty or thirty years ago. I say the mainframe monopoly is not the main issue because calling it a monopoly ignores the realities of modern computing. Customers clearly have options beyond the mainframe; most (if not all) consider and purchase modern servers for production systems that either run alongside mainframes or have replaced mainframes. A mainframe monopoly is like a train monopoly… It might be a point of contention if one company owned all the trains, but this company would still be competing against all the very prevalent and modern options (trucks, boats, cars, planes) for shipping and travel. One cited “proof point” of this monopoly is that modern server prices have fallen by over 40 percent since 2001 while mainframe prices have fallen less than 13 percent. But the server costs are only one element of the total cost of ownership. The mainframe has a reputation for reliability, scalability, and throughput. At an enterprise level, servers are typically positioned as a mainframe replacement in part because of the ability to cheaply buy enough to duplicate mainframe values in volume. The total cost of ownership (which includes hardware, software, and services) for all those servers compared to a single mainframe is a point of much debate. Some state that the total cost of ownership for the mainframe is lower while server proponents claim the opposite. But if a single mainframe can potentially offer a lower total cost of ownership than many servers, it makes sense that the hardware price would remain higher. And regardless of whose numbers you believe, the modern server is positioned as a commodity compared to the mainframe; not because of the lack of monopoly status but simply because one of the values is its cheapness. If part of the server philosophy is the cheapness of the hardware, it makes sense that the price would decline more rapidly. The real issue cited in the action is not the hardware monopoly but the fact that many companies have mission-critical systems running on these mainframes; systems they have invested too much money in to move. Any company that cannot ever move off of its existing base of code because it has invested “too much” in it will one day face extinction, if it does not face it already. From where I sit, the software and services industry is fueled by the multi-billion dollar business of selling companies modern solutions and consulting to migrate off of legacy solutions. In fact, IBM, as a major force in professional services, is one of the companies helping consumers move off of legacy code to modern enterprise software. Has IBM prevented other vendors from creating options that would allow companies to retain their legacy code but run it on cheaper, modern servers? I am not going to comment on the truth or legality of such actions. My concern is and always has been about how to plan for the future IT direction of a company. Even if such legacy code alternatives have been prevented, these would only have been temporary measures anyway. If a company takes 30-year old legacy code and moves it from a mainframe to a server, that company’s problem is not solved. It may mean some lower-cost maintenance for the next few years (though, as discussed earlier, the total cost of ownership is in question), but that company still needs to consider its true next step. Otherwise this is sweeping a long-term problem under the rug with a short-term fix.

The World (of U.S. P&C pricing) Is Flat

SNL reported today on two benchmarks for Property and Casualty pricing in 2010. The bottom line is that revenue support is not going to come from a hardening market. At best, prices may stay flat.

According to SNL, a MarketScout report forecast flat to slightly higher prices, while a Willis study concluded that price increases would be difficult to come by.

All of this points to the importance of keeping business that is currently on the books and looking for ways to earn business from other carriers. As noted in other blog postings, the change in this approach in this economic environment is that growth will come from service, distribution channel management, and acquisitions. New market and/or innovative product initiatives will not be the engines that they have been in the past.

For systems professionals, this means lowering expenses and raising quality on what is already being done and selectively investing in projects with shorter timelines that have direct revenue impact.

Large UK Insurer Tackles the Legacy System Challenge

Reuters announced a possible offshoring deal that will see Royal Bank of Scotland Insurance move it’s claims management system to India. RBSI, better known for it’s brands of Direct Line and Churchill, is a very significant UK insurance operator. Although reports are unconfirmed at this stage, it is of no surprise to those watching the industry that the Tier 1 players are investing even in this current climate. As discussed in last weeks post where we commented on Axa’s IT investment here in the UK, there is significant momemtum in large IT investment decisions. Those legacy systems continue to impede transformative business strategies, and so the challenge must be addressed. These two companies highlight that there are different approaches, each suited to a certain set of circumstances. Either way, the resolute insurers are marching forward, and those without a similar plan, will surely be left behind.

Tracking Insurer IT Investment Responses in Q3

We published our survey results from quarter two discussing the insurance industry’s expectations and strategies for investment in the current economic climate. Opinions from over 120 insurance executives from around the world were gathered, and the 30-second elevator pitch is that the future looks a hazy shade of pink. We’ve just opened the third quarter survey to all insurers. Join in now, it only takes five minutes, and contribute to the view on how IT investment is fairing.

12.3.09: Cloud Computing, Software as a Service, and Technology Outsourcing

Celent senior analyst Jeff Goldberg

This event is free to Celent clients and the media. Non-clients can attend for a fee of USD $249. Celent will contact non-clients after they register for credit card information.

Please click here for more information.