menu
cover Insurance
Data Solutions

The right thing to do: the past, the present and the future of data standards in the insurance industry

date: 23 November 2023
reading time: 9 min

Data standards in the insurance industry are nothing new. “I’m a broker, I have been data handling for years and did not realise it.” said James Livett, Associate Director at London & International Insurance Brokers’ Association during his presentation at InsurTalk, our October event in London.

James, who has been active in the London Market for the last 37 years, is also a member of the PPL Board, ACORD London Advisory Board, LIMOSS Board and many market committees and his family have now been in the insurance business for four generations.

James’ presentation was a fascinating story on the past, present and future of the data standards in the London insurance industry. Let us take you through it, starting from a quick history lesson.


Data Standards in the 20th century

The end of the past century is when data standardisation started to take off. Until 1970s, some standards were there, but they tended to be multiplied. For example, Lloyd’s marine insurance had slightly different variations than non-marine; as did aviation. The eighties brought more standardisation.

As James said, “In 1986 data standards already started in the London Market. I was working for a small underwriter. As the losses and claims came in, we all filled them through the same standard form: pink was property, blue was marine and yellow was liability. Capturing the same data is still a data standard.

In 2000, the Electronic Placing Support (EPS), an early e-trading solution, was introduced. “It was green screen, it was torturous, it was ahead of its time. It was painful, it was forward thinking, and it was unwanted. It was a huge amount of re-keying, the technology was not stable, integration wasn’t even a word. Few people were actually using technology.” – James recalled – “But it did open my eyes – and I’m sure many others – to how things could move forward.”

The beginning of the 21st century brought further improvements to the standards with the LMP 2001 which created the first London Market Processing slip. New versions of the slips were launched until the Market Reform Contract (MRC) in 2007.

All of these slip versions, James argued, were ultimately the way of capturing and standardising data. Coming up with these standards meant companies didn’t have to create special processes for different classes. “We’ve now started generating standards – standard contracts, standard referencing. Excellent. Now we’re starting with the IT catching on with us.” James stated.


Is the London Market capable of change?

In his presentation, James addressed the argument posed by some that the London Market wasn’t capable of change. He pointed out a few examples that show major changes that the industry has gone through:

  • changing from the old school slip format to the MRC format with 100+ thousand policies across hundreds of firms, classes and other areas moving to a standardised format in 12 months
  • changing the way premiums were calculated with 300 brokers getting rid of physical paper and starting to send data and pdfs
  • changing the way claims were handled and processed with ECF and data

The industry has had many successes, but what it tends to forget about, James argued, is the failures.


We failed miserably…

“Yes, I have just used the magic word ‘failure’. I’m perfectly happy to stand here and say that we failed miserably on some of these things.” admitted James. ”Why? Because we didn’t bother asking the end user.”, James explained discussing examples of things which could have gone better. For example, the DDM system where brokers were keying the same data 5 times, because they hadn’t been included in the design of the system.

Some systems were good, but suffered from poor buy in. “Why didn’t they get embraced? Because somebody came along and said ‘here’s a new shiny thing”, James argued.

Along came further projects, like the 2010 Future Process, the 2016 CSRP and the 2023 Blueprint2 standard.


The curse of the legacy systems

“Somebody once said to me ‘we’re trying to play mp3s on a gramophone’. Ultimately what we’re trying to do is use XML messaging and APIs but underneath it is the old COBAL code mainframe that has been in place for around 40 years. It went live in 1986, same time I did – I’ve got old and creaky and so have those systems”, commented James.

He argued the market had a habit of being reactive rather than proactive, pointing out that change normally happens as a result of regulatory imperative.


What do the successes and failures prove?

The victorious and failed attempts at data standardisation have proven two things, James argued.

First, things fail without long term commitment and market buy in.

Second, there are huge business benefits to modernisation. “By bringing it all to the same slip, we didn’t have to have all these specialists, we could be crossing over. That’s an operational efficiency – everybody using the same standard means we didn’t need to set standards for systems. We proved we can do it if we bother putting our minds to it”, James said.


Remember this thing called ‘the pandemic’?

What did the Covid-19 pandemic do to the industry?

Two things:

  1. people had to switch to using digital systems which may have not been as efficient as they wanted but the systems worked, and
  2. communication became much easier with people moving to videoconferencing.

James commented: “The brokers and underwriters are now sitting at home thinking ‘how am I going to get my nice rubber stamp on this contract? I can’t. So, hang on, they forced me to do it through this system… and it really works, alright, maybe not as efficiently and effectively as we really want, but it works. (…) 18 months on there’s a dramatic shift after the country shutting down and there’s us emerging, blinking into the light and… arise Blueprint Two.


Blueprint Two – what is it?

Blueprint Two is a programme aimed at digitalising the London insurance market to make it better, faster and cheaper. It’s about transforming the journey of placing risk and making claims for open market and delegated authority business. The programme is set to be delivered in multiple phases, the first two phases start next year.

In phase one central services will be moved to a new single digital platform and processing services for open market and delegated authority business. All market firms must adopt phase one by 1 July 2024, but change will be limited.

Phase two, starting from September 2024, will offer a set of services that fully utilise the new digital processing platform. Market participants can move from phase one to phase two at a time that is right for them.


Blueprint Two – components and phasing

James then went into the process for Blueprint Two in a bit more detail. First is the data definition and the Core Data Record (CDR) (now on version 3.2). Then, gathering data with a mechanism called the Market Reform Contract version 3 (MRC v3) and other slightly more technologically advanced solutions.

Work is now progressing on who’s going to provide the data – the phase called the ‘Process, Roles and Responsibilities’– and as of 11 October 23 the work pivoted towards the Claims.

CDR version 3.2 has just been released. A function was put in place in Oct 2023 to start constantly reviewing and monitoring the CDR and the MRC for changes to them. The next CDR that will be published is the Claims CDR, and then following it the Delegate Authority and the Treaty Reinsurance ones. The plan is to have them all done by Feb or March 2024.

On 1st July 2024, the old green screen platforms are going to be switched off and everyone will be moving across to new replacements – IPOS and ICOS and a gateway called IROS, as mentioned in the earlier section. Following this, in September, brokers and carriers should begin to have the ability to achieve full integration with messaging.


A goodbye to sellotape, string and chewing gum

“We’ve kicked this can down the road so much we no longer have the choice” – said James – “These systems are creaking and falling apart. Velonetic, DXC and others have done an amazing job over the last 10 years with sellotape, string, chewing gum, and all sorts of other things to hold these systems together. They have invested tens of millions on security software that sat over top of these things to make sure your data is safe and things work. But at the same time these systems are creaking at the seams. So these systems and processes are way overdue for a review, and we’re doing it.”
James Livett

James said he was optimistic that things would be done right this time because Lloyds and the company market have come together and are getting their processes and data requirements closer together which should result in scale and operational efficiency.


Innovation doesn’t stop: it’s change, it’s movement, it’s evolution

James brought up the question of ‘is going to be easy?’ and admitted: “No, absolutely not. It’s going to be really difficult. And rather annoyingly it’s going to cost money. But we’re not giving the market the choice, by turning off the old stuff.” He continued “People do know about standards; we’ve been doing them for years. We’ve just not necessarily done them technologically wise. If we’re going to take things away, it’s going to be difficult, it’s going to take time, there will be errors, it will cost, but it’s the right thing to do.” James concluded.

We have negotiated a deal with Acord – all LMA, IUA LIIBA members have limited access to Acord standards and support purely for the London Market bureaus.


Further resources:


An interview with James Livett

Following James’ presentation on the evolution of data standards in the insurance industry, we asked him a few more follow-up questions revolving around top priorities and changes needed in the industry:


Future Processing: What is the no 1 priority for the insurance industry/London Market right now and why?

James Livett: LIIBA have just asked its members this question…and the general themes have been “Modernisation/Change” and “Cost”. A third is “Staff & Training”. Oddly I see them as all related. Blueprint 2 and other modernisations challenges are going to cost money and they will need skilled staff or staff training. A bit of a long-winded answer but ultimately you cannot deliver change or progress without adequate funding and good staff.


FP: If you had a magic wand and you could change the industry in any way, what would you do and why?

JL: I have one on my desk….it does not work! If it did, I would give everyone an understanding of what everyone else does. So many mistakes are made because no one considers the other parties in the chain.


FP: How do you see the Insurance Industry evolving beyond the Blueprint Two programme?

JL: BP2 is just the start. Change is constant, but one thing I see is that by gathering standardised data we open ourselves up to a greater level of analysis that can provide greater understanding of our Client’s need. Improved products and better risk management. This can only be achieved once we have the data.


FP: What role do you see for vendors in the digitalisation of the insurance industry and in the data standards drive?

JL: Vendors are absolutely central and essential to the delivery of all technological change. If the vendors are not at the heart of the move to digitisation it will never happen. It is incumbent on the market to ensure that they have a clear understanding of the desired outcomes but once done it is their skill and expertise that will actually deliver the ability to achieve the outcomes.

Read more on our blog

Discover similar posts

Contact

© Future Processing. All rights reserved.

Cookie settings