Implementing new data standards
The acceleration of digital technologies has brought the insurance industry to a pivotal juncture, where it is under more pressure than ever to modernise and enhance operational efficiency.
Within the insurance sector, there is a lot of change happening, and with it, the expectation of an evolution of long-held processes and approaches, notably the adoption of standardised data practices, which seek to foster a more cohesive and effective collaboration between numerous market participants.
In the latest episode of our IT Insights InsurTalk by Future Processing, we met with the Executive Director of the London and International Insurance Brokers’ Association (LIIBA), James Livett, to discuss all about implementing new data standards in the insurance sector.
With nearly four decades of experience in the industry, Livett shared his extensive knowledge of the current state of data standards in the London market, the challenges faced, and the significant benefits that a unified approach to data standards offers brokers in the modern world. Livett emphasises the importance of collaboration between brokers and underwriters, and technology’s pivotal role in facilitating the transition to standardised data practices.
In this article, we explore the key takeaways from this engaging conversation and uncover how the insurance industry can successfully navigate the path toward data standardisation.
The current state of data standards in the London Market
Data standardisation has already been around for years and is not a new concept in insurance. Examples of this include LPAN and LCCF, for instance, which are ‘data standard’ by definition, but not strictly digital.
A&S and ECF are digitised market processes that are used with Velonetic to submit premiums and claims, and these have been available for more than 20 years using the ACORD DRI standard. Insurance companies have been using standardised data for a long time, so the state of the market on standards for central service interaction is currently very positive.
The challenge is that very few organisations use a common set of data standards within their businesses themselves – they are used across the wider industry, but less so actually within individual companies.
Not only utilising data standards across companies in the insurance industry as a whole but also within various departments of the companies themselves should be the goal as it has proved successful in other projects such as the Ruschlikon Initiative. However, this approach isn’t currently as widely adopted as it could be.
Another example of a current initiative to standardise data and analysis is Blueprint Two, an initiative by Lloyd’s of London which seeks to bridge the gap between data systems and operations and to provide a single, common data platform and approach for brokers, insurance companies and Syndicates.
The insurance ‘lingua franca’
Blueprint Two using the ACORD data standard could well turn out to be the insurance industry’s ‘lingua franca’ of the future, with key players in the world of insurance busy working on the platform to standardise and centralise databases and data analysis so that conglomerates of insurance brokers and companies can collate and analyse insurance data under one common platform.
As James Livett of LIIBA discusses, standardised data allows for a single entry of information, such as something as common and as simple as a contract’s inception date, rather than multiple entries by brokers, underwriters and claims teams.
This significantly reduces the chances of errors and streamlines processes, saving valuable time across tens of thousands of policies. In addition, having data in a consistent format enables more effective analysis and review of business trends and exposures, as disparate databases currently hinder comprehensive analysis.
Livett emphasises that data standards must be seamlessly integrated into operations, much like the interoperability between different mobile phone systems. Just as Android and iPhone communicate using common standards despite different software languages, insurance data standards should function invisibly to enhance and support industry practices. This integration is crucial for achieving the operational efficiencies and analytical capabilities that modernisation promises.
Driving the adoption of data standards in the London Market
The responsibility for driving the adoption and definition of data standards in the London Market doesn’t fall solely on brokers or underwriters but involves a collaborative effort across the entire ecosystem. This includes entities such as vendors, platforms and operations people who all play incredibly crucial roles.
The London market’s unique advantage over other insurance magnates lies in its centralised bodies such as the London Bureau – Velonetic, which handle a significant portion of London transactions (over 80%, according to James Livett).
Therefore, if we take the step to encourage these major insurance players in the London market to adopt standardised data and platforms, conforming to a common set of data standards and practices, brokers and underwriters on both sides will start to homogenise, to come together and work from the same direction to centralise and unify their data practices.
Encouraging the key players in the insurance industry (like the London Market) to onboard and adopt these new processes and data standards will encourage others to follow suit. It is the responsibility of those committed to progressing digital solutions to let the wider community know about the benefits of standardised data and platforms to encourage a more widespread adoption for the good of the industry as a whole.
The role of the MRC and CDR committees
The Market Reform Contract (MRC) and Central Data Repository (CDR) committees play a pivotal role in standardising data practices within the London Market.
The MRC committee, which is owned and overseen by the London Market Group, has been actively addressing a backlog of issues and queries since its formation.
Meeting monthly, the committee tackles a variety of topics to do with insurance and data standards and handling, from minor details to significant structural questions, ensuring that client-facing documents are precise and comprehensive. This process involves extensive collaboration and discussion, reflecting the complex nature of these standards.
Simultaneously, the CDR committee focuses on the broader aspects of data standards, particularly those affecting premium and claims transactions.
Despite their different functions, both committees are crucial in shaping a unified approach to data management in the market, and encouraging market participation through platforms that allow stakeholders to submit questions and feedback, resulting in a more transparent standardisation process.
The committees regularly interact to resolve overlapping issues, aiming to streamline processes and enhance operational efficiency by more closely meeting the market needs and dealing with practical challenges effectively.
Read more about data standardisation in the London Market:
- London Market data standardisation – essential insights by Cassandra Vukorep
- Transforming data into success: the harmonisation advantage – insights from the panel discussion
- Navigating data standards in the London Market: essential insights by Mark Bennett
Data standards implementation – success stories from outside the London Market
One of the most notable examples of successful data standard implementation outside the London Market is the ACORD Ruschlikon Initiative, which has been in operation for over two decades. It has garnered significant traction among major reinsurance brokers and underwriters across Europe.
Industry experts like Simon Squires at AXA XL, Emma Ford at Liberty, Tim Pledger at Swiss Re, Richard Brame at WTW, and Terry Calthorpe at Gallagher Re have all attested to the initiative’s success, highlighting the substantial operational benefits and efficiency gains it provides. The Ruschlikon model showcases how effective data standardisation can streamline processes and improve overall market functionality.
In contrast, the London Market’s earlier attempt at similar standardisation through the eAccounts system did not achieve the same success. Although it was adopted by a few brokers who reported impressive results – such as a 90-95% cash match rate and significantly faster processes – the initiative ultimately floundered. These experiences demonstrated that while new systems may seem daunting, their potential benefits in terms of operational efficiency and error reduction make them well worth the effort.
It is clear that while there was much room for improvement from the London Market’s data standardisation initiative, by looking outside of the London Market to other success stories we can begin to produce a roadmap for implementing data standards that will be effective and long-lasting.
The risks of not implementing effective data standards
The risks of not implementing data standards are significant and multifaceted. Firms that fail to adopt these standards will find themselves stuck in inefficient manual processes, producing PDF documents rather than leveraging automated systems.
This not only increases the likelihood of errors but also results in higher operational costs. Smaller firms, which may be reluctant to invest in new technology, might inadvertently adopt these standards through indirect usage. However, they will still face challenges if they cling to outdated, manual methods.
Mitigating the Risks
To mitigate these risks, the London Market is likely to enforce a more uniform adoption of data standards. The presence of central platforms and the bureau will drive this change, reducing the chance of non-compliance.
Within the next five years, it’s anticipated that the market will phase out manual processes entirely, making it mandatory for practitioners to interact with central services using standardised data formats. This transition will be facilitated by ensuring that even portal interfaces conform to these standards, thereby easing the adoption process and minimising disruption.
Summary
The implementation of new data standards, both within the London Market and in the wider insurance industry as a whole, is not just an aspiration but a necessity. The benefits are clear:
- enhanced operational efficiency,
- reduced errors, and
- the ability to conduct comprehensive data analysis.
The collaborative efforts of brokers, underwriters, vendors and central bodies are pivotal in driving this transformation. Learning from successful models like Ruschlikon, the London Market can avoid the pitfalls of the past and embrace a future where data standards streamline processes and foster innovation.
The journey towards full adoption may have its challenges, but the end goal is a more efficient, accurate and competitive market. As the industry progresses, these standards will become an integral part of daily operations, benefiting all stakeholders involved. Embracing these changes now ensures that the London Market remains at the forefront of the global insurance industry, prepared to meet the demands of an increasingly digital world.
At Future Processing, we have over two decades of experience in data harmonisation, implementing digital solutions, consulting, AI and cloud services. We are your ideal partner to bring to market your digital initiatives, as we work closely with you to identify and bring your goals to life. If you are interested in learning more about Future Processing’s industry-leading digital services, contact us today and we will help find the best solution for your unique business.