May 2016

Simplifying the Data Management Burden

Bob Hirsch By Bob Hirsch

As insurance companies continue to compete through product innovation, business acquisitions and new marketing channels, policy data becomes scattered across multiple business areas and systems. Many actuaries and finance professionals are spending more time collecting and collating data for their organization and less time performing the analytics tasks that they were originally hired to complete. Insurance companies are seeing a higher level of turnover because of the additional data management activities required. This problem has grown over time largely due to lack of communication between respective Actuarial and IT departments. Actuaries often complain that IT is too slow in building the data structures they need and often ask for detailed business requirements that are not always known at the time of the request. IT complains that actuaries maintain their own data and don’t share their business requirement needs with IT. Numerous insurance companies are embarking on actuarial systems transformations that can significantly reduce the amount of up-front data management required by actuaries. In order to be successful, IT and Actuarial departments must work together and implement best practices for managing policy related data and the transformation of the same.

Data governance between IT and the various business stakeholders is critical to the success of transformational efforts. IT is responsible for managing all of the policy related systems and the flow of data between software applications. It is important that there is a data governance group or similar organization structure and supporting processes in place to understand the needs of the various data stakeholders (e.g., Operations, Finance and Actuarial) and develop a prioritized list of initiatives to achieve their needs. The data governance group also manages changes to the data and notifies stakeholders of various changes or tests being conducted. Creating a common data dictionary for policy data is also part of the data governance charter. It’s OK if the various stakeholders have different definitions for the same data element and the data dictionary should contain all of the different definitions and how the definitions are mapped to the various systems in scope. Having a single source for the data dictionary saves countless hours trying to find data elements without having to hunt down the system definitions or intended use.

Another challenge is the number of administration systems and third party administrators (TPAs) compared to the number of policies. It’s not uncommon for an insurance company that has grown through acquisition to have 10, 20, 40, or more different administration systems and TPAs that actuaries need to source data from. Given the complexity of these systems many of the business rules don’t reflect the current business model and the data requires transformation.  This situation also causes the actuaries to pull data from multiple sources. One recommended solution is to implement Master Data Management (MDM) for policy data. The MDM centralizes basic policy information stored across all of the Administration Systems and is updated on a daily basis. Examples of policy data include policy number, face amount, issued state code, resident state code, beneficiary state code, etc. Several of these fields change over the life of the policy and an MDM allows the changes to be tracked and managed in a central location. These MDM systems are also mapped to the current Chart of Accounts (COA) in Finance so that transactions for a given policy can be accounted for in the General Ledger (GL). With MDM in place, transactions imported into a valuation system each month can be compared to the MDM policy data so that any data transformations can be made automatically before being loaded into the valuation system and ensuring transactions are consistently categorized with the GL. Insurance companies that have implemented MDM found it dramatically reduced the manual effort actuaries spent cleaning up policy data each quarter and also reduced the time taken responding to state audit requests.

Moving and transforming the data each month and quarter for valuations is also time consuming and adds to the overall valuation delivery timeline. Working with IT on a comprehensive automated solution can reduce the time actuaries spend transforming and importing the data as well as reduce the overall cost of moving the data for IT. A common practice that many companies are using is the Landing Zone approach, also called a Data Staging layer. These solutions are typically built on tools from Informatica or IBM’s Data Stage solution and are specifically designed to move data efficiently between multiple systems.  A Landing Zone approach reduces the complexity of the extraction of data from the source administration systems as well as simplifies the loading of data to valuation systems by taking over the scheduling, data mapping and data transformation rules. Companies that implement both Landing Zone and MDM gain additional benefits in that transactions flowing through the landing zone can be looked up against the MDM policy data while the data is moving between systems. One of the key capabilities of the Landing Zone approach is that data does not persist in the landing zone once it has reached its destination, making it easier to certify the data feeds. This approach also reduces the amount of time it takes IT to create new feeds for the actuarial team thus shortening project timelines.

All of these approaches help take the data management activities out of actuarial work, but they also require new efforts on the parts of actuaries learning to work with IT departments. In many organizations the Actuarial team has served as their own IT and data management shop for the valuation tools, looking to IT when data extracts are required. The more IT can be involved with the data management and system management of the valuation system the less of a burden these tasks are on the actuarial department for each month and quarter end. This takes time as the two departments are often not familiar with working together and IT is typically not familiar with actuarial valuation systems. However, as IT’s role becomes clearly defined and they work more with the actuarial team, IT will be more able to successfully take on the data management roles.

Bob Hirsch has 25 years of IT experience specializing in Enterprise Architecture, Integration, and large scale IT Transformation projects. He can be contacted at bhirsch@deloitte.com.