Asset finance organisations that decide to upgrade their systems must go through the painful process of data migration. This means transferring vast quantities of data from years — if not decades — of business activities. Businesses must strike a balance between keeping the data transfer simple and efficient against complex and rich.
Here Sam Fairhurst, Senior Project Manager at Alfa, explains what makes asset finance software migration so difficult, how to avoid mistakes, and why having the right tool for the job is so essential.
What do asset finance businesses stand to gain by migrating data?
It’s all well and good investing in a new, best-of-breed technology platform, but unless you’re a new startup, overlooking a data migration means that on Day One your new system will have almost nothing on it. Without a data migration, most business processes remain stuck in the legacy practices you’re trying to move away from. Leaving that transition down to natural churn is likely to take years, and the complexity of running the multiple systems in parallel could be costly. Migrating data to the new system means a business starts reaping the benefits of the new functionality far sooner. The improved processes, automation and efficiency gains can benefit the whole portfolio and not be throttled to just the new business booked after go-live day.
Why is migrating legacy data such a challenge for asset finance organisations?
The big challenge for many finance organisations is making sense of their legacy systems. Many have had their system in place for 10 or 20 years, perhaps more. There might be products in there that they no longer offer. Regulatory changes have come in impacting their old data, making it non-compliant. They might have client portfolios on there from an even older system or perhaps purchased from a third party. Quite often the data isn’t as fully fleshed out as is needed in today’s world. So, finance organisations need to work out what data they want to migrate and often they’ll have to be very pragmatic to do it effectively.
How can businesses onboard portfolios then?
Users are nothing if not creative. When a system doesn’t fully support user processes, teams will find ways of bending them to fit. This might involve entering dummy values to clear validation on a property they’d rather leave blank or ‘overloading’ a single property with multiple business meanings depending on the value or context. Combine these often-undocumented use cases with unintended issues such as incomplete or outdated addresses, typos and system defects, and the cleansing task could become quite involved.
What do we need to know about transferring datasets?
Most important is to recognise that data migration isn’t just the lift and shift from one place to another. The tech guys can’t just do it on a whim. The most successful migrations are the ones where a business sees it as more of a cross-functional exercise that includes your finance and accounting experts and some business experts who are going to help interpret what that data is supposed to mean and therefore what it should look like in the new system. Having that team in place means having experts on hand who can tackle difficult questions and keep things moving.
For acquisitions, the format and quantity of data is agreed at a high level, but the finer details of data quality are probably left for the implementation team to uncover. I’ve even seen a scenario where the purchase involved a guarantee of no loss-making contracts only to discover that there were several of them in the portfolio provided. That finance director was less than pleased to hear the news.
How can asset finance businesses analyse legacy data effectively?
Much of this analysis is likely already being done somewhere in the business, so it’s often a case of talking to the right people. Understanding the ‘shape’ of the data, such as relative volumes of different products and geographies or contract status and age banding, isn’t a migration-specific requirement. There’s probably existing management information (MI) reporting within the business that can be leveraged to get a high-level view.
Similarly, when looking to identify data that may need cleansing, there’s no need to reinvent the wheel. The first port of call should be the list of any open support tickets relating to the legacy system, as they’ll be a great indication of potential problem areas in the data. Talking to experienced business experts that use the legacy system day-to-day will also reveal a lot about challenges in the existing data. Inevitably, some ad-hoc custom reporting will be required along the way, but answers to many of the big questions are probably already out there if you know where to look.
For the mapping exercise, the key is to remember that the goal is not to take every data item and find a home for it in the new system. The goal is to identify the appropriate sub-set of existing data to migrate to support future use cases. There will be redundant data that can be excluded, and it’s here that a good understanding of future processes, and the real-world meaning of the data, are essential.
Can you discuss the benefits of incremental migration tactics?
For me, the two main drivers here are accelerating the delivery of business value and de-risking the overall delivery. Large migrations will likely involve several associated dependencies, be they complex mapping or configuration exercises, system enhancements or integrations. Where these dependencies only block sub-sets of the portfolio, why keep the rest of the population waiting? Where it’s feasible, the business can start getting the benefits of having the simpler subsets migrated sooner. This also then serves to familiarise the business users with the new application and ensure any teething issues in the landscape are discovered and addressed before they risk impacting the full volume of contracts. Lessons learned from small, early deliveries can constantly feedback to improve data quality, mappings, and configuration for future phases.
Why is the right tool for the job important?
There’s always a possibility that it just makes sense to do this manually. For some clients that have a very small pilot phase or have just acquired some dataset, then perhaps the most pragmatic thing is to just key them manually.
For the middle-tier of volume and complexities, we look towards an integration-based approach, either building integration from another system or some form of upload, like a spreadsheet. That can then make use of the rich web service API that Alfa Systems has on offer, and it will be much quicker than a manual entry. Once the integration is in place it’s available for repeat use and can be quite effective, particularly for acquisitions.
For higher volume and higher complexity where you want data to look functionally rich, the tool we propose is the Alfa Migration Suite, which makes contracts look like they’ve been in the new platform from Day One. So, you’ll have original start dates, origination charges, the payment history, historic invoices, summary details of any historic changes in support, or on the historic credit reference information — and a lot more. It’s very powerful.
Why is all of this so important in today’s economy?
The ability to build some flexibility into your project is important. What if the priorities of your project change? What if you need to pause the project? Go after a different acquisition? Having these agile approaches helps you to be more flexible, helps you pivot to different market conditions, or prioritise other areas of the business without having to invest in a ton of work for no reward. A considered approach to software migration allows you to release business value periodically and then pivot and go after different elements. That’s key for challenging economic conditions.