With the ever-growing size and complexity of datasets, the need for accurate and timely reconciliations has never been more crucial. So how have reconciliation processes adapted over time?
Without accurate and timely reconciliation processes companies can find themselves in a lot of trouble. Reconciliations are control processes that help detect and prevent fraud and stops the company getting into debt, and ensures they are adhering to their regulatory and compliance obligations. Typically, reconciliations can spot data issues such as duplication, date error, omission or amount error. The way reconciliations are being undertaken has changed significantly over the last ten years. In 2013, 90% of financial services companies were using Excel to undertake their reconciliation activities. However, in a 2021 Censuswide study, this figure has dropped to just 17%.
The shift to automate previously manual reconciliation processes has been driven because automation eliminates a lot of the problems that manual processes create, most notably the time taken to run the reconciliation and the quality of the data. The time saving is an obvious advantage. Computers can much more quickly compare different datasets in different formats and produce results. Furthermore, computers do not make errors if they have been programmed correctly.
Despite the benefits however, there is a degree of scepticism as to whether an automated solution can really replace a current manual reconciliation. In a 2021 study by the Aberdeen Group over 44% of participants believed their datasets were too large and complicated for an automated solution to handle. Furthermore, 44% of the participants believed the disruption to current business-as-usual processes would be too great when implementing a new automated solution. Finally, the time taken to implement a new automated solution and train new employees in what can be a complex artificial intelligence system is seen as a significant barrier. Simply being process efficient is not enough, it needs buy-in from the teams who will use the system who are often do not have time to learn new systems.
The issue of data quality also creates a barrier to companies adopting automated reconciliation solutions. Poor data quality typically confounds any type of process automation, but reconciliation platforms in particular have needed to evolve their ability to handle complex datasets. The scale at which data creation is growing is unprecedented and so too are the formats that data can come in. Traditionally, SWIFT (Society for Worldwide Interbank Financial Telecommunications) was the industry standard for financial data. However, in recent years new formats have come onto the market (for example, FIRDS, BAI or ACH). Any potential reconciliation system would need to be able to both read and use these new formats. Furthermore, the system would need to be upgradeable to take account of any new formats that come into fruition. Adding new data sources may require significant development work which could take time to implement. Also, new regulatory standards such as MiFID II and Basel III have changed the role reconciliations need to play in financial companies, mandating that companies need to reconcile more data than ever before.
Finally, it is not uncommon for data to have to go through a clean-up exercise before it can be used in a reconciliation. Different currencies and netting are just two examples which could prevent different datasets from being compared. Any automatic reconciliation solution needs to be able to handle these types of complexities. The latest generation of artificial intelligence (AI) and machine learning reconciliation tools look to overcome these issues. They can integrate with a wide variety of data sources, can handle a large number of transactions and cope with complex matching rules (one-to-one, one-to-many and many-to-many). Exception management is also made easier as machine learning can learn how previous exceptions were resolved and apply the same fixes automatically.
The need for reconciliation tools over traditional methods such as spreadsheets has never been more apparent. With datasets continuously growing in size and complexity, automated tooling is needed if the reconciliation process is to remain efficient and scalable. The emergence of AI and machine learning tools has paved the way for the automation of large scale, complex reconciliations which would prove to be extremely challenging using traditional methods. We have seen a shift in recent years from Excel-based reconciliations to more automated solutions, either using specialist reconciliation software or data analytics software such as Infogix. As datasets get ever larger and more complex, this shift is set to continue.