Why intelligent data automation is important and how to harness it – Global Banking And Finance Review

Posted: January 5, 2022 at 8:58 am

By Douglas Greenwell, Chief Strategy Officer,Duco

The way a business manages its growing volumes of data can make the difference between clear insights that aid business growth, and slow reporting processes that cost time, money and fall short of regulatory standards.

The accelerated momentum for improving data quality in financial services has been driven by growth in digital services all of which rely on high quality data combined with the pandemic shedding light on the issue of data integrity and data reconciliation.

Laws and regulations such as BCBS 239, Sarbanes-Oxley, Basel III, SFTR, IFRS17, GAAP and the new US depository regulations are also making compliance increasingly complex, requiring organisations to guarantee that appropriate systems and procedures are in place to effectively manage and control operational risk.

Despite the mission-critical reliance on good quality data, many financial services firms are still using outdated legacy systemsand manual processes to managetheir reconciliations. These systems are unable to handle complex and changing data formats, resulting in inconsistencies that require human intervention.

The current state of data reconciliation

Financial services organisations are at breaking point with their current legacy systems and manual processes.

Manual processes and legacy systems are not only costing organisations a lot of money and man hours, but they are also causing problems with transparency, which can also have costly affects in the form of regulatory fines.

Financial firms are at this breaking point because many organisations are finding that the amount and complexity of data they now handle as a business is unmanageable with their current systems and processes. And this complexity is leading to transparency issues, with the lack of a transparent, consolidated view of reconciliations being a major problem.

Concerns about data inconsistency and lineage issues are also at the front of companies minds as they plan for future growth. Without automation, automated data lineage is not possible, meaning that instead of a holistic view of the data presented in a structured way, financial services firms are dealing with unstructured data silos with teams and individuals unable to see how upstream data feeds affect them and what downstream business processes rely on their data.

In fact, in a study conducted by Duco surveying 300 heads of global reconciliation utilities, chief operating officers, heads of financial control and heads of finance transformation working in large financial services organisations nearly half (42%) of financial services firms said they are currently struggling with poor data quality and data integrity within their organisations.

Why change is difficult

While agility and fraud are big drivers of the move towards automation and away from legacy systems, often the biggest business case for change is simply cost control. When businesses assess the cost of ownership of technology from data normalisation, data prep and infrastructure to hosting, running and upgrades, the opportunity for technology rationalisation becomes clear.

However, despite there being a compelling argument for changing the status quo, businesses are still finding it difficult to shake things up.

According to our survey, 44% believe that reconciliation without manual processes would be too challenging due to the different types and sources of data they are dealing with. A further 42% believe that the risk of disrupting their business to improve data reconciliation is not worth the benefits of data automation.

But, while there is still some nervousness around the perceived disruption that a move to automation and machine learning will involve, the appetite to become more automated is strongly evident amongst financial services organisations.

Moving towards intelligent data automation

The pandemic, however, has provided the much-needed impetus for change, at a time when conveniently, intelligent data automation (IDA) has become commercially viable.

IDA is a data management strategy that uses no-code, cloud-based technology to automate and control all financial, operational and commercial data across an organisation helping firms to cut costs significantly while reducing risk and improving compliance.

With its use of fully customisable, low-cost solutions that can sit alongside or on top of legacy systems, an IDA approach is key to not only successfully managing data, but to unlocking the full benefits of that data for the business.

By employing an over-arching, self-optimised level of automation, an IDA approach enables businesses to get a detailed view of data across the entire enterprise. With this level of insight, financial services organisations can better understand the performance of their operations, uncover and address weaknesses and identify new opportunities, all of which drives greater efficiency and agility across the organisation and improves regulation reporting accuracy.

With internal and external factors pressurising firms to change, financial services organisations are beginning to look towards IDA as a tool to secure their prosperity in the long term.

Encouragingly, almost half (49%) of financial services firms surveyed say that intelligent data automation is the future, and organisations will need to embrace it to survive. Furthermore, 42% say they will investigate the use of more machine learning in 2021 for the purposes of intelligent data automation.

Covid-19 has accelerated the adoption of machine learning and data automation. We can expect this momentum to continue, driven by the benefits to the business and the end user which make IDA a game-changer for financial services compliance, risk management and in reducing costs.

Original post:

Why intelligent data automation is important and how to harness it - Global Banking And Finance Review

Related Posts