Sydney +61 2 9235 0245 | Melbourne +61 3 9900 6402 | Singapore +65 6653 4470

Improving investment capability? Don’t forget data quality

Investment management is progressively becoming more complex due to an increase in the volume of data, the proliferation of data sources, and the outsourcing of business and IT functions. This means that data management and data quality are becoming more and more important and with proper implementation can be a competitive advantage.

The drive for more data

Transactional data, market data, client-specific data and reference data plus the historical data associated with each is growing at an exponential rate, providing opportunities to better meet the needs of customers, improve efficiency, manage risk, and improve decision making. As a consequence there is an ever-growing effort to harness these opportunities with the advent of ‘big data’ tools and algorithms. One of the key enablers for businesses to utilise their data is ensuring the data is presented in a usable and understandable format. As the volume of data continues to grow this task becomes increasingly difficult.

The many sources of investment data

It is not just the volume of data that is the issue however. If using a multi-sourcing model investment managers have data sourced from a primary custodian and potentially multiple secondary custodians; plus the custodians of institutional mandate clients. This data is generally a back-office Accounting Book of Record view, which doesn’t meet the requirements of the front or middle office who need an Investment Book of Record view. Adding to the volume of both data and data sources investment managers potentially have to deal with market data from domestic and international markets and various asset classes. Having so many different data sources is problematic as the data is likely to arrive in multiple formats and structures; or unformatted, unstructured and different from source to source. In addition there may be overlap, data gaps, and inconsistent or out-of-date information. An effective data management strategy ensures that information that comes in to the business remains consistent and provides an accurate view of key business metrics.

Outsourcing complicates data quality

Finally, there is also a continued and growing trend of financial service organisations outsourcing their business and IT functions to enhance efficiency and reduce costs. This in turn results in large amounts of unstructured, unformatted data flowing between many systems and many organisations. As different systems and people touch this data its quality may degrade. A rigorous data management/governance framework enables information to be quickly and accurately processed and to reach key stakeholder in the shortest amount of time.

A data quality framework is essential

A well thought out and structured data quality framework is an important piece in the overall data strategy of an organisation. A data quality framework ensures that information coming in and going out of an organisation is in the best possible state for aiding the business decision making process. The four key themes that emerge as part of a good data quality framework are: timeliness; accuracy; accessibility; and relevance.

Timeliness ensures data and information reaches key decision makers within the required timeframe. It involves putting in place rigour around obtaining, processing and delivering information to relevant stakeholders. To ensure that an organisation meets the timeliness of delivery, they need to eliminate all unnecessary tasks in the data delivery process. A regular audit of data management processes allows identification and removal of redundant tasks and improves timeliness.

Accuracy is probably the most important aspect of a data quality framework. This phase of the framework puts the steps in place to ensure that the data is in the correct state to meet its desired use. Tasks can be put in place to test the data flowing into an organisation to ensure that key accuracy criteria are met before the data is used. Accuracy criteria should be determined by prioritisation of data points based on their relevance and importance.

Accessibility refers to the ease in which information is accessed by stakeholders. An organisation needs to determine who the users of the data are and the medium through which the information is to be delivered. Accessibility of data is one of common pitfalls that organisation face; where many people are obtaining the same data from different sources resulting in inconsistencies. Data warehouses and reporting/Business Intelligence platforms enable organisations to overcome the accessibility problems and ensure information is sourced from a single central source.

Finally, the Relevance of data determines which bits of data are to be stored and which are to be discarded. In a world with an ever increasing amount of data, having a rigorous process in place to determine relevant information is an important aspect in reducing ‘noise’ and IT costs. To ensure the data kept is relevant an organisation should link data points to business processes and objectives. Information that is not linked should be discarded.

Think about the data first

The bottom line is when you are considering enhancing your investment management or performance or risk reporting capabilities you should include consideration of a data quality framework as an essential enabler.

 

Faraz Mahmoodian
Senior Consultant

Leave a Reply

Share This

Copy Link to Clipboard

Copy