Decision-Making in The Age of Imperfect Data
CIOs have long grappled with best ways to slice and dice information in order to shape the smartest investment strategies. And this challenge has only grown more complex as the volume of data and investment vehicles available today seem to multiply on a daily basis.
Allocators often struggle to glean any proactive value from the sheer volume of data that is coming in from every channel, from multiple sources and in every format, at varying levels of depth and lag, and being stored in silos across the organization.
Even the most acutely intelligent investors need help in filtering through this vast amount of ever-changing data at their fingertips to discern what information is actually meaningful and useful.
Let’s delve a bit further into some of the data challenges most commonly faced.
Too Much Data
Simply put, there is more information out there than ever before, with more being generated every second. In addition to the sheer volume of data, the 2008 financial crisis has made it so CIOs have a lot more data to identify. As the amount of data they are contending with expands, so too do the parameters for identifying the relevant data. Keeping up with the ever increasing stream of information is a massive challenge in and of itself. However, on top of this, allocators are also contending with lagged data, varying data formats, and different degrees of information depth.
Data lag is one of the largest challenges facing allocators in making sound investment decisions. Different types of investment vehicles require different standards of reporting, meaning that CIOs are often getting information with various lag times. Endowment funds and private equity vehicles will have vastly different reporting structures and requirements, leaving allocators with information from different points in time from which to make a decision.
Emails. Information requests from online portals. Excel. Unstructured PDFs. Data points coming in a number of different formats is like comparing apples to sneakers. It’s difficult to unearth relevant and perhaps unexpected connections between data that isn’t able to “talk” to each other in the same language.
Data transparency varies and for many in the space, consistency in this regard is still considered a “nice to have” rather than a necessity. There are many factors to which this can be attributed, be it the size of the investor or the liquidity of the investment, but the real challenge for allocators is how to make up for any varying degrees of transparency and detail in the data they are receiving.
These challenges can certainly be overwhelming. However, allocators should avoid the paralysis that can often come from not knowing where to begin. My best advice is to start small. Many times, up to 90% of a firm’s processes are overlapping. Like cleaning up duplicate data, consider some of the other simple “blocking and tackling” efforts that can be undertaken in order to fully understand your data and what you need from it.
Another helpful exercise is identifying the three or five main capabilities you need today, the most relevant ways in data can be used to make better decisions. This can be people related, in terms of breaking down silos and opening up access to information across different facets of the organization. This could also be output driven, as in, the ability to create reports or scorecards. The takeaway is to start small and build up, and starting small can be as simple as identifying where gaps in information lie within your organization.
Looking big picture, CIOs need the tools that will help their teams sift through and organize data in a normalized fashion so that it can be interpreted in ways that add proactively value. The end goal should be a centralized repository for data that allows it to be analyzed - and this is key - both quantitatively and qualitatively.
We know that the decision-making process for every allocator is idiosyncratic. As such, CIOs may be well served to build building a framework that marries qualitative and quantitative data and normalizes it in order to support decision making.
Chad Erwin is SVP of Asset Owners at Backstop Solutions
© The Sortino Group Ltd
All Rights Reserved. No part of this publication may be reproduced, stored in a retrieval system or transmitted in any form or by any means, electronic, mechanical, photocopying, recording or scanning or otherwise, except under the terms of the Copyright, Designs and Patents Act 1988 or under the terms of a licence issued by the Copyright Licensing Agency or other Reprographic Rights Organisation, without the written permission of the publisher. For more information about reprints from AlphaWeek, click here.