A case study that looks at University of California Investments’ reporting of illiquid asset valuations shows how large institutional investors can use data science to improve operational efficiency.
Specifically looking at innovation in fair value of illiquid assets, the case study details how fair value quantities can be improved and extended for limited partners (LPs) both in terms of accuracy, timeliness and granularity. But the purpose of the case study is to show the value of bringing sophisticated data tools inside a long-term investor.
In the paper, “Data Science: a case for innovation in valuation”, the authors, from the University of California, Stanford University and FEV Analytics, seek to demonstrate how “adoption of advanced data science techniques can move organisations past the current unsatisfactory state of the art, to an unprecedented level of operational finesse”.
Sheridan Porter, head of marketing at FEV Analytics and researcher on the paper, says LPs need to get started with data science.
“It is inevitable that the industry will move towards using data science,” she says. “LPs need to get started or they will be at a disadvantage to their peers in five years.”
The paper provides LPs clarity on what data is needed, how it should be handled, and the business case for its rigorous application.
The case study
The $120 billion University of California Regents has about 9 per cent allocated to illiquid assets, and like many of its peers, has under-developed technical tools. As such, the authors say, the organisation could show incredible improvement with small changes.
The case study looks at how data technology can streamline and strengthen portfolio fair valuation of illiquid assets and produce additional benefits for the investment team outside of operations.
General partners typically report asset values quarterly, but their reports often have months of lag. This creates an opaque and complex environment for LPs, who are required to report the fair value of their investments objectively and ahead of their GPs.
The commonly used “roll forward”, which takes the most recently reported GP estimate of fair value and adjusts it for the accounting period, is flawed – not least because it is a manual process.
The new technology the case study showcases makes the roll forward more exacting and systematic, overcoming the limitations that make fund valuations unreliable. Where richer data is available, such as in co-investment portfolios, the technology automatically expands the set of outputs, creating a valuation independent of GP reporting.
“The conventional roll forward procedure is a practical workaround to a difficult issue, but its manual nature makes it unreliable,” Porter says. “As a result, it’s typically done just once a year as a reporting procedure, rather than quarterly or monthly as a monitoring or risk-management procedure. The innovation moves roll forward into another realm, where systematic daily valuation is possible.”
The coded statistical approach to the roll forward, introduced in the paper, operates at the holdings level, the accuracy and reliability of which are explicitly captured and aggregated to the portfolio level. Data pathways automatically adjust the methodology according to the information available on each fund, meaning LPs don’t need to make data uniform across a portfolio before operationalising it.
“It’s a massive computation, done within hours,” Porter explains, “It makes all parts of the roll forward procedure – including the market adjustment – systematic and automated. These are necessary conditions for it to scale and be accountable to quality standards and stakeholders.
“As portfolios and the expectations of in-house teams evolve, this type of measurement is useful because it connects a valuation process to a risk-management process.”
The automated approach eliminates the lag that is common in private asset reporting. It’s not out of the question for a valuation to have a six-month lag, which can have a big impact on the portfolio if the projected and actual valuations are significantly different.
Practical advantages for LPs
“Measuring asset value, independent of the GP, gives the LP some very practical tools,” Porter says. “For example, a difference in our measure and the GP’s estimate might indicate a stale NAV [net asset value], the largest of which can be prioritised by the LP for discussion with the GP. We can work the technology to streamline operational processes and leverage that for the LP to speak to the GP.”
The paper shows that there are also other benefits for LPs using measurements that do not have a lag.
“If LPs can see granular portfolio valuations on a more frequent basis, other portfolio measures related to liquidity, market sensitivity and performance also become available on a more rolling basis,” Porter explains. “Portfolio manipulation and look-through are core components of investment operations.
“For CIOs, this brings the [alternatives] portfolio in off its island, makes it synchronous with other asset classes. It has real implications for asset allocation as well as operations.”
The paper, written by Arthur Guimaraes from the University of California Investments, Ashby Monk from Stanford University and Sidney Porter from FEV Analytics, will be published in the fall edition of the Journal of Portfolio Management.
It can be accessed here: Improving-investment-operations-through-data-science-a-case-study-of-innovation-in-valuation