Skip to main content
The Technancial Company

Q&A: Alexander Lamb, The Technancial Company

Alexander Lamb, Head of Business Development Americas and Head of Marketing for The Technancial Company Ltd., which provides the JANUS™ suite of risk management and surveillance technology, discusses risk monitoring and big data value in the post MiFID/MAR era.

AW: Alexander: What have the key issues been in getting to and getting through the fairly broad requirements that were set out in MiFID?

AL: The biggest challenge that we have found in working with clients and in exploring the needs of prospects has been one of bringing all of the required data into one place and format, whilst maintaining its integrity and timeliness. The diversity that has evolved in how trading data has been captured and what is in each record is enormous; it is almost as if new languages have been invented for different asset classes and different trading types, different back office systems and maybe even different risk management systems. By this, we mean not only the trades that have taken place, but also all orders that precede that trade – so in many cases, multiples in terms of numbers of orders per executed trade – so the volumes of ‘transactions’ is enormous. Added to this is the novelty of the art, rather than the science, of observing and figuring out exceptions because there really isn’t enough information yet to have a ‘handbook’ that can be used as a starting standard.

AW: Why is risk monitoring needed?

AL: Risk monitoring has always been needed; what, and how, have until recently been the choice of each business, so no regulations typically meant almost no monitoring. Being able to prohibit or at least deter bad actors is one of the advantages that these mandates for monitoring have brought. Monitoring has, even in its infancy, been eye-opening, but more importantly it has raised awareness by huge leaps. What, and how, trades are occurring, without necessarily the why (secret strategies etc.), can give the risk or trader manager insight into not only what the perception that these activities present to the outside world but also a sense of the relative impact to the firm’s own infrastructure and potential pressures that can expand when markets change their behaviour. An example is that margin has now become a key element to be monitored and validated to ensure prudent exposure management for firms – and until recently the real nature of market risk exposure being generated during the day has been woefully underestimated. Identifying the source of risk when it is approaching is better than recognising it on its way past, once the damage has been done.

AW: What are the value points for a CTA and what unknowns, or surprises could be revealed?

AL: A CTA typically deploys a fraction of the money in a fund to trade markets – sometimes selecting markets that may have triggered signals that indicate increases in volatility (strong directional movements) – so deploying some of the customer money into positions with larger initial margin requirements might mean tailoring those exposures or reducing others to limit the amount of capital that is deployed for a strategy. Figuring out what the margin effect of new orders placed should be something that the CTA can do before orders are placed, both to ensure that cash on hand is adequate for the strategy or that the new position doesn’t explode initial margins in play.

Alexander Lamb
Technancial's Alexander Lamb

Monitoring everything from the Initial Margins to the patterns or behaviours of the traders generates information that gives colour to the otherwise monochrome trading landscape. Are margins high on all the exposures? Did the margins change while the positions didn’t? Is margin compared to open position changing rapidly? Do the orders and trades reach the monitoring system with consistent speed? Are there delays or bottlenecks with some trading venues? Are many orders being placed and cancelled? Are any of these orders likely to be construed as orders placed without intention to trade, such as attempts to trade out of one position into another while leaning on the bid in the first?

What becomes visible that was previously unknown is the holistic effect of adding orders, reducing or increasing individual instrument positions, on the many variables that constitute the whole book. The alert configuration isn’t simply to identify wrongdoing but is also a tool to help the trader see concentration, have an idea of liquidity risk, and set his or her metrics to monitor the market to order – or is it order to market - relationship that may indicate what the quality of execution is likely to be if the average trade size for an order in a particular instrument starts to change dramatically.

AW: How does this help the clearers that hold CTA positions?

AL: Given that CTAs are independent decision makers, it helps in many ways. One is to look at the ‘worst net exposure’ potential in the instruments where the orders are placed. Whilst that’s reasonably easy to see on an outright basis today, viewing these in isolation whilst not looking at the margin exposure that the portfolio assessment will give could lead to either too relaxed or over cautious behaviour on the part of the person that is aware of this. Ranking accounts by multiple criteria, collateral utilised or not utilised, ratio of initial margin to collateral, instrument class offsets where margin offsets are either nonexistent or minimal, as well as overall position limits, when seen in concert with each other may build a clearer picture than simply banging all the positions from the previous night with a stress test of a certain number of standard deviation market moves. Yes, these may expose a position that likely can cause a problem if all positions are the same on the next day and the markets don’t all move in concert, but some move more violently than others. How do we action such information? Surely, knowing what is happening now, with a view of exactly where the risk concentration is, makes more sense. Comparing that exposure with the current market, its price, depth and the likely execution price would help the clearer’s risk manager’s make more sensible decisions and reduce risk where the risk is greatest, first. This could avoid increasing risk exposure by reducing the biggest contract position while missing the biggest volatility risk that continues to accelerate because that position didn’t seem very large.

AW: Are there alerts that a system can give that give an advantage to position management decisions?

AL: Absolutely. Like with a trading system, the risk monitoring system can be configured to look for what the trader manager or risk manager would see as challenges (using experience to imagine scenarios that could occur and setting several parameters to highlight these scenarios as early warning signals). Imagine a trader that manages a myriad of commodity positions and in order to maintain the strategy through several monthly delivery cycles, he or she needs to be able to avoid rolling the contracts when the front month liquidity dries up and spreads become unpredictable. They monitor the spread levels, receive alerts if they change, monitor the average daily volume relative to the positions in the front and next month and if the ratio is favourable, they trade. Let the system inform you actively rather than devoting a lot of time either manually monitoring all the different instruments or paying a programmer to build programs that do it for you but that need to be maintained going forward.

AW: Do you expect monitoring to become a standard for most participants in the marketplace, or is the challenge of implementing these kinds of systems going to limit it to the sell side or FCM community?

AL: Monitoring, whilst seen as an annoying and expensive burden by some, continues to drop in cost as hardware continues to improve, but more importantly, as shared environments with differing performance demands develop, we are seeing that the expensive all you can eat approach is being replaced by a menu driven approach, and the technology, software components are all being developed in a more and more modular method.

Mining the mountains of data out there can be very expensive, particularly if the mountain continues to grow, but being in a shared environment where that data mining technology is paid for by many, while remaining flexible enough to be user specific enough, means that the process begins to add masses of business intelligence value to each participant.

Alexander Lamb is Head of Business Development Americas and Head of Marketing for The Technancial Company

© The Sortino Group Ltd

All Rights Reserved. No part of this publication may be reproduced, stored in a retrieval system or transmitted in any form or by any means, electronic, mechanical, photocopying, recording or scanning or otherwise, except under the terms of the Copyright, Designs and Patents Act 1988 or under the terms of a licence issued by the Copyright Licensing Agency or other Reprographic Rights Organisation, without the written permission of the publisher. For more information about reprints from AlphaWeek, click here.