More coverage on our recent victory in the Risk.Net Markets Technology Awards as the Electronic Trading Support Product of the year 2020. See the article below that describes why we were selected by Risk.Net: The Devil in the Data.
We are delighted that our unique method of collecting, pre-processing and aggregating enormous quantities of data to provide Financial Services business with real-time actionable intelligence is getting the recognition it deserves.
Advances in technology are often a double-edged sword. For banks, the past couple of years have brought tremendous leaps in the ability to capture, manipulate and apply large volumes of data to everyday decisions; but it has been accompanied by a step-change in demand and expectations.
Whether those expectations are being set by regulators, by clients, or by businesses trying to get ahead of their competitors, banks are feeling the pressure – and, in some cases, are cracking.
These pressures were a key theme in this year’s Markets Technology Awards, with many of the 180 submissions focusing – at least in part – on the creative ways in which vendors are trying to help. (A full list of this year’s winners – and judges – as well as a summary of the awards methodology, can be found below.)
Regulatory reporting is one obvious trouble spot. On November 26, Citigroup’s UK entities were fined £44 million for misreporting their capital and liquidity positions to the UK’s Prudential Regulation Authority – a record fine for the watchdog. The accompanying 44-page notice contained a litany of failings, with data management a key theme.
At one point, the PRA explains why it cares: “The failure to provide accurate and timely regulatory data can indicate a range of weaknesses in a firm’s ability to manage its business prudently. Experience shows firms that do not produce timely, complete and accurate data during periods of relative stability are less likely to produce it under stress.”
During the misreporting period, Citi was trying to up its game. It built an internal data repository to feed a new regulatory reporting platform, but the combination of the two only came online this year.
Other banks are also turning to tech to solve their problems. London-based Kaizen Reporting gives the example of a global investment bank operating a complex trading book where the UK’s Financial Conduct Authority – which fined both Goldman Sachs and UBS for reporting breaches this year – had commissioned an independent review because of concerns about the quality of the bank’s reporting. Following two reviews, the FCA was still not satisfied.
Fearing a formal sanction, the bank brought in Kaizen. The firm’s software was able to quantify the exact number of reporting errors and what they were, says Dario Crispini, chief executive officer of Kaizen, which won the regulatory reporting product of the year category.
Reconciliation of the bank’s back-office systems with its transaction reports identified over-reporting and enabled the firm to estimate the excess costs this incurred. Ultimately, the bank was able to identify the back-office system that was the source of its errors and make the required changes.
In another instance, a US investment bank discovered under-reported data for Dodd-Frank Act reporting – in part because some counterparties were not being correctly classified as ‘US persons’, which are treated differently by the regime – but was unable to quantify the scale of the problem and the extent of the errors.
Here, Kaizen’s software “was able to determine the US person status of all counterparties as well as take raw source system data and independently apply Part 43 and Part 45 rules to determine reportable trades. It also reconciled reportable trade volumes against reported trades to ascertain over- or under-reporting figures, finding hundreds of thousands of trades were under-reported in foreign exchange asset classes alone,” says Crispini.
These cases, and others where the company has helped banks that have attracted regulatory scrutiny, are often down to institutions “marking their own homework”, says Crispini. “Testing approaches delivered in-house typically break down between sample-based testing and scenario-based testing. Both approaches have significant shortcomings as they are unable to test every single report and detect all reporting errors.”
In some cases, new reporting regimes are driving institutions to explore near-real-time monitoring and controls. A case in point is the requirement in Europe’s revised Markets in Financial Instruments Directive for firms and venues to timestamp transactions with microsecond granularity or less. The benefits can go beyond simple compliance.
“Trades are data in motion,” says Nick Gordon, Beeks Analytics representative, which won the electronic trading support category in this year’s awards. “If a bank can capture, track and correlate the data as it moves across its systems, it can get full end-to-end visibility that can be used by various business elements.”
Nick Gordon, Beeks Analytics representative, collects all trade and associated data, including market, trade, risk, hedging, middle-office, netting, settlement and payment information. For some clients, this amounts to monitoring up to 30 million events a day at each of 250 data points.
Velocimetrics does this directly and in real time while the data is in transit, rather than waiting for the underlying systems to report it. The company has devised 200 data-harvesting mechanisms to ensure there is minimal impact on the underlying operational systems. For example, it can pick up Fix, exchange or other messages as they flow past network switches, or eavesdrop on conversations on publish-and-subscribe systems, such as WebSphere MQ or Tibco messaging. It extracts and normalises the business information, centralises and correlates it, making visible the transaction flow and enabling various monitoring analytics to be applied. Users can check where a trade or data is, where it is going, whether it has arrived, what its quality is, and whether there are problems with it.
“Banks are beginning to ask how they can use this data to make their business more profitable,” says Gordon. “They are asking what the data tells them about how they might reduce costs, or where they should invest in their systems to get better performance.”
One bank has used Velocimetrics technology to identify and replace a slow forex pricing platform, for example, turning the business round from loss to profit.
Banks looking to avoid the Fundamental Review of the Trading Book’s costly capital add-ons for risk factors deemed ‘non-modellable’ – applied in markets where data is thin or patchy – are investigating pooling their trade data to achieve the required price history. Several organisations are looking to offer third-party data pooling services in this area, but there are obstacles. Some large banks with bigger trade datasets argue they would be gaining less benefit from the arrangement than smaller participants – and do not want to give their rivals a leg-up – while regional banks are also reluctant to give up their wealth of local market data to international competitors. One solution is for banks to form their own pools.
First out of the blocks is DataVault Innovations – an initiative of six top Canadian banks: BMO Nesbitt Burns, CIBC World Markets, National Bank Financial, RBC Capital Markets, Scotia Capital and TD Securities. Two key factors have enabled the group to get its data pool up and running before others. First, the banks already had an organisational framework in place through CanDeal, an online exchange for Canadian dollar debt securities that the banks own jointly, along with TMX Group. And, second, they deployed the data pooling platform from big data technology specialist Ticksmith.
The platform anonymises the banks’ data, pools the price observations, enhances the data with ratings and sector buckets, and processes modellability reports. DataVault Innovations was announced in March and plans to cover all over-the-counter asset classes. Already, participants have been able to increase the number of modellable instruments they trade, says Francis Wenzel, chief executive of Montreal-based Ticksmith. Furthermore, the banks have been able to monetise certain pools of OTC data and are planning to add evaluated pricing to the service.
”Clients install our software in their infrastructure and remain in control of their content. They are not sending it to a third party”
Francis Wenzel, Ticksmith
Other groups of banks are now talking to Ticksmith about setting up similar pooling operations. “There is a big appetite to replicate what the Canadians have done,” says Wenzel.
The company may be an attractive partner not just because of its technology, but also because the banks can retain ownership of their data, Wenzel argues: “We are a software vendor and we are not after the banks’ data. Clients install our software in their infrastructure and remain in control of their content. They are not sending it to a third party.”
Data pooling is just one module of Ticksmith’s big data platform. CME uses the platform to manage and distribute 450 terabytes of historical market data, some of it going back to the 1970s. Other applications include real-time transaction cost analysis and powering back-testing engines for algorithmic trading. “What these applications have in common is scale – the large amount of data, number of instruments or number of calculations,” says Wenzel. The next growth area for big data technology will be in AI, particularly managing data to feed machine learning applications, he says.
AI 4 XVA
AI is also being drafted in to help with the challenges of calculating multiple valuation adjustments (XVAs) to trade prices. Measuring XVAs in a way that not only satisfies regulatory requirements, but also gives a competitive edge, is so demanding that banks are throwing a whole basket of leading-edge technologies at it.
“In essence, valuation adjustments allow you to more accurately price credit risk, funding risks and regulatory capital – all major industry focuses since the financial crisis,” says Andrew Bateman, executive vice-president and group president for capital markets buy-side business at FIS.
When managed properly, XVAs are a tremendous source of competitive advantage for front-office pricing. “However, XVAs are complex by nature and require large-scale simulations that instantly differentiate firms with the most mature and sophisticated systems,” he says.
Often in these kinds of computationally demanding situations, organisations have faced a trade-off between accuracy and speed. However, the deployment of certain techniques and technologies in pursuit of XVA pricing accuracy means users don’t necessarily have to wait for results. Among the techniques and technologies FIS deploys in its XVA solution are adjoint algorithmic differentiation (AAD) to calculate price sensitivities, parallel processing through the vectorisation of code and use of graphical processing units, and running analytics in memory.
“Accuracy can go hand in hand with performance – these qualities aren’t mutually exclusive,” says Bateman. “Take AAD, which is not only 1,000 times faster than the industry-standard finite difference approach, but also much more accurate, providing mathematically exact derivatives, rather than an approximation.” Similarly, using an in-memory aggregation engine for accuracy and transparency in analytics is also much faster than many traditional cube technologies, claims Bateman.
In the search for even greater efficiency in managing XVA, FIS, which won the XVA product of the year category, is now working with one client on the application of artificial intelligence to calculate XVAs more efficiently. Machine learning models are being trained on large datasets to identify characteristics from the daily credit exposure profiles generated through Monte Carlo simulation and to identify anomalies due to either counterparty behavioural changes or errors in inputs or calculation.
Markets Technology Awards 2020
Murex was the big winner of this year’s awards, scooping a total of five categories, including the coveted award for systems support and implementation.
Bloomberg took three prizes, while FactSet and IHS Markit won two apiece.
Many of this year’s winners were established vendors, rather than recent start-ups – the honourable exceptions being BestX, Droit Financial Technologies and Kaizen Reporting. The judging panel added a special award for innovation, to recognise the progress made by another start-up – Beacon Platform – in establishing itself in the fiercely competitive market for traded risk systems.
An overview of the awards methodology, and a list of the judges, can be found below.
JUDGES’ AWARD FOR INNOVATION:
TRADED RISK TECHNOLOGY:
Market liquidity risk product of the year: Bloomberg
Market risk management product of the year: Murex
Counterparty risk product of the year: Murex
XVA calculation product of the year: FIS
FRTB product of the year: IHS Markit
Mifid II product of the year: Droit Financial Technologies
Regulatory reporting product of the year: Kaizen Reporting
Solvency II product of the year: Moody’s Analytics
Pricing and analytics, commodities: Lacima
Pricing and analytics, structured products and cross-asset: Bloomberg
Trading systems, commodities: Ion RightAngle
Trading systems, FICC: Murex
Trading systems, structured products and cross-asset: Murex
Best execution product of the year: BestX
Buy-side ALM product of the year: RiskFirst
Buy-side market risk management product of the year: FactSet
Buy-side trading system of the year: FactSet
Performance attribution product of the year: StatPro
CCP clearing support product of the year: Nasdaq
Collateral management and optimisation product of the year: Bloomberg
DATA AND OTHER SPECIALIST CATEGORIES:
Alternative data vendor of the year: IHS Markit
Best vendor for systems support and implementation: Murex
Electronic trading support product of the year: Velocimetrics
Risk data repository and data management product of the year: Xenomorph Software
Methodology and list of judges:
Technology vendors were invited to pitch their products and services in 30 categories covering traded risk, front-office regulation, pricing and trading, buy-side technology, back office, data and other specialist areas. Candidates were required to answer a set of questions within a maximum word count about how their technology met industry needs, its differentiating factors and recent developments. A total of 180 entries were received and shortlisted, a slight increase on last year’s total.
A panel of 11 industry experts and Risk.net editorial staff reviewed the shortlisted entries, with judges recusing themselves from categories or entries where they had a conflict of interest or no direct experience. The judges individually scored and commented on the shortlisted entrants, before meeting in October to review the scores and, after discussion, made final decisions on the winners.
In all, 25 awards were granted this year. Awards were not granted if a category had not attracted enough entrants, or if the judging panel was not convinced by any of the pitches.
This year’s judging panel consisted of:
Peter Burgess, independent adviser
Sean Coppinger, head of risk technology, Standard Chartered Bank
Sid Dash, research director, Chartis Research
Clive Davidson, contributor, Risk.net
Ian Green, chief executive officer, eCo Financial Technology
Ahimsa Gounden, head of market risk, capital and regulatory, Absa
Jenny Knott, chief executive officer, Fintech Strategic Advisors
Peter Quell, head of portfolio analytics for market and credit risk, DZ Bank
Hugh Stewart, independent adviser
Tom Wilson, chief risk officer, Allianz
Duncan Wood, global editorial director, Risk.net
Copyright Infopro Digital Limited. All rights reserved.
You may share this content using our article tools. Copying this content is for the sole use of the Authorised User (named subscriber), as outlined in our terms and conditions.
If you would like to purchase additional rights please email firstname.lastname@example.org