Last week I took part in a panel discussion on Fintech trends in Capital Markets at the International Derivatives Expo in London. Joining me on the panel were Umesh Patel, Global Head of Strategic Alliances at Symphony, Jerome Kemp, President at Baton Systems and Liam Huxley, CEO and Founder at Cassini Systems. The central point we were tasked with answering was, what has the impact of transformational technologies such as artificial intelligence, chatbots, distributed ledger and cloud computing been on the working practices of the financial services sector over the last 18 months?
In short, out of necessity, we have seen seismic shifts in the way financial institutions work simply because no one could rely on working in the same physical location anymore. Naturally, technology has helped fill the void and meant the industry could largely carry on almost without interruption. There is no debate to be had that the pandemic has caused an inflection point on the rate of digitisation within our industry. The growth in chat platforms, AI and cloud are universal. However, I wanted to build on this truism by sharing some of the insights from the discussion.
Data is the lowest common denominator
What surprised me most, was that every panel member was making the case for data or more specifically, data standardisation. Why? Because data standardisation makes interoperability a reality, and it's data that fuels these transformative technologies and actually justifies their very existence. With standards comes the ability to much more easily share data across back, middle and front office systems using modern technologies such as Symphony bots to distribute data out to users. It becomes cyclical, more data drives up demand for AI and a myriad of channels to distribute data through. In turn, wider deployment of these technologies increases the demand for more data to support decision making and how business gets done. The growing demand for data is fuelling digitisation just as much as the pandemic has.
The screen estate has shrunk
At a point when consumption of data is growing and the feeds (read applications) for data are increasing, we've had to get used to having fewer screens. Working from home has meant ditching the traditional cinematic screen estate most enjoyed in the office in favour of a solitary laptop or desktop screen. This has driven a need to consolidate the number of applications serving up only one or two sources of data. Instead, opting for a consolidation into one application or one single source of truth. The challenge has been developing solutions rapidly that don't require coding and can be quickly deployed to user desktop estates. This is one area where ipushpull have seen phenomenal growth, acting as the hub to a myriad of data feeds making viewing and re-sharing far easier from one screen.
Real-time data is the new currency
We've seen in both pre and post-trade the morphing into a common risk management thread. Accessibility of data and workflow in real-time have become the number one challenge to solve for. This has been significantly improved through the use of bots, API's and data services.
Email is dead, long live email!
The biggest challenge holding back the industry until now has been inertia, or more specifically, old habits when it comes to data sharing. Time and time again, organisations fall back to what they know, typically this is email or Excel based file sharing. The challenge with this approach is it's slow and error prone. The last 18 months have helped accelerated the move away from these traditional sharing approaches, however, there is still much work to do to retire these deeply ingrained habits.
Pushing for data standards
I was buoyed by the rallying cry from the panel. Pushing for data standardisation benefits the entire industry. Only then will it become easier to share real-time data seamlessly across different systems of record, market data providers, chat platforms, Excel and other applications. Standards make working collaboratively far easier and help futureproof against further unforeseen global disruptions.
Moreover, concurrent standardisation of related protocols and syntax will allow the data to be more easily consumed and interpreted by chatbots, which can enable the use of Natural Language Processing (NLP) and broader Machine Learning applications. This will contribute towards greater efficiency and productivity across the trade life cycle.
For more information why not read this article from Markets Media. Please get in touch if you would like to discuss further.