Disruption in capital markets is nothing new. But now, technological advances have dramatically accelerated the pace of disruption.
The cloud offers cheap computing power with almost unlimited scalability. Organizations are building data science and machine learning capabilities to maximize the business value of their data. Also, rapid developments in AI are enabling machines to take over manual tasks, freeing humans to focus on more complex, higher-value tasks.
Together, these advances are increasing the value and speed of insights, helping ambitious business leaders disrupt their domains. The technological laggards, however, face an increasing risk of irrelevance.
In capital markets, incumbents face stiff competition from fintechs and technology companies. These newcomers are agile in their approach, disruptive in their products, and innovative in their execution.
Transforming processes with conversational AI
Part of the challenge for capital markets firms is that many of their core processes are still heavily manual, iterative, and conversational. Most of these interactive processes, including negotiations and sales, haven't changed in decades and suffer from high processing costs, inconsistent outcomes, and inefficient operations. Digitizing these front office processes with smarter technologies can reduce operating costs and save time.
One emerging technology in particular — conversational AI — can have a profound impact on a firm’s ability to automate traditional processes. In this article, we’ll examine the most promising use cases for conversational AI in capital markets, and outline the key principles to guide your transformation strategy.
What is conversational AI?
Conversational AI uses natural language processing (NLP) machine learning algorithms to understand and generate natural human language. It gives machines the ability to have meaningful dialogs with humans in spoken or written form.
Few fields in AI have made such amazing progress in recent years as conversational AI. Personal assistants such as Siri and Alexa have rapidly grown in popularity, both with consumers and organizations, as their performance has continuously improved.
One of the keys to the boom in conversational AI was the invention of “transformers”. Transformers such as Google’s BERT, OpenAI’s GPT2, and GPT3 use huge volumes of unlabeled texts from across the internet to train very large natural language models, sometimes with billions of parameters. BERT now powers almost every English-based search on Google.
These tools remove the expensive manual process of labeling training data, and they also provide a massive performance boost for NLP technology that underpins conversational AI.
So, how can capital markets firms take advantage of these new levels of conversational AI performance and accessibility? Let’s explore the potential use cases.
Conversational AI use cases in capital markets
Collateral and risk management
The margin call and settlement process for over-the-counter (OTC) derivatives trades can be a highly iterative and interactive process when the counterparties dispute margin calls. Operations teams send emails back and forth until an agreement is reached. If disputes remain unresolved, banks face increased dispute capital charges, and counterparties are expected to resolve them promptly.
The recent COVID-19 related market volatility has dramatically increased collateral volumes and highlighted the challenges in this area. In 2020, there was a 76% increase in the average number of margin calls, and the value of margin calls grew by over 200%, massively increasing the workload of collateral managers.
Conversational AI systems that can read emails and automatically resolve simpler disputes can help reduce operations teams’ workload. These AI systems, such as IVP Treasury and Synechron Automated Margin Call Management, will also detect patterns in disputes to help improve future trade structures and contracts.
With Markets in Financial Instruments Directive (MiFID II) requiring sell-side firms to charge for research separately, many firms are under growing pressure to reduce research costs while enhancing quality. Some are even reducing the number of research analysts, which has made producing and consuming research more challenging, both on sell-side and buy-side.
In response to these challenges, we’re now beginning to see increased use of conversational AI to make high-quality research more cost-effective. Morgan Stanley, for example, has developed a chatbot to help analysis and sales teams search over 50,000 reports generated each year. And JP Morgan has made its Corporate and Investment Banking research available over Alexa.
These technologies reduce costs both at sell-side and buy-side, but they also streamline research content, enhancing the quality of research and increasing productivity.
Sales and trading
Wider adoption of Straight Through Processing (STP) has led to an increasing end-to-end optimization of trading operations. However, structured securities trading remains largely manual and high touch, especially during pre-sales.
Structured products generate high commissions for investment banks, but at high costs. With conversational AI, firms can now optimize structured trading operations to increase margins.
Conversational AI systems can capture semi-structured Request For Quotes (RFQ) conversations into a schema that firms can analyze across the lifecycle of several trades. These systems can provide real-time insights to help sales and trading teams optimize processes and standardize frequently traded bespoke products.
The entire corporate actions lifecycle — from the announcement of a corporate action event to entitlements being made available to shareholders — is based on manual processes that create significant inefficiencies.
Event announcements are not always made in standardized formats, and processing a huge and ever-increasing volume of manual and electronic responses to corporate action events is complex and tedious. Failure to process a single event can lead to losses in tens of millions of dollars, along with reputational damage to investment management firms. The cost of corporate actions processing also runs in hundreds of millions of dollars.
While conversational AI may not be able to help resolve all the issues in corporate actions processing, it has the potential to optimize some areas. For example, AI can gather and normalize corporate actions data from multiple sources such as newswires, the internet, and data providers. Another potential area for efficiency gains is processing customer responses to event notifications. A conversational AI system could read electronic responses received from customers and process them, sending only the complicated, incomplete, inconsistent, and incorrect responses to the operations team for manual processing.
A staged approach to launching conversational AI
To successfully introduce conversational AI in your organization, alignment with the business is critical. Choose use cases that will create real business value and then start small with minimum viable products (MVPs) to help manage the risk of introducing a new technology.
Continuously assess your MVPs against meaningful metrics to determine if you’re achieving the desired business value and to identify how and when you should adapt, scale, or decommission them.
To ensure smooth collaboration between your data science, machine learning, and IT teams, use a frictionless delivery process for conversational AI to bring high-quality models into production safely, reliably, and continuously.
In general, it’s helpful to take an approach that breaks your conversational AI program into three stages: Initiate, Extend and Enhance.
In the initiate stage, build up NLP know-how in your data science and ML teams. In the Extend stage, your teams can use their new capabilities to extend NLP use cases to classify texts like emails, scanned letters, or legal documents and trigger appropriate workflows. In the Enhance stage, your teams should begin using advanced NLP techniques and tools, including transformers, to enable use cases such as identifying deadlines or reporting obligations in legal contracts and performing follow-up actions automatically.
Succeeding with chatbots in your organization
Chatbots are the first step in your organization’s conversational AI implementation, conducting dialogs with users in natural language to complete specific tasks. Sophisticated chatbots can have a conversation with a user until the task is completed effectively. They create a new form of user interface for your customers and employees, they help reduce operational costs and deliver a better user experience.
All the major public cloud providers offer ready-to-use frameworks to set up and operate chatbots. However, it’s still useful to have some skills in NLP, ML, MLOps, and dialog design to scale up from your early, simple use cases to an organization-wide deployment of chatbots for numerous use cases.
Prioritize user experience
To ensure user adoption, you need to create a chatbot that your customers and employees are happy to use and that offers them value. For example, an apparently incremental improvement in language understanding accuracy — from 95% to 99% — can make the difference between people barely using a chatbot or using it all the time.
It’s also crucial to design your chatbot’s personality very carefully. Your chatbot is a “face” of the firm, and its personality should be consistent with your company’s brand.
User experience designers can define your chatbot’s personality using a set of opposite character traits, moving sliders to create a complex character of the chatbot that dialog designers can use to define the bot’s responses.
Designing the personality of a chatbot
Chatbot technical components
The diagram below shows the technical composition of a chatbot. In the first step, the intent recognizer tries to understand the intent of the input. For example, in the sentence “My phone number is 120 1234 567”, the intent is “provide phone number”.
Next, the named entity recognizer identifies named entities in the sentence — in our example, that’s the phone number. Then the dialog execution module decides the next actions during the dialog, based on the status of the conversation. These next actions could be asking for missing information, an API call to an external system, or a finishing message when the task has been accomplished. These actions are then performed by the action resolver.
In the last step, the natural language generator formulates the response to the user.
The technical components of a chatbot
A continuous delivery platform is essential
To improve the quality of your chatbot during its development and operation, it’s vital to use an agile development process based on continuous delivery principles, such as MLOps or Continuous Delivery for Machine Learning (CD4ML).
The diagram below shows the continuous development and deployment process from a chatbot project. This process uses a continuous delivery platform to closely monitor the chatbot’s behavior in production and optimize it continuously. With this platform, you can automate and optimize most chatbot development tasks, including testing, deploying, monitoring, and patching.
The continuous development and deployment of a chatbot
S3 refers to Amazon's simple storage service, it is an object storage service built to store and retrieve any amount of data from anywhere.
Become a disruptor with conversational AI
It’s unlikely that conversational AI applications such as chatbots will ever achieve 100% automation, but they do have the power to drive operational efficiencies, particularly in areas dealing with heavily customized products and manual processes. Expertise in using conversational AI is becoming an important differentiator for investment banks and we expect to see the adoption rates rise.
Introducing radically new technology can be challenging, but the guiding principles and incremental journey we’ve outlined can help you begin using conversational AI in a simple and safe manner.
Invest in conversational AI now, and you can uncover valuable insights, achieve cost savings, improve customer experiences — and stay ahead in delivering innovation.
This article was published on April 8, 2021