Insights

How is ChatGPT Making Inroads on Wall Street?

July 15, 2024 | By: Ivy Schmerken

From summarizing earnings calls to coding data queries, generative AI tools have arrived.


With the rapid growth of artificial intelligence, Wall Street is experimenting with ChatGPT from OpenAI and other generative AI-based tools to save time on tedious tasks and find information in massive data sets.

In April, the New York Times reported that investment banks have adopted AI to automate the work of analysts who pull all-nighters to assemble PowerPoint presentations and crunch data in Excel spreadsheets. Major banks are testing AI tools with code names like Socrates to supplement the work of junior analysts, wrote the NYTimes, predicting that in the future such tools could upend the need to hire armies of college graduates who perform “boring tasks” as part of the rite of passage.

There is a lot of buzz around generative AI systems, with their ability to produce content and calculate new data, but no one is replacing higher-level jobs in trading or investment management. Financial firms are looking to take advantage of large language models (LLMs) and chat bots like ChatGPT, which can synthesize vast amounts of information and provide responses in question-and-answer format.  

Some predict that LLMs can play a transformative role in research, investing and trading, but firms are proceeding cautiously. In the initial stages, firms are focused on using ChatGPT to make things more efficient and to boost productivity while keeping humans in the loop as decision makers.  

“What we’re seeing emerge is using genAI to act as an agent for you where it can execute some pre-commanded instructions to help create efficiencies in ongoing repetitive processes,” said Brad Ahrens, Senior Vice President, Advanced Analytics at the Financial Industry Regulatory Authority, speaking on the FINRA Unscripted podcast. “We’ve heard from the largest-to-the smallest firms wading into this space that they are taking a conservative and dialed approach,” said Ahrens, who noted that no customer information is involved.

In November of 2023, FINRA surveyed 100 broker-dealers and included two questions around their plans for generative AI, either vendor supported, internally developed or open source. The biggest implementation was question-and-answer retrieval on procedure manuals that can be a couple hundred pages long or employee handbooks, where firms can get back an immediate response.

More broadly, FINRA saw firms focused on a few key areas to simplify human processes, such as creating a tool to review a corporate 10K filing, then extracting pieces of information, which is then pushed to other parts of the organization. Others were using genAI tools to listen to quarterly earnings calls and report that back, or to create presentations based on a set of data they wanted included. Among the more innovative use cases was training an avatar to speak on a topic, where a human is speaking but train the model to do the voiceover. Firms also have applied chatbots to surveillance, where they can review massive amounts of data, leaving a subset for humans to monitor.

Research Buried in the Inbox

In addition to tackling repetitive tasks, firms are looking at generative AI tools to find information more efficiently within the enterprise.

Analysts and portfolio managers who cover different industries often waste a lot of time searching for research notes, files and emails hidden in their inboxes.

 “The data proliferation in our industry is insane. Our Wall Street brokers send us research all the time which lands in our in-boxes.  It’s hard to find, and a lot of it goes unread until it’s needed,” said Andrew Meister, a securities analyst for the past 19 years, who is currently a sector analyst at Thrivent Asset Management in Appleton, WI.   “I said to myself there’s got to be another way to do this,” said Meister, who programmed DoTadda with a few buddies as a side project. 

DoTadda’s AI tool-based on ChatGPT-4 Turbo – enables securities analysts and PMs to find, store and discover their research reports, press releases and news articles stored in their email inboxes or in research management systems. It also includes Tweets, web sites and Youtube videos or anything with a URL link.

“What we do on ingestion, we use AI to read the article, summarize it and auto-tag it to then make searchable pills, organized by topic or calendar date. Each data item is stored in a vector database, which can reside in the cloud or on premises,” said Meister.

Functioning like a data librarian on steroids, DoTadda allows an analyst or PM to search by key word so that if someone follows Boeing, they can digest all the news articles. The newer capability is semantic search, where the user can ask a question, such as: “did Elon Musk discuss layoffs in the Q3 conference call? Another query could be: “what is the outlook for the Brazilian corn crop in the fourth quarter?,” said Meister.

It offers a retail product called DoTadda Knowledge which can access and summarize earnings calls for portfolio managers and analysts, allowing them to ask pre-canned questions of the conference call transcript. “That would take about an hour to do, and this takes 30 seconds,” said Meister.

Virtual Assistants on the Trading Desk

Geoff Smith, Data Scientist at FlexTrade, predicts that virtual assistants trained to search for data in an order-and-execution management system (0EMS) will be among the main applications on the institutional trading desk.

In June, FlexTrade began to roll out a beta version of FlexA, [short for FlexTrade Assistant], to its buy-side clients. “The idea is that you can interact with the EMS or OEMS using natural language and you can basically control the trading system through a chat space,” said Smith. As a sample query, an equities trader could ask FlexA: “show all my Swiss orders,” said Smith. A trader could type a query asking: how many orders do I have over 10% ADV [average daily volume].

Instead of interns being hired to research and code database queries, ChatGPT can now perform these tasks for the buy-side trader. In FlexA’s case, the AI-based search tool is not replacing the trader who would still review the output and make the trading decision. “This is like Microsoft Copilot and GitHub Copilot; these kinds of tools are assistants. I see FlexA in the same space as those tools but for the OEMS,” said Smith.

Trading systems are typically complex, and each trader has different workflow patterns, so the ability to interact with via FlexA in natural language, allows the trader to get immediate results with reduced button clicks.

Recently, the OEMS was integrated with a third-party market data provider so that the user could search for information not in the OEMS, such as who is the CEO of Nvidia? In addition, FlexA is now integrated with FlexData, a separate database, enabling a buy-trader to create a query of their historical orders.  Now the trader “can input any kind of aggregate-level query at the broker level or algo-level, such as when did I last trade this stock, or how much did we trade in September using the AlgoWheel,” said Smith.

Data Considerations

On a recent Data Minds webcast, hosted by Wall Street Horizon, industry experts offered different perspectives on some of the unique challenges posted by generative AI and large language models (LLMS).

 “AI for finance is really its own beast with its own lexicon in the broader AI ecosystem,” said Chris Petrescu, founder and CEO of CP Capital,

According to Petrescu, who advises hedge funds about data strategies, some of the top concerns for traders are data accuracy, safeguarding their data inputs into AI models and protecting against information leakage by AI engines.

Feeding data into AI models “is garbage in, garbage out, and needs to be pristine and needs to be free of bias and needs to be auditable in some way,” said Petrescu. “If someone wants an AI model to predict the top three metrics for anticipating bankruptcies but feeds it only the names of companies that are alive today, the AI doesn’t know what to predict.”

“It’s important to make sure the data is free of survivorship bias and political bias,” he said. If a user asks ChatGPT who will win the presidential election but feeds in left-wing or right-wing publications, the answer will be one-sided, he noted.

Petrescu suggested that firms put guardrails around the prompts they send out to ChatGPT, which could go to OpenAI’s servers and be used to train the model.  In addition, he noted that some data analytics vendors were building their own internal models, taking a basic level of ChatGPT and then expanding it for finance, but closing it off from the outside world.  “It’s important that AI tools being built within organizations are not leaking your information back into the world,” he said.

To avoid any leakage of confidential data, there is a way to put controls around ChatGPT applications.

In FlexA’s case, the buy-side firm’s EMS data is walled off from the outside. When a trader types in a query, it is sent to Open AI, hosted on Microsoft Azure’s cloud infrastructure, which then sends a query back in the form of code it will run on the desktop in the EMS. “No order execution data gets shared with Chat GPT. It’s all internal to our system. The code will run in the EMS locally, so none of the data leaves your EMS,” emphasized Smith.

What’s Ahead

With generative AI models changing every few months, the capabilities are constantly evolving.

FINRA’s Aarons noted that a new model like GPT-4 is trained on over a trillion different parameters, whereas earlier models were trained on a few million parameters. “In less than five years AI companies have used 100 times more data to train the new models,” he said.

According to FlexTrade’s Smith, it takes about two seconds to make a request to ChatGPT. “That will come down over the next year or two because there is a lot of competition for hardware in that space. “It’s moving very fast.  The capabilities of the large language models and the performance are improving,” said Smith.

Going forward, speakers on the webinar said firms will look to apply generative AI with a large language model to investing and trading.  

“We are trying to incorporate it into real processes with AI thematic driven funds,” said Jeremy Schwartz, chief investment officer of Wisdom Tree Asset Management, which runs $108 billion across the US and Europe in ETF structures.

“There are definitely ways to use [AI] from an investment perspective as well as a day-to-day tasks perspective,” said Schwartz. On the personal productivity side, firms also need to figure out how to improve their day-to-day lives such as coding.”

Microsoft’s Co-Pilot, Google’s Gemini and Open AI’s ChatGPT have brought out enterprise versions for businesses. Schwartz said his firm uses the enterprise version of Co-Pilot to improve writing drafts and to summarize transcripts.

“The right firms are going to try to get the most out of their employees and capital and technology and this is no different,” said Schwartz.

Explore more

Insights

SEC Updates Tick-Size and Access-Fee Rules, Industry Reacts

Insights

A New Era in Bilateral Liquidity

Explore All Posts