JPMorgan is not your typical investment bank. It is currently setting itself apart by adopting an AI-first strategy. Most recently, the US bank announced that it is launching a generative AI tool for its employees – an in-house version of OpenAI’s ChatGPT that can effectively perform tasks typically done by research analysts.
Named the LLM Suite, the new large language model platform will be available to employees in JPMorgan’s asset and wealth management division. It supports tasks like writing, idea generation, and document summarisation making use of third-party models.
Earlier this year, JPMorgan began rolling out the LLM Suite to select areas within the bank, reaching approximately 50,000 employees, or about 15% of its workforce.
The bank developed its own proprietary LLM platform internally because its employees are prohibited from using consumer AI chatbots such as Anthropic’s Claude, OpenAI’s ChatGPT, or Google’s Gemini for work. Strict regulations in the financial sector require that client data remains within secure, in-house servers.
With numerous open-source models available, such as Llama 3.1 and Mistral Large 2, which rival OpenAI’s GPT-4o, financial institutions now have the flexibility to develop their own AI models rather than relying solely on OpenAI’s APIs.
“AI is real. We already have thousands of people working on it, including top scientists around the world like Manuela Veloso from Carnegie Mellon Machine Learning,” said JP Morgan chief Jamie Dimon, adding that AI is already a living, breathing entity.
“It’s going to change, there will be all types of different models, tools, and technologies. But for us, the way to think about it is in every single process—errors, trading, hedging, research, every app, every database—you’re going to be applying AI,” he predicted. “It might be as a copilot, or it might be to replace humans.”
He further said that they have already integrated generative AI into their various services, primarily for idea generation and note taking. He further explained that while taking notes, the AI provides additional insights or highlights important points. For example, it can flag potential client interests or identify recurring issues related to errors or customer service.
Dimon is convinced that AI might replace some jobs as well. “What AI’s gonna do is know more about you, learn more about you, look at patterns, and look at successful things in the past. AI will be a huge aid to things like that,” he said.
JPMorgan’s Generative AI Offerings
One of the standout offerings from JPMorgan is Quest IndexGPT, a tool powered by OpenAI’s GPT-4 model for thematic investing. IndexGPT analyses news articles and generates keyword-driven investment themes, facilitating more informed decision-making for clients.
“In the past, the process of finding stock portfolios that track themes such as cloud computing or cybersecurity was complicated. Now, we use AI to systematically generate the keywords that help us identify the relevant stocks,” explained Deepak Maharaj, who heads the Equities Strategic Indices team at JPMorgan.
“With GPT-4, the keyword generation is superior to older models, and therefore our clients benefit from a potentially more accurate representation of the theme,” he added.
In 2023, JPMorgan Chase developed an AI tool to help it analyse United States Federal Reserve statements. This was done to determine whether trading signals could be detected in the speeches and statements. With the tool, analysts could observe policy shifts and generate signals for trading.
In-House Research
JPMorgan’s AI research is advancing rapidly alongside its use of GPT-4. The firm now employs over 2,000 AI and machine learning experts. AI is used in over 400 areas within the bank, including marketing, fraud detection, and risk management. Dimon said that, now, generative AI is being explored for customer service, operations, and software engineering.
According to AIM Research, JPMorgan is aggressively hiring in AI, with over 75 open positions in the field. Salaries are competitive, with roles like applied AI/ML senior associate offering between $129,250 and $195,000 annually, depending on location and experience.
Earlier this year, JPMorgan’s AI team published a paper on DocGraphLM, a new tool for document analysis. DocGraphLM enhances information extraction (IE) and question-answering (QA) for complex documents by integrating pre-trained language models with graph semantics. It features a joint encoder architecture and an innovative link prediction method for reconstructing document graphs.
Building on this, the bank also introduced DocLLM, a generative language model designed for multimodal document understanding. DocLLM stands out as a lightweight extension to LLMs for analysing enterprise documents, spanning forms, invoices, reports, and contracts that carry intricate semantics at the intersection of textual and spatial modalities.
Furthermore, JPMorgan has launched FlowMind, a system that uses LLMs to automate workflow generation.
What Are the Others Doing?
Goldman Sachs plans to complete the rollout of its initial generative AI tool for code generation to thousands of developers across the company. Goldman Sachs’ generative AI platform, the GS AI Platform, evolved from an existing machine-learning system and serves as the central hub for all generative AI initiatives at the company.
Goldman has partnered with Microsoft, an OpenAI-backer, to utilise GPT-3.5 and GPT-4 models, and with Google for its Gemini model. The platform also incorporates open-source models such as Meta’s Llama.
Meanwhile, Citigroup provided its 40,000 coders with access to generative AI. For example, when federal regulators released 1,089 pages of new capital rules for the US banking sector, Citigroup used generative AI to meticulously review the document.
Wells Fargo has integrated generative AI into its virtual assistant, called ‘Fargo,’ to enhance customer interactions. The AI-driven assistant can answer queries, provide account information, and assist with transactions in a conversational manner. The app, built on Google Dialogflow and utilising Google’s PaLM 2 LLM, was launched only recently.
Morgan Stanley also recently launched its second generative AI application for financial advisers, opting for in-house solutions instead of pre-built tools from AI startups. The new application, AI @ Morgan Stanley Debrief, is designed to summarise video meetings and create follow-up email drafts.
This addition comes after the September 2023 debut of its AI knowledge assistant tool. Developed in partnership with OpenAI, it aids advisers in swiftly locating information from Morgan Stanley’s research.
Finally, Bloomberg recently introduced BloombergGPT, a model trained on over 50 billion parameters. It assists users in interpreting financial documents, reports, and invoices.