TRANSCRIPT: How ChatGPT and Chatbots are changing the landscape in Financial Markets

Matthew Cheung: Hi, this is Matthew Chung. I'm CEO of ipushpull. And thanks for joining us on today's podcast where we're gonna be talking about ChatGPT. We have Imran Ahmed joining me. And Imran is the newest and latest employee of ipushpull and he's come in as our web development lead. Hi Imran.

Imran Ahmed: Hi Matt. So I joined ipushpull back in January as a lead web developer to try and put some focus on the UI and UX.

Imran Ahmed: So my background is I've been working in FinTech financial services for about 25 years. I've always worked as a developer, so I've done several. Years over at UBS, Credit Suisse, London Stock Exchange, and I've got a wide understanding of technology, although I'm quite heavily focused on front end stuff.

Imran Ahmed: So I've got a good grasp of AI, crypto, and all the latest web dev technologies as

Matthew Cheung: well. And you've also got a very famous son. Tell us about him quickly. Okay,

Imran Ahmed: so, so my son Ben he's gonna turn 14 in March now, but when he was 12 years old, he released his NFT collection called Weird Wills, and at the time he was, like, the youngest person in the world to release an NFT collection, and this was right bang in the middle of the whole A bull run, which had just been kicked off around NFTs and stuff.

Imran Ahmed: So he was like very, very early and that's traded around 5 million for something that he built at home in his spare time. And then off the back of that, he got like loads of press coverage, BBC, New York Times, Daily Mail, Sun. Practically everywhere. If you're going to search his name on the internet. So yeah, so that's kind of how I got pulled into like NFTs, but I always had an interest in crypto and stuff as well.

Imran Ahmed: So I've got a good convergence of stuff that's happening in my professional life around AI, crypto, NFTs, and financial services.

Matthew Cheung: That's how we end up meeting and you eventually joining. I push pull through your thumb. So today we're going to talk about what is chat GPT, who is open AI and what use cases are there in financial markets for chat GPT.

Matthew Cheung: And then from that, we'll kind of look at where a chat bots today. And then what can people do to get started. Let's kick off and just look at ChatGPT itself, and it's been a phenomenal rise of technology that's happened. Well, it looks like it's happened very, very rapidly. It was launched back in the end of November.

Matthew Cheung: ChatGPT then hit 1 million users after a week, and then hit 100 million users after 1 month. That's the fastest growing app of all time, but remember, it's based on artificial intelligence, which has actually been around since the end of World War Two. So it's not a technology that suddenly appeared out of nowhere.

Matthew Cheung: It's a technology that's slowly grown through many, many decades has seen lots of what they call AI winters where the technology was looking very hopeful. And then it kind of disappeared again, and then now has resurfaced in what's very much an interface moment where chat EPT essentially is a large language model, which can interpret natural language and give responses as if it were a human.

Matthew Cheung: And implicitly it's developed a theory for meaningful sentences and can understand the fundamental principles of human language. I mean, Imran, you and I have been using it a lot in the last, you know, month, couple of months. We've even just subscribed to ChatGPT Plus, which is 20 a month, allows you to access it at any time and is a lot faster.

Matthew Cheung: But we can see lots of benefits already, right? Yeah. I mean,

Imran Ahmed: even, even before chat GPT, I was using like GitHub co pilot in my professional life, which is also built on the GPT model. And that was like trained against loads of like public commits on open source projects. And that's, that's been like really good.

Imran Ahmed: One thing we have. Remember with these tools, they're not replacements at the moment. They're very, very good for like assistance. So you still need like high level professionals, but what they do is massively reduce the amount of like boilerplate stuff you need to do. So I'm specifically using it in day to day development, although I have had around play around in my personal life as well, like getting it to write poems and creating images using Dali and stuff like that, but professionally.

Imran Ahmed: I use it predominantly for coding. And as you say, like these technologies have been around for like 30, 40 years. But I think what's different with this iteration is it actually feels like intelligence. It's very interactive when you converse with it and it's. People are so used to like searching the web, but when you use this just as a search tool and you can prompt it and then re prompt it again based upon the information that it gives you, you really start to understand how powerful this stuff is.

Imran Ahmed: And

Matthew Cheung: talking about power, the GPT large language models, if you look at GPT 3, that's a hundred times larger than GPT 2 with 175 billion parameters. And GPT 4, which apparently has been kind of alpha tested out in West Coast, amongst some close friends of OpenAI. Apparently, that's going to be 500 times the size of GPT 3.

Matthew Cheung: The GPT 4 should be coming sometime this year. That will give you 100 trillion parameters. Versus 175 billion with GPT three. So that's as many parameters as the brain has synapses. So some of the tweets we've seen from people like Sam Altman and other people apparently hinting at GPT four is, is, is able to pass the Turing test and what the Turing test is.

Matthew Cheung: Tests of deception, essentially fooling someone that an AI could theoretically pass a test without it possessing intelligence. And in a human sense, it's it's quite narrow because it's exclusively focused on linguistics, but apparently. That's that's something which GPT 4 can pass, which is a big milestone for, for AI.

Matthew Cheung: One of the things with AI is obviously it's an, it's an exponential technology and exponential technologies at the beginning are quite deceptive and they look like they move quite slowly, which is probably what's happened over the last year, 30, 40 years. But then quite quickly, everything is doubling, if not more than doubling, with GPT 4 having 500 times the size of a model as GPT 3, so you can quite quickly see how things are going to progress.

Matthew Cheung: What are your thoughts on that? Yeah, I mean,

Imran Ahmed: I think progression. In human intelligence, not just in technology is going to be really profound. So something Sam Altman, who's one of the founders along with Elon Musk of open AI, who's actually built these models tweeted the other day that he said we could have something similar to Moore's law around human intelligence.

Imran Ahmed: So Frank Moore's law was. a prediction that was made by Gordon Moore, the co founder of Intel, around 1965, when he stated that the number of transistors on a microchip would double approximately every two years, leading to an exponential increase in compute power and a decrease in the cost of electronics.

Imran Ahmed: So if you imagine something like ChatGPT out in the wild now, everybody's Using it from school students to professionals, we could have like, say, a doubling of human intelligence, like every two years or something. And that's really mind blowing when you, you think of it in that way. And there's another reason why I believe that this technology will spread.

Imran Ahmed: Really quickly, because it's being built on existing technological infrastructure. So the cloud infrastructure is already there. It's been around for about 10 years. It's well settled. So you can distribute these models to like millions of enterprises, billions of users around the world. You know, lightning quick speed and very, very low cost as well.

Imran Ahmed: So, I mean, if you had this model, say like 30 years ago, we wouldn't have been able to distribute it as efficiently as we can today. So I think that's a really important point to note why this will spread so quick is because it's being built upon existing infrastructure, which we didn't have like 20 years ago.

Matthew Cheung: And that's very much why Microsoft has invested in OpenAI, you know, invested a billion dollars a couple of years ago. And then last month in January 2023, they invested 10 billion, bringing OpenAI's valuation now to 29 billion. Basically, in OpenAI, remember, they started actually as a non profit. There was Sam Altman, who's the CEO, and he was also the CEO of Y Combinator.

Matthew Cheung: Which is a very big famous incubator. He also had people from Stripe. He had Reid Hoffman, the former LinkedIn founder, Peter Thiel, and then Elon Musk, like you just mentioned, Imran. I think Elon's story is quite interesting as well. 'cause he actually stepped away in 2018 from OpenAI, even though he was one of the co-founders.

Matthew Cheung: He's no longer a stakeholder or anything to do with it. And some of that was the conflict around the work he was doing with Tesla. But more recently, we've seen Elon concerned about open AI because it's, there's a number of guardrails that are there that are stopping it from making like controversial responses and racial slurs and sexist remarks.

Matthew Cheung: And that on its own doesn't sound too bad, but one of the things which there was a tweet thread, which Elon Musk participated in, where a user was talking to And it refused to utter a racial slur after being presented with an odd hypothetical scenario where saying the slur would have prevented a nuclear bomb from going off and then Elon tweeted it was concerning.

Matthew Cheung: But Elon now is recruiting AI researchers to create his own version of OpenAI because he now thinks that it's a closed source maximum profit company controlled by Microsoft. But nonetheless Microsoft is... It's probably the one of the reasons it's going to be able to be accessible inside enterprises and people in financial market enterprises like we are, and lots of other firms can start using it.

Matthew Cheung: There's also been a funny last few months where obviously Google is now very concerned about losing its domination in the search space. Because Microsoft has invested in chat GPT. They brought it into Bing and now Bing is is seeing the biggest rise in user numbers in all of its history. So there's lots of interesting events that are happening.

Matthew Cheung: Already this year, and we'll continue to see that when GPT 4 comes out later in the year. We got anything to add to that, Ron?

Imran Ahmed: Yeah, definitely. I mean, the integration stuff is going to be really powerful. This is where I find it quite ironic when they start banning it in certain places because it's obvious that the next version of Microsoft Office will have this embedded.

Imran Ahmed: Into it, whether it'll be an add on module or it's something that comes by default. We don't know yet, but it's going to work its way into, it already has into being, it'll be in Microsoft Word. It will be in Excel. It will be in PowerPoint. So I think rather than, you know, ban this stuff, there's really got to be some, you know, you need to embrace it and see how.

Imran Ahmed: It can be useful as an assistance tool. I think it's also worth mentioning that I don't see this as being a false start. So around, we had like a false start around the web early 19, you know, nineties when there was a whole. com boom. And we had like insane valuations around pets. com and stuff. We've had several false starts around crypto.

Imran Ahmed: I think where. AI differs is what the web and what crypto try to do is they try to fundamentally redefine business processes. So if you take something like the job of an accountant, the web comes along and says, ah, actually your old model of, you know, doing everything in your office is not going to work.

Imran Ahmed: You need to be online. And then crypto will come along and say, even further, or you need to be looking at tokenization, you need to be looking at. On chain strategies. You need to be looking at NFTs, but AI is a lot more subtle. What it does is it says that, look, you keep your existing business process as it is, but I can enhance it.

Imran Ahmed: So it doesn't try to redefine what an accountant does or how they operate their business. It just makes them a lot more efficient. So I think that aspect of AI. Is really powerful, and it's one of the reasons why I believe they'll take off a lot quicker than the Internet and crypto did

Matthew Cheung: exactly. It's all about productivity and acting as a copilot.

Matthew Cheung: I think in terms of you were just talking about Microsoft rolling out into different applications. They're due to roll out into Microsoft Teams any day now, what's called intelligent meetings. So you'll have an AI bot that sits in the meeting and at the end will provide you with meeting notes, provide you with next steps and tasks and so on.

Matthew Cheung: So the same stuff that, you know, someone in a meeting would be jotting down on a notepad, that can all be done for you. Which means you're not worrying about taking notes and instead you can participate in the meeting on a much greater level. So let's move on to. Use cases for chat GPT in the financial markets.

Matthew Cheung: They've really kind of touched on some of the general uses around productivity and efficiency. So that could be anything from helping to draft emails, helping to create blogs for a website for a more general level. And then from a developers perspective. How would you be using a tool like this at the moment in run?

Matthew Cheung: So

Imran Ahmed: a lot of development around is so I always see development is part technical and it's part art. So it's a mixture of science and art. The science. A bit can be completely commoditized by these AI tools. So all the developers really focused on is the art aspect of it, architecture and applications, integration with various different systems, making sure systems are well tested, stable performance, scalable, but the whole like generating boilerplate code.

Imran Ahmed: Like if I need to write a depth first bind research, yeah, you know, I've. Done it several times, but it might have been 10 years since I've written one. So rather than go to Google and search for it, which they'll then take me to stack overflow and I'll be browsing through the answers to see various different implementations.

Imran Ahmed: I can just get chat GPT to throw it out. I can just give it a high level prompt. So, you know, create a depth first binary search in JavaScript, TypeScript, Python, whatever language. And it creates like the bulk of it. That did not. Always 100 percent perfect. You do get some like edge cases and some performance issues around them, but they're very, very good at doing them sort of problems.

Imran Ahmed: If you've seen like the, you know, the Google type interviews where you get, where you have to do very algorithmic. Type questions, you know, after you implement search algorithms and stuff, all of that stuff can just be chewed up by chat GPT. So it kind of shows you how deficient them interview processes really are because you can just commoditize all that stuff.

Imran Ahmed: So it's really good with that. What I've been using it. Myself, like sometimes I will write a function and I'll say, write the unit tests for that function and it just write all the boilerplate stuff for me so I can just grab them unit tests. You can even give it existing code and say, look, how would I optimize this, convert this to a one liner, you know, make it more condensed, make it more terse.

Imran Ahmed: And we've already seen, I think, Stack Overflow traffic is down by 15%. Per month or something like that. Since chat GPT has come out, I see this completely replacing stack overflow because it's interactive. You can have like a dialogue with it. It's very specific. It's very, the code in itself is very deterministic.

Imran Ahmed: So as long as you feed it with the correct like API documentation, the syntax, and it's trained on like huge amount of like open source coding data, it as a coding. Tool assistance is unbelievably powerful. I mean, I could reduce teams from around 10 to 4, 5 people that were really good with like prompt engineering and stuff.

Imran Ahmed: I mean, there was a tweet from one of the Tesla engineers a while back where he said that 85 percent of my coding is now being done by AI tools.

Matthew Cheung: Yes, it's very, very disruptive for every single industry and just looking specifically in financial markets where a lot of our listeners will be. There's, I was actually asking chat GPT earlier where it thought there were use cases and it kind of picked out, I think, two, two areas.

Matthew Cheung: One was around analysis. And I'm doing analysis for users and the other is around kind of sweeping through documents or data. I mean, 2 sides of the same coin on the analysis side. It was looking at talking about sentiment analysis, looking at news articles when they're released, looking at social media, just enhancing.

Matthew Cheung: Some of those tools that are already out there to do some of that stuff, but then being able to handle that natural language piece of it and looking using it for things like patent spotting, which we've seen lots of algo funds already do that. Lots of black boxes and HFT funds and what have you already using.

Matthew Cheung: Technologies like this, but then this gives you again the ability to have that natural language element on top of it as well. And then that feeds into looking at documents, like, in the OTC trading world, you have like swaps agreements and legal documents, having chat GPT being able to scan that for you.

Matthew Cheung: It's much quicker than a human doing it just on their own. Then you have the world of compliance as well. So, things like trade surveillance, looking for unusual activity, risk management. So, that was just some of the things that CHAT GPT itself, like, threw up when I was talking to it and prompting it in the right way.

Matthew Cheung: And then the obvious one is around Client interaction where a client is talking to the service provider. So the, on the retail banking side, you can kind of see those tools already in action where they're quite simple and essentially kind of a tree decision type workflow. And then after you go so far with it, the bot may put you in touch with a human, whereas having a chat GPT bot inside of those chats, it means that.

Matthew Cheung: The bot is able to handle much more complex queries and be able to respond in a natural language rather than predefined syntax. Any other use cases you can think of besides those, Imran? Yeah, I mean,

Imran Ahmed: in terms of data processing issues, you know, Microsoft recently done a deal with the London Stock Exchange, so I'm sure they're going to be wanting to get it.

Imran Ahmed: Stacks of financial data in here at the moment. It doesn't have any real time data GP, I mean, GPT three doesn't. So you can't get it to tell you what is a current price of Bitcoin or a Tesla stock, but I'm sure that's going to be coming. Down the line as well in terms of building out like quantitative models and stuff, if you want to write write some code around building an interest rate swap, it's really good at, you know, doing stuff like that.

Imran Ahmed: You can ask it, what is an interest rate swap? Can you call it me? The skeleton of this class in any language you want as well. The other thing I found it's really good at is actually translation from language to language. So I had a friend who is a Turkish developer and they. When I threw some problems at him and then he, we asked it to translate him into Turkish and he was like really impressed with the translation.

Imran Ahmed: He said the subtleties of the language he had picked up quite well. So remember this is a, it's a global tool. It's not a catered to the English speaking world. It's useful for pretty much anyone out there. So I think the language aspect of it was like really quite impressive as well. Just

Matthew Cheung: picking up on your comment about Microsoft investing in LSAC.

Matthew Cheung: 'cause that's quite a new, quite a good segue into talking about chat as it is today. Kind of the world before chat, GPT and where, where things are. So Microsoft invested $2 billion or 4% of LSEG Market Cap. Lseg over the years has acquired lots of other companies, one of them being Refinitiv, which is formerly Reuters.

Matthew Cheung: So in the chat world there is Refinitiv Messenger. So looking at the kind of chat landscape that you have at the moment. You've kind of got the big players in financial markets and the likes of Bloomberg chat. Obviously, that's been around for a long time. You've got people like Symphony which is, which has come around a lot around the last 7, 8 years and beginning to get good traction across their customers.

Matthew Cheung: But the 1 now that people are looking at is Microsoft Teams, because Microsoft Teams is pretty much available on anyone's desktop because it's part of the office suite. And, you know, looking at users of Microsoft Teams over the last couple of years since COVID. It went from 10 million to 25 million to in the hundreds of millions already.

Matthew Cheung: So if you start bringing in chat GPT into Teams, like, like we were talking about earlier, some of the features and functionality to help you use Teams in meetings and so on. But the other reason people use Teams is for chat, not just video calls. And if you combine, you know, potentially you might have an in kind of the chat ecosystem.

Matthew Cheung: We've got Teams, Microsoft Teams and LSEC combined. Like you said, you know LSEG has all of this financial market data that they've got trading applications and systems. I've got an external directory of users with refinitive messenger. So, this year should see some interesting responses from Microsoft and LSEG in that world to compete with Bloomberg Chat, to compete with Symfony.

Matthew Cheung: And then you've got all the other ones out there as well, you know, like Slack. Ice Messenger and WhatsApp and WeChat, and we've had some inquiries for Skype. I don't know if that's still around, but some, some users in South America that's still using Skype for trading workflows. But you're getting to this point now, there's actually this proliferation of chat.

Matthew Cheung: There's kind of too many things there. We're taking. Kind of one piece of that aside, I mean, one of the interesting areas, which I push Paul works a lot in is around chat box. Now, putting chat GPT park park that for for 1, 1 moment and just think about chat box. And what is the chat bot today? So, how, how we're doing it as I push Paul.

Matthew Cheung: You know, we allow a bot, a machine user that can sit in a chat room that can connect into our service, and then we can connect into financial market data or another system or another service. And it means that you're, you're allowing a user in a chat room to be able to use the bot to go and fetch information quickly and either sharing information or consuming information.

Matthew Cheung: And one of the things where ChatGPT becomes really interesting is because up until now. We've had to use like a predefined syntax or command line kind of interface so that people can use the bot and query data, get responses and so on. And our service sits in the middle and can map and transform chat messages into say, fixed messages, for example.

Matthew Cheung: But one of the things ChatGPT can do, it's now got the ability for an iPushPool bot that's sitting in, you know, Symfony and Slack and any other platforms. To understand those natural language inputs and then generate responses in a human like way, which means that you don't need to be relying on syntax.

Matthew Cheung: So that's, that's going to be an interesting thing that we're going to be looking at over the next year. And then you've been looking into. how you can use the APIs available directly from OpenAI and actually the ones from Microsoft.

Imran Ahmed: So the API feature was something that OpenAPI has released recently and I think as I mentioned before this stuff is really scalable in a big way because you're only really paying for like tokens.

Imran Ahmed: To actually use the API, the infrastructure is all hosted in the cloud. So I think this is going to be a big win, but just to get back to your point about using AI within banks and investment banks specifically, when you, when you mention AI in investment banks, people automatically assume that it's going to be used as some kind of trading tool, or can I use this to trade the markets?

Imran Ahmed: Can I make money off this? But that's only a small. As you know, it's only a small percentage of what banks actually do. They have like big convoluted middle office and back office processes. So they have departments like HR, legal compliance, procurements, accounting, product, AI can be used across the whole of that stack in the middle and the back office as well.

Imran Ahmed: So I think that's the one place where. We will see lots of efficiency over the next five to 10 years. A lot of cost cutting, a lot of headcount reduction, because even so saying it merges and acquisitions, they have to draft like legal documents, reams of them. If you can imagine, if you can get the whole of LexisNexis data into something like chat GPT and stuff, you know, you could reduce the number of lawyers that you hire.

Imran Ahmed: Hiring within a bank by at least 50, 60, 70%. I think that's one thing people don't want to just focus on. Can I use AI as a trading tool to play the markets? Because I think that's just a small part of what AI can do within a bank.

Matthew Cheung: And improve the user experience for them, because if you can, if you can give the ability for your clients to self serve information without having to bother the sales people or the traders. Yeah, it just helps make everyone more efficient. And I think that's probably the crux of what we're talking about here is efficiency.

Matthew Cheung: AI is a tool to make anyone using it more efficient in whatever they're doing and more productive. So I guess let's look at and finish up on how can people get to use and play around with Chatbots and ChatGPT. Because on the chat GPT side, one question, which I'm sure a lot of people listening will be thinking about is where's my data going?

Matthew Cheung: I spoke to someone from Microsoft a few weeks ago and they said that's the main question. After they ask about, you know, how can I connect into it, which is fairly straightforward. You don't need Azure hosting. You just need an Azure account and you can hook into it. But the next question is where, where does my data set?

Matthew Cheung: And what we saw actually yesterday, it's March today, we saw the 1st of March, OpenAI came out and said that you can opt out of having, actually opt in if you want to have your data included for training the OpenAI models. So, obviously, most financial institutions, there's a lot of sensitive financial information that might be going over these sort of channels, so that can be opted out.

Matthew Cheung: And then, like any 3rd party service sitting on the cloud, you know, the usual security measures will apply around confidentiality of data. Some things will be fine to use chat GPT for, others may not be. So that's something to kind of think about if you want to start consuming like me. Chat GPT and the APIs and so on, and the other areas, I suppose, is where, where can you start using chatbots now already people that are listening will be using Bloomberg chat and symphony team slack, all of those major chat platforms.

Matthew Cheung: With I push pull with one connection to our service, you're able to access all of these different chat platforms through one bot framework. And with that one bot framework, you can configure a chat bot, and it will work in all of the chat platforms that are connected. And one of the things which Imran is exploring at the moment is, you know, how, how can we use chat GPT.

Matthew Cheung: To enhance the interface and the natural language element of that. How's that been going so far,

Imran Ahmed: Imran? Yeah, I mean, it's going really well. I mean, for our use case, it works well because we've got like a well defined deterministic API. If we could feed that into some kind of AI model, a lot of our end clients could, you know, develop their bots and applications in a much more specific way.

Imran Ahmed: Specific where that is more natural in language rather than having to code away with it. So, I think that's going to be very powerful is when you can take these models and you can start feeding in your own company data docs APIs that you don't really want to put into like a publicly accessible

Matthew Cheung: model.

Matthew Cheung: And it's that ability to be able to query all of that information. I think one of the use cases that. That we'll be looking at. I push Paul is the ability to have these kind of predictive analytics. And it's one of the things we're looking initially around. Is it something that we need to start building ourselves or is it something we can use a model off the shelf?

Matthew Cheung: And that's exactly what now is available via, you know, Microsoft and open AI. So, so lots of tools that people can use straight away, you know, generative AI, which is kind of falls into. Is a big thing at the moment. Lots of VC funding going to this area, but chat GT's not the only thing. You know, there's 5,000, 200 kind of platforms and applications out there where you can use AI and generative ai.

Matthew Cheung: But chat GPT because of the easy to use interface, that that's, that's the main reason it's taken off to a hundred million users by, by one month after it was launched. So, so we'll just finish up now. So this was Matthew Chiang, I'm CEO of ipushpull, and I was joined by Imran Ahmed, our lead web developer.

Matthew Cheung: And if there's any further questions from, from listeners, then please get in touch with us. Thanks for listening. Thank you,

Imran Ahmed: Matt.