ChatGPT. OK, it's cool, but what is it for? This is the question I'd be asking if I were a banking executive. Oh, and of course: What are the risks of using it?
There is huge excitement about this bright new toy, but what it mainly does is produce content on demand that is distilled from information picked up off the internet. To my mind, what makes it smart is its ability to produce language that sounds like a convincing voice, not the substance of what it is telling you.
So why are banks banning it inside their businesses? The answer is in what bankers might use it for. Bank of America Corp. and Goldman Sachs Group Inc. have joined JPMorgan Chase & Co. in telling staff they mustn't use it for business purposes.
Those business purposes could be to generate a draft of a pitch document or research report, just as people have tried it out writing parts of academic papers, press releases or even entire novels. Maybe senior bankers think their juniors will get lazy. More likely, the compliance departments are fretting about the risks involved, especially after being fined by regulators for bankers' use of WhatsApp.
ChatGPT and other large language models have been shown to make mistakes and get things wrong, or even hallucinate and make up non-existent fields of scientific enquiry, for example. If a sell-side analysts' research report turned out to have plausible but entirely fantastic sectoral developments threatening or benefiting a listed company, I assume that would look bad.
Also, as ChatGPT goes around pulling information from the web, there's a danger that it might end up straight plagiarising someone else's work. Again, if you're a bank, or any information-centered business where reputation and trust matters, this would
Read more on tech.hindustantimes.com