Since their introduction by OpenAI, ChatGPT and generative artificial intelligence (AI) have had a significant impact on the financial industry. The AI chatbot gained a million subscribers in the first five days of its launch and had 100 million users by January 2023, causing a commotion in the IT industry. It signaled a turning point in AI research and advanced large language models (LLM) development. Technology related to AI has overtaken the previous year’s “most significant news event” in AI.
Perhaps asking ChatGPT for the definition of generative AI would be intriguing (and understandably meta). Suppose we were to ponder, “What is generative artificial intelligence?” It would answer by saying:
“Generative artificial intelligence (AI) is a fascinating field within AI that has the incredible ability to generate fresh data or information by analyzing and understanding patterns and structures present in existing data. It’s like having a creative mind that can recognize underlying patterns and use them to create something entirely new. By leveraging past data, generative AI can produce innovative content, offering unique insights and novel perspectives.” The goal of this AI strategy is to develop models that can closely approximate human creativity and intelligence.
Deep learning techniques are frequently used to build generative AI models, and these models are trained using big sample datasets. When these models are trained, they can produce new data—such as pictures, music, or text—that closely matches the tone and structure of the training set.
How does ChatGPT, a big language model, function?
How exactly do these generative AI models, like ChatGPT, operate? The creator of the computational engine Wolfram Alpha, Stephen Wolfram, claims that advanced auto-completing models are essentially what huge language models like ChatGPT are. In other words, ChatGPT functions as a predictive model that, given an input prompt, creates text outputs that have the highest probability of delivering “coherent continuations” based on its training data.
They are pretty different from how people perceive language, which is fascinating. Language serves as a tool of communication for us since it encodes our thoughts, feelings, experiences, and intentions. Additionally, syntax governs human language, a set of rules that specify how words can be put together to create coherent or grammatical sentences.
None of these characteristics are, however, innately present in nature. It appears that LLMs are more adept at comprehending language’s symbolic force than grammar. This is not surprising because syntax (usually) follows a rigorous structure, whereas LLMs find it more difficult to understand the meaning of a phrase (or even more complex ideas like subjects and metaphors). After all, their knowledge is restricted to the training set.
Syntactically correct but semantically incorrect sentences are occasionally produced by natural language processing models like ChatGPT, leading to claims that may sound believable but are untrue. ChatGPT frequently struggles the most when referencing the works of experts, as David Semeradón, Professor of Economics at the University of Queensland, pointed out in a Twitter thread.
What effects is generative AI having on the finance sector?
Although the technology is still in its infancy (and highly contentious because it has sparked concerns about the perils of artificial intelligence), there have been long-running conversations regarding its application in the context of financial services. Even though machine learning and AI have been used extensively for more than a decade, For producing predictions, conventional systems rely on historical data, whereas generative AI generates fresh material.
In this regard, I think that improving customer experience using general AI is the most important application for financial services. Financial businesses store enormous volumes of data about their consumers, which has been said to enable them to do something innovative. Like any AI system, it is practically useless without data. One significant instance is the recent release of BloombergGPT by the financial services company, which seeks to develop its own ChatGPT substitute.
I can see financial companies using third-party products or creating their technology, like Bloomberg, to improve the customer experience by giving investors an easy way to track their investment portfolios, get simple answers to their most pressing financial questions, and provide them with straightforward financial advice. Although it can’t completely replace the job of seasoned financial experts, it could be able to complement their efforts by serving as a junior assistant or “financial aid” to both people and seasoned financial counselors. Because financial decisions are so deeply personal and influenced by emotions, people cannot fully trust AI efforts without some human involvement.
Since we are in the year 2023 and technology is advancing quickly and our species is evolving, a lot could happen in the following 20 to 30 years.