2024 will be the year that generative AI is put through the litmus test of delivering on its promise.
The initial hype generated in late 2022 and early 2023 around this groundbreaking technology has been massive. While most of that hype is deserved, rationale and reason are on every executive's mind. Many companies are now busy developing their blueprint on where this technology can be applied within their enterprise and how to get there.
The flurry of new large language models (LLMs) — a kind of generative AI algorithm that uses deep learning and huge data sets to recognize, summarize, generate, and predict new content — that popped up almost every week in Q2 of 2023 is settling down. Frontline choices for the vendors of generative AI are narrowing, which may help CIOs deal with the decision fatigue.
On the cloud infrastructure side, the big three of AWS, Microsoft Azure, and Google Cloud Platform have all responded adequately to this new trend. Microsoft's bet to invest and partner with OpenAI appears to be paying off as ChatGPT remains a popular LLM. AWS is investing in Anthropic, which gives it frontline access to the Claude2 model. AWS also has its Bedrock service, which allows customers to choose from a bouquet of models and makes consumption of this technology easier for enterprises. Google is powering ahead with its PaLM family of models and focusing on vertical-specific models such as Med-PaLM for healthcare. Meta is not far behind with its Llama models. Investments in these models will continue at large scale, and they will get more reliable and perform better in 2024.
The dilemma facing CIOs regarding consuming generative AI-as-a-service from one of these vendors and building a purpose-built LLM specific to their business and running them locally is also getting resolved. Capital expenditure to build infrastructure that can host these models is high while the cloud vendors are promising data security and exclusivity to customers. They provide assurances that a given customer's enterprise data can be excluded from training the generic LLM model and will be handled strictly privately. This will lead to a trend in 2024 where generative AI will be consumed as a service in cloud, rather than provided from a local deployment in house.
While generative AI models are very good at natural language processing (NLP), building any application around it for an enterprise often involves other crucial components in the mix, such as vectorizing company data and using an orchestration framework, which acts as the interface between the end user, the model, and the data. Langchain is emerging as a popular framework. You will start to see many packaged, purpose-built plugins emerging in 2024 combining all these ingredients to solve industry-specific problems.
Generative AI will not result in a massive reduction of existing jobs in 2024, although in the next three to five years, it will impact a range of professions including content creators, translators, copy writers, reviewers, editors, and some administrative positions. First, however, the technology needs to permeate more into people's daily lives. This will accelerate next year. The overall productivity of the workforce will increase many fold due to generative AI in 2024. In our daily work life, new plugins, such as Microsoft Copilot, will assist you with writing emails, drafting designs, summarizing the meetings you may have missed, and reminding you of follow-up actions. Software engineers using a copilot type of assistant to write code will be more the norm going forward. Generative AI will increasingly penetrate sales, marketing, and customer self-service use cases. Even high-value engineering jobs, like product design and data science, will have increased levels of automation and productivity with generative AI. AI-focused jobs, such as prompt engineering, will become a sought-after and highly paid career path.
Overall, 2024 will be the year that many companies execute their GenAI strategy. The "early movers" will have an advantage, as they would be able to harvest the increased productivity with higher quality that comes with generative AI. If you have not taken note of this trend and have not drafted a plan on how to adopt generative AI to your business, now is a good time.
Vignesh Subramanian is Vice President of Product Management at Infor.