Master LLMs with our FREE course in collaboration with Activeloop & Intel Disruptor Initiative. Join now!

Publication

The rise of API-powered NLP apps: Hype Cycle, or a New Disruptive Industry?
Latest   Machine Learning

The rise of API-powered NLP apps: Hype Cycle, or a New Disruptive Industry?

Last Updated on March 19, 2023 by Editorial Team

Author(s): Nikola Nikolov

Originally published on Towards AI.

Image generated with Stable Diffusion.

Large Language Models (LLMs) have come a long way in recent years. From fluent dialogue generation to text summarisation, and article generation, language models have made it extremely easy for anyone to build an NLP-powered product. As a result, hundreds of apps have been popping up every day, predominantly relying on APIs such as OpenAI, Cohere, or Stable Diffusion.

Looking at these developments, one might wonder: what is the disruptive potential of such apps? Are they poised to deliver transformative results to all industries? Or will their impact be limited to certain narrow use cases?

Furthermore, what challenges do developers and business owners need to be aware of in order to make a lasting impact in this space?

The rise of LLM-centred product development

Large Language Models (LLMs) have seen significant advancements in the last year, primarily due to the development of techniques that better align them to human preferences. This has resulted in an impressive capacity for generating fluent text in a wide range of styles, and for different purposes, with significantly greater precision, detail, and coherence than what was previously possible.

The capacity of LLMs to follow instructions, and to learn from examples presented in their context, has made it possible to tackle virtually any NLP task with an LLM, that is, at least in principle. All that is needed is a carefully constructed prompt that is able to extract the required functionality out of the LLM. The LLM itself can be conveniently accessed through a simple API call.

The progress in LLMs, as well as their general availability, has led to an explosion of LLM-based apps targeting diverse use cases. From blog post generation to generation of email responses, summarisation of articles and meetings, fluent dialogue conversations, or code generation.

Most of these apps focus on a narrow user workflow, essentially abstracting away certain functionalities of the underlying LLM. They typically operate by charging a premium on top of API fees.

Challenges with API-centric AI product development

When it comes to solving tangible problems that users are willing to pay for, some of the weaknesses of the API LLM approach quickly come to light. Let’s explore a few of those.

Challenge 1: No specialization

Image generated with Stable Diffusion.

LLMs are general-purpose models trained on the whole of the internet. For general-purpose tasks that only require creative suggestions, this is typically fine: for example, suggesting a title for a blog post.

Many real-world problems, however, require significant or full levels of automation. Often, the problem lies within a narrow domain, such as extracting information from biomedical or legal articles. The lack of specialization of the LLM is, therefore, unlikely to fully match user expectations out-of-the-box in these narrow domains, where LLM outputs would require additional manual checking to ensure they meet quality expectations.

This problem can be alleviated to a certain extent: for example, prompt engineering, in-context learning, and model fine-tuning might help to better align the base LLM with user expectations. However, this is certainly not a quick and easy task since it requires deep domain knowledge, carefully constructed human feedback datasets, as well a deep understanding of the underlying tech. These are issues that are unlikely to be resolved by simply calling an API but would require a more fundamental approach to the problem.

API-based apps that fail to deliver expected automation levels risk becoming just another tool that’s fun to play around with, but doesn’t get integrated into existing workflows, hence generating revenue or making a lasting impact.

Challenge 2: No differentiation

Image generated with Stable Diffusion.

LLM APIs have significantly sped up the process of AI prototyping. A prototype that might have taken years to develop can now be built within a week. All that is needed is a pretty interface that encapsulates the desired user workflow: the API does the rest of the heavy lifting.

The speed and flexibility the APIs provide are certainly amazing: we are seeing so many creative applications of LLMs. However, on a more fundamental level, it becomes uncertain which solutions really have something that significantly separates them from the baseline LLM, which is accessible by anyone. In other words, is there something more fundamental under the hood, or are customers essentially paying to access a polished prompt behind a pretty interface?

While there will certainly be areas where this approach will work and will even lead to user traction, the question is, what happens when 5 similar apps come out next week? Is it really possible to build a sustainable business relying purely on such APIs?

Ultimately, the winners that come out successful from the LLM app race are likely to be the ones that manage to quickly capture user traction, understand where the fundamental value lies, and capitalize on the traction to build custom solutions that set them apart from the competition.

Challenge 3: Lack of ownership

Image generated with Stable Diffusion.

The reliance on LLM APIs also creates a number of business risks. The API provider could, at any point in time, decide to make changes that could cause a significant business impact. For example, they could decide to change the price of the API, change the fee structure, the terms and conditions, or even make changes to the underlying model.

Many users might also have concerns regarding data regulation and security since their data is handed over to third parties.

This could potentially be a recipe for disaster. What happens to your app if an API no longer works as expected or if there is an outage? Since you don’t own the tech, you have no backup option. You could switch to a different API provider, but would everything work exactly as before? Furthermore, how can you establish trust with your users that the data is handled properly and will be handled properly in the future?

The lack of control and transparency with APIs is certainly an aspect to seriously consider, especially if your whole business case is built around them.

Conclusion and Outlook

We are witnessing a technology revolution driven by the availability of powerful AI tools. We’re still at the tip of the iceberg in terms of possible applications, and there are many unknowns regarding what the next 6–12 months will look like. One thing is certain: LLMs are here to stay, and they will have a significant impact on our society over the coming years.

LLM APIs certainly have a role to play: they provide an interface to powerful LLMs that make it extremely easy for anyone to build AI products or add some AI features to an existing product, all without having to invest in any infrastructure or development. In the short term, many of these products are likely to gather some traction. However, the challenges outlined in this article make it difficult to go beyond the prototype stage in the long term as user expectations rise and competition increases.

The winning companies in the space are likely to be the ones who: (1) Capitalise APIs for fast prototype iteration, data, and feedback collection while determining what’s the true value within a niche; (2) Build proprietary tech as quickly as possible: curate custom datasets, develop and train custom tech that converts the prototype to a robust solution that solves a real-world problem.

Thanks for reading! If you enjoyed this article, you should consider subscribing to my NLP newsletter or YouTube channel.
If you are looking for state-of-the-art expertise in Natural Language Processing, you should check out our services at The Global NLP Lab.

Join thousands of data leaders on the AI newsletter. Join over 80,000 subscribers and keep up to date with the latest developments in AI. From research to projects and ideas. If you are building an AI startup, an AI-related product, or a service, we invite you to consider becoming a sponsor.

Published via Towards AI

Feedback ↓