top of page
Writer's pictureAbhinav Kimothi

Creating Impact: A Spotlight on 6 Practical Retrieval Augmented Generation Use Cases

In 2023, RAG has become one of the most used technique in the domain of Large Language Models. In fact, one can assume that no LLM powered application doesn’t use RAG in one way or the other. Here are 6 use cases that RAG forms a pivotal part of


If you’re interested in finding out more about retrieval augmented generation, do give my blog a read - Context is Key: The Significance of RAG in Language Models



Document Question Answering Systems using Retrieval Augmented Generation

By providing access to proprietary enterprise document to an LLM, the responses are limited to what is provided within them. A retriever can search for the most relevant documents and provide the information to the LLM. Check out this blog for an example —



Conversational agents

LLMs can be customised to product/service manuals, domain knowledge, guidelines, etc. using RAG. The agent can also route users to more specialised agents depending on their query. SearchUnify has an LLM+RAG powered conversational agent for their users.


Real-time Event Commentary

Imagine an event like a sports or a new event. A retriever can connect to real-time updates/data via APIs and pass this information to the LLM to create a virtual commentator. These can further be augmented with Text To Speech models.IBM leveraged the technology for commentary during the 2023 US Open


Content Generation

The widest use of LLMs has probably been in content generation. Using RAG, the generation can be personalised to readers, incorporate real-time trends and be contextually appropriate. Yarnit is an AI based content marketing platform that uses RAG for multiple tasks.


Yarnit is an AI based content marketing platform that uses RAG for multiple tasks.


Personalised Recommendation

Recommendation engines have been a game changes in the digital economy. LLMs are capable of powering the next evolution in content recommendations. Check out Aman’s blog on the utility of LLMs in recommendation systems.


Virtual Assistants

Virtual personal assistants like Siri, Alexa and others are in plans to use LLMs to enhance the experience. Coupled with more context on user behaviour, these assistants can become highly personalised.


Virtual  personal assistants like Siri, Alexa and others are in plans to use  LLMs to enhance the experience.


If you’re someone who follows Generative AI and Large Language Models let’s connect on LinkedIn — https://www.linkedin.com/in/abhinav-kimothi/


 

Also, please read a free copy of my notes on Large Language Models — https://abhinavkimothi.gumroad.com/l/GenAILLM


RAG & Large Language Models

I write about Generative AI and Large Language Models. Please follow https://medium.com/@abhinavkimothi to read my other blogs

46 views0 comments

Comments


bottom of page