KMWorld 2024 Is Nov. 18-21 in Washington, DC. Register now for $100 off!

 
The problem of hallucinations, false information, and fabricated data is well known to searchers. It puts people off from trusting search results when generative AI is involved. As Large Language Models (LLMs) proliferate, not only in the open web search world but also in enterprise search, the issue of search results reliability becomes particularly critical. One potential solution to mitigate against hallucinations is RAG (Retrieval Augmented Generation), an AI framework that enhances the quality and relevance of generated text. Hear about the latest developments in this webinar.

In this webinar, you will learn from Graft and PoolParty about:

– Using AI to clean and enrich data for better knowledge retrieval
– Tackling inaccuracies and inconsistencies with RAG
– Combining Knowledge Graphs and generative AI (LLMs) for knowledge retrieval
– Leveraging generative AI to create and maintain a Knowledge Graph

Don't miss this live event on Wednesday, May 29th, 11:00 AM PT / 2:00 PM ET. Register Now to attend the webinar Optimizing LLMs with RAG: Key Technologies and Best Practices.



SPEAKERS       MODERATOR  
headshot headshot headshot
Adam Oliner
CEO & Founder
Graft
Sebastian Gabler
Chief Customer Officer
PoolParty
Marydee Ojala
Editor-in-Chief
KMWorld magazine