-->

NEXT WEEK! KM & AI Summit 2025, March 17 - 19 in beautiful Scottsdale, Arizona. Register Now! 

Semantic Search: A Deeper Meaning

Article Featured Image

At the heart of semantic search is a vector database that utilizes datapoints in multidimensional space to indicate how closely a word or concept is to the search term or question. SearchBlox creates the vector database, generating the vector numbers as it imports the file. The numbers assigned to the words depend on the LLM that is being used. Transforming words into vector numbers is referred to as “word embedding” or just “embedding” and is a process used by all LLMs.

The parameters of an LLM reflect patterns of word usage based on the content that has been ingested and determine the behavior of the AI model. An LLM has a lot of parameters, but bigger isn’t always better; there is a cost in both dollars and performance. The parameters reflect the weighting and strength of connections across the vectors. Tokens in the LLM consist of a string of characters or a word, or sometimes several words, and are the unit of information that the LLM has analyzed. After considerable testing of various LLMs, SearchBlox selected Llama as the best model for its platform. Within SearchBlox, the usage of the LLM is a fixed and predictable cost because it is an integral part of the platform.

SearchBlox also includes RAG capability. RAG incorporates enterprise content from outside the LLM to produce a model that is more tailored to the organization. It might include technical documents that were unlikely to have been used in the LLM, for example, which will improve the accuracy of results. In addition, RAG provides citations that allow users to review the source material on which the response was based. “We were seeing people struggling with RAG and wanted to make it a part of our platform to increase ease of use,” continued Selvaraj. “It can be implemented with just one click.” Also built into the platform and easily deployed are chatbots and agents, which rely on the same semantic search as users in order to bring accurate information to customers.

Semantic Support for Customer Service

Grazitti Interactive is a digital services provider that operates in four major areas: AI, online communities, marketing services, and CRM. After having built more than 100 online communities, knowledgebases, and partner portals, the company observed that many enterprises with multiple disparate content sources were struggling to put relevant content in the hands of the end user.

This awareness led to the evolution of SearchUnify Cognitive Search that is now part of an enterprise agentic platform for self-service and customer support. Agents and customers alike need accurate and contextual information that may be scattered across numerous data sources, including technical documents, company policies, and financial databases. “Robust LLM integration across our suite of products, coupled with federated retrieval-augmented generation architecture,” said Taranjeet Singh, head of AI for SearchUnify, “provides fine-tuned, contextual, and intent-driven conversational experiences at scale.”

Grazitti Interactive advocates a BYO approach to LLMs; it supports major LLMs and benchmarks them against global standards. The recommendation for the optimal LLM is based on specific needs, such as summarization or conversational tasks, considering both cost and performance. “This approach ensures we provide the best solution tailored to each customer’s requirements,” noted Singh. “We support vector embeddings with a variety of models and algorithms such as approximate nearest neighbors (ANN) and re-ranker techniques, enabling hybrid search and other advanced techniques to fully utilize vector search capabilities.”

KMWorld Covers
Free
for qualified subscribers
Subscribe Now Current Issue Past Issues