Semedy’s KMS search capabilities are driven by its Semantic Reasoner, an application that allows users to retrieve not just keyword-based matches through lexical search, but also contextually relevant information that reflects the underlying relationships among the entities. “Users can construct complex queries that go beyond basic knowledge retrieval, addressing multiple properties and dependencies,” Lagor pointed out, “and the system can provide answers that reflect deeper insights.”
For example, a user who needs to retrieve PDF documents related to a particular medication could start with a lexical search to identify all the PDFs with that medication’s name. Then a semantic search could be conducted to refine the results by identifying, via tagged properties, those PDFs that discussed the use of the medication in the context of a clinical trial.
The primary pain point for organizations is not retrieval per se, according to Lagor, but representing knowledge in a way that makes it retrievable. Users may need help managing different releases of taxonomies and ontologies, or managing metadata that is diverse and constantly changing. “A concern that some customers have is managing definitions that would be used to query real-time data,” noted Lagor. “For example, they might want to specify characteristics that could be used to identify patients who were eligible for a particular clinical trial.”
Having an integrated knowledge-base has been a great benefit to Semedy’s customers. “All our customers had struggled with multiple knowledge silos that were not integrated,” Lagor said, “or taxonomies that were from different sources and were not mapped to one another.” After implementing Semedy’s KMS, the knowledge sources were in the same knowledgebase and were connected through appropriate semantic relationships. “The KMS allows users to search a single source of truth with a powerful semantic search engine,” he concluded.
SearchBlox SearchAI is an integrated RAG and search platform that contains all of the components needed for developing both lexical and semantic search solutions. “We wanted an integrated solution for multiple reasons,” said Timo Selvaraj, co-founder and chief product officer for SearchBlox. “One was for ease of use, to have everything at hand that is needed. Another was security. When you have to send your data out to an external LLM end point such as OpenAI in order to get it processed, there is a security risk. Finally, there is cost. People are realizing that the use of LLMs for GenAI is producing some runaway costs; we are offering a predictable cost.”
Content is added to the knowledge-base by selecting the file or folder and importing it into SearchBlox, which defines it as a “Collection.” The system can ingest 300-plus different types of data sources for Collections. As the data is imported, it is processed by an LLM that allows the system to create better descriptions of the document and better titles. The platform then indexes the data. “SearchBlox can automatically create additional metadata using LLMs,” explained Selvaraj, “or can add tags defined by the user.”
When a search is conducted, the Admin search interface shows the lexical results on the left side of the screen and semantic results on the right. The search results for semantic searches are driven by the LLM model. A sliding bar allows the weighting for each one to be changed to emphasize lexical versus semantic search. A number associated with each title indicates the document’s ranking. “Searches can be done using just one or the other, Selvaraj noted, “or sequentially to first get a precise piece of information and then seek an answer from the semantic search that might not use a particular term but conveys the concept.”