Semantic Legal Search Platform

3 March 2026

Semantic search

RAG

On-Premise

LLM

Legal

IA

Elasticsearch

“OSMOZ-IT & MATICOM were able to understand our confidentiality constraints and offer us a modern AI solution without compromising on the security of our data. Hybrid search will transform the way we work.”

Contexte

The challenge: FBT, a Swiss law firm, was faced with a growing volume of documents (case law, legal doctrine, legal texts, contracts), making keyword searches time-consuming. Identified needs :

  • Intelligent search with specialized legal vocabulary
  • Absolute confidentiality of customer data (on-premise hosting mandatory)
  • Traceability of sources cited for each result
Solutions

We designed and deployed a 100% on-premise RAG (Retrieval-Augmented Generation) platform, enabling intelligent semantic search across the entire documentary corpus. Architecture implemented :

  • Vector search engine: Elasticsearch with 768-dimension embeddings
  • Local LLM: Ollama (Mistral) for contextualized response generation
  • Embedding template: nomic-embed-text optimized for legal French
  • Secure API: Spring Boot with OAuth2/Keycloak authentication
  • Dedicated infrastructure: High-performance server (AMD Ryzen 9, 30 GB RAM allocated)

Key features :

  • Search by semantic similarity (not just keywords)
  • Automatic indexing of new documents
Résultats

Response time: < 10 seconds Confidentiality: 100% on-premise, no data outsourced Relevance: Hybrid semantic & lexical search (keywords)

NOUS CONTACTER

Vous avez un projet similaire ?

Vous souhaitez exploiter l'IA pour valoriser vos documents tout en gardant le contrôle total de vos
données ?

Homme lisant une tablette