// our consulting

We offer a wide
variety of consulting services

Sease offer ad hoc consulting services on various topics:
Apache Lucene/Solr, Elasticsearch, OpenSearch, Vespa, Search Relevance tuning, AI and Machine Learning integration and log monitoring.

Apache Lucene/Solr

Solr is the popular, blazing-fast, open source enterprise search platform built on Apache Lucene.

Solr is highly reliable, scalable and fault tolerant, providing distributed indexing, replication and load-balanced querying, automated failover and recovery, centralized configuration and more. Solr powers the search and navigation features of many of the world’s largest internet sites.

Sease team have been Solr specialists since 2010 and often contributing support, ideas, bug fixes and new features to the project.
We can design and develop search applications for you, implement new features for your use cases and organize training classes for your employees.


Elasticsearch is the world’s leading free and open search and analytics solution.

Elasticsearch is a distributed, RESTful search and analytics engine capable of addressing a growing number of use cases. As the heart of the Elastic Stack, it centrally stores your data for lightning fast search, fine‑tuned relevancy, and powerful analytics that scale with ease.

Sease team have been Elasticsearch specialists since early 2010 working on various projects aiming to build search solutions.
We can design and develop search applications for you, implement new features for your use cases and organize training classes for your employees.


OpenSearch is a community-driven, open source search and analytics suite derived from Apache 2.0 licensed Elasticsearch 7.10.2 & Kibana 7.10.2.

Sease can help you easily ingest, secure, search, aggregate, view, and analyze data; have a secure, high-quality search and analytics suite. Sease help would be useful in use cases such as application search, log analytics, and more.


Vespa is a full-featured text search engine and supports both regular text search and fast approximate vector search (ANN).

Sease can help you:
– create high-performing search applications at any scale
– recommendation, personalization and targeting
– providing direct answers to questions
– semi-structured navigation

AI and Machine Learning Integration

Integrating Machine Learning and semantic technologies with your search application can give a strong boost to your search ecosystem.

Sease can help you in integrating :
– Natural Language Processing techniques (including Named Entity Recognition and Linking)
– Deep Learning for Search (Language Modeling, Bert)
– Learning to rank
– Document classification
– Collaborative filtering/ content based recommender engines.

Search Quality Evaluation

Evaluation is fundamental in every scientific domain. That's why we are actively working on many aspects of the topic and we contributed important milestones to the community, like:

- The Interleaving module for Learning to Rank in Apache Solr
- Rated Ranking Evaluator, an open-source relevance testing library for Elasticsearch and Apache Solr
- Rated Ranking Evaluator Enterprise, a complete application that guides you from collecting the user's feedback and judgments to the evaluation of your system quality with few clicks.

Search Relevance Tuning

Search relevance is an hard topic and strictly domain dependent.

Sease can help you:
– modelling your data and text analysis
– design better queries to satisfy your requirements
– tune search components and features to target your objectives
– integrate advance boosting/techniques to improve the quality of results
– evaluate relevancy improvements in a precise and user friendly manner.

Log monitoring/search

Properly managing the logs of your services can be crucial to quickly identify and solve issues or provide meaningful analytics.

Sease can help you modelling your log messages in machine readable formats to then search them and monitor them through powerful frameworks such as ELK ( Elastic Logstash Kibana).

// Can't find what you are looking for?