// our training

Learning to Rank
[Solr OR Elasticsearch]

In this Learning to Rank Training you will Solve a ranking problem integrating machine learning system with your search engine. You will learn how to build a training set, train your model and test it both online and offline.
The Learning to Rank Training will cover Apache Solr Integration OR Elasticsearch Integration.

Would you like to schedule a training for your Team?

Our trainings are meant for different targets ( business, developer and everything in between), and are individual/small groups/classroom-based.
We offer different levels of flexibility to make sure the training course will be a perfect fit for your requirements 




// book now

840 £

If you can’t participate, you can easily ask for private training by contacting us!
If you are looking for different topics to train on you can look at our full training schedule and find whatever suits the best to you!

// pay now

1x Learning to Rank – Training
Price: 840 GBP

By Purchasing a Ticket You Accept our Training’s Terms and Conditions.

Skills You Will Gain

• How to integrate Machine Learning with your Search Engine to tune your relevance function;
• How to gather user feedback and prepare your training set;
• Ranking models life-cycle (Training and Deploy);
• How to test your ranking models Offline/Online.


• Basic understanding of Search Engines and Machine Learning.

Intended Audience

Software Engineers, Data Scientists, Machine Learning passionates.

// learning to rank

Our Trainer

Alessandro Benedetti


Alessandro Benedetti is the founder of Sease Ltd.
Alessandro has been involved from the beginning to the Apache Solr Learning To Rank plugin developments carried over by Bloomberg.
Over the years he has worked in various Learning To Rank projects and solutions including the contribution to Apache Solr of the Interleaving capability.

// learning to rank

Full Programme

Introduction to LTR

  • Offline Learning to Rank Techniques
    • Core Concepts
    • Algorithms
    • State of the art
  • Online Learning to Rank
    • Core Concepts
    • Algorithms
    • State of the art

How to Build your Training Set

  • Implicit Feedback
  • Explicit Feedback
  • Feature Engineering
    • Feature level
    • Feature type
    • Categorical Features
    • Missing values
  • Relevance Label Estimation
    • Click Modelling
  • Train/Test/Validation Split
  • Hands On Exercises
    • Categorical Encoding
    • Missing Values Count
    • From interactions to training set
    • Let’s split the training set

How to Train your Model

  • Libraries Overview
    • Ranklib
    • XGBoost
  • Hands On Exercises
    • Let’s train a model using XGBoost

Evaluation and Explainability

  • Offline Model Evaluation
    • Metrics
    • Open Source Tools
  • Online Model Evaluation
    • A/B Testing
    • Interleaving
  • Explain your Model
    • Overview
    • Open Source Libraries
  • Hands On Exercises
    • Let’s explain a model using TreeSHAP

Open Source Search Engines Integration

  • Apache Solr Integration OR Elasticsearch Integration

    • Features Management
    • Ranking Models Management
    • How to rerank search results
    • Extract features from the results
    • Interleaving (Apache Solr only)
  • Hands On Exercises
    • Upload Features definition and Models
    • Run a re-ranking query
    • Interleave two models in the results
    • Extract features from the results

War Stories

The training has prepared me well to tackle my own project. It helped me to understand how to set up the project and which tools or algorithms I can use for it. The content of the training is quite compact, but not overloaded, so that there was also time for individual questions. I particularly liked the fact that Alessandro shared his experiences from older projects, which allowed him to point out potential problems.
Julia Silberberg

No public training available at the moment

    Feel free to contact us

    Your email address will not be published. Required fields are marked *