Buy Tickets
Unfortunately you missed this class! But don't despair - check out upcoming public trainings or inquire about private training for your team.

Over two days, gives you a solid foundation for applied Information Retrieval with Solr.

From the team that wrote Relevant Search; co-taught by Eric Pugh and Dan Worley. We teach you a practice for building smarter, more relevant search experiences with Solr. From measurement, to TF*IDF, to semantic search and learning to rank.

Agenda

Day One - Managing, Measuring, and Testing Search Relevance

This day helps the class understand how working on relevance requires different thinking than other engineering problems. We teach you to measure search quality, take a hypothesis-driven approach to search projects, and safely ‘fail fast’ towards ever improving business KPIs

  • What is search?
  • Holding search accountable to the business
  • Search quality feedback
  • Hypothesis-driven relevance tuning
  • User studies for search quality
  • Using analytics & clicks to understand search quality

Day Two - Engineering Relevance with Solr

This day demonstrates relevance tuning techniques that actually work. Relevance can’t be achieved by just tweaking field weights: Boosting strategies, synonyms, and semantic search are discussed. The day is closed introducing machine learning for search (aka “Learning to Rank”).

  • Getting a feel for Solr
  • Signal Modeling (data modeling for relevance)
  • Dealing with multiple, competing objectives in search relevance
  • Synonym strategies that actually work
  • Taxonomy-based Semantic Search in Solr
  • Introduction to Learning to Rank

What You’ll Get Out Of It

  • A practice for using Elasticsearch to improve relevance
  • Incorporating user and analytics feedback into relevance tuning
  • Measuring relevance, proving it has a business impact
  • Combining different ranking signals
  • Using taxonomies and synonyms to build semantic search
  • Bringing to bear machine learning resources on Learning to Rank

Your Trainers: Experienced Relevance Experts

Eric Pugh and Dan Worley are experienced Solr Relevance thought leaders. Eric has solved tough search problems at more than 100 organizations, having been involved in the Solr project since 2007. Eric wrote the first book on Apache Solr, “Solr Enterprise Search Server. Dan Worley led OpenSource Connections’s efforts to help improve Wikipedia search relevance. During this time, he helped to develop the Elasticsearch Learning to Rank plugin, inspired by much of the functionality in the Solr Learning to Rank plugin.

Style of Training: Small Group Workshop

Our trainers are not ‘stock tech trainer’ from central casting mindlessly reading slides. Our trainers expect to problem solve in real-time, and we want to hear your tough problems. As OpenSource Connection’s mission is to ‘empower search teams’, we see training as the central component to our mission. Our training is ‘workshop style’ where much of the value is the interactions and knowledge sharing between the small class and the two trainers.

Who This Training is For

Some basic exposure to Solr is recommended, but not required. But even those with extensive Solr training will get value from this training as we teach Solr from the relevance perspective. Roles that would get value out of this training:

  • Search engineers
  • Data scientists
  • Data engineers
  • Machine learning engineers
  • Relevance engineers
  • Search product owners

Quotes From Past Attendees:

'Think Like a Relevance Engineer' has helped me think differently about how I solve Solr & Elasticsearch relevance problems"

Matt Corkum, Disruptive Technology Director,
Elsevier

What a positive experience! We have so many new ideas to implement after attending 'Think Like a Relevance Engineer' training.

Andrew Lee, Director of Engineering for Search
DHI

Buy Tickets Want training for your team/town? check this out!