Over two days, gives you a solid foundation for applied Information Retrieval with Elasticsearch.
From the team that wrote Relevant Search; co-taught by Max Irwin and Bertrand Rigaldies. We teach you a practice for building smarter, more relevant search experiences with Elasticsearch. From measurement, to TF*IDF, to semantic search and learning to rank.
Day One - Managing, Measuring, and Testing Search Relevance
This day helps the class understand how working on relevance requires different thinking than other engineering problems. We teach you to measure search quality, take a hypothesis-driven approach to search projects, and safely ‘fail fast’ towards ever improving business KPIs
- What is search?
- Holding search accountable to the business
- Search quality feedback
- Hypothesis-driven relevance tuning
- User studies for search quality
- Using analytics & clicks to understand search quality
Day Two - Engineering Relevance with Elasticsearch
This day demonstrates relevance tuning techniques that actually work. Relevance can’t be achieved by just tweaking field weights: Boosting strategies, synonyms, and semantic search are discussed. The day is closed introducing machine learning for search (aka “Learning to Rank”).
- Getting a feel for Elasticsearch
- Signal Modeling (data modeling for relevance)
- Dealing with multiple, competing objectives in search relevance
- Synonym strategies that actually work
- Taxonomy-based Semantic Search in Elasticsearch
- Introduction to Learning to Rank
What You’ll Get Out Of It
- A practice for using Elasticsearch to improve relevance
- Incorporating user and analytics feedback into relevance tuning
- Measuring relevance, proving it has a business impact
- Combining different ranking signals
- Using taxonomies and synonyms to build semantic search
- Bringing to bear machine learning resources on Learning to Rank
Your Trainers: Experienced Relevance Experts
Bertrand Rigaldies and Max Irwin are experienced search relevance experts. Bertrand is a regular speaker at Haystack and other search events. He’s lead organizations on a number of projects using agile Test-Driven Relevance methodologies with Quepid, OpenSource Connections search relevance tool bench. Max Irwin previously led Wolters Kluwer’s search center of excellence, where he consulted with Wolters Kluwer’s dozens of search teams to improve search quality. At OpenSource Connections, he has been a regular speaker at Haystack and has consulted with Fortune 50 companies on search quality as an OpenSource Connections consultant.
Style of Training: Small Group Workshop
Our trainers are not ‘stock tech trainer’ from central casting mindlessly reading slides. Our trainers expect to problem solve in real-time, and we want to hear your tough problems. As OpenSource Connection’s mission is to ‘empower search teams’, we see training as the central component to our mission. Our training is ‘workshop style’ where much of the value is the interactions and knowledge sharing between the small class and the two trainers.
Who This Training is For
Some basic exposure to Elasticsearch is recommended, but not required. But even those with extensive Elasticsearch training will get value from this training as we teach Elasticsearch from the relevance perspective. Roles that would get value out of this training:
- Search engineers
- Data scientists
- Data engineers
- Machine learning engineers
- Relevance engineers
- Search product owners
Quotes From Past Attendees:
'Think Like a Relevance Engineer' has helped me think differently about how I solve Solr & Elasticsearch relevance problems"
Buy Tickets Want training for your team/town? check this out!
What a positive experience! We have so many new ideas to implement after attending 'Think Like a Relevance Engineer' training.