Over two days, gives you a solid foundation for applied Information Retrieval with Solr.
From the team that wrote Relevant Search; taught by practising relevance consultants. We teach you a practice for building smarter, more relevant search experiences with Solr. From measurement, to TF*IDF, to semantic search and learning to rank.
Day One - Managing, Measuring, and Testing Search Relevance
This day helps the class understand how working on relevance requires different thinking than other engineering problems. We teach you to measure search quality, take a hypothesis-driven approach to search projects, and safely ‘fail fast’ towards ever improving business KPIs
- What is search?
- Holding search accountable to the business
- Search quality feedback
- Hypothesis-driven relevance tuning
- User studies for search quality
- Using analytics & clicks to understand search quality
Day Two - Engineering Relevance with Solr
This day demonstrates relevance tuning techniques that actually work. Relevance can’t be achieved by just tweaking field weights: Boosting strategies, synonyms, and semantic search are discussed. The day is closed introducing machine learning for search (aka “Learning to Rank”).
- Getting a feel for Solr
- Signal Modeling (data modeling for relevance)
- Dealing with multiple, competing objectives in search relevance
- Synonym strategies that actually work
- Taxonomy-based Semantic Search in Solr
- Introduction to Learning to Rank
What You’ll Get Out Of It
- A practice for using Elasticsearch to improve relevance
- Incorporating user and analytics feedback into relevance tuning
- Measuring relevance, proving it has a business impact
- Combining different ranking signals
- Using taxonomies and synonyms to build semantic search
- Bringing to bear machine learning resources on Learning to Rank
Your Trainers: Experienced Relevance Experts
Our trainers are experienced search relevance experts and regular speakers at Haystack and other search events. They lead organizations on a number of projects using agile Test-Driven Relevance methodologies with Quepid, OpenSource Connections’ search relevance tool bench and have consulted with Fortune 50 companies on search quality.
Style of Training: Small Group Workshop
Our trainers are not ‘stock tech trainer’ from central casting mindlessly reading slides. Our trainers expect to problem solve in real-time, and we want to hear your tough problems. As OpenSource Connection’s mission is to ‘empower search teams’, we see training as the central component to our mission. Our training is ‘workshop style’ where much of the value is the interactions and knowledge sharing between the small class and the two trainers.
Who This Training is For
Some basic exposure to Solr is recommended, but not required. But even those with extensive Solr training will get value from this training as we teach Solr from the relevance perspective. Roles that would get value out of this training:
- Search engineers
- Data scientists
- Data engineers
- Machine learning engineers
- Relevance engineers
- Search product owners
Quotes From Past Attendees:
'Think Like a Relevance Engineer' has helped me think differently about how I solve Solr & Elasticsearch relevance problems"
Buy Tickets Want training for your team/town? check this out!
What a positive experience! We have so many new ideas to implement after attending 'Think Like a Relevance Engineer' training.