Generative AI Consulting

Generative AI is revolutionizing the landscape of search and information - but to take advantage of these new techniques and build AI systems you need to ground them in truth and be able to effectively measure their quality.

Get the answer, not just more questions

Generative AI unleashes new ways to access information: instead of a list of results, your users can be shown directly to a single answer. Powered by vector databases and the new wave of Large Language Models (LLMs), Generative AI techniques such as Retrieval Augmented Generation (RAG) can be used to create summaries, extract information from multiple sources and create & enhance content.

Taking AI to from proof of concept to production can be challenging: there are many concepts to understand, technical choices to make, risks to understand and reduce, processes to create and manage.

The importance of measurement

OSC believes that to succeed at Generative AI, you need to be able to measure quality at multiple points in the system. Developing effective Generative AI systems and mitigating known problems like hallucination requires a deep understanding of the technical options available, accurate system design, high data quality and a great user experience. Our information retrieval expertise and practical experience of building AI-powered systems will help you build AI that works.

AI

How we can help

Our generative AI consulting helps you understand how generative AI really works and how to apply AI to solve your business problems.

With our guidance and support you can take AI out of the lab and to production:

  • Audit your organisation for AI readiness
  • Choose models, vector databases and APIs to suit your needs and capabilities
  • Implement data pipelines and transformations to feed AI the high quality data it needs
  • Create AI Proofs of Concept
  • Create a quality measurement process for AI, both offline and online
  • Deploy models to production (MLOps)

What is Retrieval Augmented Generation?

It’s well known that LLMs like ChatGPT can hallucinate – create plausible sounding content that isn’t truthful. That’s because they’re trained on public content, not the information your business depends on, and when they don’t have the right information they can make it up!

RAG lets you ground LLMs in truth – by using a vector index of your own, curated data, and then asking the LLM to use results from searching this index as the basis for its answers.

We’re already building systems like this for clients as part of our generative AI consulting practice.

Retrieval Augmented Generation

Pragmatic AI-powered search

Hear from OSC’s Charlie Hull on the new landscape of search and AI – this talk won Best Paper at the British Computer Society’s Search Solutions 2023 conference!

What we’ll do

Get you AI ready

We can audit your organisation for AI readiness as part of our Proven Process for search & AI quality improvement and help you develop a roadmap to AI success, with clear recommendations and guidance.

AI promises to transform your business – but before you start you need to be sure you have the processes in place to supply the data AI needs and deploy new technologies effectively and fast.

Create a solid foundation of quality measurement

We can help you set up measurement systems using tools like Quepid to measure AI systems at multiple points – from the results of a vector search to the summaries produced by LLMs. With this foundation you will be able to continually verify the performance of your AI project.

Choose & tune models

There are hundreds of thousands of language models available on sites like Hugging Face – but which is best for you? You don’t always need a large, expensive to run model – sometimes smaller models are enough. You can find models trained for particular languages, industry sectors or tasks, and fine-tune models on your own data for increased accuracy. Our experimental approach will help you balance performance and cost and we can help you navigate a complex licensing landscape.

Depend on vector search

Some AI systems such as RAG incorporate vector databases – these provide source data, chat history and context. With our deep knowledge of information retrieval (IR) we can build powerful and scalable vector stores to power AI. Great AI is grounded in great search!

Engineer your prompts

Some LLMs can be asked to carry out tasks in plain language – discover prompt engineering and evolve the best prompts for your task, reducing errors and hallucinations:

“For each document check whether it is related to the question. Only use documents that are related to the question to answer it. Ignore documents that are not related to the question. If the answer exists in several documents, summarize them. Only answer based on the documents provided. Don’t make things up.

A section of an example LLM prompt

Build your AI team

We know how to structure a search or AI team for success – let us help you create and fill the roles you’ll need, develop effective processes and foster collaboration. As you build an effective team, let us fill the gaps.

Questions we are asked during Generative AI consulting:

* What’s the business case for AI & LLMs?
* How do I select & fine-tune AI models?
* How do I build an effective AI team?
* How can I reduce hallucination and ground LLMs in truth?

Generative AI can transform your business

Talk to us and find out how.

Generative Ai Vectors by Vecteezy