Skip to content

deepset-ai/haystack-cookbook

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

πŸ‘©πŸ»β€πŸ³ Haystack Cookbook

Green logo of a stylized white 'H' with the text 'Haystack, by deepset. Haystack 2.0 is live πŸŽ‰'Β Abstract green and yellow diagrams in the background.

A collection of example notebooks using Haystack πŸ‘‡

You can use these examples as guidelines on how to make use of different model providers, vector databases, retrieval techniques and more with Haystack. Most of them showcase a specific, small demo.

To learn more about how to use Haystack, please visit our Docs and official Tutorials.

For more examples, you may also find our Blog useful.

Note: Unless '(Haystack 1.x)' is mentioned in the title, all of these examples use Haystack 2.0 onwards.

Name Colab
Speaker Diarization with AssemblyAI Open In Colab
Advance Prompt Customization for Anthropic Open In Colab
Techcrunch News Digest with Local LLMs using TitanML Takeoff Open In Colab
Use Gemini Models with Vertex AI Open In Colab
Gradient AI Embedders and Generators for RAG Open In Colab
Mixtral 8x7B with Hugging Face TGI for Web QA Open In Colab
Amazon Bedrock and OpenSearch for PDF QA Open In Colab
Use Zephyr 7B Beta with Hugging Face for RAG Open In Colab
Hacker News RAG with Custom Component Open In Colab
Use Chroma for RAG and Indexing Open In Colab
Using the Jina-embeddings-v2-base-en model in a Haystack RAG pipeline for legal document analsysis Open In Colab
Multilingual RAG from a podcast with Whisper, Qdrant and Mistral Open In Colab
Improve retrieval by embedding meaningful metadata Open In Colab
Information extraction via LLMs (Gorilla OpenFunctions) Open In Colab
Information extraction via LLMs (NexusRaven) Open In Colab
Using AstraDB as a data store in your Haystack pipelines Open In Colab
Streaming model explorer: compare how different models handle the same prompt. Open In Colab
Function Calling with OpenAIChatGenerator Open In Colab
Use the vLLM inference engine in Haystack 2.x Open In Colab
Build with Google Gemma: chat and RAG Open In Colab
Optimizing Retrieval with HyDE Open In Colab
RAG pipeline using FastEmbed for embeddings generation Open In Colab
Evaluate a RAG pipeline using Haystack-UpTrain integration Open In Colab
RAG on the Oscars using Llama 3 models Open In Colab
Chatting with SQL Databases Open In Colab
Evaluate a RAG pipeline using DeepEval integration Open In Colab
Evaluate a RAG pipeline using Ragas integration Open In Colab
Sparse Embedding Retrieval with Qdrant and FastEmbed Open In Colab
Extract Metadata Filters from a Query Open In Colab
Run tasks concurrently within a custom component Open In Colab
Cohere for Multilingual QA (Haystack 1.x) Open In Colab
GPT-4 and Weaviate for Custom Documentation QA (Haystack 1.x) Open In Colab
Whisper Transcriber and Weaviate for YouTube video QA (Haystack 1.x) Open In Colab

How to Contribute to this repository

If you have an example that uses Haystack, you can add it to this repository by creating a PR. You can also create a PR from Colab by creating a Fork of this repository and selecting "Save a Copy to GitHub". Once you add your example to your fork, you can create a PR onto this repository.

  1. Add your Notebook
  2. Give a descriptive name to your file that includes the names of (if applicable) the model providers, databases the technologies you use in your example and/or the task you are completing in the example.
  3. Make sure to add a row in the table above πŸŽ‰

About

πŸ‘©πŸ»β€πŸ³ A collection of example notebooks

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published