Building Natural Language and LLM Pipelines : Build production-grade RAG, tool contracts, and context engineering with Haystack and LangGraph

Modern LLM applications often break in production due to brittle pipelines, loose tool definitions, and noisy context. This book shows you how to build production-ready, context-aware systems using Haystack and LangGraph. You’ll learn to design deterministic pipelines with strict tool contracts and deploy them as microservices. Through structured context engineering, you’ll orchestrate reliable agent workflows and move beyond simple prompt-based interactions.

You'll start by understanding LLM behavior—tokens, embeddings, and transformer models—and see how prompt engineering has evolved into a full context engineering discipline. Then, you'll build retrieval-augmented generation (RAG) pipelines with retrievers, rankers, and custom components using Haystack’s graph-based architecture. You’ll also create knowledge graphs, synthesize unstructured data, and evaluate system behavior using Ragas and Weights & Biases. In LangGraph, you’ll orchestrate agents with supervisor-worker patterns, typed state machines, retries, fallbacks, and safety guardrails.

By the end of the book, you’ll have the skills to design scalable, testable LLM pipelines and multi-agent systems that remain robust as the AI ecosystem evolves.

*Email sign-up and proof of purchase required

Begin vandaag nog met dit boek voor € 0

  • Krijg volledige toegang tot alle boeken in de app tijdens de proefperiode
  • Geen verplichtingen, op elk moment annuleren
Probeer nu gratis
Meer dan 52.000 mensen hebben Nextory 5 sterren gegeven in de App store en op Google Play.

Gerelateerde categorieën