Building Natural Language and LLM Pipelines : Build production-grade RAG, tool contracts, and context engineering with Haystack and LangGraph

Modern LLM applications often break in production due to brittle pipelines, loose tool definitions, and noisy context. This book shows you how to build production-ready, context-aware systems using Haystack and LangGraph. You’ll learn to design deterministic pipelines with strict tool contracts and deploy them as microservices. Through structured context engineering, you’ll orchestrate reliable agent workflows and move beyond simple prompt-based interactions.

You'll start by understanding LLM behavior—tokens, embeddings, and transformer models—and see how prompt engineering has evolved into a full context engineering discipline. Then, you'll build retrieval-augmented generation (RAG) pipelines with retrievers, rankers, and custom components using Haystack’s graph-based architecture. You’ll also create knowledge graphs, synthesize unstructured data, and evaluate system behavior using Ragas and Weights & Biases. In LangGraph, you’ll orchestrate agents with supervisor-worker patterns, typed state machines, retries, fallbacks, and safety guardrails.

By the end of the book, you’ll have the skills to design scalable, testable LLM pipelines and multi-agent systems that remain robust as the AI ecosystem evolves.

*Email sign-up and proof of purchase required

Empieza hoy con este libro por 0 €

  • Disfruta de acceso completo a todos los libros de la app durante el periodo de prueba
  • Sin compromiso, cancela cuando quieras
Pruébalo gratis ahora
Más de 52 000 clientes han dado a Nextory 5 estrellas en la App Store y Google Play.