Building Natural Language and LLM Pipelines : Build production-grade RAG, tool contracts, and context engineering with Haystack and LangGraph

Modern LLM applications often break in production due to brittle pipelines, loose tool definitions, and noisy context. This book shows you how to build production-ready, context-aware systems using Haystack and LangGraph. You’ll learn to design deterministic pipelines with strict tool contracts and deploy them as microservices. Through structured context engineering, you’ll orchestrate reliable agent workflows and move beyond simple prompt-based interactions.

You'll start by understanding LLM behavior—tokens, embeddings, and transformer models—and see how prompt engineering has evolved into a full context engineering discipline. Then, you'll build retrieval-augmented generation (RAG) pipelines with retrievers, rankers, and custom components using Haystack’s graph-based architecture. You’ll also create knowledge graphs, synthesize unstructured data, and evaluate system behavior using Ragas and Weights & Biases. In LangGraph, you’ll orchestrate agents with supervisor-worker patterns, typed state machines, retries, fallbacks, and safety guardrails.

By the end of the book, you’ll have the skills to design scalable, testable LLM pipelines and multi-agent systems that remain robust as the AI ecosystem evolves.

*Email sign-up and proof of purchase required

Aloita tämä kirja jo tänään, hintaan 0 €

  • Kokeilujakson aikana käytössäsi on kaikki sovelluksen kirjat
  • Ei sitoumusta, voit perua milloin vain
Kokeile nyt ilmaiseksi
Yli 52 000 ihmistä on antanut Nextorylle viisi tähteä App Storessa ja Google Playssä.

Liittyvät kategoriat