Skip to main content

Tutorials

New to LangChain or LLM app development in general? Read this material to quickly get up and running building your first applications.

If you're looking to get up and running quickly with chat models, vector stores, or other LangChain components from a specific provider, check out our supported integrations.

Refer to the how-to guides for more detail on using common LangChain components.

See the conceptual documentation for high level explanations of all LangChain concepts.

Basics​

  • LLM applications: Build and deploy a simple LLM application.
  • Chatbots: Build a chatbot that incorporates memory.
  • Vector stores: Build vector stores and use them to retrieve data.
  • Agents: Build an agent that interacts with external tools.

Working with external knowledge​

Specialized tasks​

LangGraph​

LangGraph is an extension of LangChain aimed at building robust and stateful multi-actor applications with LLMs by modeling steps as edges and nodes in a graph.

LangGraph documentation is currently hosted on a separate site. You can peruse LangGraph tutorials here.

LangSmith​

LangSmith allows you to closely trace, monitor and evaluate your LLM application. It seamlessly integrates with LangChain, and you can use it to inspect and debug individual steps of your chains as you build.

LangSmith documentation is hosted on a separate site. You can peruse LangSmith tutorials here.

Evaluation​

LangSmith helps you evaluate the performance of your LLM applications. The tutorial below is a great way to get started:

More​

For more tutorials, see our cookbook section.


Was this page helpful?