This page introduces how to build LLM-powered applications using LangChain. The overviews on this page link to procedure guides in GitHub.
What is LangChain?
LangChain is an LLM orchestration framework that helps developers build generative AI applications or retrieval-augmented generation (RAG) workflows. It provides the structure, tools, and components to streamline complex LLM workflows.
For more information about LangChain, see the Google LangChain page. For more information about the LangChain framework, see the LangChain product documentation.
LangChain components for Cloud SQL for PostgreSQL
Cloud SQL for PostgreSQL offers the following LangChain interfaces:
Learn how to use LangChain with the LangChain Quickstart for Cloud SQL for PostgreSQL.
Vector store for Cloud SQL for PostgreSQL
Vector store retrieves and stores documents and metadata from a vector database. Vector store gives an application the ability to perform semantic searches that interpret the meaning of a user query. This type of search is a called a vector search, and it can find topics that match the query conceptually. At query time, vector store retrieves the embedding vectors that are most similar to the embedding of the search request. In LangChain, a vector store takes care of storing embedded data and performing the vector search for you.
To work with vector store in Cloud SQL for PostgreSQL, use the
PostgresVectorStore
class.
For more information, see the LangChain Vector Stores product documentation.
Vector store procedure guide
The Cloud SQL for PostgreSQL guide for vector store shows you how to do the following:
- Install the integration package and LangChain
- Create a
PostgresEngine
object and configure a connection pool to your Cloud SQL for PostgreSQL database - Initialize a table
- Create an embedding object using
VertexAIEmbeddings
- Initialize a default
PostgresVectorStore
- Add texts
- Delete texts
- Search for documents
- Search for documents by vector
- Add an index to accelerate vector search queries
- Re-index
- Remove an index
- Create a custom vector store
- Search for documents with a metadata filter
Document loader for Cloud SQL for PostgreSQL
The document loader saves, loads, and deletes a LangChain Document
objects. For example, you can load data for processing into embeddings and
either store it in vector store or use it as a tool to provide specific context
to chains.
To load documents from document loader in Cloud SQL for PostgreSQL, use the
PostgresLoader
class. PostgresLoader
returns a list of documents from a
table using the first column for page content and all other columns for
metadata. The default table has the first column as page content and the second
column as JSON metadata. Each row becomes a document. Use the
PostgresDocumentSaver
class to save and delete documents.
For more information, see the LangChain Document loaders topic.
Document loader procedure guide
The Cloud SQL for PostgreSQL guide for document loader shows you how to do the following:
- Install the integration package and LangChain
- Load documents from a table
- Add a filter to the loader
- Customize the connection and authentication
- Customize Document construction by specifying customer content and metadata
- How to use and customize a
PostgresDocumentSaver
to store and delete documents
Chat message history for Cloud SQL for PostgreSQL
Question and answer applications require a history of the things said in the
conversation to give the application context for answering further questions
from the user. The LangChain ChatMessageHistory
class lets the application
save messages to a database and retrieve them when needed to formulate further
answers. A message can be a question, an answer, a statement, a greeting or any
other piece of text that the user or application gives during the conversation.
ChatMessageHistory
stores each message and chains messages together for each
conversation.
Cloud SQL for PostgreSQL extends this class with PostgresChatMessageHistory
.
Chat message history procedure guide
The Cloud SQL for PostgreSQL guide for chat message history shows you how to do the following:
- Install LangChain and authenticate to Google Cloud
- Create a
PostgresEngine
object and configure a connection pool to your Cloud SQL for PostgreSQL database - Initialize a table
- Initialize the
PostgresChatMessageHistory
class to add and delete messages - Create a chain for message history using the LangChain Expression Language (LCEL) and Google's Vertex AI chat models