Source URL: https://cloud.google.com/blog/products/databases/google-cloud-database-and-langchain-integrations-support-go-java-and-javascript/
Source: Cloud Blog
Title: Google Cloud Database and LangChain integrations now support Go, Java, and JavaScript
Feedly Summary: Last year, Google Cloud and LangChain announced integrations that give generative AI developers access to a suite of LangChain Python packages. This allowed application developers to leverage Google Cloud’s database portfolio in their gen AI applications to drive the most value from their private data.
Today, we are expanding language support for our integrations to include Go, Java, and JavaScript.
Each package will have up to three LangChain integrations:
Vector stores to enable semantic search for our databases
Chat message history to enable chains to recall previous conversations
Document loader for loading documents from your enterprise data
Developers now have the flexibility to create intricate workflows and easily interchange underlying components (like a vector database) as needed to align with specific use cases. This technology unlocks a variety of applications, including personalized product recommendations, question answering, document search and synthesis, customer service automation, and more.
In this post, we’ll share more about the integrations – and code snippets to get started.
aside_block
New language support
LangChain is known for its popular Python package; however, your team’s expertise and services may not be in Python. Java and Go are commonly used programming languages for production-grade and enterprise-scale applications. Developers may prefer Javascript and Typescript for the asynchronous programming support and compatibility with front-end frameworks like React and Vue.
In addition to Python developers, the LangChain developer community encompasses developers proficient in Java, JavaScript, and Go. It is an active and supportive community centered around the LangChain framework, which facilitates the development of applications powered by large language models (LLMs).
Google Cloud is dedicated to providing secure and easy to use database integrations for your Gen AI applications. Our integrations embed Google Cloud connectors that create secure connections, handle SSL certificates, and support IAM authorization and authentication. The integrations are optimized for PostgreSQL databases (AlloyDB for PostgreSQL, AlloyDB Omni, Cloud SQL for PostgreSQL) to ensure proper connection management, flexible tables schemas, and improved filtering.
JavaScript Support
JavaScript developers can utilize LangChain.js, which provides tools and building blocks for developing applications leveraging LLMs. LangChain simplifies the process of connecting LLMs to external data sources and enables reasoning capabilities in applications. Other Google Cloud integrations, such as Gemini models, are available within LangChain.js, allowing seamless interaction with GCP resources.
Resources (Cloud SQL PostgreSQL support only)
Links
Documentation
Link
How-to guides
Vector Store
Memory
Document Loader
Quick start guide
Vector Store
Memory
Document Loader
GitHub
Repository
Below are the integrations and their code snippets to get started.
Install the dependency:
code_block
<ListValue: [StructValue([(‘code’, ‘npm install @langchain/google-cloud-sql-pg’), (‘language’, ”), (‘caption’, <wagtail.rich_text.RichText object at 0x3ebdba852f70>)])]>
Engine
code_block
<ListValue: [StructValue([(‘code’, ‘import { PostgresEngine } from “@langchain/google-cloud-sql-pg";\r\n\r\nconst engine: PostgresEngine = await PostgresEngine.fromInstance(\r\n "project-id",\r\n "region",\r\n "instance-name",\r\n "database-name",\r\n);’), (‘language’, ‘lang-py’), (‘caption’, <wagtail.rich_text.RichText object at 0x3ebdba852970>)])]>
Use this package with AlloyDB for PostgreSQL and AlloyDB Omni by customizing your Engine to connect your instance. You will need the AlloyDB Auth Proxy to make authorized, encrypted connections to AlloyDB instances.
code_block
<ListValue: [StructValue([(‘code’, ‘import { PostgresEngine, PostgresEngineArgs} from "@langchain/google-cloud-sql-pg";\r\n\r\nconst engine: PostgresEngine = await PostgresEngine.fromEngineArgs(\r\n `postgresql+asyncpg://${OMNI_USER}:${OMNI_PASSWORD}@${OMNI_HOST}:5432/${OMNI_DATABASE_NAME}`\r\n);’), (‘language’, ‘lang-py’), (‘caption’, <wagtail.rich_text.RichText object at 0x3ebdba852550>)])]>
Vector store
code_block
<ListValue: [StructValue([(‘code’, ‘import { PostgresVectorStore } from "@langchain/google-cloud-sql-pg";\r\nimport { VertexAIEmbeddings } from "@langchain/google-vertexai";\r\n\r\nawait engine.initVectorstoreTable("my_vector_store_table", 768);\r\n\r\nconst embeddings = new VertexAIEmbeddings({\r\n model: "text-embedding-004",\r\n});\r\nconst vectorStore = await PostgresVectorStore.create(\r\n engine,\r\n embeddingService,\r\n "my_vector_store_table"\r\n);’), (‘language’, ‘lang-py’), (‘caption’, <wagtail.rich_text.RichText object at 0x3ebdba852d30>)])]>
Chat message history
code_block
<ListValue: [StructValue([(‘code’, ‘import { PostgresChatMessageHistory } from "@langchain/google-cloud-sql-pg";\r\n\r\nawait engine.initChatHistoryTable("my_chat_table", 768);\r\n\r\nconst chat_history = await PostgresChatMessageHistory.create({\r\n engine,\r\n "user-session-1",\r\n "my_chat_table"\r\n});’), (‘language’, ‘lang-py’), (‘caption’, <wagtail.rich_text.RichText object at 0x3ebdba852eb0>)])]>
Loader
code_block
<ListValue: [StructValue([(‘code’, ‘import { PostgresLoader } from "@langchain/google-cloud-sql-pg";\r\n\r\n\r\nconst loader = await PostgresChatMessageHistory.create(\r\n engine,\r\n {query: "SELECT * FROM my_table"}\r\n);\r\n\r\nlet data = await loader.load()’), (‘language’, ‘lang-py’), (‘caption’, <wagtail.rich_text.RichText object at 0x3ebdba852850>)])]>
Java Support
For Java developers, there’s LangChain4j, a Java implementation of LangChain. This allows Java developers to build LLM-powered applications with a familiar ecosystem. In LangChain4j, you can also access the full array of VertexAI Gemini models.
Resources
Links
How-to Guides
AlloyDB Embedding Store
AlloyDB Document Loader
Quick Start Guides
AlloyDB example
GitHub
Repository
*Note: Cloud SQL integrations will be released soon.
Below are the integrations and their code snippets to get started.
For Maven in pom.xml:
code_block
<ListValue: [StructValue([(‘code’, ‘<dependency>\r\n <groupId>dev.langchain4j</groupId>\r\n <artifactId>langchain4j-alloydb-pg</artifactId>\r\n <version>1.0.0-beta3</version>\r\n</dependency>\r\n\r\n<!– New Version to be released –>\r\n<dependency>\r\n <groupId>dev.langchain4j</groupId>\r\n <artifactId>langchain4j-cloud-sql-pg</artifactId>\r\n <version>1.0.0-beta4</version>\r\n</dependency>’), (‘language’, ”), (‘caption’, <wagtail.rich_text.RichText object at 0x3ebdba8521f0>)])]>
Engine
code_block
<ListValue: [StructValue([(‘code’, ‘import dev.langchain4j.engine.AlloyDBEngine;\r\n\r\nAlloyDBEngine engine = new AlloyDBEngine.Builder()\r\n .projectId("PROJECT_ID")\r\n .region("REGION")\r\n .cluster("CLUSTER")\r\n .instance("INSTANCE")\r\n .database("DATABASE")\r\n .build();’), (‘language’, ‘lang-py’), (‘caption’, <wagtail.rich_text.RichText object at 0x3ebdba852b20>)])]>
Embedding store
code_block
<ListValue: [StructValue([(‘code’, ‘import dev.langchain4j.store.embedding.alloydb.AlloyDBEmbeddingStore;\r\n\r\nengine.initVectorStoreTable(new EmbeddingStoreConfig.builder(tableName, vectorSize).build());\r\nAlloyDBEmbeddingStore store = new AlloyDBEmbeddingStore.Builder(engine, tableName).build();’), (‘language’, ‘lang-py’), (‘caption’, <wagtail.rich_text.RichText object at 0x3ebdba852790>)])]>
Document loader
code_block
<ListValue: [StructValue([(‘code’, ‘import dev.langchain4j.data.document.loader.alloydb.AlloyDBLoader;\r\n\r\nAlloyDBLoader loader = new AlloyDBLoader.Builder(engine).query("SELECT * FROM my_table").build();\r\nList<Document> data = loader.load();’), (‘language’, ‘lang-py’), (‘caption’, <wagtail.rich_text.RichText object at 0x3ebdba852610>)])]>
Go support
LangchainGo is the Go programming language port of LangChain.
The LangChain framework was designed to support the development of sophisticated applications that connect language models to data sources and enable interaction with their environment. The most powerful and differentiated applications go beyond simply using a language model via an API; they are data-aware and agentic.
Last year Google’s SDKs were added as providers for LangChainGo; this makes it possible to use the capabilities of the LangChain framework with Google’s Gemini models as LLM providers.
We now have AlloyDB and Cloud SQL for PostgreSQL support in LangchainGo.
Resources
Links
How-to guides
AlloyDB Vector Store
Cloud SQL Vector Store
AlloyDB Memory
Cloud SQL Memory
Quick start guides
AlloyDB Vector Store
Cloud SQL Vector Store
AlloyDB Memory
Cloud SQL Memory
GitHub
Repository
Below are the integrations and their code snippets to get started.
Install the dependency
code_block
<ListValue: [StructValue([(‘code’, ‘go get -u github.com/tmc/langchaingo’), (‘language’, ”), (‘caption’, <wagtail.rich_text.RichText object at 0x3ebdba852f10>)])]>
Engine
code_block
<ListValue: [StructValue([(‘code’, ‘import (\r\n\t"context"\r\n\t"github.com/tmc/langchaingo/internal/alloydbutil"\r\n)\r\n\r\npgEngine, err := alloydbutil.NewPostgresEngine(ctx,\r\n\talloydbutil.WithDatabase(database),\r\n\talloydbutil.WithAlloyDBInstance(projectID, region, cluster, instance),\r\n)’), (‘language’, ”), (‘caption’, <wagtail.rich_text.RichText object at 0x3ebdba852ac0>)])]>
Vector Store
code_block
<ListValue: [StructValue([(‘code’, ‘package main\r\n\r\nimport (\r\n\t"log"\r\n\r\n\t"github.com/tmc/langchaingo/embeddings"\r\n\t"github.com/tmc/langchaingo/internal/alloydbutil"\r\n\t"github.com/tmc/langchaingo/llms/googleai/vertex"\r\n\t"github.com/tmc/langchaingo/vectorstores/alloydb"\r\n)\r\n\r\nfunc main() {\r\n\t// Initialize table for the Vectorstore to use. You only need to do this the first time you use this table.\r\n\tvectorstoreTableoptions, err := &alloydbutil.VectorstoreTableOptions{\r\n\t\tTableName: "my_table",\r\n\t\tVectorSize: 768,\r\n\t}\r\n\tif err != nil {\r\n\t\tlog.Fatal(err)\r\n\t}\r\n\r\n\terr = pgEngine.InitVectorstoreTable(ctx, *vectorstoreTableoptions)\r\n\tif err != nil {\r\n\t\tlog.Fatal(err)\r\n\t}\r\n\r\n\t// Initialize VertexAI LLM\r\n\tllm, err := vertex.New(ctx,\r\n\t\tvertex.WithCloudProject(projectID),\r\n\t\tvertex.WithCloudLocation(vertexLocation),\r\n\t\tvertex.WithDefaultModel("text-embedding-005"),\r\n\t)\r\n\tif err != nil {\r\n\t\tlog.Fatal(err)\r\n\t}\r\n\r\n\te, err := embeddings.NewEmbedder(llm)\r\n\tif err != nil {\r\n\t\tlog.Fatal(err)\r\n\t}\r\n\r\n\t// Create a new AlloyDB Vectorstore\r\n\tvs, err := alloydb.NewVectorStore(ctx, pgEngine, e, "my_table")\r\n}’), (‘language’, ”), (‘caption’, <wagtail.rich_text.RichText object at 0x3ebdba8528e0>)])]>
Chat message history
code_block
<ListValue: [StructValue([(‘code’, ‘import (\r\n\t"context"\r\n\t"log"\r\n\t"github.com/tmc/langchaingo/internal/alloydbutil"\r\n\t"github.com/tmc/langchaingo/llms"\r\n\t"github.com/tmc/langchaingo/memory/alloydb"\r\n)\r\n\r\n\t\r\n// Creates a new table in the Postgres database, which will be used for storing Chat History.\r\nerr = pgEngine.InitChatHistoryTable(ctx, tableName)\r\nif err != nil {\r\n\tlog.Fatal(err)\r\n}\r\n\r\n// Creates a new Chat Message History\r\ncmh, err := alloydb.NewChatMessageHistory(ctx, *pgEngine, tableName, sessionID)\r\nif err != nil {\r\n\tlog.Fatal(err)\r\n}’), (‘language’, ”), (‘caption’, <wagtail.rich_text.RichText object at 0x3ebdba852bb0>)])]>
*Note code is shown for AlloyDB. See links for Cloud SQL for Postgres examples.
Get started
The LangChain Vector stores integration is available for Google Cloud databases with vector support, including AlloyDB, Cloud SQL for PostgreSQL, Firestore, Memorystore for Redis, and Spanner.
The Document loaders and Memory integrations are available for all Google Cloud databases including AlloyDB, Cloud SQL for MySQL, PostgreSQL and SQL Server, Firestore, Datastore, Bigtable, Memorystore for Redis, El Carro for Oracle databases, and Spanner. Below are a few resources to get started.
Resources:
Github
Build LLM applications in Go
Codelabs
AI Summary and Description: Yes
**Summary:** The text discusses the recent expansion of language support in Google Cloud’s integrations with LangChain, allowing developers to use various programming languages (Go, Java, JavaScript) to connect generative AI applications with Google Cloud’s database offerings. This enables the creation of complex workflows and improves interaction with large language models (LLMs) through enhanced data management solutions, encouraging enterprises to leverage their private data effectively.
**Detailed Description:**
The content outlines enhancements in the integration between Google Cloud and LangChain, focusing on generative AI applications. Key points include:
– **Language Support Expansion**:
– New integrations now support Go, Java, and JavaScript, in addition to the existing Python package.
– This caters to developers familiar with enterprise-grade programming languages and facilitates wider adoption across teams.
– **Features of Integrations**:
– **Vector Stores**: Enables semantic search capabilities utilizing Google Cloud databases.
– **Chat Message History**: Allows applications to retain context from previous interactions, supporting more coherent and context-rich communication in AI-driven applications.
– **Document Loader**: Assists in loading enterprise documents into applications for analysis and interaction.
– **Flexibility for Developers**:
– Developers can create intricate workflows and interchange components based on specific use cases, enhancing customization and precision in application development.
– The technology supports a range of applications such as personalized recommendations, question answering, document search, and customer service automation.
– **Security and Authentication**:
– Integrations are designed to embed Google Cloud security features, including secure connection handling and IAM authorization, boosting trust and compliance in the database interactions.
– **Community and Resources**:
– LangChain promotes an active developer community, offering resources, documentation, and code snippets for getting started with their frameworks.
– Availability of GitHub repositories and how-to guides to facilitate application development.
In summary, these expansions not only enhance the functional capabilities of generative AI applications but also emphasize security, usability, and community support. The ability for enterprises to utilize their private data effectively while being compliant with security measures resonates well with security, compliance, and infrastructure professionals. This move also indicates market responsiveness to a diverse developer landscape and the evolving demand in cloud solutions.