@langchain/weaviate
This package contains the LangChain.js integrations for Weaviate with the weaviate-ts-client
SDK.
Installation
npm install @langchain/weaviate @langchain/core
Vectorstore
This package adds support for Weaviate vectorstore.
To follow along with this example install the @langchain/openai
package for their Embeddings model.
npm install @langchain/openai
Now set the necessary environment variables (or pass them in via the client object):
export WEAVIATE_SCHEME=
export WEAVIATE_HOST=
export WEAVIATE_API_KEY=
import weaviate, { ApiKey } from 'weaviate-ts-client';
import { WeaviateStore } from "@langchain/weaviate";
const client = (weaviate as any).client({
scheme: process.env.WEAVIATE_SCHEME || "https",
host: process.env.WEAVIATE_HOST || "localhost",
apiKey: new ApiKey(
process.env.WEAVIATE_API_KEY || "default"
),
});
await WeaviateStore.fromTexts(
["hello world", "hi there", "how are you", "bye now"],
[{ foo: "bar" }, { foo: "baz" }, { foo: "qux" }, { foo: "bar" }],
new OpenAIEmbeddings(),
{
client,
indexName: "Test",
textKey: "text",
metadataKeys: ["foo"],
}
);
Development
To develop the @langchain/weaviate
package, you'll need to follow these instructions:
Install dependencies
yarn install
Build the package
yarn build
Or from the repo root:
yarn build --filter=@langchain/weaviate
Run tests
Test files should live within a tests/
file in the src/
folder. Unit tests should end in .test.ts
and integration tests should
end in .int.test.ts
:
$ yarn test
$ yarn test:int
Lint & Format
Run the linter & formatter to ensure your code is up to standard:
yarn lint && yarn format
Adding new entrypoints
If you add a new file to be exported, either import & re-export from src/index.ts
, or add it to the entrypoints
field in the config
variable located inside langchain.config.js
and run yarn build
to generate the new entrypoint.