Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat(community): Prepare implementation IBM WatsonxAI to langchain community package. #6916

Merged
merged 9 commits into from
Oct 21, 2024
Merged
362 changes: 362 additions & 0 deletions docs/core_docs/docs/integrations/llms/ibm.ipynb
Original file line number Diff line number Diff line change
@@ -0,0 +1,362 @@
{
"cells": [
{
"cell_type": "raw",
"id": "67db2992",
"metadata": {
"vscode": {
"languageId": "raw"
}
},
"source": [
"---\n",
"sidebar_label: IBM watsonx.ai\n",
"---"
]
},
{
"cell_type": "markdown",
"id": "9597802c",
"metadata": {},
"source": [
"# IBM watsonx.ai\n",
"\n",
"\n",
"This will help you get started with IBM [text completion models (LLMs)](/docs/concepts#llms) using LangChain. For detailed documentation on `IBM watsonx.ai` features and configuration options, please refer to the [IBM watsonx.ai](https://api.js.langchain.com/classes/_langchain_community.llms_ibm.html).\n",
"\n",
"## Overview\n",
"### Integration details\n",
"\n",
"\n",
"| Class | Package | Local | Serializable | [PY support](https://python.langchain.com/docs/integrations/llms/ibm_watsonx/) | Package downloads | Package latest |\n",
"| :--- | :--- | :---: | :---: | :---: | :---: | :---: |\n",
"| [`IBM watsonx.ai`](https://api.js.langchain.com/modules/_langchain_community.llms_ibm.html) | [@langchain/community](https://api.js.langchain.com/modules/langchain_community_llms_ibm.html) | ❌ | ✅ | ✅ | ![NPM - Downloads](https://img.shields.io/npm/dm/@langchain/community?style=flat-square&label=%20&) | ![NPM - Version](https://img.shields.io/npm/v/@langchain/community?style=flat-square&label=%20&) |\n",
"\n",
"## Setup\n",
"\n",
"\n",
"To access IBM WatsonxAI models you'll need to create an IBM watsonx.ai account, get an API key or any other type of credentials, and install the `@langchain/community` integration package.\n",
"\n",
"### Credentials\n",
"\n",
"\n",
"Head to [IBM Cloud](https://cloud.ibm.com/login) to sign up to IBM watsonx.ai and generate an API key or provide any other authentication form as presented below.\n",
"\n",
"#### IAM authentication\n",
"\n",
"```bash\n",
"export WATSONX_AI_AUTH_TYPE=iam\n",
"export WATSONX_AI_APIKEY=<YOUR-APIKEY>\n",
"```\n",
"\n",
"#### Bearer token authentication\n",
"\n",
"```bash\n",
"export WATSONX_AI_AUTH_TYPE=bearertoken\n",
"export WATSONX_AI_BEARER_TOKEN=<YOUR-BEARER-TOKEN>\n",
"```\n",
"\n",
"#### CP4D authentication\n",
"\n",
"```bash\n",
"export WATSONX_AI_AUTH_TYPE=cp4d\n",
"export WATSONX_AI_USERNAME=<YOUR_USERNAME>\n",
"export WATSONX_AI_PASSWORD=<YOUR_PASSWORD>\n",
"export WATSONX_AI_URL=<URL>\n",
"```\n",
"\n",
"Once these are places in your enviromental variables and object is initialized authentication will proceed automatically.\n",
FilipZmijewski marked this conversation as resolved.
Show resolved Hide resolved
"\n",
"Authentication can also be accomplished by passing these values as parameters to a new instance.\n",
"\n",
"## IAM authentication\n",
"\n",
"```typescript\n",
"import { WatsonxLLM } from \"@langchain/community/llms/ibm\";\n",
"\n",
"const props = {\n",
" version: \"YYYY-MM-DD\",\n",
" serviceUrl: \"<SERVICE_URL>\",\n",
" projectId: \"<PROJECT_ID>\",\n",
" watsonxAIAuthType: \"iam\",\n",
" watsonxAIApikey: \"<YOUR-APIKEY>\",\n",
"};\n",
"const instance = new WatsonxLLM(props);\n",
"```\n",
"\n",
"## Bearer token authentication\n",
"\n",
"```typescript\n",
"import { WatsonxLLM } from \"@langchain/community/llms/ibm\";\n",
"\n",
"const props = {\n",
" version: \"YYYY-MM-DD\",\n",
" serviceUrl: \"<SERVICE_URL>\",\n",
" projectId: \"<PROJECT_ID>\",\n",
" watsonxAIAuthType: \"bearertoken\",\n",
" watsonxAIBearerToken: \"<YOUR-BEARERTOKEN>\",\n",
"};\n",
"const instance = new WatsonxLLM(props);\n",
"```\n",
"\n",
"### CP4D authentication\n",
"\n",
"```typescript\n",
"import { WatsonxLLM } from \"@langchain/community/llms/ibm\";\n",
"\n",
"const props = {\n",
" version: \"YYYY-MM-DD\",\n",
" serviceUrl: \"<SERVICE_URL>\",\n",
" projectId: \"<PROJECT_ID>\",\n",
" watsonxAIAuthType: \"cp4d\",\n",
" watsonxAIUsername: \"<YOUR-USERNAME>\",\n",
" watsonxAIPassword: \"<YOUR-PASSWORD>\",\n",
" watsonxAIUrl: \"<url>\",\n",
"};\n",
"const instance = new WatsonxLLM(props);\n",
"```\n",
"\n",
"If you want to get automated tracing of your model calls you can also set your [LangSmith](https://docs.smith.langchain.com/) API key by uncommenting below:\n",
"\n",
"```bash\n",
"# export LANGCHAIN_TRACING_V2=\"true\"\n",
"# export LANGCHAIN_API_KEY=\"your-api-key\"\n",
"```\n",
"\n",
"### Installation\n",
"\n",
"The LangChain IBM watsonx.ai integration lives in the `@langchain/community` package:\n",
"\n",
"```{=mdx}\n",
"import IntegrationInstallTooltip from \"@mdx_components/integration_install_tooltip.mdx\";\n",
"import Npm2Yarn from \"@theme/Npm2Yarn\";\n",
"\n",
"<IntegrationInstallTooltip></IntegrationInstallTooltip>\n",
"\n",
"<Npm2Yarn>\n",
" @langchain/community @langchain/core\n",
"</Npm2Yarn>\n",
"\n",
"```"
]
},
{
"cell_type": "markdown",
"id": "0a760037",
"metadata": {},
"source": [
"## Instantiation\n",
"\n",
"Now we can instantiate our model object and generate chat completions:\n"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "a0562a13",
"metadata": {},
"outputs": [],
"source": [
"import { WatsonxLLM } from \"@langchain/community/llms/ibm\";\n",
"\n",
"const props = {\n",
" decoding_method: \"sample\",\n",
" max_new_tokens: 100,\n",
" min_new_tokens: 1,\n",
" temperature: 0.5,\n",
" top_k: 50,\n",
" top_p: 1,\n",
"};\n",
"const instance = new WatsonxLLM({\n",
" version: \"YYYY-MM-DD\",\n",
" serviceUrl: process.env.API_URL,\n",
" projectId: \"<PROJECT_ID>\",\n",
" spaceId: \"<SPACE_ID>\",\n",
" idOrName: \"<DEPLOYMENT_ID>\",\n",
" modelId: \"<MODEL_ID>\",\n",
" ...props,\n",
"});"
]
},
{
"cell_type": "markdown",
"id": "f7498103",
"metadata": {},
"source": [
"Note:\n",
"\n",
"- You must provide spaceId, projectId or idOrName(deployment id) in order to proceed.\n",
FilipZmijewski marked this conversation as resolved.
Show resolved Hide resolved
"- Depending on the region of your provisioned service instance, use correct serviceUrl.\n",
"- You need to specify the model you want to use for inferencing through model_id."
]
},
{
"cell_type": "markdown",
"id": "0ee90032",
"metadata": {},
"source": [
"## Invocation and generation\n"
]
},
{
"cell_type": "code",
"execution_count": 4,
"id": "035dea0f",
"metadata": {
"tags": []
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"\n",
"print('Hello world.')<|endoftext|>\n",
"{\n",
" generations: [ [ [Object] ], [ [Object] ] ],\n",
" llmOutput: { tokenUsage: { generated_token_count: 28, input_token_count: 10 } }\n",
"}\n"
]
}
],
"source": [
"const result = await instance.invoke(\"Print hello world.\");\n",
"console.log(result);\n",
"\n",
"const results = await instance.generate([\n",
" \"Print hello world.\",\n",
" \"Print bye, bye world!\",\n",
"]);\n",
"console.log(results);"
]
},
{
"cell_type": "markdown",
"id": "add38532",
"metadata": {},
"source": [
"## Chaining\n",
"\n",
"We can chain our completion model with a prompt template like so:"
]
},
{
"cell_type": "code",
"execution_count": 6,
"id": "078e9db2",
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Ich liebe Programmieren.\n",
"\n",
"To express that you are passionate about programming in German,\n"
]
}
],
"source": [
"import { PromptTemplate } from \"@langchain/core/prompts\"\n",
"\n",
"const prompt = PromptTemplate.fromTemplate(\"How to say {input} in {output_language}:\\n\")\n",
"\n",
"const chain = prompt.pipe(instance);\n",
"await chain.invoke(\n",
" {\n",
" output_language: \"German\",\n",
" input: \"I love programming.\",\n",
" }\n",
")"
]
},
{
"cell_type": "markdown",
"id": "0c305670",
"metadata": {},
"source": [
"## Props overwrittion\n",
FilipZmijewski marked this conversation as resolved.
Show resolved Hide resolved
"\n",
"Passed props at initialization will last for the whole life cycle of the object, however you may overwrite them for a single method's call by passing second argument as below\n"
]
},
{
"cell_type": "code",
"execution_count": 9,
"id": "bb53235c",
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"\n",
"print('Hello world.')<|endoftext|>\n"
]
}
],
"source": [
"const result = await instance.invoke(\"Print hello world.\", {\n",
" modelId: \"<NEW_MODEL_ID>\",\n",
" parameters: {\n",
" max_new_tokens: 20,\n",
" },\n",
" });\n",
" console.log(result);"
]
},
{
"cell_type": "markdown",
"id": "577a0583",
"metadata": {},
"source": [
"## Tokenization\n",
"This package has it's custom getNumTokens implementation which returns exact amount of tokens that would be used.\n"
]
},
{
"cell_type": "code",
"execution_count": 10,
"id": "339e237c",
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"4\n"
]
}
],
"source": [
"const tokens = await instance.getNumTokens(\"Print hello world.\");\n",
"console.log(tokens);"
]
},
{
"cell_type": "markdown",
"id": "e9bdfcef",
"metadata": {},
"source": [
"## API reference\n",
"\n",
"For detailed documentation of all `IBM watsonx.ai` features and configurations head to the API reference: [API docs](https://api.js.langchain.com/modules/_langchain_community.embeddings_ibm.html)"
]
}
],
"metadata": {
"kernelspec": {
"display_name": "JavaScript (Node.js)",
"language": "javascript",
"name": "javascript"
},
"language_info": {
"file_extension": ".js",
"mimetype": "application/javascript",
"name": "javascript",
"version": "20.17.0"
}
},
"nbformat": 4,
"nbformat_minor": 5
}
Loading