diff --git a/.github/FUNDING.yml b/.github/FUNDING.yml new file mode 100644 index 000000000..fa9d527b8 --- /dev/null +++ b/.github/FUNDING.yml @@ -0,0 +1,13 @@ +# These are supported funding model platforms + +github: [FlowiseAI] # Replace with up to 4 GitHub Sponsors-enabled usernames e.g., [user1, user2] +patreon: # Replace with a single Patreon username +open_collective: # Replace with a single Open Collective username +ko_fi: # Replace with a single Ko-fi username +tidelift: # Replace with a single Tidelift platform-name/package-name e.g., npm/babel +community_bridge: # Replace with a single Community Bridge project-name e.g., cloud-foundry +liberapay: # Replace with a single Liberapay username +issuehunt: # Replace with a single IssueHunt username +otechie: # Replace with a single Otechie username +lfx_crowdfunding: # Replace with a single LFX Crowdfunding project-name e.g., cloud-foundry +custom: # Replace with up to 4 custom sponsorship URLs e.g., ['link1', 'link2'] diff --git a/.github/ISSUE_TEMPLATE/bug_report.md b/.github/ISSUE_TEMPLATE/bug_report.md index b73185075..b8e2e8a5a 100644 --- a/.github/ISSUE_TEMPLATE/bug_report.md +++ b/.github/ISSUE_TEMPLATE/bug_report.md @@ -23,9 +23,14 @@ A clear and concise description of what you expected to happen. **Screenshots** If applicable, add screenshots to help explain your problem. +**Flow** +If applicable, add exported flow in order to help replicating the problem. + **Setup** -- OS: [e.g. iOS, Windows, Linux] +- Installation [e.g. docker, `npx flowise start`, `yarn start`] +- Flowise Version [e.g. 1.2.11] +- OS: [e.g. macOS, Windows, Linux] - Browser [e.g. chrome, safari] **Additional context** diff --git a/.github/workflows/main.yml b/.github/workflows/main.yml index a89846a62..759f195f7 100644 --- a/.github/workflows/main.yml +++ b/.github/workflows/main.yml @@ -3,7 +3,7 @@ name: Node CI on: push: branches: - - master + - main pull_request: branches: @@ -19,7 +19,8 @@ jobs: platform: [ubuntu-latest] node-version: [18.15.0] runs-on: ${{ matrix.platform }} - + env: + PUPPETEER_SKIP_DOWNLOAD: true steps: - uses: actions/checkout@v3 - name: Use Node.js ${{ matrix.node-version }} diff --git a/.github/workflows/test_docker_build.yml b/.github/workflows/test_docker_build.yml new file mode 100644 index 000000000..a27cf22dd --- /dev/null +++ b/.github/workflows/test_docker_build.yml @@ -0,0 +1,20 @@ +name: Test Docker Build + +on: + push: + branches: + - main + + pull_request: + branches: + - '*' + +jobs: + build: + runs-on: ubuntu-latest + env: + PUPPETEER_SKIP_DOWNLOAD: true + steps: + - uses: actions/checkout@v3 + + - run: docker build --no-cache -t flowise . diff --git a/.gitignore b/.gitignore index 9f5ef2e56..533f68a52 100644 --- a/.gitignore +++ b/.gitignore @@ -8,6 +8,7 @@ **/yarn.lock ## logs +**/logs **/*.log ## build @@ -42,4 +43,4 @@ **/uploads ## compressed -**/*.tgz \ No newline at end of file +**/*.tgz diff --git a/CODE_OF_CONDUCT-ZH.md b/CODE_OF_CONDUCT-ZH.md new file mode 100644 index 000000000..be6332ddb --- /dev/null +++ b/CODE_OF_CONDUCT-ZH.md @@ -0,0 +1,49 @@ + + +# 贡献者公约行为准则 + +[English](./CODE_OF_CONDUCT.md) | 中文 + +## 我们的承诺 + +为了促进一个开放和友好的环境,我们作为贡献者和维护者承诺,使参与我们的项目和社区的体验对每个人来说都是无骚扰的,无论年龄、体型、残疾、种族、性别认同和表达、经验水平、国籍、个人形象、种族、宗教或性取向如何。 + +## 我们的标准 + +有助于创建积极环境的行为示例包括: + +- 使用友好和包容性的语言 +- 尊重不同的观点和经验 +- 优雅地接受建设性的批评 +- 关注社区最有利的事情 +- 向其他社区成员表达同理心 + +参与者不可接受的行为示例包括: + +- 使用性暗示的语言或图像和不受欢迎的性关注或进展 +- 恶作剧、侮辱/贬低的评论和个人或政治攻击 +- 公开或私下骚扰 +- 未经明确许可发布他人的私人信息,如实际或电子地址 +- 在专业环境中可能被合理认为不适当的其他行为 + +## 我们的责任 + +项目维护者有责任明确可接受行为的标准,并预期对任何不可接受行为的情况采取适当和公正的纠正措施。 + +项目维护者有权和责任删除、编辑或拒绝不符合本行为准则的评论、提交、代码、维基编辑、问题和其他贡献,或者临时或永久禁止任何贡献者,如果他们认为其行为不适当、威胁、冒犯或有害。 + +## 适用范围 + +本行为准则适用于项目空间和公共空间,当个人代表项目或其社区时。代表项目或社区的示例包括使用官方项目电子邮件地址、通过官方社交媒体账号发布或在线或离线活动中担任指定代表。项目的代表可以由项目维护者进一步定义和澄清。 + +## 执法 + +可以通过联系项目团队 hello@flowiseai.com 来报告滥用、骚扰或其他不可接受的行为。所有投诉将经过审核和调查,并将得出视情况认为必要和适当的回应。项目团队有义务对事件举报人保持机密。具体执行政策的更多细节可能会单独发布。 + +如果项目维护者不诚信地遵守或执行行为准则,可能会面临其他项目领导成员决定的临时或永久的后果。 + +## 归属 + +该行为准则的内容来自于[贡献者公约](http://contributor-covenant.org/)1.4 版,可在[http://contributor-covenant.org/version/1/4](http://contributor-covenant.org/version/1/4)上获取。 + +[主页]: http://contributor-covenant.org diff --git a/CODE_OF_CONDUCT.md b/CODE_OF_CONDUCT.md index 7865b84e0..da7a51c66 100644 --- a/CODE_OF_CONDUCT.md +++ b/CODE_OF_CONDUCT.md @@ -1,5 +1,7 @@ # Contributor Covenant Code of Conduct +English | [中文](./CODE_OF_CONDUCT-ZH.md) + ## Our Pledge In the interest of fostering an open and welcoming environment, we as diff --git a/CONTRIBUTING-ZH.md b/CONTRIBUTING-ZH.md new file mode 100644 index 000000000..bec081f4d --- /dev/null +++ b/CONTRIBUTING-ZH.md @@ -0,0 +1,155 @@ + + +# 贡献给 Flowise + +[English](./CONTRIBUTING.md) | 中文 + +我们欢迎任何形式的贡献。 + +## ⭐ 点赞 + +点赞并分享[Github 仓库](https://github.com/FlowiseAI/Flowise)。 + +## 🙋 问题和回答 + +在[问题和回答](https://github.com/FlowiseAI/Flowise/discussions/categories/q-a)部分搜索任何问题,如果找不到,可以毫不犹豫地创建一个。这可能会帮助到其他有类似问题的人。 + +## 🙌 分享 Chatflow + +是的!分享你如何使用 Flowise 是一种贡献方式。将你的 Chatflow 导出为 JSON,附上截图并在[展示和分享](https://github.com/FlowiseAI/Flowise/discussions/categories/show-and-tell)部分分享。 + +## 💡 想法 + +欢迎各种想法,如新功能、应用集成和区块链网络。在[想法](https://github.com/FlowiseAI/Flowise/discussions/categories/ideas)部分提交。 + +## 🐞 报告错误 + +发现问题了吗?[报告它](https://github.com/FlowiseAI/Flowise/issues/new/choose)。 + +## 👨‍💻 贡献代码 + +不确定要贡献什么?一些想法: + +- 从 Langchain 创建新组件 +- 更新现有组件,如扩展功能、修复错误 +- 添加新的 Chatflow 想法 + +### 开发人员 + +Flowise 在一个单一的单体存储库中有 3 个不同的模块。 + +- `server`:用于提供 API 逻辑的 Node 后端 +- `ui`:React 前端 +- `components`:Langchain 组件 + +#### 先决条件 + +- 安装 [Yarn v1](https://classic.yarnpkg.com/en/docs/install) + ```bash + npm i -g yarn + ``` + +#### 逐步指南 + +1. Fork 官方的[Flowise Github 仓库](https://github.com/FlowiseAI/Flowise)。 + +2. 克隆你 fork 的存储库。 + +3. 创建一个新的分支,参考[指南](https://docs.github.com/en/pull-requests/collaborating-with-pull-requests/proposing-changes-to-your-work-with-pull-requests/creating-and-deleting-branches-within-your-repository)。命名约定: + + - 对于功能分支:`feature/<你的新功能>` + - 对于 bug 修复分支:`bugfix/<你的新bug修复>`。 + +4. 切换到新创建的分支。 + +5. 进入存储库文件夹 + + ```bash + cd Flowise + ``` + +6. 安装所有模块的依赖项: + + ```bash + yarn install + ``` + +7. 构建所有代码: + + ```bash + yarn build + ``` + +8. 在[http://localhost:3000](http://localhost:3000)上启动应用程序 + + ```bash + yarn start + ``` + +9. 开发时: + + - 在`packages/ui`中创建`.env`文件并指定`PORT`(参考`.env.example`) + - 在`packages/server`中创建`.env`文件并指定`PORT`(参考`.env.example`) + - 运行 + + ```bash + yarn dev + ``` + + 对`packages/ui`或`packages/server`进行的任何更改都将反映在[http://localhost:8080](http://localhost:8080)上 + + 对于`packages/components`中进行的更改,再次运行`yarn build`以应用更改。 + +10. 做完所有的更改后,运行以下命令来确保在生产环境中一切正常: + + ```bash + yarn build + ``` + + 和 + + ```bash + yarn start + ``` + +11. 提交代码并从指向 [Flowise 主分支](https://github.com/FlowiseAI/Flowise/tree/master) 的分叉分支上提交 Pull Request。 + +## 🌱 环境变量 + +Flowise 支持不同的环境变量来配置您的实例。您可以在 `packages/server` 文件夹中的 `.env` 文件中指定以下变量。阅读[更多信息](https://docs.flowiseai.com/environment-variables) + +| 变量名 | 描述 | 类型 | 默认值 | +| -------------------------- | ------------------------------------------------------ | ----------------------------------------------- | ----------------------------------- | +| PORT | Flowise 运行的 HTTP 端口 | 数字 | 3000 | +| FLOWISE_USERNAME | 登录用户名 | 字符串 | | +| FLOWISE_PASSWORD | 登录密码 | 字符串 | | +| DEBUG | 打印组件的日志 | 布尔值 | | +| LOG_PATH | 存储日志文件的位置 | 字符串 | `your-path/Flowise/logs` | +| LOG_LEVEL | 日志的不同级别 | 枚举字符串: `error`, `info`, `verbose`, `debug` | `info` | +| APIKEY_PATH | 存储 API 密钥的位置 | 字符串 | `your-path/Flowise/packages/server` | +| TOOL_FUNCTION_BUILTIN_DEP | 用于工具函数的 NodeJS 内置模块 | 字符串 | | +| TOOL_FUNCTION_EXTERNAL_DEP | 用于工具函数的外部模块 | 字符串 | | +| OVERRIDE_DATABASE | 是否使用默认值覆盖当前数据库 | 枚举字符串: `true`, `false` | `true` | +| DATABASE_TYPE | 存储 flowise 数据的数据库类型 | 枚举字符串: `sqlite`, `mysql`, `postgres` | `sqlite` | +| DATABASE_PATH | 数据库保存的位置(当 DATABASE_TYPE 是 sqlite 时) | 字符串 | `your-home-dir/.flowise` | +| DATABASE_HOST | 主机 URL 或 IP 地址(当 DATABASE_TYPE 不是 sqlite 时) | 字符串 | | +| DATABASE_PORT | 数据库端口(当 DATABASE_TYPE 不是 sqlite 时) | 字符串 | | +| DATABASE_USERNAME | 数据库用户名(当 DATABASE_TYPE 不是 sqlite 时) | 字符串 | | +| DATABASE_PASSWORD | 数据库密码(当 DATABASE_TYPE 不是 sqlite 时) | 字符串 | | +| DATABASE_NAME | 数据库名称(当 DATABASE_TYPE 不是 sqlite 时) | 字符串 | | + +您也可以在使用 `npx` 时指定环境变量。例如: + +``` +npx flowise start --PORT=3000 --DEBUG=true +``` + +## 📖 贡献文档 + +[Flowise 文档](https://github.com/FlowiseAI/FlowiseDocs) + +## 🏷️ Pull Request 流程 + +当您打开一个 Pull Request 时,FlowiseAI 团队的成员将自动收到通知/指派。您也可以在 [Discord](https://discord.gg/jbaHfsRVBW) 上联系我们。 + +## diff --git a/CONTRIBUTING.md b/CONTRIBUTING.md index a09051f32..90ba5498d 100644 --- a/CONTRIBUTING.md +++ b/CONTRIBUTING.md @@ -2,6 +2,8 @@ # Contributing to Flowise +English | [中文](./CONTRIBUTING-ZH.md) + We appreciate any form of contributions. ## ⭐ Star @@ -42,7 +44,7 @@ Flowise has 3 different modules in a single mono repository. #### Prerequisite -- Install Yarn +- Install [Yarn v1](https://classic.yarnpkg.com/en/docs/install) ```bash npm i -g yarn ``` @@ -84,7 +86,11 @@ Flowise has 3 different modules in a single mono repository. yarn start ``` -9. For development, run +9. For development: + + - Create `.env` file and specify the `PORT` (refer to `.env.example`) in `packages/ui` + - Create `.env` file and specify the `PORT` (refer to `.env.example`) in `packages/server` + - Run ```bash yarn dev @@ -110,9 +116,41 @@ Flowise has 3 different modules in a single mono repository. 11. Commit code and submit Pull Request from forked branch pointing to [Flowise master](https://github.com/FlowiseAI/Flowise/tree/master). +## 🌱 Env Variables + +Flowise support different environment variables to configure your instance. You can specify the following variables in the `.env` file inside `packages/server` folder. Read [more](https://docs.flowiseai.com/environment-variables) + +| Variable | Description | Type | Default | +| -------------------------- | ---------------------------------------------------------------------------- | ------------------------------------------------ | ----------------------------------- | +| PORT | The HTTP port Flowise runs on | Number | 3000 | +| FLOWISE_USERNAME | Username to login | String | | +| FLOWISE_PASSWORD | Password to login | String | | +| DEBUG | Print logs from components | Boolean | | +| LOG_PATH | Location where log files are stored | String | `your-path/Flowise/logs` | +| LOG_LEVEL | Different levels of logs | Enum String: `error`, `info`, `verbose`, `debug` | `info` | +| APIKEY_PATH | Location where api keys are saved | String | `your-path/Flowise/packages/server` | +| TOOL_FUNCTION_BUILTIN_DEP | NodeJS built-in modules to be used for Tool Function | String | | +| TOOL_FUNCTION_EXTERNAL_DEP | External modules to be used for Tool Function | String | | +| OVERRIDE_DATABASE | Override current database with default | Enum String: `true`, `false` | `true` | +| DATABASE_TYPE | Type of database to store the flowise data | Enum String: `sqlite`, `mysql`, `postgres` | `sqlite` | +| DATABASE_PATH | Location where database is saved (When DATABASE_TYPE is sqlite) | String | `your-home-dir/.flowise` | +| DATABASE_HOST | Host URL or IP address (When DATABASE_TYPE is not sqlite) | String | | +| DATABASE_PORT | Database port (When DATABASE_TYPE is not sqlite) | String | | +| DATABASE_USER | Database username (When DATABASE_TYPE is not sqlite) | String | | +| DATABASE_PASSWORD | Database password (When DATABASE_TYPE is not sqlite) | String | | +| DATABASE_NAME | Database name (When DATABASE_TYPE is not sqlite) | String | | +| PASSPHRASE | Passphrase used to create encryption key | String | `MYPASSPHRASE` | +| SECRETKEY_PATH | Location where encryption key (used to encrypt/decrypt credentials) is saved | String | `your-path/Flowise/packages/server` | + +You can also specify the env variables when using `npx`. For example: + +``` +npx flowise start --PORT=3000 --DEBUG=true +``` + ## 📖 Contribute to Docs -In-Progress +[Flowise Docs](https://github.com/FlowiseAI/FlowiseDocs) ## 🏷️ Pull Request process diff --git a/Dockerfile b/Dockerfile index 315b4739f..e485cd3ef 100644 --- a/Dockerfile +++ b/Dockerfile @@ -1,14 +1,24 @@ # Build local monorepo image # docker build --no-cache -t flowise . + # Run image # docker run -d -p 3000:3000 flowise + FROM node:18-alpine +RUN apk add --update libc6-compat python3 make g++ +# needed for pdfjs-dist +RUN apk add --no-cache build-base cairo-dev pango-dev + +# Install Chromium +RUN apk add --no-cache chromium + +ENV PUPPETEER_SKIP_DOWNLOAD=true +ENV PUPPETEER_EXECUTABLE_PATH=/usr/bin/chromium-browser WORKDIR /usr/src/packages # Copy root package.json and lockfile -COPY package.json ./ -COPY yarn.lock ./ +COPY package.json yarn.loc[k] ./ # Copy components package.json COPY packages/components/package.json ./packages/components/package.json diff --git a/README-ZH.md b/README-ZH.md new file mode 100644 index 000000000..e0eb9de26 --- /dev/null +++ b/README-ZH.md @@ -0,0 +1,188 @@ + + + + +# Flowise - 轻松构建 LLM 应用程序 + +[![发布说明](https://img.shields.io/github/release/FlowiseAI/Flowise)](https://github.com/FlowiseAI/Flowise/releases) +[![Discord](https://img.shields.io/discord/1087698854775881778?label=Discord&logo=discord)](https://discord.gg/jbaHfsRVBW) +[![Twitter关注](https://img.shields.io/twitter/follow/FlowiseAI?style=social)](https://twitter.com/FlowiseAI) +[![GitHub星图](https://img.shields.io/github/stars/FlowiseAI/Flowise?style=social)](https://star-history.com/#FlowiseAI/Flowise) +[![GitHub分支](https://img.shields.io/github/forks/FlowiseAI/Flowise?style=social)](https://github.com/FlowiseAI/Flowise/fork) + +[English](./README.md) | 中文 + +

拖放界面构建定制化的LLM流程

+ + + +## ⚡ 快速入门 + +下载并安装 [NodeJS](https://nodejs.org/en/download) >= 18.15.0 + +1. 安装 Flowise + ```bash + npm install -g flowise + ``` +2. 启动 Flowise + + ```bash + npx flowise start + ``` + + 使用用户名和密码 + + ```bash + npx flowise start --FLOWISE_USERNAME=user --FLOWISE_PASSWORD=1234 + ``` + +3. 打开 [http://localhost:3000](http://localhost:3000) + +## 🐳 Docker + +### Docker Compose + +1. 进入项目根目录下的 `docker` 文件夹 +2. 创建 `.env` 文件并指定 `PORT`(参考 `.env.example`) +3. 运行 `docker-compose up -d` +4. 打开 [http://localhost:3000](http://localhost:3000) +5. 可以通过 `docker-compose stop` 停止容器 + +### Docker 镜像 + +1. 本地构建镜像: + ```bash + docker build --no-cache -t flowise . + ``` +2. 运行镜像: + + ```bash + docker run -d --name flowise -p 3000:3000 flowise + ``` + +3. 停止镜像: + ```bash + docker stop flowise + ``` + +## 👨‍💻 开发者 + +Flowise 在一个单一的代码库中有 3 个不同的模块。 + +- `server`:用于提供 API 逻辑的 Node 后端 +- `ui`:React 前端 +- `components`:Langchain 组件 + +### 先决条件 + +- 安装 [Yarn v1](https://classic.yarnpkg.com/en/docs/install) + ```bash + npm i -g yarn + ``` + +### 设置 + +1. 克隆仓库 + + ```bash + git clone https://github.com/FlowiseAI/Flowise.git + ``` + +2. 进入仓库文件夹 + + ```bash + cd Flowise + ``` + +3. 安装所有模块的依赖: + + ```bash + yarn install + ``` + +4. 构建所有代码: + + ```bash + yarn build + ``` + +5. 启动应用: + + ```bash + yarn start + ``` + + 现在可以在 [http://localhost:3000](http://localhost:3000) 访问应用 + +6. 用于开发构建: + + - 在 `packages/ui` 中创建 `.env` 文件并指定 `PORT`(参考 `.env.example`) + - 在 `packages/server` 中创建 `.env` 文件并指定 `PORT`(参考 `.env.example`) + - 运行 + + ```bash + yarn dev + ``` + + 任何代码更改都会自动重新加载应用程序,访问 [http://localhost:8080](http://localhost:8080) + +## 🔒 认证 + +要启用应用程序级身份验证,在 `packages/server` 的 `.env` 文件中添加 `FLOWISE_USERNAME` 和 `FLOWISE_PASSWORD`: + +``` +FLOWISE_USERNAME=user +FLOWISE_PASSWORD=1234 +``` + +## 🌱 环境变量 + +Flowise 支持不同的环境变量来配置您的实例。您可以在 `packages/server` 文件夹中的 `.env` 文件中指定以下变量。了解更多信息,请阅读[文档](https://github.com/FlowiseAI/Flowise/blob/main/CONTRIBUTING.md#-env-variables) + +## 📖 文档 + +[Flowise 文档](https://docs.flowiseai.com/) + +## 🌐 自托管 + +### [Railway](https://docs.flowiseai.com/deployment/railway) + +[![在 Railway 上部署](https://railway.app/button.svg)](https://railway.app/template/pn4G8S?referralCode=WVNPD9) + +### [Render](https://docs.flowiseai.com/deployment/render) + +[![部署到 Render](https://render.com/images/deploy-to-render-button.svg)](https://docs.flowiseai.com/deployment/render) + +### [HuggingFace Spaces](https://docs.flowiseai.com/deployment/hugging-face) + +HuggingFace Spaces + +### [AWS](https://docs.flowiseai.com/deployment/aws) + +### [Azure](https://docs.flowiseai.com/deployment/azure) + +### [DigitalOcean](https://docs.flowiseai.com/deployment/digital-ocean) + +### [GCP](https://docs.flowiseai.com/deployment/gcp) + +## 💻 云托管 + +即将推出 + +## 🙋 支持 + +在[讨论区](https://github.com/FlowiseAI/Flowise/discussions)中随时提问、提出问题和请求新功能 + +## 🙌 贡献 + +感谢这些了不起的贡献者 + + + + + +参见[贡献指南](CONTRIBUTING.md)。如果您有任何问题或问题,请在[Discord](https://discord.gg/jbaHfsRVBW)上与我们联系。 + +## 📄 许可证 + +此代码库中的源代码在[MIT 许可证](LICENSE.md)下提供。 diff --git a/README.md b/README.md index 545b36ba8..b98a223a6 100644 --- a/README.md +++ b/README.md @@ -1,14 +1,25 @@ -# Flowise - LangchainJS UI + +# Flowise - Build LLM Apps Easily + +[![Release Notes](https://img.shields.io/github/release/FlowiseAI/Flowise)](https://github.com/FlowiseAI/Flowise/releases) +[![Discord](https://img.shields.io/discord/1087698854775881778?label=Discord&logo=discord)](https://discord.gg/jbaHfsRVBW) +[![Twitter Follow](https://img.shields.io/twitter/follow/FlowiseAI?style=social)](https://twitter.com/FlowiseAI) +[![GitHub star chart](https://img.shields.io/github/stars/FlowiseAI/Flowise?style=social)](https://star-history.com/#FlowiseAI/Flowise) +[![GitHub fork](https://img.shields.io/github/forks/FlowiseAI/Flowise?style=social)](https://github.com/FlowiseAI/Flowise/fork) + +English | [中文](./README-ZH.md) + +

Drag & drop UI to build your customized LLM flow

-Drag & drop UI to build your customized LLM flow using [LangchainJS](https://github.com/hwchase17/langchainjs) - ## ⚡Quick Start +Download and Install [NodeJS](https://nodejs.org/en/download) >= 18.15.0 + 1. Install Flowise ```bash npm install -g flowise @@ -19,16 +30,41 @@ Drag & drop UI to build your customized LLM flow using [LangchainJS](https://git npx flowise start ``` + With username & password + + ```bash + npx flowise start --FLOWISE_USERNAME=user --FLOWISE_PASSWORD=1234 + ``` + 3. Open [http://localhost:3000](http://localhost:3000) ## 🐳 Docker +### Docker Compose + 1. Go to `docker` folder at the root of the project -2. Create `.env` file and specify the `PORT` (refer to `.env.example`) +2. Copy `.env.example` file, paste it into the same location, and rename to `.env` 3. `docker-compose up -d` 4. Open [http://localhost:3000](http://localhost:3000) 5. You can bring the containers down by `docker-compose stop` +### Docker Image + +1. Build the image locally: + ```bash + docker build --no-cache -t flowise . + ``` +2. Run image: + + ```bash + docker run -d --name flowise -p 3000:3000 flowise + ``` + +3. Stop image: + ```bash + docker stop flowise + ``` + ## 👨‍💻 Developers Flowise has 3 different modules in a single mono repository. @@ -39,7 +75,7 @@ Flowise has 3 different modules in a single mono repository. ### Prerequisite -- Install Yarn +- Install [Yarn v1](https://classic.yarnpkg.com/en/docs/install) ```bash npm i -g yarn ``` @@ -80,31 +116,57 @@ Flowise has 3 different modules in a single mono repository. 6. For development build: - ```bash - yarn dev - ``` + - Create `.env` file and specify the `PORT` (refer to `.env.example`) in `packages/ui` + - Create `.env` file and specify the `PORT` (refer to `.env.example`) in `packages/server` + - Run + + ```bash + yarn dev + ``` Any code changes will reload the app automatically on [http://localhost:8080](http://localhost:8080) ## 🔒 Authentication -To enable app level authentication, add `USERNAME` and `PASSWORD` to the `.env` file in `packages/server`: +To enable app level authentication, add `FLOWISE_USERNAME` and `FLOWISE_PASSWORD` to the `.env` file in `packages/server`: ``` -USERNAME=user -PASSWORD=1234 +FLOWISE_USERNAME=user +FLOWISE_PASSWORD=1234 ``` +## 🌱 Env Variables + +Flowise support different environment variables to configure your instance. You can specify the following variables in the `.env` file inside `packages/server` folder. Read [more](https://github.com/FlowiseAI/Flowise/blob/main/CONTRIBUTING.md#-env-variables) + ## 📖 Documentation -Coming soon - -## 💻 Cloud Hosted - -Coming soon +[Flowise Docs](https://docs.flowiseai.com/) ## 🌐 Self Host +### [Railway](https://docs.flowiseai.com/deployment/railway) + +[![Deploy on Railway](https://railway.app/button.svg)](https://railway.app/template/pn4G8S?referralCode=WVNPD9) + +### [Render](https://docs.flowiseai.com/deployment/render) + +[![Deploy to Render](https://render.com/images/deploy-to-render-button.svg)](https://docs.flowiseai.com/deployment/render) + +### [HuggingFace Spaces](https://docs.flowiseai.com/deployment/hugging-face) + +HuggingFace Spaces + +### [AWS](https://docs.flowiseai.com/deployment/aws) + +### [Azure](https://docs.flowiseai.com/deployment/azure) + +### [DigitalOcean](https://docs.flowiseai.com/deployment/digital-ocean) + +### [GCP](https://docs.flowiseai.com/deployment/gcp) + +## 💻 Cloud Hosted + Coming soon ## 🙋 Support @@ -113,7 +175,14 @@ Feel free to ask any questions, raise problems, and request new features in [dis ## 🙌 Contributing +Thanks go to these awesome contributors + + + + + See [contributing guide](CONTRIBUTING.md). Reach out to us at [Discord](https://discord.gg/jbaHfsRVBW) if you have any questions or issues. +[![Star History Chart](https://api.star-history.com/svg?repos=FlowiseAI/Flowise&type=Timeline)](https://star-history.com/#FlowiseAI/Flowise&Date) ## 📄 License diff --git a/artillery-load-test.yml b/artillery-load-test.yml new file mode 100644 index 000000000..6b1c81401 --- /dev/null +++ b/artillery-load-test.yml @@ -0,0 +1,36 @@ +# npm install -g artillery@latest +# artillery run artillery-load-test.yml +# Refer https://www.artillery.io/docs + +config: + target: http://128.128.128.128:3000 # replace with your url + phases: + - duration: 1 + arrivalRate: 1 + rampTo: 2 + name: Warm up phase + - duration: 1 + arrivalRate: 2 + rampTo: 3 + name: Ramp up load + - duration: 1 + arrivalRate: 3 + name: Sustained peak load +scenarios: + - flow: + - loop: + - post: + url: '/api/v1/prediction/chatflow-id' # replace with your chatflowid + json: + question: 'hello' # replace with your question + count: 1 # how many request each user make + +# User __ +# 3 / +# 2 / +# 1 _/ +# 1 2 3 +# Seconds +# Total Users = 2 + 3 + 3 = 8 +# Each making 1 HTTP call +# Over a duration of 3 seconds diff --git a/docker/.env.example b/docker/.env.example index c0c68b1ca..16b19cdcf 100644 --- a/docker/.env.example +++ b/docker/.env.example @@ -1 +1,26 @@ -PORT=3000 \ No newline at end of file +PORT=3000 +PASSPHRASE=MYPASSPHRASE # Passphrase used to create encryption key +DATABASE_PATH=/root/.flowise +APIKEY_PATH=/root/.flowise +SECRETKEY_PATH=/root/.flowise +LOG_PATH=/root/.flowise/logs + +# DATABASE_TYPE=postgres +# DATABASE_PORT="" +# DATABASE_HOST="" +# DATABASE_NAME="flowise" +# DATABASE_USER="" +# DATABASE_PASSWORD="" +# OVERRIDE_DATABASE=true + +# FLOWISE_USERNAME=user +# FLOWISE_PASSWORD=1234 +# DEBUG=true +# LOG_LEVEL=debug (error | warn | info | verbose | debug) +# TOOL_FUNCTION_BUILTIN_DEP=crypto,fs +# TOOL_FUNCTION_EXTERNAL_DEP=moment,lodash + +# LANGCHAIN_TRACING_V2=true +# LANGCHAIN_ENDPOINT=https://api.smith.langchain.com +# LANGCHAIN_API_KEY=your_api_key +# LANGCHAIN_PROJECT=your_project \ No newline at end of file diff --git a/docker/Dockerfile b/docker/Dockerfile index e4bf704a0..1ad1bf5ee 100644 --- a/docker/Dockerfile +++ b/docker/Dockerfile @@ -4,6 +4,14 @@ USER root RUN apk add --no-cache git RUN apk add --no-cache python3 py3-pip make g++ +# needed for pdfjs-dist +RUN apk add --no-cache build-base cairo-dev pango-dev + +# Install Chromium +RUN apk add --no-cache chromium + +ENV PUPPETEER_SKIP_DOWNLOAD=true +ENV PUPPETEER_EXECUTABLE_PATH=/usr/bin/chromium-browser # You can install a specific version like: flowise@1.0.0 RUN npm install -g flowise diff --git a/docker/README.md b/docker/README.md new file mode 100644 index 000000000..d3ad1c197 --- /dev/null +++ b/docker/README.md @@ -0,0 +1,35 @@ +# Flowise Docker Hub Image + +Starts Flowise from [DockerHub Image](https://hub.docker.com/repository/docker/flowiseai/flowise/general) + +## Usage + +1. Create `.env` file and specify the `PORT` (refer to `.env.example`) +2. `docker-compose up -d` +3. Open [http://localhost:3000](http://localhost:3000) +4. You can bring the containers down by `docker-compose stop` + +## 🔒 Authentication + +1. Create `.env` file and specify the `PORT`, `FLOWISE_USERNAME`, and `FLOWISE_PASSWORD` (refer to `.env.example`) +2. Pass `FLOWISE_USERNAME` and `FLOWISE_PASSWORD` to the `docker-compose.yml` file: + ``` + environment: + - PORT=${PORT} + - FLOWISE_USERNAME=${FLOWISE_USERNAME} + - FLOWISE_PASSWORD=${FLOWISE_PASSWORD} + ``` +3. `docker-compose up -d` +4. Open [http://localhost:3000](http://localhost:3000) +5. You can bring the containers down by `docker-compose stop` + +## 🌱 Env Variables + +If you like to persist your data (flows, logs, apikeys, credentials), set these variables in the `.env` file inside `docker` folder: + +- DATABASE_PATH=/root/.flowise +- APIKEY_PATH=/root/.flowise +- LOG_PATH=/root/.flowise/logs +- SECRETKEY_PATH=/root/.flowise + +Flowise also support different environment variables to configure your instance. Read [more](https://docs.flowiseai.com/environment-variables) diff --git a/docker/docker-compose.yml b/docker/docker-compose.yml index 7d142cb8a..4a03bcf33 100644 --- a/docker/docker-compose.yml +++ b/docker/docker-compose.yml @@ -6,6 +6,15 @@ services: restart: always environment: - PORT=${PORT} + - PASSPHRASE=${PASSPHRASE} + - FLOWISE_USERNAME=${FLOWISE_USERNAME} + - FLOWISE_PASSWORD=${FLOWISE_PASSWORD} + - DEBUG=${DEBUG} + - DATABASE_PATH=${DATABASE_PATH} + - APIKEY_PATH=${APIKEY_PATH} + - SECRETKEY_PATH=${SECRETKEY_PATH} + - LOG_LEVEL=${LOG_LEVEL} + - LOG_PATH=${LOG_PATH} ports: - '${PORT}:${PORT}' volumes: diff --git a/images/flowise.png b/images/flowise.png new file mode 100644 index 000000000..09c71fde2 Binary files /dev/null and b/images/flowise.png differ diff --git a/package.json b/package.json index 50c59a308..d5af440d2 100644 --- a/package.json +++ b/package.json @@ -1,6 +1,6 @@ { "name": "flowise", - "version": "1.2.6", + "version": "1.3.3", "private": true, "homepage": "https://flowiseai.com", "workspaces": [ @@ -28,7 +28,6 @@ "*.{js,jsx,ts,tsx,json,md}": "eslint --fix" }, "devDependencies": { - "turbo": "1.7.4", "@babel/preset-env": "^7.19.4", "@babel/preset-typescript": "7.18.6", "@types/express": "^4.17.13", @@ -48,6 +47,7 @@ "pretty-quick": "^3.1.3", "rimraf": "^3.0.2", "run-script-os": "^1.1.6", + "turbo": "1.7.4", "typescript": "^4.8.4" }, "engines": { diff --git a/packages/components/.env.example b/packages/components/.env.example deleted file mode 100644 index 352bc6cb0..000000000 --- a/packages/components/.env.example +++ /dev/null @@ -1 +0,0 @@ -DEBUG=true \ No newline at end of file diff --git a/packages/components/README-ZH.md b/packages/components/README-ZH.md new file mode 100644 index 000000000..2a8ba4ac5 --- /dev/null +++ b/packages/components/README-ZH.md @@ -0,0 +1,19 @@ + + +# 流式组件 + +[English](./README.md) | 中文 + +Flowise 的应用集成。包含节点和凭据。 + +![Flowise](https://github.com/FlowiseAI/Flowise/blob/main/images/flowise.gif?raw=true) + +安装: + +```bash +npm i flowise-components +``` + +## 许可证 + +此存储库中的源代码在[MIT 许可证](https://github.com/FlowiseAI/Flowise/blob/master/LICENSE.md)下提供。 diff --git a/packages/components/README.md b/packages/components/README.md index 8014661e1..848071882 100644 --- a/packages/components/README.md +++ b/packages/components/README.md @@ -2,6 +2,8 @@ # Flowise Components +English | [中文](./README-ZH.md) + Apps integration for Flowise. Contain Nodes and Credentials. ![Flowise](https://github.com/FlowiseAI/Flowise/blob/main/images/flowise.gif?raw=true) @@ -12,14 +14,6 @@ Install: npm i flowise-components ``` -## Debug - -To view all the logs, create an `.env` file and add: - -``` -DEBUG=true -``` - ## License Source code in this repository is made available under the [MIT License](https://github.com/FlowiseAI/Flowise/blob/master/LICENSE.md). diff --git a/packages/components/credentials/AirtableApi.credential.ts b/packages/components/credentials/AirtableApi.credential.ts new file mode 100644 index 000000000..323b308f3 --- /dev/null +++ b/packages/components/credentials/AirtableApi.credential.ts @@ -0,0 +1,27 @@ +import { INodeParams, INodeCredential } from '../src/Interface' + +class AirtableApi implements INodeCredential { + label: string + name: string + version: number + description: string + inputs: INodeParams[] + + constructor() { + this.label = 'Airtable API' + this.name = 'airtableApi' + this.version = 1.0 + this.description = + 'Refer to official guide on how to get accessToken on Airtable' + this.inputs = [ + { + label: 'Access Token', + name: 'accessToken', + type: 'password', + placeholder: '' + } + ] + } +} + +module.exports = { credClass: AirtableApi } diff --git a/packages/components/credentials/AnthropicApi.credential.ts b/packages/components/credentials/AnthropicApi.credential.ts new file mode 100644 index 000000000..955196c9b --- /dev/null +++ b/packages/components/credentials/AnthropicApi.credential.ts @@ -0,0 +1,23 @@ +import { INodeParams, INodeCredential } from '../src/Interface' + +class AnthropicApi implements INodeCredential { + label: string + name: string + version: number + inputs: INodeParams[] + + constructor() { + this.label = 'Anthropic API' + this.name = 'anthropicApi' + this.version = 1.0 + this.inputs = [ + { + label: 'Anthropic Api Key', + name: 'anthropicApiKey', + type: 'password' + } + ] + } +} + +module.exports = { credClass: AnthropicApi } diff --git a/packages/components/credentials/ApifyApi.credential.ts b/packages/components/credentials/ApifyApi.credential.ts new file mode 100644 index 000000000..c961fd385 --- /dev/null +++ b/packages/components/credentials/ApifyApi.credential.ts @@ -0,0 +1,26 @@ +import { INodeParams, INodeCredential } from '../src/Interface' + +class ApifyApiCredential implements INodeCredential { + label: string + name: string + version: number + description: string + inputs: INodeParams[] + + constructor() { + this.label = 'Apify API' + this.name = 'apifyApi' + this.version = 1.0 + this.description = + 'You can find the Apify API token on your Apify account page.' + this.inputs = [ + { + label: 'Apify API', + name: 'apifyApiToken', + type: 'password' + } + ] + } +} + +module.exports = { credClass: ApifyApiCredential } diff --git a/packages/components/credentials/AzureOpenAIApi.credential.ts b/packages/components/credentials/AzureOpenAIApi.credential.ts new file mode 100644 index 000000000..65f63f379 --- /dev/null +++ b/packages/components/credentials/AzureOpenAIApi.credential.ts @@ -0,0 +1,47 @@ +import { INodeParams, INodeCredential } from '../src/Interface' + +class AzureOpenAIApi implements INodeCredential { + label: string + name: string + version: number + description: string + inputs: INodeParams[] + + constructor() { + this.label = 'Azure OpenAI API' + this.name = 'azureOpenAIApi' + this.version = 1.0 + this.description = + 'Refer to official guide of how to use Azure OpenAI service' + this.inputs = [ + { + label: 'Azure OpenAI Api Key', + name: 'azureOpenAIApiKey', + type: 'password', + description: `Refer to official guide on how to create API key on Azure OpenAI` + }, + { + label: 'Azure OpenAI Api Instance Name', + name: 'azureOpenAIApiInstanceName', + type: 'string', + placeholder: 'YOUR-INSTANCE-NAME' + }, + { + label: 'Azure OpenAI Api Deployment Name', + name: 'azureOpenAIApiDeploymentName', + type: 'string', + placeholder: 'YOUR-DEPLOYMENT-NAME' + }, + { + label: 'Azure OpenAI Api Version', + name: 'azureOpenAIApiVersion', + type: 'string', + placeholder: '2023-06-01-preview', + description: + 'Description of Supported API Versions. Please refer examples' + } + ] + } +} + +module.exports = { credClass: AzureOpenAIApi } diff --git a/packages/components/credentials/BraveSearchApi.credential.ts b/packages/components/credentials/BraveSearchApi.credential.ts new file mode 100644 index 000000000..fdacf82c1 --- /dev/null +++ b/packages/components/credentials/BraveSearchApi.credential.ts @@ -0,0 +1,24 @@ +import { INodeParams, INodeCredential } from '../src/Interface' + +class BraveSearchApi implements INodeCredential { + label: string + name: string + version: number + description: string + inputs: INodeParams[] + + constructor() { + this.label = 'Brave Search API' + this.name = 'braveSearchApi' + this.version = 1.0 + this.inputs = [ + { + label: 'BraveSearch Api Key', + name: 'braveApiKey', + type: 'password' + } + ] + } +} + +module.exports = { credClass: BraveSearchApi } diff --git a/packages/components/credentials/ChromaApi.credential.ts b/packages/components/credentials/ChromaApi.credential.ts new file mode 100644 index 000000000..759c113cd --- /dev/null +++ b/packages/components/credentials/ChromaApi.credential.ts @@ -0,0 +1,24 @@ +import { INodeParams, INodeCredential } from '../src/Interface' + +class ChromaApi implements INodeCredential { + label: string + name: string + description: string + version: number + inputs: INodeParams[] + + constructor() { + this.label = 'Chroma API' + this.name = 'chromaApi' + this.version = 1.0 + this.inputs = [ + { + label: 'Chroma Api Key', + name: 'chromaApiKey', + type: 'password' + } + ] + } +} + +module.exports = { credClass: ChromaApi } diff --git a/packages/components/credentials/CohereApi.credential.ts b/packages/components/credentials/CohereApi.credential.ts new file mode 100644 index 000000000..b171090e5 --- /dev/null +++ b/packages/components/credentials/CohereApi.credential.ts @@ -0,0 +1,23 @@ +import { INodeParams, INodeCredential } from '../src/Interface' + +class CohereApi implements INodeCredential { + label: string + name: string + version: number + inputs: INodeParams[] + + constructor() { + this.label = 'Cohere API' + this.name = 'cohereApi' + this.version = 1.0 + this.inputs = [ + { + label: 'Cohere Api Key', + name: 'cohereApiKey', + type: 'password' + } + ] + } +} + +module.exports = { credClass: CohereApi } diff --git a/packages/components/credentials/ConfluenceApi.credential.ts b/packages/components/credentials/ConfluenceApi.credential.ts new file mode 100644 index 000000000..a1d32e9ca --- /dev/null +++ b/packages/components/credentials/ConfluenceApi.credential.ts @@ -0,0 +1,33 @@ +import { INodeParams, INodeCredential } from '../src/Interface' + +class ConfluenceApi implements INodeCredential { + label: string + name: string + version: number + description: string + inputs: INodeParams[] + + constructor() { + this.label = 'Confluence API' + this.name = 'confluenceApi' + this.version = 1.0 + this.description = + 'Refer to official guide on how to get accessToken on Confluence' + this.inputs = [ + { + label: 'Access Token', + name: 'accessToken', + type: 'password', + placeholder: '' + }, + { + label: 'Username', + name: 'username', + type: 'string', + placeholder: '' + } + ] + } +} + +module.exports = { credClass: ConfluenceApi } diff --git a/packages/components/credentials/DynamodbMemoryApi.credential.ts b/packages/components/credentials/DynamodbMemoryApi.credential.ts new file mode 100644 index 000000000..2f5ffa64c --- /dev/null +++ b/packages/components/credentials/DynamodbMemoryApi.credential.ts @@ -0,0 +1,29 @@ +import { INodeParams, INodeCredential } from '../src/Interface' + +class DynamodbMemoryApi implements INodeCredential { + label: string + name: string + version: number + description: string + inputs: INodeParams[] + + constructor() { + this.label = 'DynamodbMemory API' + this.name = 'dynamodbMemoryApi' + this.version = 1.0 + this.inputs = [ + { + label: 'Access Key', + name: 'accessKey', + type: 'password' + }, + { + label: 'Secret Access Key', + name: 'secretAccessKey', + type: 'password' + } + ] + } +} + +module.exports = { credClass: DynamodbMemoryApi } diff --git a/packages/components/credentials/FigmaApi.credential.ts b/packages/components/credentials/FigmaApi.credential.ts new file mode 100644 index 000000000..aed49359e --- /dev/null +++ b/packages/components/credentials/FigmaApi.credential.ts @@ -0,0 +1,27 @@ +import { INodeParams, INodeCredential } from '../src/Interface' + +class FigmaApi implements INodeCredential { + label: string + name: string + version: number + description: string + inputs: INodeParams[] + + constructor() { + this.label = 'Figma API' + this.name = 'figmaApi' + this.version = 1.0 + this.description = + 'Refer to official guide on how to get accessToken on Figma' + this.inputs = [ + { + label: 'Access Token', + name: 'accessToken', + type: 'password', + placeholder: '' + } + ] + } +} + +module.exports = { credClass: FigmaApi } diff --git a/packages/components/credentials/GithubApi.credential.ts b/packages/components/credentials/GithubApi.credential.ts new file mode 100644 index 000000000..34c5074ec --- /dev/null +++ b/packages/components/credentials/GithubApi.credential.ts @@ -0,0 +1,27 @@ +import { INodeParams, INodeCredential } from '../src/Interface' + +class GithubApi implements INodeCredential { + label: string + name: string + version: number + description: string + inputs: INodeParams[] + + constructor() { + this.label = 'Github API' + this.name = 'githubApi' + this.version = 1.0 + this.description = + 'Refer to official guide on how to get accessToken on Github' + this.inputs = [ + { + label: 'Access Token', + name: 'accessToken', + type: 'password', + placeholder: '' + } + ] + } +} + +module.exports = { credClass: GithubApi } diff --git a/packages/components/credentials/GoogleAuth.credential.ts b/packages/components/credentials/GoogleAuth.credential.ts new file mode 100644 index 000000000..16b8e3b3f --- /dev/null +++ b/packages/components/credentials/GoogleAuth.credential.ts @@ -0,0 +1,55 @@ +import { INodeParams, INodeCredential } from '../src/Interface' + +class GoogleVertexAuth implements INodeCredential { + label: string + name: string + version: number + inputs: INodeParams[] + + constructor() { + this.label = 'Google Vertex Auth' + this.name = 'googleVertexAuth' + this.version = 1.0 + this.inputs = [ + { + label: 'Google Application Credential File Path', + name: 'googleApplicationCredentialFilePath', + description: + 'Path to your google application credential json file. You can also use the credential JSON object (either one)', + placeholder: 'your-path/application_default_credentials.json', + type: 'string', + optional: true + }, + { + label: 'Google Credential JSON Object', + name: 'googleApplicationCredential', + description: 'JSON object of your google application credential. You can also use the file path (either one)', + placeholder: `{ + "type": ..., + "project_id": ..., + "private_key_id": ..., + "private_key": ..., + "client_email": ..., + "client_id": ..., + "auth_uri": ..., + "token_uri": ..., + "auth_provider_x509_cert_url": ..., + "client_x509_cert_url": ... +}`, + type: 'string', + rows: 4, + optional: true + }, + { + label: 'Project ID', + name: 'projectID', + description: 'Project ID of GCP. If not provided, it will be read from the credential file', + type: 'string', + optional: true, + additionalParams: true + } + ] + } +} + +module.exports = { credClass: GoogleVertexAuth } diff --git a/packages/components/credentials/GoogleSearchApi.credential.ts b/packages/components/credentials/GoogleSearchApi.credential.ts new file mode 100644 index 000000000..cb82b25ae --- /dev/null +++ b/packages/components/credentials/GoogleSearchApi.credential.ts @@ -0,0 +1,31 @@ +import { INodeParams, INodeCredential } from '../src/Interface' + +class GoogleSearchApi implements INodeCredential { + label: string + name: string + version: number + description: string + inputs: INodeParams[] + + constructor() { + this.label = 'Google Custom Search API' + this.name = 'googleCustomSearchApi' + this.version = 1.0 + this.description = + 'Please refer to the Google Cloud Console for instructions on how to create an API key, and visit the Search Engine Creation page to learn how to generate your Search Engine ID.' + this.inputs = [ + { + label: 'Google Custom Search Api Key', + name: 'googleCustomSearchApiKey', + type: 'password' + }, + { + label: 'Programmable Search Engine ID', + name: 'googleCustomSearchApiId', + type: 'string' + } + ] + } +} + +module.exports = { credClass: GoogleSearchApi } diff --git a/packages/components/credentials/HuggingFaceApi.credential.ts b/packages/components/credentials/HuggingFaceApi.credential.ts new file mode 100644 index 000000000..1b9221941 --- /dev/null +++ b/packages/components/credentials/HuggingFaceApi.credential.ts @@ -0,0 +1,23 @@ +import { INodeParams, INodeCredential } from '../src/Interface' + +class HuggingFaceApi implements INodeCredential { + label: string + name: string + version: number + inputs: INodeParams[] + + constructor() { + this.label = 'HuggingFace API' + this.name = 'huggingFaceApi' + this.version = 1.0 + this.inputs = [ + { + label: 'HuggingFace Api Key', + name: 'huggingFaceApiKey', + type: 'password' + } + ] + } +} + +module.exports = { credClass: HuggingFaceApi } diff --git a/packages/components/credentials/MotorheadMemoryApi.credential.ts b/packages/components/credentials/MotorheadMemoryApi.credential.ts new file mode 100644 index 000000000..68a18ec1c --- /dev/null +++ b/packages/components/credentials/MotorheadMemoryApi.credential.ts @@ -0,0 +1,31 @@ +import { INodeParams, INodeCredential } from '../src/Interface' + +class MotorheadMemoryApi implements INodeCredential { + label: string + name: string + version: number + description: string + inputs: INodeParams[] + + constructor() { + this.label = 'Motorhead Memory API' + this.name = 'motorheadMemoryApi' + this.version = 1.0 + this.description = + 'Refer to official guide on how to create API key and Client ID on Motorhead Memory' + this.inputs = [ + { + label: 'Client ID', + name: 'clientId', + type: 'string' + }, + { + label: 'API Key', + name: 'apiKey', + type: 'password' + } + ] + } +} + +module.exports = { credClass: MotorheadMemoryApi } diff --git a/packages/components/credentials/NotionApi.credential.ts b/packages/components/credentials/NotionApi.credential.ts new file mode 100644 index 000000000..ebe4bf99d --- /dev/null +++ b/packages/components/credentials/NotionApi.credential.ts @@ -0,0 +1,26 @@ +import { INodeParams, INodeCredential } from '../src/Interface' + +class NotionApi implements INodeCredential { + label: string + name: string + version: number + description: string + inputs: INodeParams[] + + constructor() { + this.label = 'Notion API' + this.name = 'notionApi' + this.version = 1.0 + this.description = + 'You can find integration token here' + this.inputs = [ + { + label: 'Notion Integration Token', + name: 'notionIntegrationToken', + type: 'password' + } + ] + } +} + +module.exports = { credClass: NotionApi } diff --git a/packages/components/credentials/OpenAIApi.credential.ts b/packages/components/credentials/OpenAIApi.credential.ts new file mode 100644 index 000000000..836da7e9b --- /dev/null +++ b/packages/components/credentials/OpenAIApi.credential.ts @@ -0,0 +1,23 @@ +import { INodeParams, INodeCredential } from '../src/Interface' + +class OpenAIApi implements INodeCredential { + label: string + name: string + version: number + inputs: INodeParams[] + + constructor() { + this.label = 'OpenAI API' + this.name = 'openAIApi' + this.version = 1.0 + this.inputs = [ + { + label: 'OpenAI Api Key', + name: 'openAIApiKey', + type: 'password' + } + ] + } +} + +module.exports = { credClass: OpenAIApi } diff --git a/packages/components/credentials/OpenAPIAuth.credential.ts b/packages/components/credentials/OpenAPIAuth.credential.ts new file mode 100644 index 000000000..3f0ef9075 --- /dev/null +++ b/packages/components/credentials/OpenAPIAuth.credential.ts @@ -0,0 +1,25 @@ +import { INodeParams, INodeCredential } from '../src/Interface' + +class OpenAPIAuth implements INodeCredential { + label: string + name: string + version: number + description: string + inputs: INodeParams[] + + constructor() { + this.label = 'OpenAPI Auth Token' + this.name = 'openAPIAuth' + this.version = 1.0 + this.inputs = [ + { + label: 'OpenAPI Token', + name: 'openAPIToken', + type: 'password', + description: 'Auth Token. For example: Bearer ' + } + ] + } +} + +module.exports = { credClass: OpenAPIAuth } diff --git a/packages/components/credentials/PineconeApi.credential.ts b/packages/components/credentials/PineconeApi.credential.ts new file mode 100644 index 000000000..4c5f62fe8 --- /dev/null +++ b/packages/components/credentials/PineconeApi.credential.ts @@ -0,0 +1,29 @@ +import { INodeParams, INodeCredential } from '../src/Interface' + +class PineconeApi implements INodeCredential { + label: string + name: string + version: number + description: string + inputs: INodeParams[] + + constructor() { + this.label = 'Pinecone API' + this.name = 'pineconeApi' + this.version = 1.0 + this.inputs = [ + { + label: 'Pinecone Api Key', + name: 'pineconeApiKey', + type: 'password' + }, + { + label: 'Pinecone Environment', + name: 'pineconeEnv', + type: 'string' + } + ] + } +} + +module.exports = { credClass: PineconeApi } diff --git a/packages/components/credentials/QdrantApi.credential.ts b/packages/components/credentials/QdrantApi.credential.ts new file mode 100644 index 000000000..fffebccc2 --- /dev/null +++ b/packages/components/credentials/QdrantApi.credential.ts @@ -0,0 +1,24 @@ +import { INodeParams, INodeCredential } from '../src/Interface' + +class QdrantApi implements INodeCredential { + label: string + name: string + version: number + description: string + inputs: INodeParams[] + + constructor() { + this.label = 'Qdrant API' + this.name = 'qdrantApi' + this.version = 1.0 + this.inputs = [ + { + label: 'Qdrant API Key', + name: 'qdrantApiKey', + type: 'password' + } + ] + } +} + +module.exports = { credClass: QdrantApi } diff --git a/packages/components/credentials/ReplicateApi.credential.ts b/packages/components/credentials/ReplicateApi.credential.ts new file mode 100644 index 000000000..e638826be --- /dev/null +++ b/packages/components/credentials/ReplicateApi.credential.ts @@ -0,0 +1,23 @@ +import { INodeParams, INodeCredential } from '../src/Interface' + +class ReplicateApi implements INodeCredential { + label: string + name: string + version: number + inputs: INodeParams[] + + constructor() { + this.label = 'Replicate API' + this.name = 'replicateApi' + this.version = 1.0 + this.inputs = [ + { + label: 'Replicate Api Key', + name: 'replicateApiKey', + type: 'password' + } + ] + } +} + +module.exports = { credClass: ReplicateApi } diff --git a/packages/components/credentials/SerpApi.credential.ts b/packages/components/credentials/SerpApi.credential.ts new file mode 100644 index 000000000..20cf6ab5a --- /dev/null +++ b/packages/components/credentials/SerpApi.credential.ts @@ -0,0 +1,24 @@ +import { INodeParams, INodeCredential } from '../src/Interface' + +class SerpApi implements INodeCredential { + label: string + name: string + version: number + description: string + inputs: INodeParams[] + + constructor() { + this.label = 'Serp API' + this.name = 'serpApi' + this.version = 1.0 + this.inputs = [ + { + label: 'Serp Api Key', + name: 'serpApiKey', + type: 'password' + } + ] + } +} + +module.exports = { credClass: SerpApi } diff --git a/packages/components/credentials/SerperApi.credential.ts b/packages/components/credentials/SerperApi.credential.ts new file mode 100644 index 000000000..9a8fee1e3 --- /dev/null +++ b/packages/components/credentials/SerperApi.credential.ts @@ -0,0 +1,24 @@ +import { INodeParams, INodeCredential } from '../src/Interface' + +class SerperApi implements INodeCredential { + label: string + name: string + version: number + description: string + inputs: INodeParams[] + + constructor() { + this.label = 'Serper API' + this.name = 'serperApi' + this.version = 1.0 + this.inputs = [ + { + label: 'Serper Api Key', + name: 'serperApiKey', + type: 'password' + } + ] + } +} + +module.exports = { credClass: SerperApi } diff --git a/packages/components/credentials/SingleStoreApi.credential.ts b/packages/components/credentials/SingleStoreApi.credential.ts new file mode 100644 index 000000000..fee9853bb --- /dev/null +++ b/packages/components/credentials/SingleStoreApi.credential.ts @@ -0,0 +1,31 @@ +import { INodeParams, INodeCredential } from '../src/Interface' + +class SingleStoreApi implements INodeCredential { + label: string + name: string + version: number + description: string + inputs: INodeParams[] + + constructor() { + this.label = 'SingleStore API' + this.name = 'singleStoreApi' + this.version = 1.0 + this.inputs = [ + { + label: 'User', + name: 'user', + type: 'string', + placeholder: '' + }, + { + label: 'Password', + name: 'password', + type: 'password', + placeholder: '' + } + ] + } +} + +module.exports = { credClass: SingleStoreApi } diff --git a/packages/components/credentials/SupabaseApi.credential.ts b/packages/components/credentials/SupabaseApi.credential.ts new file mode 100644 index 000000000..beb2a4223 --- /dev/null +++ b/packages/components/credentials/SupabaseApi.credential.ts @@ -0,0 +1,24 @@ +import { INodeParams, INodeCredential } from '../src/Interface' + +class SupabaseApi implements INodeCredential { + label: string + name: string + version: number + description: string + inputs: INodeParams[] + + constructor() { + this.label = 'Supabase API' + this.name = 'supabaseApi' + this.version = 1.0 + this.inputs = [ + { + label: 'Supabase API Key', + name: 'supabaseApiKey', + type: 'password' + } + ] + } +} + +module.exports = { credClass: SupabaseApi } diff --git a/packages/components/credentials/VectaraApi.credential.ts b/packages/components/credentials/VectaraApi.credential.ts new file mode 100644 index 000000000..96ad29a66 --- /dev/null +++ b/packages/components/credentials/VectaraApi.credential.ts @@ -0,0 +1,34 @@ +import { INodeParams, INodeCredential } from '../src/Interface' + +class VectaraAPI implements INodeCredential { + label: string + name: string + version: number + description: string + inputs: INodeParams[] + + constructor() { + this.label = 'Vectara API' + this.name = 'vectaraApi' + this.version = 1.0 + this.inputs = [ + { + label: 'Vectara Customer ID', + name: 'customerID', + type: 'string' + }, + { + label: 'Vectara Corpus ID', + name: 'corpusID', + type: 'string' + }, + { + label: 'Vectara API Key', + name: 'apiKey', + type: 'password' + } + ] + } +} + +module.exports = { credClass: VectaraAPI } diff --git a/packages/components/credentials/WeaviateApi.credential.ts b/packages/components/credentials/WeaviateApi.credential.ts new file mode 100644 index 000000000..041b41eac --- /dev/null +++ b/packages/components/credentials/WeaviateApi.credential.ts @@ -0,0 +1,24 @@ +import { INodeParams, INodeCredential } from '../src/Interface' + +class WeaviateApi implements INodeCredential { + label: string + name: string + version: number + description: string + inputs: INodeParams[] + + constructor() { + this.label = 'Weaviate API' + this.name = 'weaviateApi' + this.version = 1.0 + this.inputs = [ + { + label: 'Weaviate API Key', + name: 'weaviateApiKey', + type: 'password' + } + ] + } +} + +module.exports = { credClass: WeaviateApi } diff --git a/packages/components/credentials/ZapierNLAApi.credential.ts b/packages/components/credentials/ZapierNLAApi.credential.ts new file mode 100644 index 000000000..72035660e --- /dev/null +++ b/packages/components/credentials/ZapierNLAApi.credential.ts @@ -0,0 +1,24 @@ +import { INodeParams, INodeCredential } from '../src/Interface' + +class ZapierNLAApi implements INodeCredential { + label: string + name: string + version: number + description: string + inputs: INodeParams[] + + constructor() { + this.label = 'Zapier NLA API' + this.name = 'zapierNLAApi' + this.version = 1.0 + this.inputs = [ + { + label: 'Zapier NLA Api Key', + name: 'zapierNLAApiKey', + type: 'password' + } + ] + } +} + +module.exports = { credClass: ZapierNLAApi } diff --git a/packages/components/credentials/ZepMemoryApi.credential.ts b/packages/components/credentials/ZepMemoryApi.credential.ts new file mode 100644 index 000000000..a78ad6d60 --- /dev/null +++ b/packages/components/credentials/ZepMemoryApi.credential.ts @@ -0,0 +1,26 @@ +import { INodeParams, INodeCredential } from '../src/Interface' + +class ZepMemoryApi implements INodeCredential { + label: string + name: string + version: number + description: string + inputs: INodeParams[] + + constructor() { + this.label = 'Zep Memory API' + this.name = 'zepMemoryApi' + this.version = 1.0 + this.description = + 'Refer to official guide on how to create API key on Zep' + this.inputs = [ + { + label: 'API Key', + name: 'apiKey', + type: 'password' + } + ] + } +} + +module.exports = { credClass: ZepMemoryApi } diff --git a/packages/components/nodes/agents/AirtableAgent/AirtableAgent.ts b/packages/components/nodes/agents/AirtableAgent/AirtableAgent.ts new file mode 100644 index 000000000..074f39c1b --- /dev/null +++ b/packages/components/nodes/agents/AirtableAgent/AirtableAgent.ts @@ -0,0 +1,232 @@ +import { ICommonObject, INode, INodeData, INodeParams, PromptTemplate } from '../../../src/Interface' +import { AgentExecutor } from 'langchain/agents' +import { getBaseClasses, getCredentialData, getCredentialParam } from '../../../src/utils' +import { LoadPyodide, finalSystemPrompt, systemPrompt } from './core' +import { LLMChain } from 'langchain/chains' +import { BaseLanguageModel } from 'langchain/base_language' +import { ConsoleCallbackHandler, CustomChainHandler } from '../../../src/handler' +import axios from 'axios' + +class Airtable_Agents implements INode { + label: string + name: string + version: number + description: string + type: string + icon: string + category: string + baseClasses: string[] + credential: INodeParams + inputs: INodeParams[] + + constructor() { + this.label = 'Airtable Agent' + this.name = 'airtableAgent' + this.version = 1.0 + this.type = 'AgentExecutor' + this.category = 'Agents' + this.icon = 'airtable.svg' + this.description = 'Agent used to to answer queries on Airtable table' + this.baseClasses = [this.type, ...getBaseClasses(AgentExecutor)] + this.credential = { + label: 'Connect Credential', + name: 'credential', + type: 'credential', + credentialNames: ['airtableApi'] + } + this.inputs = [ + { + label: 'Language Model', + name: 'model', + type: 'BaseLanguageModel' + }, + { + label: 'Base Id', + name: 'baseId', + type: 'string', + placeholder: 'app11RobdGoX0YNsC', + description: + 'If your table URL looks like: https://airtable.com/app11RobdGoX0YNsC/tblJdmvbrgizbYICO/viw9UrP77Id0CE4ee, app11RovdGoX0YNsC is the base id' + }, + { + label: 'Table Id', + name: 'tableId', + type: 'string', + placeholder: 'tblJdmvbrgizbYICO', + description: + 'If your table URL looks like: https://airtable.com/app11RobdGoX0YNsC/tblJdmvbrgizbYICO/viw9UrP77Id0CE4ee, tblJdmvbrgizbYICO is the table id' + }, + { + label: 'Return All', + name: 'returnAll', + type: 'boolean', + default: true, + additionalParams: true, + description: 'If all results should be returned or only up to a given limit' + }, + { + label: 'Limit', + name: 'limit', + type: 'number', + default: 100, + additionalParams: true, + description: 'Number of results to return' + } + ] + } + + async init(): Promise { + // Not used + return undefined + } + + async run(nodeData: INodeData, input: string, options: ICommonObject): Promise { + const model = nodeData.inputs?.model as BaseLanguageModel + const baseId = nodeData.inputs?.baseId as string + const tableId = nodeData.inputs?.tableId as string + const returnAll = nodeData.inputs?.returnAll as boolean + const limit = nodeData.inputs?.limit as string + + const credentialData = await getCredentialData(nodeData.credential ?? '', options) + const accessToken = getCredentialParam('accessToken', credentialData, nodeData) + + let airtableData: ICommonObject[] = [] + + if (returnAll) { + airtableData = await loadAll(baseId, tableId, accessToken) + } else { + airtableData = await loadLimit(limit ? parseInt(limit, 10) : 100, baseId, tableId, accessToken) + } + + let base64String = Buffer.from(JSON.stringify(airtableData)).toString('base64') + + const loggerHandler = new ConsoleCallbackHandler(options.logger) + const handler = new CustomChainHandler(options.socketIO, options.socketIOClientId) + + const pyodide = await LoadPyodide() + + // First load the csv file and get the dataframe dictionary of column types + // For example using titanic.csv: {'PassengerId': 'int64', 'Survived': 'int64', 'Pclass': 'int64', 'Name': 'object', 'Sex': 'object', 'Age': 'float64', 'SibSp': 'int64', 'Parch': 'int64', 'Ticket': 'object', 'Fare': 'float64', 'Cabin': 'object', 'Embarked': 'object'} + let dataframeColDict = '' + try { + const code = `import pandas as pd +import base64 +import json + +base64_string = "${base64String}" + +decoded_data = base64.b64decode(base64_string) + +json_data = json.loads(decoded_data) + +df = pd.DataFrame(json_data) +my_dict = df.dtypes.astype(str).to_dict() +print(my_dict) +json.dumps(my_dict)` + dataframeColDict = await pyodide.runPythonAsync(code) + } catch (error) { + throw new Error(error) + } + + // Then tell GPT to come out with ONLY python code + // For example: len(df), df[df['SibSp'] > 3]['PassengerId'].count() + let pythonCode = '' + if (dataframeColDict) { + const chain = new LLMChain({ + llm: model, + prompt: PromptTemplate.fromTemplate(systemPrompt), + verbose: process.env.DEBUG === 'true' ? true : false + }) + const inputs = { + dict: dataframeColDict, + question: input + } + const res = await chain.call(inputs, [loggerHandler]) + pythonCode = res?.text + } + + // Then run the code using Pyodide + let finalResult = '' + if (pythonCode) { + try { + const code = `import pandas as pd\n${pythonCode}` + finalResult = await pyodide.runPythonAsync(code) + } catch (error) { + throw new Error(`Sorry, I'm unable to find answer for question: "${input}" using follwoing code: "${pythonCode}"`) + } + } + + // Finally, return a complete answer + if (finalResult) { + const chain = new LLMChain({ + llm: model, + prompt: PromptTemplate.fromTemplate(finalSystemPrompt), + verbose: process.env.DEBUG === 'true' ? true : false + }) + const inputs = { + question: input, + answer: finalResult + } + + if (options.socketIO && options.socketIOClientId) { + const result = await chain.call(inputs, [loggerHandler, handler]) + return result?.text + } else { + const result = await chain.call(inputs, [loggerHandler]) + return result?.text + } + } + + return pythonCode + } +} + +interface AirtableLoaderResponse { + records: AirtableLoaderPage[] + offset?: string +} + +interface AirtableLoaderPage { + id: string + createdTime: string + fields: ICommonObject +} + +const fetchAirtableData = async (url: string, params: ICommonObject, accessToken: string): Promise => { + try { + const headers = { + Authorization: `Bearer ${accessToken}`, + 'Content-Type': 'application/json', + Accept: 'application/json' + } + const response = await axios.get(url, { params, headers }) + return response.data + } catch (error) { + throw new Error(`Failed to fetch ${url} from Airtable: ${error}`) + } +} + +const loadAll = async (baseId: string, tableId: string, accessToken: string): Promise => { + const params: ICommonObject = { pageSize: 100 } + let data: AirtableLoaderResponse + let returnPages: AirtableLoaderPage[] = [] + + do { + data = await fetchAirtableData(`https://api.airtable.com/v0/${baseId}/${tableId}`, params, accessToken) + returnPages.push.apply(returnPages, data.records) + params.offset = data.offset + } while (data.offset !== undefined) + + return data.records.map((page) => page.fields) +} + +const loadLimit = async (limit: number, baseId: string, tableId: string, accessToken: string): Promise => { + const params = { maxRecords: limit } + const data = await fetchAirtableData(`https://api.airtable.com/v0/${baseId}/${tableId}`, params, accessToken) + if (data.records.length === 0) { + return [] + } + return data.records.map((page) => page.fields) +} + +module.exports = { nodeClass: Airtable_Agents } diff --git a/packages/components/nodes/agents/AirtableAgent/airtable.svg b/packages/components/nodes/agents/AirtableAgent/airtable.svg new file mode 100644 index 000000000..867c3b5ae --- /dev/null +++ b/packages/components/nodes/agents/AirtableAgent/airtable.svg @@ -0,0 +1,9 @@ + + + + + + + + + diff --git a/packages/components/nodes/agents/AirtableAgent/core.ts b/packages/components/nodes/agents/AirtableAgent/core.ts new file mode 100644 index 000000000..450bf5ea6 --- /dev/null +++ b/packages/components/nodes/agents/AirtableAgent/core.ts @@ -0,0 +1,29 @@ +import type { PyodideInterface } from 'pyodide' +import * as path from 'path' +import { getUserHome } from '../../../src/utils' + +let pyodideInstance: PyodideInterface | undefined + +export async function LoadPyodide(): Promise { + if (pyodideInstance === undefined) { + const { loadPyodide } = await import('pyodide') + const obj: any = { packageCacheDir: path.join(getUserHome(), '.flowise', 'pyodideCacheDir') } + pyodideInstance = await loadPyodide(obj) + await pyodideInstance.loadPackage(['pandas', 'numpy']) + } + + return pyodideInstance +} + +export const systemPrompt = `You are working with a pandas dataframe in Python. The name of the dataframe is df. + +The columns and data types of a dataframe are given below as a Python dictionary with keys showing column names and values showing the data types. +{dict} + +I will ask question, and you will output the Python code using pandas dataframe to answer my question. Do not provide any explanations. Do not respond with anything except the output of the code. + +Question: {question} +Output Code:` + +export const finalSystemPrompt = `You are given the question: {question}. You have an answer to the question: {answer}. Rephrase the answer into a standalone answer. +Standalone Answer:` diff --git a/packages/components/nodes/agents/AutoGPT/AutoGPT.ts b/packages/components/nodes/agents/AutoGPT/AutoGPT.ts index 4775507b2..69e9b9ed5 100644 --- a/packages/components/nodes/agents/AutoGPT/AutoGPT.ts +++ b/packages/components/nodes/agents/AutoGPT/AutoGPT.ts @@ -3,10 +3,12 @@ import { BaseChatModel } from 'langchain/chat_models/base' import { AutoGPT } from 'langchain/experimental/autogpt' import { Tool } from 'langchain/tools' import { VectorStoreRetriever } from 'langchain/vectorstores/base' +import { flatten } from 'lodash' class AutoGPT_Agents implements INode { label: string name: string + version: number description: string type: string icon: string @@ -17,6 +19,7 @@ class AutoGPT_Agents implements INode { constructor() { this.label = 'AutoGPT' this.name = 'autoGPT' + this.version = 1.0 this.type = 'AutoGPT' this.category = 'Agents' this.icon = 'autogpt.png' @@ -67,7 +70,7 @@ class AutoGPT_Agents implements INode { const model = nodeData.inputs?.model as BaseChatModel const vectorStoreRetriever = nodeData.inputs?.vectorStoreRetriever as VectorStoreRetriever let tools = nodeData.inputs?.tools as Tool[] - tools = tools.flat() + tools = flatten(tools) const aiName = (nodeData.inputs?.aiName as string) || 'AutoGPT' const aiRole = (nodeData.inputs?.aiRole as string) || 'Assistant' const maxLoop = nodeData.inputs?.maxLoop as string @@ -89,7 +92,6 @@ class AutoGPT_Agents implements INode { const res = await executor.run([input]) return res || 'I have completed all my tasks.' } catch (e) { - console.error(e) throw new Error(e) } } diff --git a/packages/components/nodes/agents/BabyAGI/BabyAGI.ts b/packages/components/nodes/agents/BabyAGI/BabyAGI.ts index 5112be0ea..303c231ec 100644 --- a/packages/components/nodes/agents/BabyAGI/BabyAGI.ts +++ b/packages/components/nodes/agents/BabyAGI/BabyAGI.ts @@ -6,6 +6,7 @@ import { VectorStore } from 'langchain/vectorstores' class BabyAGI_Agents implements INode { label: string name: string + version: number description: string type: string icon: string @@ -16,6 +17,7 @@ class BabyAGI_Agents implements INode { constructor() { this.label = 'BabyAGI' this.name = 'babyAGI' + this.version = 1.0 this.type = 'BabyAGI' this.category = 'Agents' this.icon = 'babyagi.jpg' @@ -45,8 +47,9 @@ class BabyAGI_Agents implements INode { const model = nodeData.inputs?.model as BaseChatModel const vectorStore = nodeData.inputs?.vectorStore as VectorStore const taskLoop = nodeData.inputs?.taskLoop as string + const k = (vectorStore as any)?.k ?? 4 - const babyAgi = BabyAGI.fromLLM(model, vectorStore, parseInt(taskLoop, 10)) + const babyAgi = BabyAGI.fromLLM(model, vectorStore, parseInt(taskLoop, 10), k) return babyAgi } diff --git a/packages/components/nodes/agents/BabyAGI/core.ts b/packages/components/nodes/agents/BabyAGI/core.ts index 76889b527..444aa3eb5 100644 --- a/packages/components/nodes/agents/BabyAGI/core.ts +++ b/packages/components/nodes/agents/BabyAGI/core.ts @@ -154,18 +154,22 @@ export class BabyAGI { maxIterations = 3 + topK = 4 + constructor( taskCreationChain: TaskCreationChain, taskPrioritizationChain: TaskPrioritizationChain, executionChain: ExecutionChain, vectorStore: VectorStore, - maxIterations: number + maxIterations: number, + topK: number ) { this.taskCreationChain = taskCreationChain this.taskPrioritizationChain = taskPrioritizationChain this.executionChain = executionChain this.vectorStore = vectorStore this.maxIterations = maxIterations + this.topK = topK } addTask(task: Task) { @@ -219,7 +223,7 @@ export class BabyAGI { this.printNextTask(task) // Step 2: Execute the task - const result = await executeTask(this.vectorStore, this.executionChain, objective, task.task_name) + const result = await executeTask(this.vectorStore, this.executionChain, objective, task.task_name, this.topK) const thisTaskId = task.task_id finalResult = result this.printTaskResult(result) @@ -257,10 +261,10 @@ export class BabyAGI { return finalResult } - static fromLLM(llm: BaseChatModel, vectorstore: VectorStore, maxIterations = 3): BabyAGI { + static fromLLM(llm: BaseChatModel, vectorstore: VectorStore, maxIterations = 3, topK = 4): BabyAGI { const taskCreationChain = TaskCreationChain.from_llm(llm) const taskPrioritizationChain = TaskPrioritizationChain.from_llm(llm) const executionChain = ExecutionChain.from_llm(llm) - return new BabyAGI(taskCreationChain, taskPrioritizationChain, executionChain, vectorstore, maxIterations) + return new BabyAGI(taskCreationChain, taskPrioritizationChain, executionChain, vectorstore, maxIterations, topK) } } diff --git a/packages/components/nodes/agents/CSVAgent/CSVAgent.ts b/packages/components/nodes/agents/CSVAgent/CSVAgent.ts new file mode 100644 index 000000000..4a42592ff --- /dev/null +++ b/packages/components/nodes/agents/CSVAgent/CSVAgent.ts @@ -0,0 +1,164 @@ +import { ICommonObject, INode, INodeData, INodeParams, PromptTemplate } from '../../../src/Interface' +import { AgentExecutor } from 'langchain/agents' +import { getBaseClasses } from '../../../src/utils' +import { LoadPyodide, finalSystemPrompt, systemPrompt } from './core' +import { LLMChain } from 'langchain/chains' +import { BaseLanguageModel } from 'langchain/base_language' +import { ConsoleCallbackHandler, CustomChainHandler } from '../../../src/handler' + +class CSV_Agents implements INode { + label: string + name: string + version: number + description: string + type: string + icon: string + category: string + baseClasses: string[] + inputs: INodeParams[] + + constructor() { + this.label = 'CSV Agent' + this.name = 'csvAgent' + this.version = 1.0 + this.type = 'AgentExecutor' + this.category = 'Agents' + this.icon = 'csvagent.png' + this.description = 'Agent used to to answer queries on CSV data' + this.baseClasses = [this.type, ...getBaseClasses(AgentExecutor)] + this.inputs = [ + { + label: 'Csv File', + name: 'csvFile', + type: 'file', + fileType: '.csv' + }, + { + label: 'Language Model', + name: 'model', + type: 'BaseLanguageModel' + }, + { + label: 'System Message', + name: 'systemMessagePrompt', + type: 'string', + rows: 4, + additionalParams: true, + optional: true, + placeholder: + 'I want you to act as a document that I am having a conversation with. Your name is "AI Assistant". You will provide me with answers from the given info. If the answer is not included, say exactly "Hmm, I am not sure." and stop after that. Refuse to answer any question not about the info. Never break character.' + } + ] + } + + async init(): Promise { + // Not used + return undefined + } + + async run(nodeData: INodeData, input: string, options: ICommonObject): Promise { + const csvFileBase64 = nodeData.inputs?.csvFile as string + const model = nodeData.inputs?.model as BaseLanguageModel + const systemMessagePrompt = nodeData.inputs?.systemMessagePrompt as string + + const loggerHandler = new ConsoleCallbackHandler(options.logger) + const handler = new CustomChainHandler(options.socketIO, options.socketIOClientId) + + let files: string[] = [] + + if (csvFileBase64.startsWith('[') && csvFileBase64.endsWith(']')) { + files = JSON.parse(csvFileBase64) + } else { + files = [csvFileBase64] + } + + let base64String = '' + + for (const file of files) { + const splitDataURI = file.split(',') + splitDataURI.pop() + base64String = splitDataURI.pop() ?? '' + } + + const pyodide = await LoadPyodide() + + // First load the csv file and get the dataframe dictionary of column types + // For example using titanic.csv: {'PassengerId': 'int64', 'Survived': 'int64', 'Pclass': 'int64', 'Name': 'object', 'Sex': 'object', 'Age': 'float64', 'SibSp': 'int64', 'Parch': 'int64', 'Ticket': 'object', 'Fare': 'float64', 'Cabin': 'object', 'Embarked': 'object'} + let dataframeColDict = '' + try { + const code = `import pandas as pd +import base64 +from io import StringIO +import json + +base64_string = "${base64String}" + +decoded_data = base64.b64decode(base64_string) + +csv_data = StringIO(decoded_data.decode('utf-8')) + +df = pd.read_csv(csv_data) +my_dict = df.dtypes.astype(str).to_dict() +print(my_dict) +json.dumps(my_dict)` + dataframeColDict = await pyodide.runPythonAsync(code) + } catch (error) { + throw new Error(error) + } + + // Then tell GPT to come out with ONLY python code + // For example: len(df), df[df['SibSp'] > 3]['PassengerId'].count() + let pythonCode = '' + if (dataframeColDict) { + const chain = new LLMChain({ + llm: model, + prompt: PromptTemplate.fromTemplate(systemPrompt), + verbose: process.env.DEBUG === 'true' ? true : false + }) + const inputs = { + dict: dataframeColDict, + question: input + } + const res = await chain.call(inputs, [loggerHandler]) + pythonCode = res?.text + } + + // Then run the code using Pyodide + let finalResult = '' + if (pythonCode) { + try { + const code = `import pandas as pd\n${pythonCode}` + finalResult = await pyodide.runPythonAsync(code) + } catch (error) { + throw new Error(`Sorry, I'm unable to find answer for question: "${input}" using follwoing code: "${pythonCode}"`) + } + } + + // Finally, return a complete answer + if (finalResult) { + const chain = new LLMChain({ + llm: model, + prompt: PromptTemplate.fromTemplate( + systemMessagePrompt ? `${systemMessagePrompt}\n${finalSystemPrompt}` : finalSystemPrompt + ), + verbose: process.env.DEBUG === 'true' ? true : false + }) + const inputs = { + question: input, + answer: finalResult + } + + if (options.socketIO && options.socketIOClientId) { + const result = await chain.call(inputs, [loggerHandler, handler]) + return result?.text + } else { + const result = await chain.call(inputs, [loggerHandler]) + return result?.text + } + } + + return pythonCode + } +} + +module.exports = { nodeClass: CSV_Agents } diff --git a/packages/components/nodes/agents/CSVAgent/core.ts b/packages/components/nodes/agents/CSVAgent/core.ts new file mode 100644 index 000000000..450bf5ea6 --- /dev/null +++ b/packages/components/nodes/agents/CSVAgent/core.ts @@ -0,0 +1,29 @@ +import type { PyodideInterface } from 'pyodide' +import * as path from 'path' +import { getUserHome } from '../../../src/utils' + +let pyodideInstance: PyodideInterface | undefined + +export async function LoadPyodide(): Promise { + if (pyodideInstance === undefined) { + const { loadPyodide } = await import('pyodide') + const obj: any = { packageCacheDir: path.join(getUserHome(), '.flowise', 'pyodideCacheDir') } + pyodideInstance = await loadPyodide(obj) + await pyodideInstance.loadPackage(['pandas', 'numpy']) + } + + return pyodideInstance +} + +export const systemPrompt = `You are working with a pandas dataframe in Python. The name of the dataframe is df. + +The columns and data types of a dataframe are given below as a Python dictionary with keys showing column names and values showing the data types. +{dict} + +I will ask question, and you will output the Python code using pandas dataframe to answer my question. Do not provide any explanations. Do not respond with anything except the output of the code. + +Question: {question} +Output Code:` + +export const finalSystemPrompt = `You are given the question: {question}. You have an answer to the question: {answer}. Rephrase the answer into a standalone answer. +Standalone Answer:` diff --git a/packages/components/nodes/agents/CSVAgent/csvagent.png b/packages/components/nodes/agents/CSVAgent/csvagent.png new file mode 100644 index 000000000..3ed16bb2c Binary files /dev/null and b/packages/components/nodes/agents/CSVAgent/csvagent.png differ diff --git a/packages/components/nodes/agents/ConversationalAgent/ConversationalAgent.ts b/packages/components/nodes/agents/ConversationalAgent/ConversationalAgent.ts index d2106e185..d8d8506c2 100644 --- a/packages/components/nodes/agents/ConversationalAgent/ConversationalAgent.ts +++ b/packages/components/nodes/agents/ConversationalAgent/ConversationalAgent.ts @@ -1,14 +1,23 @@ -import { ICommonObject, IMessage, INode, INodeData, INodeParams } from '../../../src/Interface' +import { ICommonObject, INode, INodeData, INodeParams } from '../../../src/Interface' import { initializeAgentExecutorWithOptions, AgentExecutor, InitializeAgentExecutorOptions } from 'langchain/agents' import { Tool } from 'langchain/tools' -import { BaseChatMemory, ChatMessageHistory } from 'langchain/memory' -import { getBaseClasses } from '../../../src/utils' -import { AIChatMessage, HumanChatMessage } from 'langchain/schema' +import { BaseChatMemory } from 'langchain/memory' +import { getBaseClasses, mapChatHistory } from '../../../src/utils' import { BaseLanguageModel } from 'langchain/base_language' +import { flatten } from 'lodash' + +const DEFAULT_PREFIX = `Assistant is a large language model trained by OpenAI. + +Assistant is designed to be able to assist with a wide range of tasks, from answering simple questions to providing in-depth explanations and discussions on a wide range of topics. As a language model, Assistant is able to generate human-like text based on the input it receives, allowing it to engage in natural-sounding conversations and provide responses that are coherent and relevant to the topic at hand. + +Assistant is constantly learning and improving, and its capabilities are constantly evolving. It is able to process and understand large amounts of text, and can use this knowledge to provide accurate and informative responses to a wide range of questions. Additionally, Assistant is able to generate its own text based on the input it receives, allowing it to engage in discussions and provide explanations and descriptions on a wide range of topics. + +Overall, Assistant is a powerful system that can help with a wide range of tasks and provide valuable insights and information on a wide range of topics. Whether you need help with a specific question or just want to have a conversation about a particular topic, Assistant is here to assist.` class ConversationalAgent_Agents implements INode { label: string name: string + version: number description: string type: string icon: string @@ -19,6 +28,7 @@ class ConversationalAgent_Agents implements INode { constructor() { this.label = 'Conversational Agent' this.name = 'conversationalAgent' + this.version = 1.0 this.type = 'AgentExecutor' this.category = 'Agents' this.icon = 'agent.svg' @@ -46,14 +56,7 @@ class ConversationalAgent_Agents implements INode { name: 'systemMessage', type: 'string', rows: 4, - optional: true, - additionalParams: true - }, - { - label: 'Human Message', - name: 'humanMessage', - type: 'string', - rows: 4, + default: DEFAULT_PREFIX, optional: true, additionalParams: true } @@ -63,9 +66,8 @@ class ConversationalAgent_Agents implements INode { async init(nodeData: INodeData): Promise { const model = nodeData.inputs?.model as BaseLanguageModel let tools = nodeData.inputs?.tools as Tool[] - tools = tools.flat() + tools = flatten(tools) const memory = nodeData.inputs?.memory as BaseChatMemory - const humanMessage = nodeData.inputs?.humanMessage as string const systemMessage = nodeData.inputs?.systemMessage as string const obj: InitializeAgentExecutorOptions = { @@ -74,9 +76,6 @@ class ConversationalAgent_Agents implements INode { } const agentArgs: any = {} - if (humanMessage) { - agentArgs.humanMessage = humanMessage - } if (systemMessage) { agentArgs.systemMessage = systemMessage } @@ -93,19 +92,10 @@ class ConversationalAgent_Agents implements INode { const memory = nodeData.inputs?.memory as BaseChatMemory if (options && options.chatHistory) { - const chatHistory = [] - const histories: IMessage[] = options.chatHistory - - for (const message of histories) { - if (message.type === 'apiMessage') { - chatHistory.push(new AIChatMessage(message.message)) - } else if (message.type === 'userMessage') { - chatHistory.push(new HumanChatMessage(message.message)) - } - } - memory.chatHistory = new ChatMessageHistory(chatHistory) + memory.chatHistory = mapChatHistory(options) executor.memory = memory } + const result = await executor.call({ input }) return result?.output diff --git a/packages/components/nodes/agents/ConversationalRetrievalAgent/ConversationalRetrievalAgent.ts b/packages/components/nodes/agents/ConversationalRetrievalAgent/ConversationalRetrievalAgent.ts new file mode 100644 index 000000000..c0cef0526 --- /dev/null +++ b/packages/components/nodes/agents/ConversationalRetrievalAgent/ConversationalRetrievalAgent.ts @@ -0,0 +1,101 @@ +import { ICommonObject, INode, INodeData, INodeParams } from '../../../src/Interface' +import { initializeAgentExecutorWithOptions, AgentExecutor } from 'langchain/agents' +import { getBaseClasses, mapChatHistory } from '../../../src/utils' +import { flatten } from 'lodash' +import { BaseChatMemory } from 'langchain/memory' +import { ConsoleCallbackHandler, CustomChainHandler } from '../../../src/handler' + +const defaultMessage = `Do your best to answer the questions. Feel free to use any tools available to look up relevant information, only if necessary.` + +class ConversationalRetrievalAgent_Agents implements INode { + label: string + name: string + version: number + description: string + type: string + icon: string + category: string + baseClasses: string[] + inputs: INodeParams[] + + constructor() { + this.label = 'Conversational Retrieval Agent' + this.name = 'conversationalRetrievalAgent' + this.version = 1.0 + this.type = 'AgentExecutor' + this.category = 'Agents' + this.icon = 'agent.svg' + this.description = `An agent optimized for retrieval during conversation, answering questions based on past dialogue, all using OpenAI's Function Calling` + this.baseClasses = [this.type, ...getBaseClasses(AgentExecutor)] + this.inputs = [ + { + label: 'Allowed Tools', + name: 'tools', + type: 'Tool', + list: true + }, + { + label: 'Memory', + name: 'memory', + type: 'BaseChatMemory' + }, + { + label: 'OpenAI Chat Model', + name: 'model', + type: 'ChatOpenAI' + }, + { + label: 'System Message', + name: 'systemMessage', + type: 'string', + default: defaultMessage, + rows: 4, + optional: true, + additionalParams: true + } + ] + } + + async init(nodeData: INodeData): Promise { + const model = nodeData.inputs?.model + const memory = nodeData.inputs?.memory as BaseChatMemory + const systemMessage = nodeData.inputs?.systemMessage as string + + let tools = nodeData.inputs?.tools + tools = flatten(tools) + + const executor = await initializeAgentExecutorWithOptions(tools, model, { + agentType: 'openai-functions', + verbose: process.env.DEBUG === 'true' ? true : false, + agentArgs: { + prefix: systemMessage ?? defaultMessage + }, + returnIntermediateSteps: true + }) + executor.memory = memory + return executor + } + + async run(nodeData: INodeData, input: string, options: ICommonObject): Promise { + const executor = nodeData.instance as AgentExecutor + + if (executor.memory) { + ;(executor.memory as any).memoryKey = 'chat_history' + ;(executor.memory as any).outputKey = 'output' + ;(executor.memory as any).chatHistory = mapChatHistory(options) + } + + const loggerHandler = new ConsoleCallbackHandler(options.logger) + + if (options.socketIO && options.socketIOClientId) { + const handler = new CustomChainHandler(options.socketIO, options.socketIOClientId) + const result = await executor.call({ input }, [loggerHandler, handler]) + return result?.output + } else { + const result = await executor.call({ input }, [loggerHandler]) + return result?.output + } + } +} + +module.exports = { nodeClass: ConversationalRetrievalAgent_Agents } diff --git a/packages/components/nodes/agents/ConversationalRetrievalAgent/agent.svg b/packages/components/nodes/agents/ConversationalRetrievalAgent/agent.svg new file mode 100644 index 000000000..c87861e5c --- /dev/null +++ b/packages/components/nodes/agents/ConversationalRetrievalAgent/agent.svg @@ -0,0 +1,9 @@ + + + + + + + + + \ No newline at end of file diff --git a/packages/components/nodes/agents/MRKLAgentChat/MRKLAgentChat.ts b/packages/components/nodes/agents/MRKLAgentChat/MRKLAgentChat.ts index 34b36fc1e..0a9e744c9 100644 --- a/packages/components/nodes/agents/MRKLAgentChat/MRKLAgentChat.ts +++ b/packages/components/nodes/agents/MRKLAgentChat/MRKLAgentChat.ts @@ -3,10 +3,12 @@ import { initializeAgentExecutorWithOptions, AgentExecutor } from 'langchain/age import { getBaseClasses } from '../../../src/utils' import { Tool } from 'langchain/tools' import { BaseLanguageModel } from 'langchain/base_language' +import { flatten } from 'lodash' class MRKLAgentChat_Agents implements INode { label: string name: string + version: number description: string type: string icon: string @@ -17,6 +19,7 @@ class MRKLAgentChat_Agents implements INode { constructor() { this.label = 'MRKL Agent for Chat Models' this.name = 'mrklAgentChat' + this.version = 1.0 this.type = 'AgentExecutor' this.category = 'Agents' this.icon = 'agent.svg' @@ -40,7 +43,7 @@ class MRKLAgentChat_Agents implements INode { async init(nodeData: INodeData): Promise { const model = nodeData.inputs?.model as BaseLanguageModel let tools = nodeData.inputs?.tools as Tool[] - tools = tools.flat() + tools = flatten(tools) const executor = await initializeAgentExecutorWithOptions(tools, model, { agentType: 'chat-zero-shot-react-description', verbose: process.env.DEBUG === 'true' ? true : false diff --git a/packages/components/nodes/agents/MRKLAgentLLM/MRKLAgentLLM.ts b/packages/components/nodes/agents/MRKLAgentLLM/MRKLAgentLLM.ts index 20246ffa1..d7af586b4 100644 --- a/packages/components/nodes/agents/MRKLAgentLLM/MRKLAgentLLM.ts +++ b/packages/components/nodes/agents/MRKLAgentLLM/MRKLAgentLLM.ts @@ -3,10 +3,12 @@ import { initializeAgentExecutorWithOptions, AgentExecutor } from 'langchain/age import { Tool } from 'langchain/tools' import { getBaseClasses } from '../../../src/utils' import { BaseLanguageModel } from 'langchain/base_language' +import { flatten } from 'lodash' class MRKLAgentLLM_Agents implements INode { label: string name: string + version: number description: string type: string icon: string @@ -17,6 +19,7 @@ class MRKLAgentLLM_Agents implements INode { constructor() { this.label = 'MRKL Agent for LLMs' this.name = 'mrklAgentLLM' + this.version = 1.0 this.type = 'AgentExecutor' this.category = 'Agents' this.icon = 'agent.svg' @@ -40,7 +43,7 @@ class MRKLAgentLLM_Agents implements INode { async init(nodeData: INodeData): Promise { const model = nodeData.inputs?.model as BaseLanguageModel let tools = nodeData.inputs?.tools as Tool[] - tools = tools.flat() + tools = flatten(tools) const executor = await initializeAgentExecutorWithOptions(tools, model, { agentType: 'zero-shot-react-description', diff --git a/packages/components/nodes/agents/OpenAIFunctionAgent/OpenAIFunctionAgent.ts b/packages/components/nodes/agents/OpenAIFunctionAgent/OpenAIFunctionAgent.ts new file mode 100644 index 000000000..8c182d1ac --- /dev/null +++ b/packages/components/nodes/agents/OpenAIFunctionAgent/OpenAIFunctionAgent.ts @@ -0,0 +1,101 @@ +import { ICommonObject, INode, INodeData, INodeParams } from '../../../src/Interface' +import { initializeAgentExecutorWithOptions, AgentExecutor } from 'langchain/agents' +import { getBaseClasses, mapChatHistory } from '../../../src/utils' +import { BaseLanguageModel } from 'langchain/base_language' +import { flatten } from 'lodash' +import { BaseChatMemory } from 'langchain/memory' +import { ConsoleCallbackHandler, CustomChainHandler } from '../../../src/handler' + +class OpenAIFunctionAgent_Agents implements INode { + label: string + name: string + version: number + description: string + type: string + icon: string + category: string + baseClasses: string[] + inputs: INodeParams[] + + constructor() { + this.label = 'OpenAI Function Agent' + this.name = 'openAIFunctionAgent' + this.version = 1.0 + this.type = 'AgentExecutor' + this.category = 'Agents' + this.icon = 'openai.png' + this.description = `An agent that uses OpenAI's Function Calling functionality to pick the tool and args to call` + this.baseClasses = [this.type, ...getBaseClasses(AgentExecutor)] + this.inputs = [ + { + label: 'Allowed Tools', + name: 'tools', + type: 'Tool', + list: true + }, + { + label: 'Memory', + name: 'memory', + type: 'BaseChatMemory' + }, + { + label: 'OpenAI Chat Model', + name: 'model', + description: + 'Only works with gpt-3.5-turbo-0613 and gpt-4-0613. Refer docs for more info', + type: 'BaseChatModel' + }, + { + label: 'System Message', + name: 'systemMessage', + type: 'string', + rows: 4, + optional: true, + additionalParams: true + } + ] + } + + async init(nodeData: INodeData): Promise { + const model = nodeData.inputs?.model as BaseLanguageModel + const memory = nodeData.inputs?.memory as BaseChatMemory + const systemMessage = nodeData.inputs?.systemMessage as string + + let tools = nodeData.inputs?.tools + tools = flatten(tools) + + const executor = await initializeAgentExecutorWithOptions(tools, model, { + agentType: 'openai-functions', + verbose: process.env.DEBUG === 'true' ? true : false, + agentArgs: { + prefix: systemMessage ?? `You are a helpful AI assistant.` + } + }) + if (memory) executor.memory = memory + + return executor + } + + async run(nodeData: INodeData, input: string, options: ICommonObject): Promise { + const executor = nodeData.instance as AgentExecutor + const memory = nodeData.inputs?.memory as BaseChatMemory + + if (options && options.chatHistory) { + memory.chatHistory = mapChatHistory(options) + executor.memory = memory + } + + const loggerHandler = new ConsoleCallbackHandler(options.logger) + + if (options.socketIO && options.socketIOClientId) { + const handler = new CustomChainHandler(options.socketIO, options.socketIOClientId) + const result = await executor.run(input, [loggerHandler, handler]) + return result + } else { + const result = await executor.run(input, [loggerHandler]) + return result + } + } +} + +module.exports = { nodeClass: OpenAIFunctionAgent_Agents } diff --git a/packages/components/nodes/agents/OpenAIFunctionAgent/openai.png b/packages/components/nodes/agents/OpenAIFunctionAgent/openai.png new file mode 100644 index 000000000..de08a05b2 Binary files /dev/null and b/packages/components/nodes/agents/OpenAIFunctionAgent/openai.png differ diff --git a/packages/components/nodes/chains/ApiChain/GETApiChain.ts b/packages/components/nodes/chains/ApiChain/GETApiChain.ts new file mode 100644 index 000000000..bd4f3bc0d --- /dev/null +++ b/packages/components/nodes/chains/ApiChain/GETApiChain.ts @@ -0,0 +1,134 @@ +import { ICommonObject, INode, INodeData, INodeParams } from '../../../src/Interface' +import { APIChain } from 'langchain/chains' +import { getBaseClasses } from '../../../src/utils' +import { BaseLanguageModel } from 'langchain/base_language' +import { PromptTemplate } from 'langchain/prompts' +import { ConsoleCallbackHandler, CustomChainHandler } from '../../../src/handler' + +export const API_URL_RAW_PROMPT_TEMPLATE = `You are given the below API Documentation: +{api_docs} +Using this documentation, generate the full API url to call for answering the user question. +You should build the API url in order to get a response that is as short as possible, while still getting the necessary information to answer the question. Pay attention to deliberately exclude any unnecessary pieces of data in the API call. + +Question:{question} +API url:` + +export const API_RESPONSE_RAW_PROMPT_TEMPLATE = + 'Given this {api_response} response for {api_url}. use the given response to answer this {question}' + +class GETApiChain_Chains implements INode { + label: string + name: string + version: number + type: string + icon: string + category: string + baseClasses: string[] + description: string + inputs: INodeParams[] + + constructor() { + this.label = 'GET API Chain' + this.name = 'getApiChain' + this.version = 1.0 + this.type = 'GETApiChain' + this.icon = 'apichain.svg' + this.category = 'Chains' + this.description = 'Chain to run queries against GET API' + this.baseClasses = [this.type, ...getBaseClasses(APIChain)] + this.inputs = [ + { + label: 'Language Model', + name: 'model', + type: 'BaseLanguageModel' + }, + { + label: 'API Documentation', + name: 'apiDocs', + type: 'string', + description: + 'Description of how API works. Please refer to more examples', + rows: 4 + }, + { + label: 'Headers', + name: 'headers', + type: 'json', + additionalParams: true, + optional: true + }, + { + label: 'URL Prompt', + name: 'urlPrompt', + type: 'string', + description: 'Prompt used to tell LLMs how to construct the URL. Must contains {api_docs} and {question}', + default: API_URL_RAW_PROMPT_TEMPLATE, + rows: 4, + additionalParams: true + }, + { + label: 'Answer Prompt', + name: 'ansPrompt', + type: 'string', + description: + 'Prompt used to tell LLMs how to return the API response. Must contains {api_response}, {api_url}, and {question}', + default: API_RESPONSE_RAW_PROMPT_TEMPLATE, + rows: 4, + additionalParams: true + } + ] + } + + async init(nodeData: INodeData): Promise { + const model = nodeData.inputs?.model as BaseLanguageModel + const apiDocs = nodeData.inputs?.apiDocs as string + const headers = nodeData.inputs?.headers as string + const urlPrompt = nodeData.inputs?.urlPrompt as string + const ansPrompt = nodeData.inputs?.ansPrompt as string + + const chain = await getAPIChain(apiDocs, model, headers, urlPrompt, ansPrompt) + return chain + } + + async run(nodeData: INodeData, input: string, options: ICommonObject): Promise { + const model = nodeData.inputs?.model as BaseLanguageModel + const apiDocs = nodeData.inputs?.apiDocs as string + const headers = nodeData.inputs?.headers as string + const urlPrompt = nodeData.inputs?.urlPrompt as string + const ansPrompt = nodeData.inputs?.ansPrompt as string + + const chain = await getAPIChain(apiDocs, model, headers, urlPrompt, ansPrompt) + const loggerHandler = new ConsoleCallbackHandler(options.logger) + + if (options.socketIO && options.socketIOClientId) { + const handler = new CustomChainHandler(options.socketIO, options.socketIOClientId, 2) + const res = await chain.run(input, [loggerHandler, handler]) + return res + } else { + const res = await chain.run(input, [loggerHandler]) + return res + } + } +} + +const getAPIChain = async (documents: string, llm: BaseLanguageModel, headers: string, urlPrompt: string, ansPrompt: string) => { + const apiUrlPrompt = new PromptTemplate({ + inputVariables: ['api_docs', 'question'], + template: urlPrompt ? urlPrompt : API_URL_RAW_PROMPT_TEMPLATE + }) + + const apiResponsePrompt = new PromptTemplate({ + inputVariables: ['api_docs', 'question', 'api_url', 'api_response'], + template: ansPrompt ? ansPrompt : API_RESPONSE_RAW_PROMPT_TEMPLATE + }) + + const chain = APIChain.fromLLMAndAPIDocs(llm, documents, { + apiUrlPrompt, + apiResponsePrompt, + verbose: process.env.DEBUG === 'true' ? true : false, + headers: typeof headers === 'object' ? headers : headers ? JSON.parse(headers) : {} + }) + return chain +} + +module.exports = { nodeClass: GETApiChain_Chains } diff --git a/packages/components/nodes/chains/ApiChain/OpenAPIChain.ts b/packages/components/nodes/chains/ApiChain/OpenAPIChain.ts new file mode 100644 index 000000000..9f6c79e4e --- /dev/null +++ b/packages/components/nodes/chains/ApiChain/OpenAPIChain.ts @@ -0,0 +1,100 @@ +import { ICommonObject, INode, INodeData, INodeParams } from '../../../src/Interface' +import { APIChain, createOpenAPIChain } from 'langchain/chains' +import { getBaseClasses } from '../../../src/utils' +import { ChatOpenAI } from 'langchain/chat_models/openai' +import { ConsoleCallbackHandler, CustomChainHandler } from '../../../src/handler' + +class OpenApiChain_Chains implements INode { + label: string + name: string + version: number + type: string + icon: string + category: string + baseClasses: string[] + description: string + inputs: INodeParams[] + + constructor() { + this.label = 'OpenAPI Chain' + this.name = 'openApiChain' + this.version = 1.0 + this.type = 'OpenAPIChain' + this.icon = 'openapi.png' + this.category = 'Chains' + this.description = 'Chain that automatically select and call APIs based only on an OpenAPI spec' + this.baseClasses = [this.type, ...getBaseClasses(APIChain)] + this.inputs = [ + { + label: 'ChatOpenAI Model', + name: 'model', + type: 'ChatOpenAI' + }, + { + label: 'YAML Link', + name: 'yamlLink', + type: 'string', + placeholder: 'https://api.speak.com/openapi.yaml', + description: 'If YAML link is provided, uploaded YAML File will be ignored and YAML link will be used instead' + }, + { + label: 'YAML File', + name: 'yamlFile', + type: 'file', + fileType: '.yaml', + description: 'If YAML link is provided, uploaded YAML File will be ignored and YAML link will be used instead' + }, + { + label: 'Headers', + name: 'headers', + type: 'json', + additionalParams: true, + optional: true + } + ] + } + + async init(nodeData: INodeData): Promise { + return await initChain(nodeData) + } + + async run(nodeData: INodeData, input: string, options: ICommonObject): Promise { + const chain = await initChain(nodeData) + const loggerHandler = new ConsoleCallbackHandler(options.logger) + + if (options.socketIO && options.socketIOClientId) { + const handler = new CustomChainHandler(options.socketIO, options.socketIOClientId) + const res = await chain.run(input, [loggerHandler, handler]) + return res + } else { + const res = await chain.run(input, [loggerHandler]) + return res + } + } +} + +const initChain = async (nodeData: INodeData) => { + const model = nodeData.inputs?.model as ChatOpenAI + const headers = nodeData.inputs?.headers as string + const yamlLink = nodeData.inputs?.yamlLink as string + const yamlFileBase64 = nodeData.inputs?.yamlFile as string + + let yamlString = '' + + if (yamlLink) { + yamlString = yamlLink + } else { + const splitDataURI = yamlFileBase64.split(',') + splitDataURI.pop() + const bf = Buffer.from(splitDataURI.pop() || '', 'base64') + yamlString = bf.toString('utf-8') + } + + return await createOpenAPIChain(yamlString, { + llm: model, + headers: typeof headers === 'object' ? headers : headers ? JSON.parse(headers) : {}, + verbose: process.env.DEBUG === 'true' ? true : false + }) +} + +module.exports = { nodeClass: OpenApiChain_Chains } diff --git a/packages/components/nodes/chains/ApiChain/POSTApiChain.ts b/packages/components/nodes/chains/ApiChain/POSTApiChain.ts new file mode 100644 index 000000000..cba4a2970 --- /dev/null +++ b/packages/components/nodes/chains/ApiChain/POSTApiChain.ts @@ -0,0 +1,123 @@ +import { ICommonObject, INode, INodeData, INodeParams } from '../../../src/Interface' +import { getBaseClasses } from '../../../src/utils' +import { BaseLanguageModel } from 'langchain/base_language' +import { PromptTemplate } from 'langchain/prompts' +import { API_RESPONSE_RAW_PROMPT_TEMPLATE, API_URL_RAW_PROMPT_TEMPLATE, APIChain } from './postCore' +import { ConsoleCallbackHandler, CustomChainHandler } from '../../../src/handler' + +class POSTApiChain_Chains implements INode { + label: string + name: string + version: number + type: string + icon: string + category: string + baseClasses: string[] + description: string + inputs: INodeParams[] + + constructor() { + this.label = 'POST API Chain' + this.name = 'postApiChain' + this.version = 1.0 + this.type = 'POSTApiChain' + this.icon = 'apichain.svg' + this.category = 'Chains' + this.description = 'Chain to run queries against POST API' + this.baseClasses = [this.type, ...getBaseClasses(APIChain)] + this.inputs = [ + { + label: 'Language Model', + name: 'model', + type: 'BaseLanguageModel' + }, + { + label: 'API Documentation', + name: 'apiDocs', + type: 'string', + description: + 'Description of how API works. Please refer to more examples', + rows: 4 + }, + { + label: 'Headers', + name: 'headers', + type: 'json', + additionalParams: true, + optional: true + }, + { + label: 'URL Prompt', + name: 'urlPrompt', + type: 'string', + description: 'Prompt used to tell LLMs how to construct the URL. Must contains {api_docs} and {question}', + default: API_URL_RAW_PROMPT_TEMPLATE, + rows: 4, + additionalParams: true + }, + { + label: 'Answer Prompt', + name: 'ansPrompt', + type: 'string', + description: + 'Prompt used to tell LLMs how to return the API response. Must contains {api_response}, {api_url}, and {question}', + default: API_RESPONSE_RAW_PROMPT_TEMPLATE, + rows: 4, + additionalParams: true + } + ] + } + + async init(nodeData: INodeData): Promise { + const model = nodeData.inputs?.model as BaseLanguageModel + const apiDocs = nodeData.inputs?.apiDocs as string + const headers = nodeData.inputs?.headers as string + const urlPrompt = nodeData.inputs?.urlPrompt as string + const ansPrompt = nodeData.inputs?.ansPrompt as string + + const chain = await getAPIChain(apiDocs, model, headers, urlPrompt, ansPrompt) + return chain + } + + async run(nodeData: INodeData, input: string, options: ICommonObject): Promise { + const model = nodeData.inputs?.model as BaseLanguageModel + const apiDocs = nodeData.inputs?.apiDocs as string + const headers = nodeData.inputs?.headers as string + const urlPrompt = nodeData.inputs?.urlPrompt as string + const ansPrompt = nodeData.inputs?.ansPrompt as string + + const chain = await getAPIChain(apiDocs, model, headers, urlPrompt, ansPrompt) + const loggerHandler = new ConsoleCallbackHandler(options.logger) + + if (options.socketIO && options.socketIOClientId) { + const handler = new CustomChainHandler(options.socketIO, options.socketIOClientId, 2) + const res = await chain.run(input, [loggerHandler, handler]) + return res + } else { + const res = await chain.run(input, [loggerHandler]) + return res + } + } +} + +const getAPIChain = async (documents: string, llm: BaseLanguageModel, headers: string, urlPrompt: string, ansPrompt: string) => { + const apiUrlPrompt = new PromptTemplate({ + inputVariables: ['api_docs', 'question'], + template: urlPrompt ? urlPrompt : API_URL_RAW_PROMPT_TEMPLATE + }) + + const apiResponsePrompt = new PromptTemplate({ + inputVariables: ['api_docs', 'question', 'api_url_body', 'api_response'], + template: ansPrompt ? ansPrompt : API_RESPONSE_RAW_PROMPT_TEMPLATE + }) + + const chain = APIChain.fromLLMAndAPIDocs(llm, documents, { + apiUrlPrompt, + apiResponsePrompt, + verbose: process.env.DEBUG === 'true' ? true : false, + headers: typeof headers === 'object' ? headers : headers ? JSON.parse(headers) : {} + }) + return chain +} + +module.exports = { nodeClass: POSTApiChain_Chains } diff --git a/packages/components/nodes/chains/ApiChain/apichain.svg b/packages/components/nodes/chains/ApiChain/apichain.svg new file mode 100644 index 000000000..3b86b9051 --- /dev/null +++ b/packages/components/nodes/chains/ApiChain/apichain.svg @@ -0,0 +1,3 @@ + \ No newline at end of file diff --git a/packages/components/nodes/chains/ApiChain/openapi.png b/packages/components/nodes/chains/ApiChain/openapi.png new file mode 100644 index 000000000..457c2e405 Binary files /dev/null and b/packages/components/nodes/chains/ApiChain/openapi.png differ diff --git a/packages/components/nodes/chains/ApiChain/postCore.ts b/packages/components/nodes/chains/ApiChain/postCore.ts new file mode 100644 index 000000000..de7215d92 --- /dev/null +++ b/packages/components/nodes/chains/ApiChain/postCore.ts @@ -0,0 +1,162 @@ +import { BaseLanguageModel } from 'langchain/base_language' +import { CallbackManagerForChainRun } from 'langchain/callbacks' +import { BaseChain, ChainInputs, LLMChain, SerializedAPIChain } from 'langchain/chains' +import { BasePromptTemplate, PromptTemplate } from 'langchain/prompts' +import { ChainValues } from 'langchain/schema' +import fetch from 'node-fetch' + +export const API_URL_RAW_PROMPT_TEMPLATE = `You are given the below API Documentation: +{api_docs} +Using this documentation, generate a json string with two keys: "url" and "data". +The value of "url" should be a string, which is the API url to call for answering the user question. +The value of "data" should be a dictionary of key-value pairs you want to POST to the url as a JSON body. +Be careful to always use double quotes for strings in the json string. +You should build the json string in order to get a response that is as short as possible, while still getting the necessary information to answer the question. Pay attention to deliberately exclude any unnecessary pieces of data in the API call. + +Question:{question} +json string:` + +export const API_RESPONSE_RAW_PROMPT_TEMPLATE = `${API_URL_RAW_PROMPT_TEMPLATE} {api_url_body} + +Here is the response from the API: + +{api_response} + +Summarize this response to answer the original question. + +Summary:` + +const defaultApiUrlPrompt = new PromptTemplate({ + inputVariables: ['api_docs', 'question'], + template: API_URL_RAW_PROMPT_TEMPLATE +}) + +const defaultApiResponsePrompt = new PromptTemplate({ + inputVariables: ['api_docs', 'question', 'api_url_body', 'api_response'], + template: API_RESPONSE_RAW_PROMPT_TEMPLATE +}) + +export interface APIChainInput extends Omit { + apiAnswerChain: LLMChain + apiRequestChain: LLMChain + apiDocs: string + inputKey?: string + headers?: Record + /** Key to use for output, defaults to `output` */ + outputKey?: string +} + +export type APIChainOptions = { + headers?: Record + apiUrlPrompt?: BasePromptTemplate + apiResponsePrompt?: BasePromptTemplate +} + +export class APIChain extends BaseChain implements APIChainInput { + apiAnswerChain: LLMChain + + apiRequestChain: LLMChain + + apiDocs: string + + headers = {} + + inputKey = 'question' + + outputKey = 'output' + + get inputKeys() { + return [this.inputKey] + } + + get outputKeys() { + return [this.outputKey] + } + + constructor(fields: APIChainInput) { + super(fields) + this.apiRequestChain = fields.apiRequestChain + this.apiAnswerChain = fields.apiAnswerChain + this.apiDocs = fields.apiDocs + this.inputKey = fields.inputKey ?? this.inputKey + this.outputKey = fields.outputKey ?? this.outputKey + this.headers = fields.headers ?? this.headers + } + + /** @ignore */ + async _call(values: ChainValues, runManager?: CallbackManagerForChainRun): Promise { + try { + const question: string = values[this.inputKey] + + const api_url_body = await this.apiRequestChain.predict({ question, api_docs: this.apiDocs }, runManager?.getChild()) + + const { url, data } = JSON.parse(api_url_body) + + const res = await fetch(url, { + method: 'POST', + headers: this.headers, + body: JSON.stringify(data) + }) + + const api_response = await res.text() + + const answer = await this.apiAnswerChain.predict( + { question, api_docs: this.apiDocs, api_url_body, api_response }, + runManager?.getChild() + ) + + return { [this.outputKey]: answer } + } catch (error) { + return { [this.outputKey]: error } + } + } + + _chainType() { + return 'api_chain' as const + } + + static async deserialize(data: SerializedAPIChain) { + const { api_request_chain, api_answer_chain, api_docs } = data + + if (!api_request_chain) { + throw new Error('LLMChain must have api_request_chain') + } + if (!api_answer_chain) { + throw new Error('LLMChain must have api_answer_chain') + } + if (!api_docs) { + throw new Error('LLMChain must have api_docs') + } + + return new APIChain({ + apiAnswerChain: await LLMChain.deserialize(api_answer_chain), + apiRequestChain: await LLMChain.deserialize(api_request_chain), + apiDocs: api_docs + }) + } + + serialize(): SerializedAPIChain { + return { + _type: this._chainType(), + api_answer_chain: this.apiAnswerChain.serialize(), + api_request_chain: this.apiRequestChain.serialize(), + api_docs: this.apiDocs + } + } + + static fromLLMAndAPIDocs( + llm: BaseLanguageModel, + apiDocs: string, + options: APIChainOptions & Omit = {} + ): APIChain { + const { apiUrlPrompt = defaultApiUrlPrompt, apiResponsePrompt = defaultApiResponsePrompt } = options + const apiRequestChain = new LLMChain({ prompt: apiUrlPrompt, llm }) + const apiAnswerChain = new LLMChain({ prompt: apiResponsePrompt, llm }) + return new this({ + apiAnswerChain, + apiRequestChain, + apiDocs, + ...options + }) + } +} diff --git a/packages/components/nodes/chains/ConversationChain/ConversationChain.ts b/packages/components/nodes/chains/ConversationChain/ConversationChain.ts index 19e28752e..08663395e 100644 --- a/packages/components/nodes/chains/ConversationChain/ConversationChain.ts +++ b/packages/components/nodes/chains/ConversationChain/ConversationChain.ts @@ -1,16 +1,19 @@ -import { ICommonObject, IMessage, INode, INodeData, INodeParams } from '../../../src/Interface' +import { ICommonObject, INode, INodeData, INodeParams } from '../../../src/Interface' import { ConversationChain } from 'langchain/chains' -import { getBaseClasses } from '../../../src/utils' +import { getBaseClasses, mapChatHistory } from '../../../src/utils' import { ChatPromptTemplate, HumanMessagePromptTemplate, MessagesPlaceholder, SystemMessagePromptTemplate } from 'langchain/prompts' -import { BufferMemory, ChatMessageHistory } from 'langchain/memory' +import { BufferMemory } from 'langchain/memory' import { BaseChatModel } from 'langchain/chat_models/base' -import { AIChatMessage, HumanChatMessage } from 'langchain/schema' +import { ConsoleCallbackHandler, CustomChainHandler } from '../../../src/handler' +import { flatten } from 'lodash' +import { Document } from 'langchain/document' -const systemMessage = `The following is a friendly conversation between a human and an AI. The AI is talkative and provides lots of specific details from its context. If the AI does not know the answer to a question, it truthfully says it does not know.` +let systemMessage = `The following is a friendly conversation between a human and an AI. The AI is talkative and provides lots of specific details from its context. If the AI does not know the answer to a question, it truthfully says it does not know.` class ConversationChain_Chains implements INode { label: string name: string + version: number type: string icon: string category: string @@ -21,6 +24,7 @@ class ConversationChain_Chains implements INode { constructor() { this.label = 'Conversation Chain' this.name = 'conversationChain' + this.version = 1.0 this.type = 'ConversationChain' this.icon = 'chain.svg' this.category = 'Chains' @@ -37,6 +41,15 @@ class ConversationChain_Chains implements INode { name: 'memory', type: 'BaseMemory' }, + { + label: 'Document', + name: 'document', + type: 'Document', + description: + 'Include whole document into the context window, if you get maximum context length error, please use model with higher context window like Claude 100k, or gpt4 32k', + optional: true, + list: true + }, { label: 'System Message', name: 'systemMessagePrompt', @@ -53,10 +66,28 @@ class ConversationChain_Chains implements INode { const model = nodeData.inputs?.model as BaseChatModel const memory = nodeData.inputs?.memory as BufferMemory const prompt = nodeData.inputs?.systemMessagePrompt as string + const docs = nodeData.inputs?.document as Document[] + + const flattenDocs = docs && docs.length ? flatten(docs) : [] + const finalDocs = [] + for (let i = 0; i < flattenDocs.length; i += 1) { + finalDocs.push(new Document(flattenDocs[i])) + } + + let finalText = '' + for (let i = 0; i < finalDocs.length; i += 1) { + finalText += finalDocs[i].pageContent + } + + const replaceChar: string[] = ['{', '}'] + for (const char of replaceChar) finalText = finalText.replaceAll(char, '') + + if (finalText) systemMessage = `${systemMessage}\nThe AI has the following context:\n${finalText}` const obj: any = { llm: model, - memory + memory, + verbose: process.env.DEBUG === 'true' ? true : false } const chatPrompt = ChatPromptTemplate.fromPromptMessages([ @@ -75,22 +106,20 @@ class ConversationChain_Chains implements INode { const memory = nodeData.inputs?.memory as BufferMemory if (options && options.chatHistory) { - const chatHistory = [] - const histories: IMessage[] = options.chatHistory - - for (const message of histories) { - if (message.type === 'apiMessage') { - chatHistory.push(new AIChatMessage(message.message)) - } else if (message.type === 'userMessage') { - chatHistory.push(new HumanChatMessage(message.message)) - } - } - memory.chatHistory = new ChatMessageHistory(chatHistory) + memory.chatHistory = mapChatHistory(options) chain.memory = memory } - const res = await chain.call({ input }) - return res?.response + const loggerHandler = new ConsoleCallbackHandler(options.logger) + + if (options.socketIO && options.socketIOClientId) { + const handler = new CustomChainHandler(options.socketIO, options.socketIOClientId) + const res = await chain.call({ input }, [loggerHandler, handler]) + return res?.response + } else { + const res = await chain.call({ input }, [loggerHandler]) + return res?.response + } } } diff --git a/packages/components/nodes/chains/ConversationalRetrievalQAChain/ConversationalRetrievalQAChain.ts b/packages/components/nodes/chains/ConversationalRetrievalQAChain/ConversationalRetrievalQAChain.ts index 7e37f9131..c14b292d6 100644 --- a/packages/components/nodes/chains/ConversationalRetrievalQAChain/ConversationalRetrievalQAChain.ts +++ b/packages/components/nodes/chains/ConversationalRetrievalQAChain/ConversationalRetrievalQAChain.ts @@ -1,26 +1,25 @@ import { BaseLanguageModel } from 'langchain/base_language' -import { ICommonObject, IMessage, INode, INodeData, INodeParams } from '../../../src/Interface' -import { getBaseClasses } from '../../../src/utils' -import { ConversationalRetrievalQAChain } from 'langchain/chains' -import { BaseRetriever } from 'langchain/schema' - -const default_qa_template = `Use the following pieces of context to answer the question at the end. If you don't know the answer, just say that you don't know, don't try to make up an answer. - -{context} - -Question: {question} -Helpful Answer:` - -const qa_template = `Use the following pieces of context to answer the question at the end. - -{context} - -Question: {question} -Helpful Answer:` +import { ICommonObject, INode, INodeData, INodeParams } from '../../../src/Interface' +import { getBaseClasses, mapChatHistory } from '../../../src/utils' +import { ConversationalRetrievalQAChain, QAChainParams } from 'langchain/chains' +import { BaseRetriever } from 'langchain/schema/retriever' +import { BufferMemory, BufferMemoryInput } from 'langchain/memory' +import { PromptTemplate } from 'langchain/prompts' +import { ConsoleCallbackHandler, CustomChainHandler } from '../../../src/handler' +import { + default_map_reduce_template, + default_qa_template, + qa_template, + map_reduce_template, + CUSTOM_QUESTION_GENERATOR_CHAIN_PROMPT, + refine_question_template, + refine_template +} from './prompts' class ConversationalRetrievalQAChain_Chains implements INode { label: string name: string + version: number type: string icon: string category: string @@ -31,6 +30,7 @@ class ConversationalRetrievalQAChain_Chains implements INode { constructor() { this.label = 'Conversational Retrieval QA Chain' this.name = 'conversationalRetrievalQAChain' + this.version = 1.0 this.type = 'ConversationalRetrievalQAChain' this.icon = 'chain.svg' this.category = 'Chains' @@ -47,6 +47,19 @@ class ConversationalRetrievalQAChain_Chains implements INode { name: 'vectorStoreRetriever', type: 'BaseRetriever' }, + { + label: 'Memory', + name: 'memory', + type: 'BaseMemory', + optional: true, + description: 'If left empty, a default BufferMemory will be used' + }, + { + label: 'Return Source Documents', + name: 'returnSourceDocuments', + type: 'boolean', + optional: true + }, { label: 'System Message', name: 'systemMessagePrompt', @@ -56,6 +69,31 @@ class ConversationalRetrievalQAChain_Chains implements INode { optional: true, placeholder: 'I want you to act as a document that I am having a conversation with. Your name is "AI Assistant". You will provide me with answers from the given info. If the answer is not included, say exactly "Hmm, I am not sure." and stop after that. Refuse to answer any question not about the info. Never break character.' + }, + { + label: 'Chain Option', + name: 'chainOption', + type: 'options', + options: [ + { + label: 'MapReduceDocumentsChain', + name: 'map_reduce', + description: + 'Suitable for QA tasks over larger documents and can run the preprocessing step in parallel, reducing the running time' + }, + { + label: 'RefineDocumentsChain', + name: 'refine', + description: 'Suitable for QA tasks over a large number of documents.' + }, + { + label: 'StuffDocumentsChain', + name: 'stuff', + description: 'Suitable for QA tasks over a small number of documents.' + } + ], + additionalParams: true, + optional: true } ] } @@ -64,35 +102,112 @@ class ConversationalRetrievalQAChain_Chains implements INode { const model = nodeData.inputs?.model as BaseLanguageModel const vectorStoreRetriever = nodeData.inputs?.vectorStoreRetriever as BaseRetriever const systemMessagePrompt = nodeData.inputs?.systemMessagePrompt as string + const returnSourceDocuments = nodeData.inputs?.returnSourceDocuments as boolean + const chainOption = nodeData.inputs?.chainOption as string + const externalMemory = nodeData.inputs?.memory - const chain = ConversationalRetrievalQAChain.fromLLM(model, vectorStoreRetriever, { + const obj: any = { verbose: process.env.DEBUG === 'true' ? true : false, - qaTemplate: systemMessagePrompt ? `${systemMessagePrompt}\n${qa_template}` : default_qa_template - }) + questionGeneratorChainOptions: { + template: CUSTOM_QUESTION_GENERATOR_CHAIN_PROMPT + } + } + + if (returnSourceDocuments) obj.returnSourceDocuments = returnSourceDocuments + + if (chainOption === 'map_reduce') { + obj.qaChainOptions = { + type: 'map_reduce', + combinePrompt: PromptTemplate.fromTemplate( + systemMessagePrompt ? `${systemMessagePrompt}\n${map_reduce_template}` : default_map_reduce_template + ) + } as QAChainParams + } else if (chainOption === 'refine') { + const qprompt = new PromptTemplate({ + inputVariables: ['context', 'question'], + template: refine_question_template(systemMessagePrompt) + }) + const rprompt = new PromptTemplate({ + inputVariables: ['context', 'question', 'existing_answer'], + template: refine_template + }) + obj.qaChainOptions = { + type: 'refine', + questionPrompt: qprompt, + refinePrompt: rprompt + } as QAChainParams + } else { + obj.qaChainOptions = { + type: 'stuff', + prompt: PromptTemplate.fromTemplate(systemMessagePrompt ? `${systemMessagePrompt}\n${qa_template}` : default_qa_template) + } as QAChainParams + } + + if (externalMemory) { + externalMemory.memoryKey = 'chat_history' + externalMemory.inputKey = 'question' + externalMemory.outputKey = 'text' + externalMemory.returnMessages = true + if (chainOption === 'refine') externalMemory.outputKey = 'output_text' + obj.memory = externalMemory + } else { + const fields: BufferMemoryInput = { + memoryKey: 'chat_history', + inputKey: 'question', + outputKey: 'text', + returnMessages: true + } + if (chainOption === 'refine') fields.outputKey = 'output_text' + obj.memory = new BufferMemory(fields) + } + + const chain = ConversationalRetrievalQAChain.fromLLM(model, vectorStoreRetriever, obj) return chain } - async run(nodeData: INodeData, input: string, options: ICommonObject): Promise { + async run(nodeData: INodeData, input: string, options: ICommonObject): Promise { const chain = nodeData.instance as ConversationalRetrievalQAChain - let chatHistory = '' + const returnSourceDocuments = nodeData.inputs?.returnSourceDocuments as boolean + const chainOption = nodeData.inputs?.chainOption as string - if (options && options.chatHistory) { - const histories: IMessage[] = options.chatHistory - chatHistory = histories - .map((item) => { - return item.message - }) - .join('') + let model = nodeData.inputs?.model + + // Temporary fix: https://github.com/hwchase17/langchainjs/issues/754 + model.streaming = false + chain.questionGeneratorChain.llm = model + + const obj = { question: input } + + if (options && options.chatHistory && chain.memory) { + ;(chain.memory as any).chatHistory = mapChatHistory(options) } - const obj = { - question: input, - chat_history: chatHistory ? chatHistory : [] + const loggerHandler = new ConsoleCallbackHandler(options.logger) + + if (options.socketIO && options.socketIOClientId) { + const handler = new CustomChainHandler( + options.socketIO, + options.socketIOClientId, + chainOption === 'refine' ? 4 : undefined, + returnSourceDocuments + ) + const res = await chain.call(obj, [loggerHandler, handler]) + if (chainOption === 'refine') { + if (res.output_text && res.sourceDocuments) { + return { + text: res.output_text, + sourceDocuments: res.sourceDocuments + } + } + return res?.output_text + } + if (res.text && res.sourceDocuments) return res + return res?.text + } else { + const res = await chain.call(obj, [loggerHandler]) + if (res.text && res.sourceDocuments) return res + return res?.text } - - const res = await chain.call(obj) - - return res?.text } } diff --git a/packages/components/nodes/chains/ConversationalRetrievalQAChain/prompts.ts b/packages/components/nodes/chains/ConversationalRetrievalQAChain/prompts.ts new file mode 100644 index 000000000..132e3a97e --- /dev/null +++ b/packages/components/nodes/chains/ConversationalRetrievalQAChain/prompts.ts @@ -0,0 +1,64 @@ +export const default_qa_template = `Use the following pieces of context to answer the question at the end. If you don't know the answer, just say that you don't know, don't try to make up an answer. + +{context} + +Question: {question} +Helpful Answer:` + +export const qa_template = `Use the following pieces of context to answer the question at the end. + +{context} + +Question: {question} +Helpful Answer:` + +export const default_map_reduce_template = `Given the following extracted parts of a long document and a question, create a final answer. +If you don't know the answer, just say that you don't know. Don't try to make up an answer. + +{summaries} + +Question: {question} +Helpful Answer:` + +export const map_reduce_template = `Given the following extracted parts of a long document and a question, create a final answer. + +{summaries} + +Question: {question} +Helpful Answer:` + +export const refine_question_template = (sysPrompt?: string) => { + let returnPrompt = '' + if (sysPrompt) + returnPrompt = `Context information is below. +--------------------- +{context} +--------------------- +Given the context information and not prior knowledge, ${sysPrompt} +Answer the question: {question}. +Answer:` + if (!sysPrompt) + returnPrompt = `Context information is below. +--------------------- +{context} +--------------------- +Given the context information and not prior knowledge, answer the question: {question}. +Answer:` + return returnPrompt +} + +export const refine_template = `The original question is as follows: {question} +We have provided an existing answer: {existing_answer} +We have the opportunity to refine the existing answer (only if needed) with some more context below. +------------ +{context} +------------ +Given the new context, refine the original answer to better answer the question. +If you can't find answer from the context, return the original answer.` + +export const CUSTOM_QUESTION_GENERATOR_CHAIN_PROMPT = `Given the following conversation and a follow up question, rephrase the follow up question to be a standalone question, answer in the same language as the follow up question. include it in the standalone question. + +Chat History: +{chat_history} +Follow Up Input: {question} +Standalone question:` diff --git a/packages/components/nodes/chains/LLMChain/LLMChain.ts b/packages/components/nodes/chains/LLMChain/LLMChain.ts index b178e28df..5088b34d4 100644 --- a/packages/components/nodes/chains/LLMChain/LLMChain.ts +++ b/packages/components/nodes/chains/LLMChain/LLMChain.ts @@ -1,11 +1,13 @@ import { ICommonObject, INode, INodeData, INodeOutputsValue, INodeParams } from '../../../src/Interface' -import { getBaseClasses } from '../../../src/utils' +import { getBaseClasses, handleEscapeCharacters } from '../../../src/utils' import { LLMChain } from 'langchain/chains' import { BaseLanguageModel } from 'langchain/base_language' +import { ConsoleCallbackHandler, CustomChainHandler } from '../../../src/handler' class LLMChain_Chains implements INode { label: string name: string + version: number type: string icon: string category: string @@ -17,6 +19,7 @@ class LLMChain_Chains implements INode { constructor() { this.label = 'LLM Chain' this.name = 'llmChain' + this.version = 1.0 this.type = 'LLMChain' this.icon = 'chain.svg' this.category = 'Chains' @@ -50,12 +53,12 @@ class LLMChain_Chains implements INode { { label: 'Output Prediction', name: 'outputPrediction', - baseClasses: ['string'] + baseClasses: ['string', 'json'] } ] } - async init(nodeData: INodeData, input: string): Promise { + async init(nodeData: INodeData, input: string, options: ICommonObject): Promise { const model = nodeData.inputs?.model as BaseLanguageModel const prompt = nodeData.inputs?.prompt const output = nodeData.outputs?.output as string @@ -67,21 +70,25 @@ class LLMChain_Chains implements INode { } else if (output === 'outputPrediction') { const chain = new LLMChain({ llm: model, prompt, verbose: process.env.DEBUG === 'true' ? true : false }) const inputVariables = chain.prompt.inputVariables as string[] // ["product"] - const res = await runPrediction(inputVariables, chain, input, promptValues) + const res = await runPrediction(inputVariables, chain, input, promptValues, options) // eslint-disable-next-line no-console console.log('\x1b[92m\x1b[1m\n*****OUTPUT PREDICTION*****\n\x1b[0m\x1b[0m') // eslint-disable-next-line no-console console.log(res) - return res + /** + * Apply string transformation to convert special chars: + * FROM: hello i am ben\n\n\thow are you? + * TO: hello i am benFLOWISE_NEWLINEFLOWISE_NEWLINEFLOWISE_TABhow are you? + */ + return handleEscapeCharacters(res, false) } } - async run(nodeData: INodeData, input: string): Promise { + async run(nodeData: INodeData, input: string, options: ICommonObject): Promise { const inputVariables = nodeData.instance.prompt.inputVariables as string[] // ["product"] const chain = nodeData.instance as LLMChain const promptValues = nodeData.inputs?.prompt.promptValues as ICommonObject - - const res = await runPrediction(inputVariables, chain, input, promptValues) + const res = await runPrediction(inputVariables, chain, input, promptValues, options) // eslint-disable-next-line no-console console.log('\x1b[93m\x1b[1m\n*****FINAL RESULT*****\n\x1b[0m\x1b[0m') // eslint-disable-next-line no-console @@ -90,11 +97,26 @@ class LLMChain_Chains implements INode { } } -const runPrediction = async (inputVariables: string[], chain: LLMChain, input: string, promptValues: ICommonObject) => { - if (inputVariables.length === 1) { - const res = await chain.run(input) - return res - } else if (inputVariables.length > 1) { +const runPrediction = async ( + inputVariables: string[], + chain: LLMChain, + input: string, + promptValuesRaw: ICommonObject, + options: ICommonObject +) => { + const loggerHandler = new ConsoleCallbackHandler(options.logger) + const isStreaming = options.socketIO && options.socketIOClientId + const socketIO = isStreaming ? options.socketIO : undefined + const socketIOClientId = isStreaming ? options.socketIOClientId : '' + + /** + * Apply string transformation to reverse converted special chars: + * FROM: { "value": "hello i am benFLOWISE_NEWLINEFLOWISE_NEWLINEFLOWISE_TABhow are you?" } + * TO: { "value": "hello i am ben\n\n\thow are you?" } + */ + const promptValues = handleEscapeCharacters(promptValuesRaw, true) + + if (promptValues && inputVariables.length > 0) { let seen: string[] = [] for (const variable of inputVariables) { @@ -106,11 +128,15 @@ const runPrediction = async (inputVariables: string[], chain: LLMChain, input: s if (seen.length === 0) { // All inputVariables have fixed values specified - const options = { - ...promptValues + const options = { ...promptValues } + if (isStreaming) { + const handler = new CustomChainHandler(socketIO, socketIOClientId) + const res = await chain.call(options, [loggerHandler, handler]) + return res?.text + } else { + const res = await chain.call(options, [loggerHandler]) + return res?.text } - const res = await chain.call(options) - return res?.text } else if (seen.length === 1) { // If one inputVariable is not specify, use input (user's question) as value const lastValue = seen.pop() @@ -119,14 +145,26 @@ const runPrediction = async (inputVariables: string[], chain: LLMChain, input: s ...promptValues, [lastValue]: input } - const res = await chain.call(options) - return res?.text + if (isStreaming) { + const handler = new CustomChainHandler(socketIO, socketIOClientId) + const res = await chain.call(options, [loggerHandler, handler]) + return res?.text + } else { + const res = await chain.call(options, [loggerHandler]) + return res?.text + } } else { throw new Error(`Please provide Prompt Values for: ${seen.join(', ')}`) } } else { - const res = await chain.run(input) - return res + if (isStreaming) { + const handler = new CustomChainHandler(socketIO, socketIOClientId) + const res = await chain.run(input, [loggerHandler, handler]) + return res + } else { + const res = await chain.run(input, [loggerHandler]) + return res + } } } diff --git a/packages/components/nodes/chains/MultiPromptChain/MultiPromptChain.ts b/packages/components/nodes/chains/MultiPromptChain/MultiPromptChain.ts new file mode 100644 index 000000000..0d1377143 --- /dev/null +++ b/packages/components/nodes/chains/MultiPromptChain/MultiPromptChain.ts @@ -0,0 +1,82 @@ +import { BaseLanguageModel } from 'langchain/base_language' +import { ICommonObject, INode, INodeData, INodeParams, PromptRetriever } from '../../../src/Interface' +import { getBaseClasses } from '../../../src/utils' +import { MultiPromptChain } from 'langchain/chains' +import { ConsoleCallbackHandler, CustomChainHandler } from '../../../src/handler' + +class MultiPromptChain_Chains implements INode { + label: string + name: string + version: number + type: string + icon: string + category: string + baseClasses: string[] + description: string + inputs: INodeParams[] + + constructor() { + this.label = 'Multi Prompt Chain' + this.name = 'multiPromptChain' + this.version = 1.0 + this.type = 'MultiPromptChain' + this.icon = 'chain.svg' + this.category = 'Chains' + this.description = 'Chain automatically picks an appropriate prompt from multiple prompt templates' + this.baseClasses = [this.type, ...getBaseClasses(MultiPromptChain)] + this.inputs = [ + { + label: 'Language Model', + name: 'model', + type: 'BaseLanguageModel' + }, + { + label: 'Prompt Retriever', + name: 'promptRetriever', + type: 'PromptRetriever', + list: true + } + ] + } + + async init(nodeData: INodeData): Promise { + const model = nodeData.inputs?.model as BaseLanguageModel + const promptRetriever = nodeData.inputs?.promptRetriever as PromptRetriever[] + const promptNames = [] + const promptDescriptions = [] + const promptTemplates = [] + + for (const prompt of promptRetriever) { + promptNames.push(prompt.name) + promptDescriptions.push(prompt.description) + promptTemplates.push(prompt.systemMessage) + } + + const chain = MultiPromptChain.fromLLMAndPrompts(model, { + promptNames, + promptDescriptions, + promptTemplates, + llmChainOpts: { verbose: process.env.DEBUG === 'true' ? true : false } + }) + + return chain + } + + async run(nodeData: INodeData, input: string, options: ICommonObject): Promise { + const chain = nodeData.instance as MultiPromptChain + const obj = { input } + + const loggerHandler = new ConsoleCallbackHandler(options.logger) + + if (options.socketIO && options.socketIOClientId) { + const handler = new CustomChainHandler(options.socketIO, options.socketIOClientId, 2) + const res = await chain.call(obj, [loggerHandler, handler]) + return res?.text + } else { + const res = await chain.call(obj, [loggerHandler]) + return res?.text + } + } +} + +module.exports = { nodeClass: MultiPromptChain_Chains } diff --git a/packages/components/nodes/chains/MultiPromptChain/chain.svg b/packages/components/nodes/chains/MultiPromptChain/chain.svg new file mode 100644 index 000000000..a5b32f90a --- /dev/null +++ b/packages/components/nodes/chains/MultiPromptChain/chain.svg @@ -0,0 +1,6 @@ + + + + + + \ No newline at end of file diff --git a/packages/components/nodes/chains/MultiRetrievalQAChain/MultiRetrievalQAChain.ts b/packages/components/nodes/chains/MultiRetrievalQAChain/MultiRetrievalQAChain.ts new file mode 100644 index 000000000..6d1506475 --- /dev/null +++ b/packages/components/nodes/chains/MultiRetrievalQAChain/MultiRetrievalQAChain.ts @@ -0,0 +1,92 @@ +import { BaseLanguageModel } from 'langchain/base_language' +import { ICommonObject, INode, INodeData, INodeParams, VectorStoreRetriever } from '../../../src/Interface' +import { getBaseClasses } from '../../../src/utils' +import { MultiRetrievalQAChain } from 'langchain/chains' +import { ConsoleCallbackHandler, CustomChainHandler } from '../../../src/handler' + +class MultiRetrievalQAChain_Chains implements INode { + label: string + name: string + version: number + type: string + icon: string + category: string + baseClasses: string[] + description: string + inputs: INodeParams[] + + constructor() { + this.label = 'Multi Retrieval QA Chain' + this.name = 'multiRetrievalQAChain' + this.version = 1.0 + this.type = 'MultiRetrievalQAChain' + this.icon = 'chain.svg' + this.category = 'Chains' + this.description = 'QA Chain that automatically picks an appropriate vector store from multiple retrievers' + this.baseClasses = [this.type, ...getBaseClasses(MultiRetrievalQAChain)] + this.inputs = [ + { + label: 'Language Model', + name: 'model', + type: 'BaseLanguageModel' + }, + { + label: 'Vector Store Retriever', + name: 'vectorStoreRetriever', + type: 'VectorStoreRetriever', + list: true + }, + { + label: 'Return Source Documents', + name: 'returnSourceDocuments', + type: 'boolean', + optional: true + } + ] + } + + async init(nodeData: INodeData): Promise { + const model = nodeData.inputs?.model as BaseLanguageModel + const vectorStoreRetriever = nodeData.inputs?.vectorStoreRetriever as VectorStoreRetriever[] + const returnSourceDocuments = nodeData.inputs?.returnSourceDocuments as boolean + + const retrieverNames = [] + const retrieverDescriptions = [] + const retrievers = [] + + for (const vs of vectorStoreRetriever) { + retrieverNames.push(vs.name) + retrieverDescriptions.push(vs.description) + retrievers.push(vs.vectorStore.asRetriever((vs.vectorStore as any).k ?? 4)) + } + + const chain = MultiRetrievalQAChain.fromLLMAndRetrievers(model, { + retrieverNames, + retrieverDescriptions, + retrievers, + retrievalQAChainOpts: { verbose: process.env.DEBUG === 'true' ? true : false, returnSourceDocuments } + }) + return chain + } + + async run(nodeData: INodeData, input: string, options: ICommonObject): Promise { + const chain = nodeData.instance as MultiRetrievalQAChain + const returnSourceDocuments = nodeData.inputs?.returnSourceDocuments as boolean + + const obj = { input } + const loggerHandler = new ConsoleCallbackHandler(options.logger) + + if (options.socketIO && options.socketIOClientId) { + const handler = new CustomChainHandler(options.socketIO, options.socketIOClientId, 2, returnSourceDocuments) + const res = await chain.call(obj, [loggerHandler, handler]) + if (res.text && res.sourceDocuments) return res + return res?.text + } else { + const res = await chain.call(obj, [loggerHandler]) + if (res.text && res.sourceDocuments) return res + return res?.text + } + } +} + +module.exports = { nodeClass: MultiRetrievalQAChain_Chains } diff --git a/packages/components/nodes/chains/MultiRetrievalQAChain/chain.svg b/packages/components/nodes/chains/MultiRetrievalQAChain/chain.svg new file mode 100644 index 000000000..a5b32f90a --- /dev/null +++ b/packages/components/nodes/chains/MultiRetrievalQAChain/chain.svg @@ -0,0 +1,6 @@ + + + + + + \ No newline at end of file diff --git a/packages/components/nodes/chains/RetrievalQAChain/RetrievalQAChain.ts b/packages/components/nodes/chains/RetrievalQAChain/RetrievalQAChain.ts index c002b6848..935866ca2 100644 --- a/packages/components/nodes/chains/RetrievalQAChain/RetrievalQAChain.ts +++ b/packages/components/nodes/chains/RetrievalQAChain/RetrievalQAChain.ts @@ -1,12 +1,14 @@ -import { INode, INodeData, INodeParams } from '../../../src/Interface' +import { ICommonObject, INode, INodeData, INodeParams } from '../../../src/Interface' import { RetrievalQAChain } from 'langchain/chains' -import { BaseRetriever } from 'langchain/schema' +import { BaseRetriever } from 'langchain/schema/retriever' import { getBaseClasses } from '../../../src/utils' import { BaseLanguageModel } from 'langchain/base_language' +import { ConsoleCallbackHandler, CustomChainHandler } from '../../../src/handler' class RetrievalQAChain_Chains implements INode { label: string name: string + version: number type: string icon: string category: string @@ -17,6 +19,7 @@ class RetrievalQAChain_Chains implements INode { constructor() { this.label = 'Retrieval QA Chain' this.name = 'retrievalQAChain' + this.version = 1.0 this.type = 'RetrievalQAChain' this.icon = 'chain.svg' this.category = 'Chains' @@ -44,13 +47,21 @@ class RetrievalQAChain_Chains implements INode { return chain } - async run(nodeData: INodeData, input: string): Promise { + async run(nodeData: INodeData, input: string, options: ICommonObject): Promise { const chain = nodeData.instance as RetrievalQAChain const obj = { query: input } - const res = await chain.call(obj) - return res?.text + const loggerHandler = new ConsoleCallbackHandler(options.logger) + + if (options.socketIO && options.socketIOClientId) { + const handler = new CustomChainHandler(options.socketIO, options.socketIOClientId) + const res = await chain.call(obj, [loggerHandler, handler]) + return res?.text + } else { + const res = await chain.call(obj, [loggerHandler]) + return res?.text + } } } diff --git a/packages/components/nodes/chains/SqlDatabaseChain/SqlDatabaseChain.ts b/packages/components/nodes/chains/SqlDatabaseChain/SqlDatabaseChain.ts index 7ea10d941..2a0c71cf2 100644 --- a/packages/components/nodes/chains/SqlDatabaseChain/SqlDatabaseChain.ts +++ b/packages/components/nodes/chains/SqlDatabaseChain/SqlDatabaseChain.ts @@ -1,13 +1,18 @@ -import { INode, INodeData, INodeParams } from '../../../src/Interface' -import { SqlDatabaseChain, SqlDatabaseChainInput } from 'langchain/chains' +import { ICommonObject, INode, INodeData, INodeParams } from '../../../src/Interface' +import { SqlDatabaseChain, SqlDatabaseChainInput } from 'langchain/chains/sql_db' import { getBaseClasses } from '../../../src/utils' import { DataSource } from 'typeorm' import { SqlDatabase } from 'langchain/sql_db' import { BaseLanguageModel } from 'langchain/base_language' +import { ConsoleCallbackHandler, CustomChainHandler } from '../../../src/handler' +import { DataSourceOptions } from 'typeorm/data-source' + +type DatabaseType = 'sqlite' | 'postgres' | 'mssql' | 'mysql' class SqlDatabaseChain_Chains implements INode { label: string name: string + version: number type: string icon: string category: string @@ -18,6 +23,7 @@ class SqlDatabaseChain_Chains implements INode { constructor() { this.label = 'Sql Database Chain' this.name = 'sqlDatabaseChain' + this.version = 1.0 this.type = 'SqlDatabaseChain' this.icon = 'sqlchain.svg' this.category = 'Chains' @@ -35,46 +41,73 @@ class SqlDatabaseChain_Chains implements INode { type: 'options', options: [ { - label: 'SQlite', + label: 'SQLite', name: 'sqlite' + }, + { + label: 'PostgreSQL', + name: 'postgres' + }, + { + label: 'MSSQL', + name: 'mssql' + }, + { + label: 'MySQL', + name: 'mysql' } ], default: 'sqlite' }, { - label: 'Database File Path', - name: 'dbFilePath', + label: 'Connection string or file path (sqlite only)', + name: 'url', type: 'string', - placeholder: 'C:/Users/chinook.db' + placeholder: '1270.0.0.1:5432/chinook' } ] } async init(nodeData: INodeData): Promise { - const databaseType = nodeData.inputs?.database as 'sqlite' + const databaseType = nodeData.inputs?.database as DatabaseType const model = nodeData.inputs?.model as BaseLanguageModel - const dbFilePath = nodeData.inputs?.dbFilePath + const url = nodeData.inputs?.url - const chain = await getSQLDBChain(databaseType, dbFilePath, model) + const chain = await getSQLDBChain(databaseType, url, model) return chain } - async run(nodeData: INodeData, input: string): Promise { - const databaseType = nodeData.inputs?.database as 'sqlite' + async run(nodeData: INodeData, input: string, options: ICommonObject): Promise { + const databaseType = nodeData.inputs?.database as DatabaseType const model = nodeData.inputs?.model as BaseLanguageModel - const dbFilePath = nodeData.inputs?.dbFilePath + const url = nodeData.inputs?.url - const chain = await getSQLDBChain(databaseType, dbFilePath, model) - const res = await chain.run(input) - return res + const chain = await getSQLDBChain(databaseType, url, model) + const loggerHandler = new ConsoleCallbackHandler(options.logger) + + if (options.socketIO && options.socketIOClientId) { + const handler = new CustomChainHandler(options.socketIO, options.socketIOClientId, 2) + const res = await chain.run(input, [loggerHandler, handler]) + return res + } else { + const res = await chain.run(input, [loggerHandler]) + return res + } } } -const getSQLDBChain = async (databaseType: 'sqlite', dbFilePath: string, llm: BaseLanguageModel) => { - const datasource = new DataSource({ - type: databaseType, - database: dbFilePath - }) +const getSQLDBChain = async (databaseType: DatabaseType, url: string, llm: BaseLanguageModel) => { + const datasource = new DataSource( + databaseType === 'sqlite' + ? { + type: databaseType, + database: url + } + : ({ + type: databaseType, + url: url + } as DataSourceOptions) + ) const db = await SqlDatabase.fromDataSourceParams({ appDataSource: datasource diff --git a/packages/components/nodes/chains/VectorDBQAChain/VectorDBQAChain.ts b/packages/components/nodes/chains/VectorDBQAChain/VectorDBQAChain.ts index 37d388b4a..038116827 100644 --- a/packages/components/nodes/chains/VectorDBQAChain/VectorDBQAChain.ts +++ b/packages/components/nodes/chains/VectorDBQAChain/VectorDBQAChain.ts @@ -1,12 +1,14 @@ -import { INode, INodeData, INodeParams } from '../../../src/Interface' +import { ICommonObject, INode, INodeData, INodeParams } from '../../../src/Interface' import { getBaseClasses } from '../../../src/utils' import { VectorDBQAChain } from 'langchain/chains' import { BaseLanguageModel } from 'langchain/base_language' import { VectorStore } from 'langchain/vectorstores' +import { ConsoleCallbackHandler, CustomChainHandler } from '../../../src/handler' class VectorDBQAChain_Chains implements INode { label: string name: string + version: number type: string icon: string category: string @@ -17,6 +19,7 @@ class VectorDBQAChain_Chains implements INode { constructor() { this.label = 'VectorDB QA Chain' this.name = 'vectorDBQAChain' + this.version = 1.0 this.type = 'VectorDBQAChain' this.icon = 'chain.svg' this.category = 'Chains' @@ -40,17 +43,29 @@ class VectorDBQAChain_Chains implements INode { const model = nodeData.inputs?.model as BaseLanguageModel const vectorStore = nodeData.inputs?.vectorStore as VectorStore - const chain = VectorDBQAChain.fromLLM(model, vectorStore, { verbose: process.env.DEBUG === 'true' ? true : false }) + const chain = VectorDBQAChain.fromLLM(model, vectorStore, { + k: (vectorStore as any)?.k ?? 4, + verbose: process.env.DEBUG === 'true' ? true : false + }) return chain } - async run(nodeData: INodeData, input: string): Promise { + async run(nodeData: INodeData, input: string, options: ICommonObject): Promise { const chain = nodeData.instance as VectorDBQAChain const obj = { query: input } - const res = await chain.call(obj) - return res?.text + + const loggerHandler = new ConsoleCallbackHandler(options.logger) + + if (options.socketIO && options.socketIOClientId) { + const handler = new CustomChainHandler(options.socketIO, options.socketIOClientId) + const res = await chain.call(obj, [loggerHandler, handler]) + return res?.text + } else { + const res = await chain.call(obj, [loggerHandler]) + return res?.text + } } } diff --git a/packages/components/nodes/chatmodels/AzureChatOpenAI/Azure.svg b/packages/components/nodes/chatmodels/AzureChatOpenAI/Azure.svg index 51eb62535..47ad8c440 100644 --- a/packages/components/nodes/chatmodels/AzureChatOpenAI/Azure.svg +++ b/packages/components/nodes/chatmodels/AzureChatOpenAI/Azure.svg @@ -1,5 +1 @@ - - - - - \ No newline at end of file + \ No newline at end of file diff --git a/packages/components/nodes/chatmodels/AzureChatOpenAI/AzureChatOpenAI.ts b/packages/components/nodes/chatmodels/AzureChatOpenAI/AzureChatOpenAI.ts index 1d2fabc76..90f430f04 100644 --- a/packages/components/nodes/chatmodels/AzureChatOpenAI/AzureChatOpenAI.ts +++ b/packages/components/nodes/chatmodels/AzureChatOpenAI/AzureChatOpenAI.ts @@ -1,32 +1,36 @@ import { OpenAIBaseInput } from 'langchain/dist/types/openai-types' -import { INode, INodeData, INodeParams } from '../../../src/Interface' -import { getBaseClasses } from '../../../src/utils' +import { ICommonObject, INode, INodeData, INodeParams } from '../../../src/Interface' +import { getBaseClasses, getCredentialData, getCredentialParam } from '../../../src/utils' import { AzureOpenAIInput, ChatOpenAI } from 'langchain/chat_models/openai' class AzureChatOpenAI_ChatModels implements INode { label: string name: string + version: number type: string icon: string category: string description: string baseClasses: string[] + credential: INodeParams inputs: INodeParams[] constructor() { this.label = 'Azure ChatOpenAI' this.name = 'azureChatOpenAI' + this.version = 1.0 this.type = 'AzureChatOpenAI' this.icon = 'Azure.svg' this.category = 'Chat Models' this.description = 'Wrapper around Azure OpenAI large language models that use the Chat endpoint' this.baseClasses = [this.type, ...getBaseClasses(ChatOpenAI)] + this.credential = { + label: 'Connect Credential', + name: 'credential', + type: 'credential', + credentialNames: ['azureOpenAIApi'] + } this.inputs = [ - { - label: 'Azure OpenAI Api Key', - name: 'azureOpenAIApiKey', - type: 'password' - }, { label: 'Model Name', name: 'modelName', @@ -43,6 +47,10 @@ class AzureChatOpenAI_ChatModels implements INode { { label: 'gpt-35-turbo', name: 'gpt-35-turbo' + }, + { + label: 'gpt-35-turbo-16k', + name: 'gpt-35-turbo-16k' } ], default: 'gpt-35-turbo', @@ -52,37 +60,15 @@ class AzureChatOpenAI_ChatModels implements INode { label: 'Temperature', name: 'temperature', type: 'number', + step: 0.1, default: 0.9, optional: true }, - { - label: 'Azure OpenAI Api Instance Name', - name: 'azureOpenAIApiInstanceName', - type: 'string', - placeholder: 'YOUR-INSTANCE-NAME' - }, - { - label: 'Azure OpenAI Api Deployment Name', - name: 'azureOpenAIApiDeploymentName', - type: 'string', - placeholder: 'YOUR-DEPLOYMENT-NAME' - }, - { - label: 'Azure OpenAI Api Version', - name: 'azureOpenAIApiVersion', - type: 'options', - options: [ - { - label: '2023-03-15-preview', - name: '2023-03-15-preview' - } - ], - default: '2023-03-15-preview' - }, { label: 'Max Tokens', name: 'maxTokens', type: 'number', + step: 1, optional: true, additionalParams: true }, @@ -90,6 +76,7 @@ class AzureChatOpenAI_ChatModels implements INode { label: 'Frequency Penalty', name: 'frequencyPenalty', type: 'number', + step: 0.1, optional: true, additionalParams: true }, @@ -97,6 +84,7 @@ class AzureChatOpenAI_ChatModels implements INode { label: 'Presence Penalty', name: 'presencePenalty', type: 'number', + step: 0.1, optional: true, additionalParams: true }, @@ -104,36 +92,41 @@ class AzureChatOpenAI_ChatModels implements INode { label: 'Timeout', name: 'timeout', type: 'number', + step: 1, optional: true, additionalParams: true } ] } - async init(nodeData: INodeData): Promise { - const azureOpenAIApiKey = nodeData.inputs?.azureOpenAIApiKey as string + async init(nodeData: INodeData, _: string, options: ICommonObject): Promise { const modelName = nodeData.inputs?.modelName as string const temperature = nodeData.inputs?.temperature as string - const azureOpenAIApiInstanceName = nodeData.inputs?.azureOpenAIApiInstanceName as string - const azureOpenAIApiDeploymentName = nodeData.inputs?.azureOpenAIApiDeploymentName as string - const azureOpenAIApiVersion = nodeData.inputs?.azureOpenAIApiVersion as string const maxTokens = nodeData.inputs?.maxTokens as string const frequencyPenalty = nodeData.inputs?.frequencyPenalty as string const presencePenalty = nodeData.inputs?.presencePenalty as string const timeout = nodeData.inputs?.timeout as string + const streaming = nodeData.inputs?.streaming as boolean + + const credentialData = await getCredentialData(nodeData.credential ?? '', options) + const azureOpenAIApiKey = getCredentialParam('azureOpenAIApiKey', credentialData, nodeData) + const azureOpenAIApiInstanceName = getCredentialParam('azureOpenAIApiInstanceName', credentialData, nodeData) + const azureOpenAIApiDeploymentName = getCredentialParam('azureOpenAIApiDeploymentName', credentialData, nodeData) + const azureOpenAIApiVersion = getCredentialParam('azureOpenAIApiVersion', credentialData, nodeData) const obj: Partial & Partial = { - temperature: parseInt(temperature, 10), + temperature: parseFloat(temperature), modelName, azureOpenAIApiKey, azureOpenAIApiInstanceName, azureOpenAIApiDeploymentName, - azureOpenAIApiVersion + azureOpenAIApiVersion, + streaming: streaming ?? true } if (maxTokens) obj.maxTokens = parseInt(maxTokens, 10) - if (frequencyPenalty) obj.frequencyPenalty = parseInt(frequencyPenalty, 10) - if (presencePenalty) obj.presencePenalty = parseInt(presencePenalty, 10) + if (frequencyPenalty) obj.frequencyPenalty = parseFloat(frequencyPenalty) + if (presencePenalty) obj.presencePenalty = parseFloat(presencePenalty) if (timeout) obj.timeout = parseInt(timeout, 10) const model = new ChatOpenAI(obj) diff --git a/packages/components/nodes/chatmodels/ChatAnthropic/ChatAnthropic.ts b/packages/components/nodes/chatmodels/ChatAnthropic/ChatAnthropic.ts index b13339ad4..12a33d994 100644 --- a/packages/components/nodes/chatmodels/ChatAnthropic/ChatAnthropic.ts +++ b/packages/components/nodes/chatmodels/ChatAnthropic/ChatAnthropic.ts @@ -1,36 +1,50 @@ -import { INode, INodeData, INodeParams } from '../../../src/Interface' -import { getBaseClasses } from '../../../src/utils' +import { ICommonObject, INode, INodeData, INodeParams } from '../../../src/Interface' +import { getBaseClasses, getCredentialData, getCredentialParam } from '../../../src/utils' import { AnthropicInput, ChatAnthropic } from 'langchain/chat_models/anthropic' class ChatAnthropic_ChatModels implements INode { label: string name: string + version: number type: string icon: string category: string description: string baseClasses: string[] + credential: INodeParams inputs: INodeParams[] constructor() { this.label = 'ChatAnthropic' this.name = 'chatAnthropic' + this.version = 1.0 this.type = 'ChatAnthropic' this.icon = 'chatAnthropic.png' this.category = 'Chat Models' this.description = 'Wrapper around ChatAnthropic large language models that use the Chat endpoint' this.baseClasses = [this.type, ...getBaseClasses(ChatAnthropic)] + this.credential = { + label: 'Connect Credential', + name: 'credential', + type: 'credential', + credentialNames: ['anthropicApi'] + } this.inputs = [ - { - label: 'ChatAnthropic Api Key', - name: 'anthropicApiKey', - type: 'password' - }, { label: 'Model Name', name: 'modelName', type: 'options', options: [ + { + label: 'claude-2', + name: 'claude-2', + description: 'Claude 2 latest major version, automatically get updates to the model as they are released' + }, + { + label: 'claude-instant-1', + name: 'claude-instant-1', + description: 'Claude Instant latest major version, automatically get updates to the model as they are released' + }, { label: 'claude-v1', name: 'claude-v1' @@ -76,13 +90,14 @@ class ChatAnthropic_ChatModels implements INode { name: 'claude-instant-v1.1-100k' } ], - default: 'claude-v1', + default: 'claude-2', optional: true }, { label: 'Temperature', name: 'temperature', type: 'number', + step: 0.1, default: 0.9, optional: true }, @@ -90,6 +105,7 @@ class ChatAnthropic_ChatModels implements INode { label: 'Max Tokens', name: 'maxTokensToSample', type: 'number', + step: 1, optional: true, additionalParams: true }, @@ -97,6 +113,7 @@ class ChatAnthropic_ChatModels implements INode { label: 'Top P', name: 'topP', type: 'number', + step: 0.1, optional: true, additionalParams: true }, @@ -104,29 +121,34 @@ class ChatAnthropic_ChatModels implements INode { label: 'Top K', name: 'topK', type: 'number', + step: 0.1, optional: true, additionalParams: true } ] } - async init(nodeData: INodeData): Promise { + async init(nodeData: INodeData, _: string, options: ICommonObject): Promise { const temperature = nodeData.inputs?.temperature as string const modelName = nodeData.inputs?.modelName as string - const anthropicApiKey = nodeData.inputs?.anthropicApiKey as string const maxTokensToSample = nodeData.inputs?.maxTokensToSample as string const topP = nodeData.inputs?.topP as string const topK = nodeData.inputs?.topK as string + const streaming = nodeData.inputs?.streaming as boolean + + const credentialData = await getCredentialData(nodeData.credential ?? '', options) + const anthropicApiKey = getCredentialParam('anthropicApiKey', credentialData, nodeData) const obj: Partial & { anthropicApiKey?: string } = { - temperature: parseInt(temperature, 10), + temperature: parseFloat(temperature), modelName, - anthropicApiKey + anthropicApiKey, + streaming: streaming ?? true } if (maxTokensToSample) obj.maxTokensToSample = parseInt(maxTokensToSample, 10) - if (topP) obj.topP = parseInt(topP, 10) - if (topK) obj.topK = parseInt(topK, 10) + if (topP) obj.topP = parseFloat(topP) + if (topK) obj.topK = parseFloat(topK) const model = new ChatAnthropic(obj) return model diff --git a/packages/components/nodes/chatmodels/ChatHuggingFace/ChatHuggingFace.ts b/packages/components/nodes/chatmodels/ChatHuggingFace/ChatHuggingFace.ts new file mode 100644 index 000000000..ee55c7bb9 --- /dev/null +++ b/packages/components/nodes/chatmodels/ChatHuggingFace/ChatHuggingFace.ts @@ -0,0 +1,126 @@ +import { ICommonObject, INode, INodeData, INodeParams } from '../../../src/Interface' +import { getBaseClasses, getCredentialData, getCredentialParam } from '../../../src/utils' +import { HFInput, HuggingFaceInference } from './core' + +class ChatHuggingFace_ChatModels implements INode { + label: string + name: string + version: number + type: string + icon: string + category: string + description: string + baseClasses: string[] + credential: INodeParams + inputs: INodeParams[] + + constructor() { + this.label = 'ChatHuggingFace' + this.name = 'chatHuggingFace' + this.version = 1.0 + this.type = 'ChatHuggingFace' + this.icon = 'huggingface.png' + this.category = 'Chat Models' + this.description = 'Wrapper around HuggingFace large language models' + this.baseClasses = [this.type, 'BaseChatModel', ...getBaseClasses(HuggingFaceInference)] + this.credential = { + label: 'Connect Credential', + name: 'credential', + type: 'credential', + credentialNames: ['huggingFaceApi'] + } + this.inputs = [ + { + label: 'Model', + name: 'model', + type: 'string', + description: 'If using own inference endpoint, leave this blank', + placeholder: 'gpt2', + optional: true + }, + { + label: 'Endpoint', + name: 'endpoint', + type: 'string', + placeholder: 'https://xyz.eu-west-1.aws.endpoints.huggingface.cloud/gpt2', + description: 'Using your own inference endpoint', + optional: true + }, + { + label: 'Temperature', + name: 'temperature', + type: 'number', + step: 0.1, + description: 'Temperature parameter may not apply to certain model. Please check available model parameters', + optional: true, + additionalParams: true + }, + { + label: 'Max Tokens', + name: 'maxTokens', + type: 'number', + step: 1, + description: 'Max Tokens parameter may not apply to certain model. Please check available model parameters', + optional: true, + additionalParams: true + }, + { + label: 'Top Probability', + name: 'topP', + type: 'number', + step: 0.1, + description: 'Top Probability parameter may not apply to certain model. Please check available model parameters', + optional: true, + additionalParams: true + }, + { + label: 'Top K', + name: 'hfTopK', + type: 'number', + step: 0.1, + description: 'Top K parameter may not apply to certain model. Please check available model parameters', + optional: true, + additionalParams: true + }, + { + label: 'Frequency Penalty', + name: 'frequencyPenalty', + type: 'number', + step: 0.1, + description: 'Frequency Penalty parameter may not apply to certain model. Please check available model parameters', + optional: true, + additionalParams: true + } + ] + } + + async init(nodeData: INodeData, _: string, options: ICommonObject): Promise { + const model = nodeData.inputs?.model as string + const temperature = nodeData.inputs?.temperature as string + const maxTokens = nodeData.inputs?.maxTokens as string + const topP = nodeData.inputs?.topP as string + const hfTopK = nodeData.inputs?.hfTopK as string + const frequencyPenalty = nodeData.inputs?.frequencyPenalty as string + const endpoint = nodeData.inputs?.endpoint as string + + const credentialData = await getCredentialData(nodeData.credential ?? '', options) + const huggingFaceApiKey = getCredentialParam('huggingFaceApiKey', credentialData, nodeData) + + const obj: Partial = { + model, + apiKey: huggingFaceApiKey + } + + if (temperature) obj.temperature = parseFloat(temperature) + if (maxTokens) obj.maxTokens = parseInt(maxTokens, 10) + if (topP) obj.topP = parseFloat(topP) + if (hfTopK) obj.topK = parseFloat(hfTopK) + if (frequencyPenalty) obj.frequencyPenalty = parseFloat(frequencyPenalty) + if (endpoint) obj.endpoint = endpoint + + const huggingFace = new HuggingFaceInference(obj) + return huggingFace + } +} + +module.exports = { nodeClass: ChatHuggingFace_ChatModels } diff --git a/packages/components/nodes/chatmodels/ChatHuggingFace/core.ts b/packages/components/nodes/chatmodels/ChatHuggingFace/core.ts new file mode 100644 index 000000000..416567f0d --- /dev/null +++ b/packages/components/nodes/chatmodels/ChatHuggingFace/core.ts @@ -0,0 +1,113 @@ +import { getEnvironmentVariable } from '../../../src/utils' +import { LLM, BaseLLMParams } from 'langchain/llms/base' + +export interface HFInput { + /** Model to use */ + model: string + + /** Sampling temperature to use */ + temperature?: number + + /** + * Maximum number of tokens to generate in the completion. + */ + maxTokens?: number + + /** Total probability mass of tokens to consider at each step */ + topP?: number + + /** Integer to define the top tokens considered within the sample operation to create new text. */ + topK?: number + + /** Penalizes repeated tokens according to frequency */ + frequencyPenalty?: number + + /** API key to use. */ + apiKey?: string + + /** Private endpoint to use. */ + endpoint?: string +} + +export class HuggingFaceInference extends LLM implements HFInput { + get lc_secrets(): { [key: string]: string } | undefined { + return { + apiKey: 'HUGGINGFACEHUB_API_KEY' + } + } + + model = 'gpt2' + + temperature: number | undefined = undefined + + maxTokens: number | undefined = undefined + + topP: number | undefined = undefined + + topK: number | undefined = undefined + + frequencyPenalty: number | undefined = undefined + + apiKey: string | undefined = undefined + + endpoint: string | undefined = undefined + + constructor(fields?: Partial & BaseLLMParams) { + super(fields ?? {}) + + this.model = fields?.model ?? this.model + this.temperature = fields?.temperature ?? this.temperature + this.maxTokens = fields?.maxTokens ?? this.maxTokens + this.topP = fields?.topP ?? this.topP + this.topK = fields?.topK ?? this.topK + this.frequencyPenalty = fields?.frequencyPenalty ?? this.frequencyPenalty + this.endpoint = fields?.endpoint ?? '' + this.apiKey = fields?.apiKey ?? getEnvironmentVariable('HUGGINGFACEHUB_API_KEY') + if (!this.apiKey) { + throw new Error( + 'Please set an API key for HuggingFace Hub in the environment variable HUGGINGFACEHUB_API_KEY or in the apiKey field of the HuggingFaceInference constructor.' + ) + } + } + + _llmType() { + return 'hf' + } + + /** @ignore */ + async _call(prompt: string, options: this['ParsedCallOptions']): Promise { + const { HfInference } = await HuggingFaceInference.imports() + const hf = new HfInference(this.apiKey) + const obj: any = { + parameters: { + // make it behave similar to openai, returning only the generated text + return_full_text: false, + temperature: this.temperature, + max_new_tokens: this.maxTokens, + top_p: this.topP, + top_k: this.topK, + repetition_penalty: this.frequencyPenalty + }, + inputs: prompt + } + if (this.endpoint) { + hf.endpoint(this.endpoint) + } else { + obj.model = this.model + } + const res = await this.caller.callWithOptions({ signal: options.signal }, hf.textGeneration.bind(hf), obj) + return res.generated_text + } + + /** @ignore */ + static async imports(): Promise<{ + HfInference: typeof import('@huggingface/inference').HfInference + }> { + try { + const { HfInference } = await import('@huggingface/inference') + return { HfInference } + } catch (e) { + throw new Error('Please install huggingface as a dependency with, e.g. `yarn add @huggingface/inference`') + } + } +} diff --git a/packages/components/nodes/chatmodels/ChatHuggingFace/huggingface.png b/packages/components/nodes/chatmodels/ChatHuggingFace/huggingface.png new file mode 100644 index 000000000..f8f202a46 Binary files /dev/null and b/packages/components/nodes/chatmodels/ChatHuggingFace/huggingface.png differ diff --git a/packages/components/nodes/chatmodels/ChatLocalAI/ChatLocalAI.ts b/packages/components/nodes/chatmodels/ChatLocalAI/ChatLocalAI.ts index bd25a9fa6..a6ddfae42 100644 --- a/packages/components/nodes/chatmodels/ChatLocalAI/ChatLocalAI.ts +++ b/packages/components/nodes/chatmodels/ChatLocalAI/ChatLocalAI.ts @@ -6,6 +6,7 @@ import { OpenAIChatInput } from 'langchain/chat_models/openai' class ChatLocalAI_ChatModels implements INode { label: string name: string + version: number type: string icon: string category: string @@ -16,6 +17,7 @@ class ChatLocalAI_ChatModels implements INode { constructor() { this.label = 'ChatLocalAI' this.name = 'chatLocalAI' + this.version = 1.0 this.type = 'ChatLocalAI' this.icon = 'localai.png' this.category = 'Chat Models' @@ -38,6 +40,7 @@ class ChatLocalAI_ChatModels implements INode { label: 'Temperature', name: 'temperature', type: 'number', + step: 0.1, default: 0.9, optional: true }, @@ -45,6 +48,7 @@ class ChatLocalAI_ChatModels implements INode { label: 'Max Tokens', name: 'maxTokens', type: 'number', + step: 1, optional: true, additionalParams: true }, @@ -52,6 +56,7 @@ class ChatLocalAI_ChatModels implements INode { label: 'Top Probability', name: 'topP', type: 'number', + step: 0.1, optional: true, additionalParams: true }, @@ -59,6 +64,7 @@ class ChatLocalAI_ChatModels implements INode { label: 'Timeout', name: 'timeout', type: 'number', + step: 1, optional: true, additionalParams: true } @@ -74,13 +80,13 @@ class ChatLocalAI_ChatModels implements INode { const basePath = nodeData.inputs?.basePath as string const obj: Partial & { openAIApiKey?: string } = { - temperature: parseInt(temperature, 10), + temperature: parseFloat(temperature), modelName, openAIApiKey: 'sk-' } if (maxTokens) obj.maxTokens = parseInt(maxTokens, 10) - if (topP) obj.topP = parseInt(topP, 10) + if (topP) obj.topP = parseFloat(topP) if (timeout) obj.timeout = parseInt(timeout, 10) const model = new OpenAIChat(obj, { basePath }) diff --git a/packages/components/nodes/chatmodels/ChatOpenAI/ChatOpenAI.ts b/packages/components/nodes/chatmodels/ChatOpenAI/ChatOpenAI.ts index 5d608c5e2..ca081ff43 100644 --- a/packages/components/nodes/chatmodels/ChatOpenAI/ChatOpenAI.ts +++ b/packages/components/nodes/chatmodels/ChatOpenAI/ChatOpenAI.ts @@ -1,31 +1,35 @@ -import { INode, INodeData, INodeParams } from '../../../src/Interface' -import { getBaseClasses } from '../../../src/utils' +import { ICommonObject, INode, INodeData, INodeParams } from '../../../src/Interface' +import { getBaseClasses, getCredentialData, getCredentialParam } from '../../../src/utils' import { ChatOpenAI, OpenAIChatInput } from 'langchain/chat_models/openai' class ChatOpenAI_ChatModels implements INode { label: string name: string + version: number type: string icon: string category: string description: string baseClasses: string[] + credential: INodeParams inputs: INodeParams[] constructor() { this.label = 'ChatOpenAI' this.name = 'chatOpenAI' + this.version = 1.0 this.type = 'ChatOpenAI' this.icon = 'openai.png' this.category = 'Chat Models' this.description = 'Wrapper around OpenAI large language models that use the Chat endpoint' this.baseClasses = [this.type, ...getBaseClasses(ChatOpenAI)] + this.credential = { + label: 'Connect Credential', + name: 'credential', + type: 'credential', + credentialNames: ['openAIApi'] + } this.inputs = [ - { - label: 'OpenAI Api Key', - name: 'openAIApiKey', - type: 'password' - }, { label: 'Model Name', name: 'modelName', @@ -36,20 +40,32 @@ class ChatOpenAI_ChatModels implements INode { name: 'gpt-4' }, { - label: 'gpt-4-0314', - name: 'gpt-4-0314' + label: 'gpt-4-0613', + name: 'gpt-4-0613' }, { - label: 'gpt-4-32k-0314', - name: 'gpt-4-32k-0314' + label: 'gpt-4-32k', + name: 'gpt-4-32k' + }, + { + label: 'gpt-4-32k-0613', + name: 'gpt-4-32k-0613' }, { label: 'gpt-3.5-turbo', name: 'gpt-3.5-turbo' }, { - label: 'gpt-3.5-turbo-0301', - name: 'gpt-3.5-turbo-0301' + label: 'gpt-3.5-turbo-0613', + name: 'gpt-3.5-turbo-0613' + }, + { + label: 'gpt-3.5-turbo-16k', + name: 'gpt-3.5-turbo-16k' + }, + { + label: 'gpt-3.5-turbo-16k-0613', + name: 'gpt-3.5-turbo-16k-0613' } ], default: 'gpt-3.5-turbo', @@ -59,6 +75,7 @@ class ChatOpenAI_ChatModels implements INode { label: 'Temperature', name: 'temperature', type: 'number', + step: 0.1, default: 0.9, optional: true }, @@ -66,6 +83,7 @@ class ChatOpenAI_ChatModels implements INode { label: 'Max Tokens', name: 'maxTokens', type: 'number', + step: 1, optional: true, additionalParams: true }, @@ -73,6 +91,7 @@ class ChatOpenAI_ChatModels implements INode { label: 'Top Probability', name: 'topP', type: 'number', + step: 0.1, optional: true, additionalParams: true }, @@ -80,6 +99,7 @@ class ChatOpenAI_ChatModels implements INode { label: 'Frequency Penalty', name: 'frequencyPenalty', type: 'number', + step: 0.1, optional: true, additionalParams: true }, @@ -87,6 +107,7 @@ class ChatOpenAI_ChatModels implements INode { label: 'Presence Penalty', name: 'presencePenalty', type: 'number', + step: 0.1, optional: true, additionalParams: true }, @@ -94,35 +115,68 @@ class ChatOpenAI_ChatModels implements INode { label: 'Timeout', name: 'timeout', type: 'number', + step: 1, + optional: true, + additionalParams: true + }, + { + label: 'BasePath', + name: 'basepath', + type: 'string', + optional: true, + additionalParams: true + }, + { + label: 'BaseOptions', + name: 'baseOptions', + type: 'json', optional: true, additionalParams: true } ] } - async init(nodeData: INodeData): Promise { + async init(nodeData: INodeData, _: string, options: ICommonObject): Promise { const temperature = nodeData.inputs?.temperature as string const modelName = nodeData.inputs?.modelName as string - const openAIApiKey = nodeData.inputs?.openAIApiKey as string const maxTokens = nodeData.inputs?.maxTokens as string const topP = nodeData.inputs?.topP as string const frequencyPenalty = nodeData.inputs?.frequencyPenalty as string const presencePenalty = nodeData.inputs?.presencePenalty as string const timeout = nodeData.inputs?.timeout as string + const streaming = nodeData.inputs?.streaming as boolean + const basePath = nodeData.inputs?.basepath as string + const baseOptions = nodeData.inputs?.baseOptions + + const credentialData = await getCredentialData(nodeData.credential ?? '', options) + const openAIApiKey = getCredentialParam('openAIApiKey', credentialData, nodeData) const obj: Partial & { openAIApiKey?: string } = { - temperature: parseInt(temperature, 10), + temperature: parseFloat(temperature), modelName, - openAIApiKey + openAIApiKey, + streaming: streaming ?? true } if (maxTokens) obj.maxTokens = parseInt(maxTokens, 10) - if (topP) obj.topP = parseInt(topP, 10) - if (frequencyPenalty) obj.frequencyPenalty = parseInt(frequencyPenalty, 10) - if (presencePenalty) obj.presencePenalty = parseInt(presencePenalty, 10) + if (topP) obj.topP = parseFloat(topP) + if (frequencyPenalty) obj.frequencyPenalty = parseFloat(frequencyPenalty) + if (presencePenalty) obj.presencePenalty = parseFloat(presencePenalty) if (timeout) obj.timeout = parseInt(timeout, 10) - const model = new ChatOpenAI(obj) + let parsedBaseOptions: any | undefined = undefined + + if (baseOptions) { + try { + parsedBaseOptions = typeof baseOptions === 'object' ? baseOptions : JSON.parse(baseOptions) + } catch (exception) { + throw new Error("Invalid JSON in the ChatOpenAI's BaseOptions: " + exception) + } + } + const model = new ChatOpenAI(obj, { + basePath, + baseOptions: parsedBaseOptions + }) return model } } diff --git a/packages/components/nodes/chatmodels/GoogleVertexAI/ChatGoogleVertexAI.ts b/packages/components/nodes/chatmodels/GoogleVertexAI/ChatGoogleVertexAI.ts new file mode 100644 index 000000000..a06ce0c95 --- /dev/null +++ b/packages/components/nodes/chatmodels/GoogleVertexAI/ChatGoogleVertexAI.ts @@ -0,0 +1,115 @@ +import { ICommonObject, INode, INodeData, INodeParams } from '../../../src/Interface' +import { getBaseClasses, getCredentialData, getCredentialParam } from '../../../src/utils' +import { ChatGoogleVertexAI, GoogleVertexAIChatInput } from 'langchain/chat_models/googlevertexai' +import { GoogleAuthOptions } from 'google-auth-library' + +class GoogleVertexAI_ChatModels implements INode { + label: string + name: string + version: number + type: string + icon: string + category: string + description: string + baseClasses: string[] + credential: INodeParams + inputs: INodeParams[] + + constructor() { + this.label = 'ChatGoogleVertexAI' + this.name = 'chatGoogleVertexAI' + this.version = 1.0 + this.type = 'ChatGoogleVertexAI' + this.icon = 'vertexai.svg' + this.category = 'Chat Models' + this.description = 'Wrapper around VertexAI large language models that use the Chat endpoint' + this.baseClasses = [this.type, ...getBaseClasses(ChatGoogleVertexAI)] + this.credential = { + label: 'Connect Credential', + name: 'credential', + type: 'credential', + credentialNames: ['googleVertexAuth'] + } + this.inputs = [ + { + label: 'Model Name', + name: 'modelName', + type: 'options', + options: [ + { + label: 'chat-bison', + name: 'chat-bison' + }, + { + label: 'codechat-bison', + name: 'codechat-bison' + } + ], + default: 'chat-bison', + optional: true + }, + { + label: 'Temperature', + name: 'temperature', + type: 'number', + step: 0.1, + default: 0.9, + optional: true + }, + { + label: 'Max Output Tokens', + name: 'maxOutputTokens', + type: 'number', + step: 1, + optional: true, + additionalParams: true + }, + { + label: 'Top Probability', + name: 'topP', + type: 'number', + step: 0.1, + optional: true, + additionalParams: true + } + ] + } + + async init(nodeData: INodeData, _: string, options: ICommonObject): Promise { + const credentialData = await getCredentialData(nodeData.credential ?? '', options) + const googleApplicationCredentialFilePath = getCredentialParam('googleApplicationCredentialFilePath', credentialData, nodeData) + const googleApplicationCredential = getCredentialParam('googleApplicationCredential', credentialData, nodeData) + const projectID = getCredentialParam('projectID', credentialData, nodeData) + + if (!googleApplicationCredentialFilePath && !googleApplicationCredential) + throw new Error('Please specify your Google Application Credential') + if (googleApplicationCredentialFilePath && googleApplicationCredential) + throw new Error('Please use either Google Application Credential File Path or Google Credential JSON Object') + + const authOptions: GoogleAuthOptions = {} + if (googleApplicationCredentialFilePath && !googleApplicationCredential) authOptions.keyFile = googleApplicationCredentialFilePath + else if (!googleApplicationCredentialFilePath && googleApplicationCredential) + authOptions.credentials = JSON.parse(googleApplicationCredential) + + if (projectID) authOptions.projectId = projectID + + const temperature = nodeData.inputs?.temperature as string + const modelName = nodeData.inputs?.modelName as string + const maxOutputTokens = nodeData.inputs?.maxOutputTokens as string + const topP = nodeData.inputs?.topP as string + + const obj: Partial = { + temperature: parseFloat(temperature), + model: modelName, + authOptions + } + + if (maxOutputTokens) obj.maxOutputTokens = parseInt(maxOutputTokens, 10) + if (topP) obj.topP = parseFloat(topP) + + const model = new ChatGoogleVertexAI(obj) + return model + } +} + +module.exports = { nodeClass: GoogleVertexAI_ChatModels } diff --git a/packages/components/nodes/chatmodels/GoogleVertexAI/vertexai.svg b/packages/components/nodes/chatmodels/GoogleVertexAI/vertexai.svg new file mode 100644 index 000000000..31244412a --- /dev/null +++ b/packages/components/nodes/chatmodels/GoogleVertexAI/vertexai.svg @@ -0,0 +1,2 @@ + + \ No newline at end of file diff --git a/packages/components/nodes/documentloaders/API/APILoader.ts b/packages/components/nodes/documentloaders/API/APILoader.ts new file mode 100644 index 000000000..3de6d6366 --- /dev/null +++ b/packages/components/nodes/documentloaders/API/APILoader.ts @@ -0,0 +1,200 @@ +import { ICommonObject, INode, INodeData, INodeParams } from '../../../src/Interface' +import { TextSplitter } from 'langchain/text_splitter' +import { BaseDocumentLoader } from 'langchain/document_loaders/base' +import { Document } from 'langchain/document' +import axios, { AxiosRequestConfig } from 'axios' + +class API_DocumentLoaders implements INode { + label: string + name: string + version: number + description: string + type: string + icon: string + category: string + baseClasses: string[] + inputs?: INodeParams[] + + constructor() { + this.label = 'API Loader' + this.name = 'apiLoader' + this.version = 1.0 + this.type = 'Document' + this.icon = 'api-loader.png' + this.category = 'Document Loaders' + this.description = `Load data from an API` + this.baseClasses = [this.type] + this.inputs = [ + { + label: 'Text Splitter', + name: 'textSplitter', + type: 'TextSplitter', + optional: true + }, + { + label: 'Method', + name: 'method', + type: 'options', + options: [ + { + label: 'GET', + name: 'GET' + }, + { + label: 'POST', + name: 'POST' + } + ] + }, + { + label: 'URL', + name: 'url', + type: 'string' + }, + { + label: 'Headers', + name: 'headers', + type: 'json', + additionalParams: true, + optional: true + }, + { + label: 'Body', + name: 'body', + type: 'json', + description: + 'JSON body for the POST request. If not specified, agent will try to figure out itself from AIPlugin if provided', + additionalParams: true, + optional: true + } + ] + } + async init(nodeData: INodeData): Promise { + const headers = nodeData.inputs?.headers as string + const url = nodeData.inputs?.url as string + const body = nodeData.inputs?.body as string + const method = nodeData.inputs?.method as string + const textSplitter = nodeData.inputs?.textSplitter as TextSplitter + const metadata = nodeData.inputs?.metadata + + const options: ApiLoaderParams = { + url, + method + } + + if (headers) { + const parsedHeaders = typeof headers === 'object' ? headers : JSON.parse(headers) + options.headers = parsedHeaders + } + + if (body) { + const parsedBody = typeof body === 'object' ? body : JSON.parse(body) + options.body = parsedBody + } + + const loader = new ApiLoader(options) + + let docs = [] + + if (textSplitter) { + docs = await loader.loadAndSplit(textSplitter) + } else { + docs = await loader.load() + } + + if (metadata) { + const parsedMetadata = typeof metadata === 'object' ? metadata : JSON.parse(metadata) + let finaldocs = [] + for (const doc of docs) { + const newdoc = { + ...doc, + metadata: { + ...doc.metadata, + ...parsedMetadata + } + } + finaldocs.push(newdoc) + } + return finaldocs + } + + return docs + } +} + +interface ApiLoaderParams { + url: string + method: string + headers?: ICommonObject + body?: ICommonObject +} + +class ApiLoader extends BaseDocumentLoader { + public readonly url: string + + public readonly headers?: ICommonObject + + public readonly body?: ICommonObject + + public readonly method: string + + constructor({ url, headers, body, method }: ApiLoaderParams) { + super() + this.url = url + this.headers = headers + this.body = body + this.method = method + } + + public async load(): Promise { + if (this.method === 'POST') { + return this.executePostRequest(this.url, this.headers, this.body) + } else { + return this.executeGetRequest(this.url, this.headers) + } + } + + protected async executeGetRequest(url: string, headers?: ICommonObject): Promise { + try { + const config: AxiosRequestConfig = {} + if (headers) { + config.headers = headers + } + const response = await axios.get(url, config) + const responseJsonString = JSON.stringify(response.data, null, 2) + const doc = new Document({ + pageContent: responseJsonString, + metadata: { + url + } + }) + return [doc] + } catch (error) { + throw new Error(`Failed to fetch ${url}: ${error}`) + } + } + + protected async executePostRequest(url: string, headers?: ICommonObject, body?: ICommonObject): Promise { + try { + const config: AxiosRequestConfig = {} + if (headers) { + config.headers = headers + } + const response = await axios.post(url, body ?? {}, config) + const responseJsonString = JSON.stringify(response.data, null, 2) + const doc = new Document({ + pageContent: responseJsonString, + metadata: { + url + } + }) + return [doc] + } catch (error) { + throw new Error(`Failed to post ${url}: ${error}`) + } + } +} + +module.exports = { + nodeClass: API_DocumentLoaders +} diff --git a/packages/components/nodes/documentloaders/API/api-loader.png b/packages/components/nodes/documentloaders/API/api-loader.png new file mode 100644 index 000000000..93668c4cf Binary files /dev/null and b/packages/components/nodes/documentloaders/API/api-loader.png differ diff --git a/packages/components/nodes/documentloaders/Airtable/Airtable.ts b/packages/components/nodes/documentloaders/Airtable/Airtable.ts new file mode 100644 index 000000000..70d0c674a --- /dev/null +++ b/packages/components/nodes/documentloaders/Airtable/Airtable.ts @@ -0,0 +1,230 @@ +import { ICommonObject, INode, INodeData, INodeParams } from '../../../src/Interface' +import { TextSplitter } from 'langchain/text_splitter' +import { BaseDocumentLoader } from 'langchain/document_loaders/base' +import { Document } from 'langchain/document' +import axios from 'axios' +import { getCredentialData, getCredentialParam } from '../../../src/utils' + +class Airtable_DocumentLoaders implements INode { + label: string + name: string + version: number + description: string + type: string + icon: string + category: string + baseClasses: string[] + credential: INodeParams + inputs?: INodeParams[] + + constructor() { + this.label = 'Airtable' + this.name = 'airtable' + this.version = 1.0 + this.type = 'Document' + this.icon = 'airtable.svg' + this.category = 'Document Loaders' + this.description = `Load data from Airtable table` + this.baseClasses = [this.type] + this.credential = { + label: 'Connect Credential', + name: 'credential', + type: 'credential', + credentialNames: ['airtableApi'] + } + this.inputs = [ + { + label: 'Text Splitter', + name: 'textSplitter', + type: 'TextSplitter', + optional: true + }, + { + label: 'Base Id', + name: 'baseId', + type: 'string', + placeholder: 'app11RobdGoX0YNsC', + description: + 'If your table URL looks like: https://airtable.com/app11RobdGoX0YNsC/tblJdmvbrgizbYICO/viw9UrP77Id0CE4ee, app11RovdGoX0YNsC is the base id' + }, + { + label: 'Table Id', + name: 'tableId', + type: 'string', + placeholder: 'tblJdmvbrgizbYICO', + description: + 'If your table URL looks like: https://airtable.com/app11RobdGoX0YNsC/tblJdmvbrgizbYICO/viw9UrP77Id0CE4ee, tblJdmvbrgizbYICO is the table id' + }, + { + label: 'Return All', + name: 'returnAll', + type: 'boolean', + default: true, + additionalParams: true, + description: 'If all results should be returned or only up to a given limit' + }, + { + label: 'Limit', + name: 'limit', + type: 'number', + default: 100, + additionalParams: true, + description: 'Number of results to return' + }, + { + label: 'Metadata', + name: 'metadata', + type: 'json', + optional: true, + additionalParams: true + } + ] + } + async init(nodeData: INodeData, _: string, options: ICommonObject): Promise { + const baseId = nodeData.inputs?.baseId as string + const tableId = nodeData.inputs?.tableId as string + const returnAll = nodeData.inputs?.returnAll as boolean + const limit = nodeData.inputs?.limit as string + const textSplitter = nodeData.inputs?.textSplitter as TextSplitter + const metadata = nodeData.inputs?.metadata + + const credentialData = await getCredentialData(nodeData.credential ?? '', options) + const accessToken = getCredentialParam('accessToken', credentialData, nodeData) + + const airtableOptions: AirtableLoaderParams = { + baseId, + tableId, + returnAll, + accessToken, + limit: limit ? parseInt(limit, 10) : 100 + } + + const loader = new AirtableLoader(airtableOptions) + + let docs = [] + + if (textSplitter) { + docs = await loader.loadAndSplit(textSplitter) + } else { + docs = await loader.load() + } + + if (metadata) { + const parsedMetadata = typeof metadata === 'object' ? metadata : JSON.parse(metadata) + let finaldocs = [] + for (const doc of docs) { + const newdoc = { + ...doc, + metadata: { + ...doc.metadata, + ...parsedMetadata + } + } + finaldocs.push(newdoc) + } + return finaldocs + } + + return docs + } +} + +interface AirtableLoaderParams { + baseId: string + tableId: string + accessToken: string + limit?: number + returnAll?: boolean +} + +interface AirtableLoaderResponse { + records: AirtableLoaderPage[] + offset?: string +} + +interface AirtableLoaderPage { + id: string + createdTime: string + fields: ICommonObject +} + +class AirtableLoader extends BaseDocumentLoader { + public readonly baseId: string + + public readonly tableId: string + + public readonly accessToken: string + + public readonly limit: number + + public readonly returnAll: boolean + + constructor({ baseId, tableId, accessToken, limit = 100, returnAll = false }: AirtableLoaderParams) { + super() + this.baseId = baseId + this.tableId = tableId + this.accessToken = accessToken + this.limit = limit + this.returnAll = returnAll + } + + public async load(): Promise { + if (this.returnAll) { + return this.loadAll() + } + return this.loadLimit() + } + + protected async fetchAirtableData(url: string, params: ICommonObject): Promise { + try { + const headers = { + Authorization: `Bearer ${this.accessToken}`, + 'Content-Type': 'application/json', + Accept: 'application/json' + } + const response = await axios.get(url, { params, headers }) + return response.data + } catch (error) { + throw new Error(`Failed to fetch ${url} from Airtable: ${error}`) + } + } + + private createDocumentFromPage(page: AirtableLoaderPage): Document { + // Generate the URL + const pageUrl = `https://api.airtable.com/v0/${this.baseId}/${this.tableId}/${page.id}` + + // Return a langchain document + return new Document({ + pageContent: JSON.stringify(page.fields, null, 2), + metadata: { + url: pageUrl + } + }) + } + + private async loadLimit(): Promise { + const params = { maxRecords: this.limit } + const data = await this.fetchAirtableData(`https://api.airtable.com/v0/${this.baseId}/${this.tableId}`, params) + if (data.records.length === 0) { + return [] + } + return data.records.map((page) => this.createDocumentFromPage(page)) + } + + private async loadAll(): Promise { + const params: ICommonObject = { pageSize: 100 } + let data: AirtableLoaderResponse + let returnPages: AirtableLoaderPage[] = [] + + do { + data = await this.fetchAirtableData(`https://api.airtable.com/v0/${this.baseId}/${this.tableId}`, params) + returnPages.push.apply(returnPages, data.records) + params.offset = data.offset + } while (data.offset !== undefined) + return returnPages.map((page) => this.createDocumentFromPage(page)) + } +} + +module.exports = { + nodeClass: Airtable_DocumentLoaders +} diff --git a/packages/components/nodes/documentloaders/Airtable/airtable.svg b/packages/components/nodes/documentloaders/Airtable/airtable.svg new file mode 100644 index 000000000..867c3b5ae --- /dev/null +++ b/packages/components/nodes/documentloaders/Airtable/airtable.svg @@ -0,0 +1,9 @@ + + + + + + + + + diff --git a/packages/components/nodes/documentloaders/ApifyWebsiteContentCrawler/ApifyWebsiteContentCrawler.ts b/packages/components/nodes/documentloaders/ApifyWebsiteContentCrawler/ApifyWebsiteContentCrawler.ts new file mode 100644 index 000000000..a5e6a6e03 --- /dev/null +++ b/packages/components/nodes/documentloaders/ApifyWebsiteContentCrawler/ApifyWebsiteContentCrawler.ts @@ -0,0 +1,139 @@ +import { INode, INodeData, INodeParams, ICommonObject } from '../../../src/Interface' +import { getCredentialData, getCredentialParam } from '../../../src/utils' +import { TextSplitter } from 'langchain/text_splitter' +import { ApifyDatasetLoader } from 'langchain/document_loaders/web/apify_dataset' +import { Document } from 'langchain/document' + +class ApifyWebsiteContentCrawler_DocumentLoaders implements INode { + label: string + name: string + description: string + type: string + icon: string + version: number + category: string + baseClasses: string[] + inputs: INodeParams[] + credential: INodeParams + + constructor() { + this.label = 'Apify Website Content Crawler' + this.name = 'apifyWebsiteContentCrawler' + this.type = 'Document' + this.icon = 'apify-symbol-transparent.svg' + this.version = 1.0 + this.category = 'Document Loaders' + this.description = 'Load data from Apify Website Content Crawler' + this.baseClasses = [this.type] + this.inputs = [ + { + label: 'Start URLs', + name: 'urls', + type: 'string', + description: 'One or more URLs of pages where the crawler will start, separated by commas.', + placeholder: 'https://js.langchain.com/docs/' + }, + { + label: 'Crawler type', + type: 'options', + name: 'crawlerType', + options: [ + { + label: 'Headless web browser (Chrome+Playwright)', + name: 'playwright:chrome' + }, + { + label: 'Stealthy web browser (Firefox+Playwright)', + name: 'playwright:firefox' + }, + { + label: 'Raw HTTP client (Cheerio)', + name: 'cheerio' + }, + { + label: 'Raw HTTP client with JavaScript execution (JSDOM) [experimental]', + name: 'jsdom' + } + ], + description: + 'Select the crawling engine, see documentation for additional information.', + default: 'playwright:firefox' + }, + { + label: 'Max crawling depth', + name: 'maxCrawlDepth', + type: 'number', + optional: true, + default: 1 + }, + { + label: 'Max crawl pages', + name: 'maxCrawlPages', + type: 'number', + optional: true, + default: 3 + }, + { + label: 'Additional input', + name: 'additionalInput', + type: 'json', + default: JSON.stringify({}), + description: + 'For additional input options for the crawler see documentation.', + optional: true + }, + { + label: 'Text Splitter', + name: 'textSplitter', + type: 'TextSplitter', + optional: true + } + ] + this.credential = { + label: 'Connect Apify API', + name: 'credential', + type: 'credential', + credentialNames: ['apifyApi'] + } + } + + async init(nodeData: INodeData, _: string, options: ICommonObject): Promise { + const textSplitter = nodeData.inputs?.textSplitter as TextSplitter + + // Get input options and merge with additional input + const urls = nodeData.inputs?.urls as string + const crawlerType = nodeData.inputs?.crawlerType as string + const maxCrawlDepth = nodeData.inputs?.maxCrawlDepth as string + const maxCrawlPages = nodeData.inputs?.maxCrawlPages as string + const additionalInput = + typeof nodeData.inputs?.additionalInput === 'object' + ? nodeData.inputs?.additionalInput + : JSON.parse(nodeData.inputs?.additionalInput as string) + const input = { + startUrls: urls.split(',').map((url) => ({ url: url.trim() })), + crawlerType, + maxCrawlDepth: parseInt(maxCrawlDepth, 10), + maxCrawlPages: parseInt(maxCrawlPages, 10), + ...additionalInput + } + + // Get Apify API token from credential data + const credentialData = await getCredentialData(nodeData.credential ?? '', options) + const apifyApiToken = getCredentialParam('apifyApiToken', credentialData, nodeData) + + const loader = await ApifyDatasetLoader.fromActorCall('apify/website-content-crawler', input, { + datasetMappingFunction: (item) => + new Document({ + pageContent: (item.text || '') as string, + metadata: { source: item.url } + }), + clientOptions: { + token: apifyApiToken + } + }) + + return textSplitter ? loader.loadAndSplit(textSplitter) : loader.load() + } +} + +module.exports = { nodeClass: ApifyWebsiteContentCrawler_DocumentLoaders } diff --git a/packages/components/nodes/documentloaders/ApifyWebsiteContentCrawler/apify-symbol-transparent.svg b/packages/components/nodes/documentloaders/ApifyWebsiteContentCrawler/apify-symbol-transparent.svg new file mode 100644 index 000000000..423a3328d --- /dev/null +++ b/packages/components/nodes/documentloaders/ApifyWebsiteContentCrawler/apify-symbol-transparent.svg @@ -0,0 +1 @@ + \ No newline at end of file diff --git a/packages/components/nodes/documentloaders/Cheerio/Cheerio.ts b/packages/components/nodes/documentloaders/Cheerio/Cheerio.ts index 9e1135059..1c21c1ea8 100644 --- a/packages/components/nodes/documentloaders/Cheerio/Cheerio.ts +++ b/packages/components/nodes/documentloaders/Cheerio/Cheerio.ts @@ -2,11 +2,12 @@ import { INode, INodeData, INodeParams } from '../../../src/Interface' import { TextSplitter } from 'langchain/text_splitter' import { CheerioWebBaseLoader } from 'langchain/document_loaders/web/cheerio' import { test } from 'linkifyjs' -import { getAvailableURLs } from '../../../src' +import { webCrawl, xmlScrape } from '../../../src' class Cheerio_DocumentLoaders implements INode { label: string name: string + version: number description: string type: string icon: string @@ -17,6 +18,7 @@ class Cheerio_DocumentLoaders implements INode { constructor() { this.label = 'Cheerio Web Scraper' this.name = 'cheerioWebScraper' + this.version = 1.0 this.type = 'Document' this.icon = 'cheerio.svg' this.category = 'Document Loaders' @@ -35,19 +37,34 @@ class Cheerio_DocumentLoaders implements INode { optional: true }, { - label: 'Web Scrap for Relative Links', - name: 'webScrap', - type: 'boolean', + label: 'Get Relative Links Method', + name: 'relativeLinksMethod', + type: 'options', + description: 'Select a method to retrieve relative links', + options: [ + { + label: 'Web Crawl', + name: 'webCrawl', + description: 'Crawl relative links from HTML URL' + }, + { + label: 'Scrape XML Sitemap', + name: 'scrapeXMLSitemap', + description: 'Scrape relative links from XML sitemap URL' + } + ], optional: true, additionalParams: true }, { - label: 'Web Scrap Links Limit', + label: 'Get Relative Links Limit', name: 'limit', type: 'number', - default: 10, optional: true, - additionalParams: true + additionalParams: true, + description: + 'Only used when "Get Relative Links Method" is selected. Set 0 to retrieve all relative links, default limit is 10.', + warning: `Retrieving all links might take long time, and all links will be upserted again if the flow's state changed (eg: different URL, chunk size, etc)` }, { label: 'Metadata', @@ -62,7 +79,7 @@ class Cheerio_DocumentLoaders implements INode { async init(nodeData: INodeData): Promise { const textSplitter = nodeData.inputs?.textSplitter as TextSplitter const metadata = nodeData.inputs?.metadata - const webScrap = nodeData.inputs?.webScrap as boolean + const relativeLinksMethod = nodeData.inputs?.relativeLinksMethod as string let limit = nodeData.inputs?.limit as string let url = nodeData.inputs?.url as string @@ -71,25 +88,34 @@ class Cheerio_DocumentLoaders implements INode { throw new Error('Invalid URL') } - const cheerioLoader = async (url: string): Promise => { - let docs = [] - const loader = new CheerioWebBaseLoader(url) - if (textSplitter) { - docs = await loader.loadAndSplit(textSplitter) - } else { - docs = await loader.load() + async function cheerioLoader(url: string): Promise { + try { + let docs = [] + const loader = new CheerioWebBaseLoader(url) + if (textSplitter) { + docs = await loader.loadAndSplit(textSplitter) + } else { + docs = await loader.load() + } + return docs + } catch (err) { + if (process.env.DEBUG === 'true') console.error(`error in CheerioWebBaseLoader: ${err.message}, on page: ${url}`) } - return docs } - let availableUrls: string[] let docs = [] - if (webScrap) { + if (relativeLinksMethod) { + if (process.env.DEBUG === 'true') console.info(`Start ${relativeLinksMethod}`) if (!limit) limit = '10' - availableUrls = await getAvailableURLs(url, parseInt(limit)) - for (let i = 0; i < availableUrls.length; i++) { - docs.push(...(await cheerioLoader(availableUrls[i]))) + else if (parseInt(limit) < 0) throw new Error('Limit cannot be less than 0') + const pages: string[] = + relativeLinksMethod === 'webCrawl' ? await webCrawl(url, parseInt(limit)) : await xmlScrape(url, parseInt(limit)) + if (process.env.DEBUG === 'true') console.info(`pages: ${JSON.stringify(pages)}, length: ${pages.length}`) + if (!pages || pages.length === 0) throw new Error('No relative links found') + for (const page of pages) { + docs.push(...(await cheerioLoader(page))) } + if (process.env.DEBUG === 'true') console.info(`Finish ${relativeLinksMethod}`) } else { docs = await cheerioLoader(url) } diff --git a/packages/components/nodes/documentloaders/Confluence/Confluence.ts b/packages/components/nodes/documentloaders/Confluence/Confluence.ts new file mode 100644 index 000000000..a17c41b9e --- /dev/null +++ b/packages/components/nodes/documentloaders/Confluence/Confluence.ts @@ -0,0 +1,120 @@ +import { ICommonObject, INode, INodeData, INodeParams } from '../../../src/Interface' +import { TextSplitter } from 'langchain/text_splitter' +import { ConfluencePagesLoader, ConfluencePagesLoaderParams } from 'langchain/document_loaders/web/confluence' +import { getCredentialData, getCredentialParam } from '../../../src' + +class Confluence_DocumentLoaders implements INode { + label: string + name: string + version: number + description: string + type: string + icon: string + category: string + baseClasses: string[] + credential: INodeParams + inputs: INodeParams[] + + constructor() { + this.label = 'Confluence' + this.name = 'confluence' + this.version = 1.0 + this.type = 'Document' + this.icon = 'confluence.png' + this.category = 'Document Loaders' + this.description = `Load data from a Confluence Document` + this.baseClasses = [this.type] + this.credential = { + label: 'Connect Credential', + name: 'credential', + type: 'credential', + credentialNames: ['confluenceApi'] + } + this.inputs = [ + { + label: 'Text Splitter', + name: 'textSplitter', + type: 'TextSplitter', + optional: true + }, + { + label: 'Base URL', + name: 'baseUrl', + type: 'string', + placeholder: 'https://example.atlassian.net/wiki' + }, + { + label: 'Space Key', + name: 'spaceKey', + type: 'string', + placeholder: '~EXAMPLE362906de5d343d49dcdbae5dEXAMPLE', + description: + 'Refer to official guide on how to get Confluence Space Key' + }, + { + label: 'Limit', + name: 'limit', + type: 'number', + default: 0, + optional: true + }, + { + label: 'Metadata', + name: 'metadata', + type: 'json', + optional: true, + additionalParams: true + } + ] + } + + async init(nodeData: INodeData, _: string, options: ICommonObject): Promise { + const spaceKey = nodeData.inputs?.spaceKey as string + const baseUrl = nodeData.inputs?.baseUrl as string + const limit = nodeData.inputs?.limit as number + const textSplitter = nodeData.inputs?.textSplitter as TextSplitter + const metadata = nodeData.inputs?.metadata + + const credentialData = await getCredentialData(nodeData.credential ?? '', options) + const accessToken = getCredentialParam('accessToken', credentialData, nodeData) + const username = getCredentialParam('username', credentialData, nodeData) + + const confluenceOptions: ConfluencePagesLoaderParams = { + username, + accessToken, + baseUrl, + spaceKey, + limit + } + + const loader = new ConfluencePagesLoader(confluenceOptions) + + let docs = [] + + if (textSplitter) { + docs = await loader.loadAndSplit(textSplitter) + } else { + docs = await loader.load() + } + + if (metadata) { + const parsedMetadata = typeof metadata === 'object' ? metadata : JSON.parse(metadata) + let finaldocs = [] + for (const doc of docs) { + const newdoc = { + ...doc, + metadata: { + ...doc.metadata, + ...parsedMetadata + } + } + finaldocs.push(newdoc) + } + return finaldocs + } + + return docs + } +} + +module.exports = { nodeClass: Confluence_DocumentLoaders } diff --git a/packages/components/nodes/documentloaders/Confluence/confluence.png b/packages/components/nodes/documentloaders/Confluence/confluence.png new file mode 100644 index 000000000..3cbb7b3dc Binary files /dev/null and b/packages/components/nodes/documentloaders/Confluence/confluence.png differ diff --git a/packages/components/nodes/documentloaders/Csv/Csv.ts b/packages/components/nodes/documentloaders/Csv/Csv.ts index f4b36ad03..750490b79 100644 --- a/packages/components/nodes/documentloaders/Csv/Csv.ts +++ b/packages/components/nodes/documentloaders/Csv/Csv.ts @@ -5,6 +5,7 @@ import { CSVLoader } from 'langchain/document_loaders/fs/csv' class Csv_DocumentLoaders implements INode { label: string name: string + version: number description: string type: string icon: string @@ -15,6 +16,7 @@ class Csv_DocumentLoaders implements INode { constructor() { this.label = 'Csv File' this.name = 'csvFile' + this.version = 1.0 this.type = 'Document' this.icon = 'Csv.png' this.category = 'Document Loaders' diff --git a/packages/components/nodes/documentloaders/Docx/Docx.ts b/packages/components/nodes/documentloaders/Docx/Docx.ts index e27991a51..419227755 100644 --- a/packages/components/nodes/documentloaders/Docx/Docx.ts +++ b/packages/components/nodes/documentloaders/Docx/Docx.ts @@ -5,6 +5,7 @@ import { DocxLoader } from 'langchain/document_loaders/fs/docx' class Docx_DocumentLoaders implements INode { label: string name: string + version: number description: string type: string icon: string @@ -15,6 +16,7 @@ class Docx_DocumentLoaders implements INode { constructor() { this.label = 'Docx File' this.name = 'docxFile' + this.version = 1.0 this.type = 'Document' this.icon = 'Docx.png' this.category = 'Document Loaders' diff --git a/packages/components/nodes/documentloaders/Figma/Figma.ts b/packages/components/nodes/documentloaders/Figma/Figma.ts new file mode 100644 index 000000000..3d3130445 --- /dev/null +++ b/packages/components/nodes/documentloaders/Figma/Figma.ts @@ -0,0 +1,91 @@ +import { getCredentialData, getCredentialParam } from '../../../src' +import { ICommonObject, INode, INodeData, INodeParams } from '../../../src/Interface' +import { FigmaFileLoader, FigmaLoaderParams } from 'langchain/document_loaders/web/figma' + +class Figma_DocumentLoaders implements INode { + label: string + name: string + version: number + description: string + type: string + icon: string + category: string + baseClasses: string[] + credential: INodeParams + inputs: INodeParams[] + + constructor() { + this.label = 'Figma' + this.name = 'figma' + this.version = 1.0 + this.type = 'Document' + this.icon = 'figma.svg' + this.category = 'Document Loaders' + this.description = 'Load data from a Figma file' + this.baseClasses = [this.type] + this.credential = { + label: 'Connect Credential', + name: 'credential', + type: 'credential', + credentialNames: ['figmaApi'] + } + this.inputs = [ + { + label: 'File Key', + name: 'fileKey', + type: 'string', + placeholder: 'key', + description: + 'The file key can be read from any Figma file URL: https://www.figma.com/file/:key/:title. For example, in https://www.figma.com/file/12345/Website, the file key is 12345' + }, + { + label: 'Node IDs', + name: 'nodeIds', + type: 'string', + placeholder: '0, 1, 2', + description: + 'A list of Node IDs, seperated by comma. Refer to official guide on how to get Node IDs' + }, + { + label: 'Recursive', + name: 'recursive', + type: 'boolean', + optional: true + }, + { + label: 'Text Splitter', + name: 'textSplitter', + type: 'TextSplitter', + optional: true + }, + { + label: 'Metadata', + name: 'metadata', + type: 'json', + optional: true, + additionalParams: true + } + ] + } + + async init(nodeData: INodeData, _: string, options: ICommonObject): Promise { + const nodeIds = (nodeData.inputs?.nodeIds as string)?.trim().split(',') || [] + const fileKey = nodeData.inputs?.fileKey as string + + const credentialData = await getCredentialData(nodeData.credential ?? '', options) + const accessToken = getCredentialParam('accessToken', credentialData, nodeData) + + const figmaOptions: FigmaLoaderParams = { + accessToken, + nodeIds, + fileKey + } + + const loader = new FigmaFileLoader(figmaOptions) + const docs = await loader.load() + + return docs + } +} + +module.exports = { nodeClass: Figma_DocumentLoaders } diff --git a/packages/components/nodes/documentloaders/Figma/figma.svg b/packages/components/nodes/documentloaders/Figma/figma.svg new file mode 100644 index 000000000..c4f85674f --- /dev/null +++ b/packages/components/nodes/documentloaders/Figma/figma.svg @@ -0,0 +1 @@ + \ No newline at end of file diff --git a/packages/components/nodes/documentloaders/Folder/Folder.ts b/packages/components/nodes/documentloaders/Folder/Folder.ts index 2290133e4..f5d0c6402 100644 --- a/packages/components/nodes/documentloaders/Folder/Folder.ts +++ b/packages/components/nodes/documentloaders/Folder/Folder.ts @@ -10,6 +10,7 @@ import { DocxLoader } from 'langchain/document_loaders/fs/docx' class Folder_DocumentLoaders implements INode { label: string name: string + version: number description: string type: string icon: string @@ -20,6 +21,7 @@ class Folder_DocumentLoaders implements INode { constructor() { this.label = 'Folder with Files' this.name = 'folderFiles' + this.version = 1.0 this.type = 'Document' this.icon = 'folder.svg' this.category = 'Document Loaders' @@ -59,7 +61,40 @@ class Folder_DocumentLoaders implements INode { '.csv': (path) => new CSVLoader(path), '.docx': (path) => new DocxLoader(path), // @ts-ignore - '.pdf': (path) => new PDFLoader(path, { pdfjs: () => import('pdf-parse/lib/pdf.js/v1.10.100/build/pdf.js') }) + '.pdf': (path) => new PDFLoader(path, { pdfjs: () => import('pdf-parse/lib/pdf.js/v1.10.100/build/pdf.js') }), + '.aspx': (path) => new TextLoader(path), + '.asp': (path) => new TextLoader(path), + '.cpp': (path) => new TextLoader(path), // C++ + '.c': (path) => new TextLoader(path), + '.cs': (path) => new TextLoader(path), + '.css': (path) => new TextLoader(path), + '.go': (path) => new TextLoader(path), // Go + '.h': (path) => new TextLoader(path), // C++ Header files + '.java': (path) => new TextLoader(path), // Java + '.js': (path) => new TextLoader(path), // JavaScript + '.less': (path) => new TextLoader(path), // Less files + '.ts': (path) => new TextLoader(path), // TypeScript + '.php': (path) => new TextLoader(path), // PHP + '.proto': (path) => new TextLoader(path), // Protocol Buffers + '.python': (path) => new TextLoader(path), // Python + '.py': (path) => new TextLoader(path), // Python + '.rst': (path) => new TextLoader(path), // reStructuredText + '.ruby': (path) => new TextLoader(path), // Ruby + '.rb': (path) => new TextLoader(path), // Ruby + '.rs': (path) => new TextLoader(path), // Rust + '.scala': (path) => new TextLoader(path), // Scala + '.sc': (path) => new TextLoader(path), // Scala + '.scss': (path) => new TextLoader(path), // Sass + '.sol': (path) => new TextLoader(path), // Solidity + '.sql': (path) => new TextLoader(path), //SQL + '.swift': (path) => new TextLoader(path), // Swift + '.markdown': (path) => new TextLoader(path), // Markdown + '.md': (path) => new TextLoader(path), // Markdown + '.tex': (path) => new TextLoader(path), // LaTeX + '.ltx': (path) => new TextLoader(path), // LaTeX + '.html': (path) => new TextLoader(path), // HTML + '.vb': (path) => new TextLoader(path), // Visual Basic + '.xml': (path) => new TextLoader(path) // XML }) let docs = [] diff --git a/packages/components/nodes/documentloaders/Gitbook/Gitbook.ts b/packages/components/nodes/documentloaders/Gitbook/Gitbook.ts new file mode 100644 index 000000000..181fa48d4 --- /dev/null +++ b/packages/components/nodes/documentloaders/Gitbook/Gitbook.ts @@ -0,0 +1,84 @@ +import { INode, INodeData, INodeParams } from '../../../src/Interface' +import { TextSplitter } from 'langchain/text_splitter' +import { GitbookLoader } from 'langchain/document_loaders/web/gitbook' + +class Gitbook_DocumentLoaders implements INode { + label: string + name: string + version: number + description: string + type: string + icon: string + category: string + baseClasses: string[] + inputs?: INodeParams[] + + constructor() { + this.label = 'GitBook' + this.name = 'gitbook' + this.version = 1.0 + this.type = 'Document' + this.icon = 'gitbook.svg' + this.category = 'Document Loaders' + this.description = `Load data from GitBook` + this.baseClasses = [this.type] + this.inputs = [ + { + label: 'Web Path', + name: 'webPath', + type: 'string', + placeholder: 'https://docs.gitbook.com/product-tour/navigation', + description: 'If want to load all paths from the GitBook provide only root path e.g.https://docs.gitbook.com/ ' + }, + { + label: 'Should Load All Paths', + name: 'shouldLoadAllPaths', + type: 'boolean', + description: 'Load from all paths in a given GitBook', + optional: true + }, + { + label: 'Text Splitter', + name: 'textSplitter', + type: 'TextSplitter', + optional: true + }, + { + label: 'Metadata', + name: 'metadata', + type: 'json', + optional: true, + additionalParams: true + } + ] + } + async init(nodeData: INodeData): Promise { + const webPath = nodeData.inputs?.webPath as string + const shouldLoadAllPaths = nodeData.inputs?.shouldLoadAllPaths as boolean + const textSplitter = nodeData.inputs?.textSplitter as TextSplitter + const metadata = nodeData.inputs?.metadata + + const loader = shouldLoadAllPaths ? new GitbookLoader(webPath, { shouldLoadAllPaths }) : new GitbookLoader(webPath) + + const docs = textSplitter ? await loader.loadAndSplit() : await loader.load() + + if (metadata) { + const parsedMetadata = typeof metadata === 'object' ? metadata : JSON.parse(metadata) + return docs.map((doc) => { + return { + ...doc, + metadata: { + ...doc.metadata, + ...parsedMetadata + } + } + }) + } + + return docs + } +} + +module.exports = { + nodeClass: Gitbook_DocumentLoaders +} diff --git a/packages/components/nodes/documentloaders/Gitbook/gitbook.svg b/packages/components/nodes/documentloaders/Gitbook/gitbook.svg new file mode 100644 index 000000000..df16237a5 --- /dev/null +++ b/packages/components/nodes/documentloaders/Gitbook/gitbook.svg @@ -0,0 +1 @@ + \ No newline at end of file diff --git a/packages/components/nodes/documentloaders/Github/Github.ts b/packages/components/nodes/documentloaders/Github/Github.ts index 552790abf..079bffb07 100644 --- a/packages/components/nodes/documentloaders/Github/Github.ts +++ b/packages/components/nodes/documentloaders/Github/Github.ts @@ -1,25 +1,37 @@ -import { INode, INodeData, INodeParams } from '../../../src/Interface' +import { ICommonObject, INode, INodeData, INodeParams } from '../../../src/Interface' import { TextSplitter } from 'langchain/text_splitter' import { GithubRepoLoader, GithubRepoLoaderParams } from 'langchain/document_loaders/web/github' +import { getCredentialData, getCredentialParam } from '../../../src' class Github_DocumentLoaders implements INode { label: string name: string + version: number description: string type: string icon: string category: string baseClasses: string[] + credential: INodeParams inputs: INodeParams[] constructor() { this.label = 'Github' this.name = 'github' + this.version = 1.0 this.type = 'Document' this.icon = 'github.png' this.category = 'Document Loaders' this.description = `Load data from a GitHub repository` this.baseClasses = [this.type] + this.credential = { + label: 'Connect Credential', + name: 'credential', + type: 'credential', + description: 'Only needed when accessing private repo', + optional: true, + credentialNames: ['githubApi'] + } this.inputs = [ { label: 'Repo Link', @@ -34,10 +46,9 @@ class Github_DocumentLoaders implements INode { default: 'main' }, { - label: 'Access Token', - name: 'accessToken', - type: 'password', - placeholder: '', + label: 'Recursive', + name: 'recursive', + type: 'boolean', optional: true }, { @@ -56,44 +67,38 @@ class Github_DocumentLoaders implements INode { ] } - async init(nodeData: INodeData): Promise { + async init(nodeData: INodeData, _: string, options: ICommonObject): Promise { const repoLink = nodeData.inputs?.repoLink as string const branch = nodeData.inputs?.branch as string - const accessToken = nodeData.inputs?.accessToken as string + const recursive = nodeData.inputs?.recursive as boolean const textSplitter = nodeData.inputs?.textSplitter as TextSplitter const metadata = nodeData.inputs?.metadata - const options: GithubRepoLoaderParams = { + const credentialData = await getCredentialData(nodeData.credential ?? '', options) + const accessToken = getCredentialParam('accessToken', credentialData, nodeData) + + const githubOptions: GithubRepoLoaderParams = { branch, - recursive: false, + recursive, unknown: 'warn' } - if (accessToken) options.accessToken = accessToken + if (accessToken) githubOptions.accessToken = accessToken - const loader = new GithubRepoLoader(repoLink, options) - let docs = [] - - if (textSplitter) { - docs = await loader.loadAndSplit(textSplitter) - } else { - docs = await loader.load() - } + const loader = new GithubRepoLoader(repoLink, githubOptions) + const docs = textSplitter ? await loader.loadAndSplit(textSplitter) : await loader.load() if (metadata) { const parsedMetadata = typeof metadata === 'object' ? metadata : JSON.parse(metadata) - let finaldocs = [] - for (const doc of docs) { - const newdoc = { + return docs.map((doc) => { + return { ...doc, metadata: { ...doc.metadata, ...parsedMetadata } } - finaldocs.push(newdoc) - } - return finaldocs + }) } return docs diff --git a/packages/components/nodes/documentloaders/Json/Json.ts b/packages/components/nodes/documentloaders/Json/Json.ts index 9177df5cb..43051251b 100644 --- a/packages/components/nodes/documentloaders/Json/Json.ts +++ b/packages/components/nodes/documentloaders/Json/Json.ts @@ -5,6 +5,7 @@ import { JSONLoader } from 'langchain/document_loaders/fs/json' class Json_DocumentLoaders implements INode { label: string name: string + version: number description: string type: string icon: string @@ -15,6 +16,7 @@ class Json_DocumentLoaders implements INode { constructor() { this.label = 'Json File' this.name = 'jsonFile' + this.version = 1.0 this.type = 'Document' this.icon = 'json.svg' this.category = 'Document Loaders' diff --git a/packages/components/nodes/documentloaders/Jsonlines/Jsonlines.ts b/packages/components/nodes/documentloaders/Jsonlines/Jsonlines.ts new file mode 100644 index 000000000..fcc2fae99 --- /dev/null +++ b/packages/components/nodes/documentloaders/Jsonlines/Jsonlines.ts @@ -0,0 +1,108 @@ +import { INode, INodeData, INodeParams } from '../../../src/Interface' +import { TextSplitter } from 'langchain/text_splitter' +import { JSONLinesLoader } from 'langchain/document_loaders/fs/json' + +class Jsonlines_DocumentLoaders implements INode { + label: string + name: string + version: number + description: string + type: string + icon: string + category: string + baseClasses: string[] + inputs: INodeParams[] + + constructor() { + this.label = 'Json Lines File' + this.name = 'jsonlinesFile' + this.version = 1.0 + this.type = 'Document' + this.icon = 'jsonlines.svg' + this.category = 'Document Loaders' + this.description = `Load data from JSON Lines files` + this.baseClasses = [this.type] + this.inputs = [ + { + label: 'Jsonlines File', + name: 'jsonlinesFile', + type: 'file', + fileType: '.jsonl' + }, + { + label: 'Text Splitter', + name: 'textSplitter', + type: 'TextSplitter', + optional: true + }, + { + label: 'Pointer Extraction', + name: 'pointerName', + type: 'string', + placeholder: 'Enter pointer name', + optional: false + }, + { + label: 'Metadata', + name: 'metadata', + type: 'json', + optional: true, + additionalParams: true + } + ] + } + + async init(nodeData: INodeData): Promise { + const textSplitter = nodeData.inputs?.textSplitter as TextSplitter + const jsonLinesFileBase64 = nodeData.inputs?.jsonlinesFile as string + const pointerName = nodeData.inputs?.pointerName as string + const metadata = nodeData.inputs?.metadata + + let alldocs = [] + let files: string[] = [] + + let pointer = '/' + pointerName.trim() + + if (jsonLinesFileBase64.startsWith('[') && jsonLinesFileBase64.endsWith(']')) { + files = JSON.parse(jsonLinesFileBase64) + } else { + files = [jsonLinesFileBase64] + } + + for (const file of files) { + const splitDataURI = file.split(',') + splitDataURI.pop() + const bf = Buffer.from(splitDataURI.pop() || '', 'base64') + const blob = new Blob([bf]) + const loader = new JSONLinesLoader(blob, pointer) + + if (textSplitter) { + const docs = await loader.loadAndSplit(textSplitter) + alldocs.push(...docs) + } else { + const docs = await loader.load() + alldocs.push(...docs) + } + } + + if (metadata) { + const parsedMetadata = typeof metadata === 'object' ? metadata : JSON.parse(metadata) + let finaldocs = [] + for (const doc of alldocs) { + const newdoc = { + ...doc, + metadata: { + ...doc.metadata, + ...parsedMetadata + } + } + finaldocs.push(newdoc) + } + return finaldocs + } + + return alldocs + } +} + +module.exports = { nodeClass: Jsonlines_DocumentLoaders } diff --git a/packages/components/nodes/documentloaders/Jsonlines/jsonlines.svg b/packages/components/nodes/documentloaders/Jsonlines/jsonlines.svg new file mode 100644 index 000000000..f3686f0c9 --- /dev/null +++ b/packages/components/nodes/documentloaders/Jsonlines/jsonlines.svg @@ -0,0 +1,16 @@ + + + + + background + + + + + + + Layer 1 + JSON + Lines + + \ No newline at end of file diff --git a/packages/components/nodes/documentloaders/Notion/NotionDB.ts b/packages/components/nodes/documentloaders/Notion/NotionDB.ts new file mode 100644 index 000000000..74879dd2f --- /dev/null +++ b/packages/components/nodes/documentloaders/Notion/NotionDB.ts @@ -0,0 +1,100 @@ +import { ICommonObject, INode, INodeData, INodeParams } from '../../../src/Interface' +import { TextSplitter } from 'langchain/text_splitter' +import { NotionAPILoader, NotionAPILoaderOptions } from 'langchain/document_loaders/web/notionapi' +import { getCredentialData, getCredentialParam } from '../../../src' + +class NotionDB_DocumentLoaders implements INode { + label: string + name: string + version: number + description: string + type: string + icon: string + category: string + baseClasses: string[] + credential: INodeParams + inputs: INodeParams[] + + constructor() { + this.label = 'Notion Database' + this.name = 'notionDB' + this.version = 1.0 + this.type = 'Document' + this.icon = 'notion.png' + this.category = 'Document Loaders' + this.description = 'Load data from Notion Database (each row is a separate document with all properties as metadata)' + this.baseClasses = [this.type] + this.credential = { + label: 'Connect Credential', + name: 'credential', + type: 'credential', + credentialNames: ['notionApi'] + } + this.inputs = [ + { + label: 'Text Splitter', + name: 'textSplitter', + type: 'TextSplitter', + optional: true + }, + { + label: 'Notion Database Id', + name: 'databaseId', + type: 'string', + description: 'If your URL looks like - https://www.notion.so/abcdefh?v=long_hash_2, then abcdefh is the database ID' + }, + { + label: 'Metadata', + name: 'metadata', + type: 'json', + optional: true, + additionalParams: true + } + ] + } + + async init(nodeData: INodeData, _: string, options: ICommonObject): Promise { + const textSplitter = nodeData.inputs?.textSplitter as TextSplitter + const databaseId = nodeData.inputs?.databaseId as string + const metadata = nodeData.inputs?.metadata + + const credentialData = await getCredentialData(nodeData.credential ?? '', options) + const notionIntegrationToken = getCredentialParam('notionIntegrationToken', credentialData, nodeData) + + const obj: NotionAPILoaderOptions = { + clientOptions: { + auth: notionIntegrationToken + }, + id: databaseId, + type: 'database' + } + const loader = new NotionAPILoader(obj) + + let docs = [] + if (textSplitter) { + docs = await loader.loadAndSplit(textSplitter) + } else { + docs = await loader.load() + } + + if (metadata) { + const parsedMetadata = typeof metadata === 'object' ? metadata : JSON.parse(metadata) + let finaldocs = [] + for (const doc of docs) { + const newdoc = { + ...doc, + metadata: { + ...doc.metadata, + ...parsedMetadata + } + } + finaldocs.push(newdoc) + } + return finaldocs + } + + return docs + } +} + +module.exports = { nodeClass: NotionDB_DocumentLoaders } diff --git a/packages/components/nodes/documentloaders/Notion/Notion.ts b/packages/components/nodes/documentloaders/Notion/NotionFolder.ts similarity index 90% rename from packages/components/nodes/documentloaders/Notion/Notion.ts rename to packages/components/nodes/documentloaders/Notion/NotionFolder.ts index f5bfcb2ad..8b8254a4f 100644 --- a/packages/components/nodes/documentloaders/Notion/Notion.ts +++ b/packages/components/nodes/documentloaders/Notion/NotionFolder.ts @@ -2,9 +2,10 @@ import { INode, INodeData, INodeParams } from '../../../src/Interface' import { TextSplitter } from 'langchain/text_splitter' import { NotionLoader } from 'langchain/document_loaders/fs/notion' -class Notion_DocumentLoaders implements INode { +class NotionFolder_DocumentLoaders implements INode { label: string name: string + version: number description: string type: string icon: string @@ -15,10 +16,11 @@ class Notion_DocumentLoaders implements INode { constructor() { this.label = 'Notion Folder' this.name = 'notionFolder' + this.version = 1.0 this.type = 'Document' this.icon = 'notion.png' this.category = 'Document Loaders' - this.description = `Load data from Notion folder` + this.description = 'Load data from the exported and unzipped Notion folder' this.baseClasses = [this.type] this.inputs = [ { @@ -78,4 +80,4 @@ class Notion_DocumentLoaders implements INode { } } -module.exports = { nodeClass: Notion_DocumentLoaders } +module.exports = { nodeClass: NotionFolder_DocumentLoaders } diff --git a/packages/components/nodes/documentloaders/Notion/NotionPage.ts b/packages/components/nodes/documentloaders/Notion/NotionPage.ts new file mode 100644 index 000000000..b45067ab1 --- /dev/null +++ b/packages/components/nodes/documentloaders/Notion/NotionPage.ts @@ -0,0 +1,101 @@ +import { ICommonObject, INode, INodeData, INodeParams } from '../../../src/Interface' +import { TextSplitter } from 'langchain/text_splitter' +import { NotionAPILoader, NotionAPILoaderOptions } from 'langchain/document_loaders/web/notionapi' +import { getCredentialData, getCredentialParam } from '../../../src' + +class NotionPage_DocumentLoaders implements INode { + label: string + name: string + version: number + description: string + type: string + icon: string + category: string + baseClasses: string[] + credential: INodeParams + inputs: INodeParams[] + + constructor() { + this.label = 'Notion Page' + this.name = 'notionPage' + this.version = 1.0 + this.type = 'Document' + this.icon = 'notion.png' + this.category = 'Document Loaders' + this.description = 'Load data from Notion Page (including child pages all as separate documents)' + this.baseClasses = [this.type] + this.credential = { + label: 'Connect Credential', + name: 'credential', + type: 'credential', + credentialNames: ['notionApi'] + } + this.inputs = [ + { + label: 'Text Splitter', + name: 'textSplitter', + type: 'TextSplitter', + optional: true + }, + { + label: 'Notion Page Id', + name: 'pageId', + type: 'string', + description: + 'The last The 32 char hex in the url path. For example: https://www.notion.so/skarard/LangChain-Notion-API-b34ca03f219c4420a6046fc4bdfdf7b4, b34ca03f219c4420a6046fc4bdfdf7b4 is the Page ID' + }, + { + label: 'Metadata', + name: 'metadata', + type: 'json', + optional: true, + additionalParams: true + } + ] + } + + async init(nodeData: INodeData, _: string, options: ICommonObject): Promise { + const textSplitter = nodeData.inputs?.textSplitter as TextSplitter + const pageId = nodeData.inputs?.pageId as string + const metadata = nodeData.inputs?.metadata + + const credentialData = await getCredentialData(nodeData.credential ?? '', options) + const notionIntegrationToken = getCredentialParam('notionIntegrationToken', credentialData, nodeData) + + const obj: NotionAPILoaderOptions = { + clientOptions: { + auth: notionIntegrationToken + }, + id: pageId, + type: 'page' + } + const loader = new NotionAPILoader(obj) + + let docs = [] + if (textSplitter) { + docs = await loader.loadAndSplit(textSplitter) + } else { + docs = await loader.load() + } + + if (metadata) { + const parsedMetadata = typeof metadata === 'object' ? metadata : JSON.parse(metadata) + let finaldocs = [] + for (const doc of docs) { + const newdoc = { + ...doc, + metadata: { + ...doc.metadata, + ...parsedMetadata + } + } + finaldocs.push(newdoc) + } + return finaldocs + } + + return docs + } +} + +module.exports = { nodeClass: NotionPage_DocumentLoaders } diff --git a/packages/components/nodes/documentloaders/Pdf/Pdf.ts b/packages/components/nodes/documentloaders/Pdf/Pdf.ts index bc36f8cb5..a9f6ab23b 100644 --- a/packages/components/nodes/documentloaders/Pdf/Pdf.ts +++ b/packages/components/nodes/documentloaders/Pdf/Pdf.ts @@ -5,6 +5,7 @@ import { PDFLoader } from 'langchain/document_loaders/fs/pdf' class Pdf_DocumentLoaders implements INode { label: string name: string + version: number description: string type: string icon: string @@ -15,6 +16,7 @@ class Pdf_DocumentLoaders implements INode { constructor() { this.label = 'Pdf File' this.name = 'pdfFile' + this.version = 1.0 this.type = 'Document' this.icon = 'pdf.svg' this.category = 'Document Loaders' @@ -49,6 +51,13 @@ class Pdf_DocumentLoaders implements INode { ], default: 'perPage' }, + { + label: 'Use Legacy Build', + name: 'legacyBuild', + type: 'boolean', + optional: true, + additionalParams: true + }, { label: 'Metadata', name: 'metadata', @@ -64,6 +73,7 @@ class Pdf_DocumentLoaders implements INode { const pdfFileBase64 = nodeData.inputs?.pdfFile as string const usage = nodeData.inputs?.usage as string const metadata = nodeData.inputs?.metadata + const legacyBuild = nodeData.inputs?.legacyBuild as boolean let alldocs = [] let files: string[] = [] @@ -81,8 +91,9 @@ class Pdf_DocumentLoaders implements INode { if (usage === 'perFile') { const loader = new PDFLoader(new Blob([bf]), { splitPages: false, - // @ts-ignore - pdfjs: () => import('pdf-parse/lib/pdf.js/v1.10.100/build/pdf.js') + pdfjs: () => + // @ts-ignore + legacyBuild ? import('pdfjs-dist/legacy/build/pdf.js') : import('pdf-parse/lib/pdf.js/v1.10.100/build/pdf.js') }) if (textSplitter) { const docs = await loader.loadAndSplit(textSplitter) @@ -92,8 +103,11 @@ class Pdf_DocumentLoaders implements INode { alldocs.push(...docs) } } else { - // @ts-ignore - const loader = new PDFLoader(new Blob([bf]), { pdfjs: () => import('pdf-parse/lib/pdf.js/v1.10.100/build/pdf.js') }) + const loader = new PDFLoader(new Blob([bf]), { + pdfjs: () => + // @ts-ignore + legacyBuild ? import('pdfjs-dist/legacy/build/pdf.js') : import('pdf-parse/lib/pdf.js/v1.10.100/build/pdf.js') + }) if (textSplitter) { const docs = await loader.loadAndSplit(textSplitter) alldocs.push(...docs) diff --git a/packages/components/nodes/documentloaders/Playwright/Playwright.ts b/packages/components/nodes/documentloaders/Playwright/Playwright.ts new file mode 100644 index 000000000..eb246045c --- /dev/null +++ b/packages/components/nodes/documentloaders/Playwright/Playwright.ts @@ -0,0 +1,202 @@ +import { INode, INodeData, INodeParams } from '../../../src/Interface' +import { TextSplitter } from 'langchain/text_splitter' +import { Browser, Page, PlaywrightWebBaseLoader, PlaywrightWebBaseLoaderOptions } from 'langchain/document_loaders/web/playwright' +import { test } from 'linkifyjs' +import { webCrawl, xmlScrape } from '../../../src' + +class Playwright_DocumentLoaders implements INode { + label: string + name: string + version: number + description: string + type: string + icon: string + category: string + baseClasses: string[] + inputs: INodeParams[] + + constructor() { + this.label = 'Playwright Web Scraper' + this.name = 'playwrightWebScraper' + this.version = 1.0 + this.type = 'Document' + this.icon = 'playwright.svg' + this.category = 'Document Loaders' + this.description = `Load data from webpages` + this.baseClasses = [this.type] + this.inputs = [ + { + label: 'URL', + name: 'url', + type: 'string' + }, + { + label: 'Text Splitter', + name: 'textSplitter', + type: 'TextSplitter', + optional: true + }, + { + label: 'Get Relative Links Method', + name: 'relativeLinksMethod', + type: 'options', + description: 'Select a method to retrieve relative links', + options: [ + { + label: 'Web Crawl', + name: 'webCrawl', + description: 'Crawl relative links from HTML URL' + }, + { + label: 'Scrape XML Sitemap', + name: 'scrapeXMLSitemap', + description: 'Scrape relative links from XML sitemap URL' + } + ], + optional: true, + additionalParams: true + }, + { + label: 'Get Relative Links Limit', + name: 'limit', + type: 'number', + optional: true, + additionalParams: true, + description: + 'Only used when "Get Relative Links Method" is selected. Set 0 to retrieve all relative links, default limit is 10.', + warning: `Retrieving all links might take long time, and all links will be upserted again if the flow's state changed (eg: different URL, chunk size, etc)` + }, + { + label: 'Wait Until', + name: 'waitUntilGoToOption', + type: 'options', + description: 'Select a go to wait until option', + options: [ + { + label: 'Load', + name: 'load', + description: 'Consider operation to be finished when the load event is fired.' + }, + { + label: 'DOM Content Loaded', + name: 'domcontentloaded', + description: 'Consider operation to be finished when the DOMContentLoaded event is fired.' + }, + { + label: 'Network Idle', + name: 'networkidle', + description: 'Navigation is finished when there are no more connections for at least 500 ms.' + }, + { + label: 'Commit', + name: 'commit', + description: 'Consider operation to be finished when network response is received and the document started loading.' + } + ], + optional: true, + additionalParams: true + }, + { + label: 'Wait for selector to load', + name: 'waitForSelector', + type: 'string', + optional: true, + additionalParams: true, + description: 'CSS selectors like .div or #div' + }, + { + label: 'Metadata', + name: 'metadata', + type: 'json', + optional: true, + additionalParams: true + } + ] + } + + async init(nodeData: INodeData): Promise { + const textSplitter = nodeData.inputs?.textSplitter as TextSplitter + const metadata = nodeData.inputs?.metadata + const relativeLinksMethod = nodeData.inputs?.relativeLinksMethod as string + let limit = nodeData.inputs?.limit as string + let waitUntilGoToOption = nodeData.inputs?.waitUntilGoToOption as 'load' | 'domcontentloaded' | 'networkidle' | 'commit' | undefined + let waitForSelector = nodeData.inputs?.waitForSelector as string + + let url = nodeData.inputs?.url as string + url = url.trim() + if (!test(url)) { + throw new Error('Invalid URL') + } + + async function playwrightLoader(url: string): Promise { + try { + let docs = [] + const config: PlaywrightWebBaseLoaderOptions = { + launchOptions: { + args: ['--no-sandbox'], + headless: true + } + } + if (waitUntilGoToOption) { + config['gotoOptions'] = { + waitUntil: waitUntilGoToOption + } + } + if (waitForSelector) { + config['evaluate'] = async (page: Page, _: Browser): Promise => { + await page.waitForSelector(waitForSelector) + + const result = await page.evaluate(() => document.body.innerHTML) + return result + } + } + const loader = new PlaywrightWebBaseLoader(url, config) + if (textSplitter) { + docs = await loader.loadAndSplit(textSplitter) + } else { + docs = await loader.load() + } + return docs + } catch (err) { + if (process.env.DEBUG === 'true') console.error(`error in PlaywrightWebBaseLoader: ${err.message}, on page: ${url}`) + } + } + + let docs = [] + if (relativeLinksMethod) { + if (process.env.DEBUG === 'true') console.info(`Start ${relativeLinksMethod}`) + if (!limit) limit = '10' + else if (parseInt(limit) < 0) throw new Error('Limit cannot be less than 0') + const pages: string[] = + relativeLinksMethod === 'webCrawl' ? await webCrawl(url, parseInt(limit)) : await xmlScrape(url, parseInt(limit)) + if (process.env.DEBUG === 'true') console.info(`pages: ${JSON.stringify(pages)}, length: ${pages.length}`) + if (!pages || pages.length === 0) throw new Error('No relative links found') + for (const page of pages) { + docs.push(...(await playwrightLoader(page))) + } + if (process.env.DEBUG === 'true') console.info(`Finish ${relativeLinksMethod}`) + } else { + docs = await playwrightLoader(url) + } + + if (metadata) { + const parsedMetadata = typeof metadata === 'object' ? metadata : JSON.parse(metadata) + let finaldocs = [] + for (const doc of docs) { + const newdoc = { + ...doc, + metadata: { + ...doc.metadata, + ...parsedMetadata + } + } + finaldocs.push(newdoc) + } + return finaldocs + } + + return docs + } +} + +module.exports = { nodeClass: Playwright_DocumentLoaders } diff --git a/packages/components/nodes/documentloaders/Playwright/playwright.svg b/packages/components/nodes/documentloaders/Playwright/playwright.svg new file mode 100644 index 000000000..0992832dc --- /dev/null +++ b/packages/components/nodes/documentloaders/Playwright/playwright.svg @@ -0,0 +1,9 @@ + + + + + + + + + \ No newline at end of file diff --git a/packages/components/nodes/documentloaders/Puppeteer/Puppeteer.ts b/packages/components/nodes/documentloaders/Puppeteer/Puppeteer.ts new file mode 100644 index 000000000..4691eb948 --- /dev/null +++ b/packages/components/nodes/documentloaders/Puppeteer/Puppeteer.ts @@ -0,0 +1,203 @@ +import { INode, INodeData, INodeParams } from '../../../src/Interface' +import { TextSplitter } from 'langchain/text_splitter' +import { Browser, Page, PuppeteerWebBaseLoader, PuppeteerWebBaseLoaderOptions } from 'langchain/document_loaders/web/puppeteer' +import { test } from 'linkifyjs' +import { webCrawl, xmlScrape } from '../../../src' +import { PuppeteerLifeCycleEvent } from 'puppeteer' + +class Puppeteer_DocumentLoaders implements INode { + label: string + name: string + version: number + description: string + type: string + icon: string + category: string + baseClasses: string[] + inputs: INodeParams[] + + constructor() { + this.label = 'Puppeteer Web Scraper' + this.name = 'puppeteerWebScraper' + this.version = 1.0 + this.type = 'Document' + this.icon = 'puppeteer.svg' + this.category = 'Document Loaders' + this.description = `Load data from webpages` + this.baseClasses = [this.type] + this.inputs = [ + { + label: 'URL', + name: 'url', + type: 'string' + }, + { + label: 'Text Splitter', + name: 'textSplitter', + type: 'TextSplitter', + optional: true + }, + { + label: 'Get Relative Links Method', + name: 'relativeLinksMethod', + type: 'options', + description: 'Select a method to retrieve relative links', + options: [ + { + label: 'Web Crawl', + name: 'webCrawl', + description: 'Crawl relative links from HTML URL' + }, + { + label: 'Scrape XML Sitemap', + name: 'scrapeXMLSitemap', + description: 'Scrape relative links from XML sitemap URL' + } + ], + optional: true, + additionalParams: true + }, + { + label: 'Get Relative Links Limit', + name: 'limit', + type: 'number', + optional: true, + additionalParams: true, + description: + 'Only used when "Get Relative Links Method" is selected. Set 0 to retrieve all relative links, default limit is 10.', + warning: `Retrieving all links might take long time, and all links will be upserted again if the flow's state changed (eg: different URL, chunk size, etc)` + }, + { + label: 'Wait Until', + name: 'waitUntilGoToOption', + type: 'options', + description: 'Select a go to wait until option', + options: [ + { + label: 'Load', + name: 'load', + description: `When the initial HTML document's DOM has been loaded and parsed` + }, + { + label: 'DOM Content Loaded', + name: 'domcontentloaded', + description: `When the complete HTML document's DOM has been loaded and parsed` + }, + { + label: 'Network Idle 0', + name: 'networkidle0', + description: 'Navigation is finished when there are no more than 0 network connections for at least 500 ms' + }, + { + label: 'Network Idle 2', + name: 'networkidle2', + description: 'Navigation is finished when there are no more than 2 network connections for at least 500 ms' + } + ], + optional: true, + additionalParams: true + }, + { + label: 'Wait for selector to load', + name: 'waitForSelector', + type: 'string', + optional: true, + additionalParams: true, + description: 'CSS selectors like .div or #div' + }, + { + label: 'Metadata', + name: 'metadata', + type: 'json', + optional: true, + additionalParams: true + } + ] + } + + async init(nodeData: INodeData): Promise { + const textSplitter = nodeData.inputs?.textSplitter as TextSplitter + const metadata = nodeData.inputs?.metadata + const relativeLinksMethod = nodeData.inputs?.relativeLinksMethod as string + let limit = nodeData.inputs?.limit as string + let waitUntilGoToOption = nodeData.inputs?.waitUntilGoToOption as PuppeteerLifeCycleEvent + let waitForSelector = nodeData.inputs?.waitForSelector as string + + let url = nodeData.inputs?.url as string + url = url.trim() + if (!test(url)) { + throw new Error('Invalid URL') + } + + async function puppeteerLoader(url: string): Promise { + try { + let docs = [] + const config: PuppeteerWebBaseLoaderOptions = { + launchOptions: { + args: ['--no-sandbox'], + headless: 'new' + } + } + if (waitUntilGoToOption) { + config['gotoOptions'] = { + waitUntil: waitUntilGoToOption + } + } + if (waitForSelector) { + config['evaluate'] = async (page: Page, _: Browser): Promise => { + await page.waitForSelector(waitForSelector) + + const result = await page.evaluate(() => document.body.innerHTML) + return result + } + } + const loader = new PuppeteerWebBaseLoader(url, config) + if (textSplitter) { + docs = await loader.loadAndSplit(textSplitter) + } else { + docs = await loader.load() + } + return docs + } catch (err) { + if (process.env.DEBUG === 'true') console.error(`error in PuppeteerWebBaseLoader: ${err.message}, on page: ${url}`) + } + } + + let docs = [] + if (relativeLinksMethod) { + if (process.env.DEBUG === 'true') console.info(`Start ${relativeLinksMethod}`) + if (!limit) limit = '10' + else if (parseInt(limit) < 0) throw new Error('Limit cannot be less than 0') + const pages: string[] = + relativeLinksMethod === 'webCrawl' ? await webCrawl(url, parseInt(limit)) : await xmlScrape(url, parseInt(limit)) + if (process.env.DEBUG === 'true') console.info(`pages: ${JSON.stringify(pages)}, length: ${pages.length}`) + if (!pages || pages.length === 0) throw new Error('No relative links found') + for (const page of pages) { + docs.push(...(await puppeteerLoader(page))) + } + if (process.env.DEBUG === 'true') console.info(`Finish ${relativeLinksMethod}`) + } else { + docs = await puppeteerLoader(url) + } + + if (metadata) { + const parsedMetadata = typeof metadata === 'object' ? metadata : JSON.parse(metadata) + let finaldocs = [] + for (const doc of docs) { + const newdoc = { + ...doc, + metadata: { + ...doc.metadata, + ...parsedMetadata + } + } + finaldocs.push(newdoc) + } + return finaldocs + } + + return docs + } +} + +module.exports = { nodeClass: Puppeteer_DocumentLoaders } diff --git a/packages/components/nodes/documentloaders/Puppeteer/puppeteer.svg b/packages/components/nodes/documentloaders/Puppeteer/puppeteer.svg new file mode 100644 index 000000000..8477fc52d --- /dev/null +++ b/packages/components/nodes/documentloaders/Puppeteer/puppeteer.svg @@ -0,0 +1,14 @@ + + + + + + + + \ No newline at end of file diff --git a/packages/components/nodes/documentloaders/Subtitles/Subtitles.ts b/packages/components/nodes/documentloaders/Subtitles/Subtitles.ts new file mode 100644 index 000000000..f85898b3e --- /dev/null +++ b/packages/components/nodes/documentloaders/Subtitles/Subtitles.ts @@ -0,0 +1,97 @@ +import { INode, INodeData, INodeParams } from '../../../src/Interface' +import { TextSplitter } from 'langchain/text_splitter' +import { SRTLoader } from 'langchain/document_loaders/fs/srt' + +class Subtitles_DocumentLoaders implements INode { + label: string + name: string + version: number + description: string + type: string + icon: string + category: string + baseClasses: string[] + inputs: INodeParams[] + + constructor() { + this.label = 'Subtitles File' + this.name = 'subtitlesFile' + this.version = 1.0 + this.type = 'Document' + this.icon = 'subtitlesFile.svg' + this.category = 'Document Loaders' + this.description = `Load data from subtitles files` + this.baseClasses = [this.type] + this.inputs = [ + { + label: 'Subtitles File', + name: 'subtitlesFile', + type: 'file', + fileType: '.srt' + }, + { + label: 'Text Splitter', + name: 'textSplitter', + type: 'TextSplitter', + optional: true + }, + { + label: 'Metadata', + name: 'metadata', + type: 'json', + optional: true, + additionalParams: true + } + ] + } + + async init(nodeData: INodeData): Promise { + const textSplitter = nodeData.inputs?.textSplitter as TextSplitter + const subtitlesFileBase64 = nodeData.inputs?.subtitlesFile as string + const metadata = nodeData.inputs?.metadata + + let alldocs = [] + let files: string[] = [] + + if (subtitlesFileBase64.startsWith('[') && subtitlesFileBase64.endsWith(']')) { + files = JSON.parse(subtitlesFileBase64) + } else { + files = [subtitlesFileBase64] + } + + for (const file of files) { + const splitDataURI = file.split(',') + splitDataURI.pop() + const bf = Buffer.from(splitDataURI.pop() || '', 'base64') + const blob = new Blob([bf]) + const loader = new SRTLoader(blob) + + if (textSplitter) { + const docs = await loader.loadAndSplit(textSplitter) + alldocs.push(...docs) + } else { + const docs = await loader.load() + alldocs.push(...docs) + } + } + + if (metadata) { + const parsedMetadata = typeof metadata === 'object' ? metadata : JSON.parse(metadata) + let finaldocs = [] + for (const doc of alldocs) { + const newdoc = { + ...doc, + metadata: { + ...doc.metadata, + ...parsedMetadata + } + } + finaldocs.push(newdoc) + } + return finaldocs + } + return alldocs + } +} + +module.exports = { nodeClass: Subtitles_DocumentLoaders } diff --git a/packages/components/nodes/documentloaders/Subtitles/subtitlesFile.svg b/packages/components/nodes/documentloaders/Subtitles/subtitlesFile.svg new file mode 100644 index 000000000..a6ee925bc --- /dev/null +++ b/packages/components/nodes/documentloaders/Subtitles/subtitlesFile.svg @@ -0,0 +1 @@ + \ No newline at end of file diff --git a/packages/components/nodes/documentloaders/Text/Text.ts b/packages/components/nodes/documentloaders/Text/Text.ts index 63e7e0e26..dacf087c9 100644 --- a/packages/components/nodes/documentloaders/Text/Text.ts +++ b/packages/components/nodes/documentloaders/Text/Text.ts @@ -5,6 +5,7 @@ import { TextLoader } from 'langchain/document_loaders/fs/text' class Text_DocumentLoaders implements INode { label: string name: string + version: number description: string type: string icon: string @@ -15,6 +16,7 @@ class Text_DocumentLoaders implements INode { constructor() { this.label = 'Text File' this.name = 'textFile' + this.version = 1.0 this.type = 'Document' this.icon = 'textFile.svg' this.category = 'Document Loaders' diff --git a/packages/components/nodes/documentloaders/VectorStoreToDocument/VectorStoreToDocument.ts b/packages/components/nodes/documentloaders/VectorStoreToDocument/VectorStoreToDocument.ts new file mode 100644 index 000000000..b3f320ce4 --- /dev/null +++ b/packages/components/nodes/documentloaders/VectorStoreToDocument/VectorStoreToDocument.ts @@ -0,0 +1,87 @@ +import { VectorStore } from 'langchain/vectorstores/base' +import { INode, INodeData, INodeOutputsValue, INodeParams } from '../../../src/Interface' +import { handleEscapeCharacters } from '../../../src/utils' + +class VectorStoreToDocument_DocumentLoaders implements INode { + label: string + name: string + version: number + description: string + type: string + icon: string + category: string + baseClasses: string[] + inputs: INodeParams[] + outputs: INodeOutputsValue[] + + constructor() { + this.label = 'VectorStore To Document' + this.name = 'vectorStoreToDocument' + this.version = 1.0 + this.type = 'Document' + this.icon = 'vectorretriever.svg' + this.category = 'Document Loaders' + this.description = 'Search documents with scores from vector store' + this.baseClasses = [this.type] + this.inputs = [ + { + label: 'Vector Store', + name: 'vectorStore', + type: 'VectorStore' + }, + { + label: 'Minimum Score (%)', + name: 'minScore', + type: 'number', + optional: true, + placeholder: '75', + step: 1, + description: 'Minumum score for embeddings documents to be included' + } + ] + this.outputs = [ + { + label: 'Document', + name: 'document', + baseClasses: this.baseClasses + }, + { + label: 'Text', + name: 'text', + baseClasses: ['string', 'json'] + } + ] + } + + async init(nodeData: INodeData, input: string): Promise { + const vectorStore = nodeData.inputs?.vectorStore as VectorStore + const minScore = nodeData.inputs?.minScore as number + const output = nodeData.outputs?.output as string + + const topK = (vectorStore as any)?.k ?? 4 + + const docs = await vectorStore.similaritySearchWithScore(input, topK) + // eslint-disable-next-line no-console + console.log('\x1b[94m\x1b[1m\n*****VectorStore Documents*****\n\x1b[0m\x1b[0m') + // eslint-disable-next-line no-console + console.log(docs) + + if (output === 'document') { + let finaldocs = [] + for (const doc of docs) { + if (minScore && doc[1] < minScore / 100) continue + finaldocs.push(doc[0]) + } + return finaldocs + } else { + let finaltext = '' + for (const doc of docs) { + if (minScore && doc[1] < minScore / 100) continue + finaltext += `${doc[0].pageContent}\n` + } + return handleEscapeCharacters(finaltext, false) + } + } +} + +module.exports = { nodeClass: VectorStoreToDocument_DocumentLoaders } diff --git a/packages/components/nodes/documentloaders/VectorStoreToDocument/vectorretriever.svg b/packages/components/nodes/documentloaders/VectorStoreToDocument/vectorretriever.svg new file mode 100644 index 000000000..208a59f14 --- /dev/null +++ b/packages/components/nodes/documentloaders/VectorStoreToDocument/vectorretriever.svg @@ -0,0 +1,7 @@ + + + + + + + \ No newline at end of file diff --git a/packages/components/nodes/embeddings/AzureOpenAIEmbedding/Azure.svg b/packages/components/nodes/embeddings/AzureOpenAIEmbedding/Azure.svg index 51eb62535..47ad8c440 100644 --- a/packages/components/nodes/embeddings/AzureOpenAIEmbedding/Azure.svg +++ b/packages/components/nodes/embeddings/AzureOpenAIEmbedding/Azure.svg @@ -1,5 +1 @@ - - - - - \ No newline at end of file + \ No newline at end of file diff --git a/packages/components/nodes/embeddings/AzureOpenAIEmbedding/AzureOpenAIEmbedding.ts b/packages/components/nodes/embeddings/AzureOpenAIEmbedding/AzureOpenAIEmbedding.ts index 355877e55..b70caa4c2 100644 --- a/packages/components/nodes/embeddings/AzureOpenAIEmbedding/AzureOpenAIEmbedding.ts +++ b/packages/components/nodes/embeddings/AzureOpenAIEmbedding/AzureOpenAIEmbedding.ts @@ -1,59 +1,43 @@ import { AzureOpenAIInput } from 'langchain/chat_models/openai' -import { INode, INodeData, INodeParams } from '../../../src/Interface' -import { getBaseClasses } from '../../../src/utils' +import { ICommonObject, INode, INodeData, INodeParams } from '../../../src/Interface' +import { getBaseClasses, getCredentialData, getCredentialParam } from '../../../src/utils' import { OpenAIEmbeddings, OpenAIEmbeddingsParams } from 'langchain/embeddings/openai' class AzureOpenAIEmbedding_Embeddings implements INode { label: string name: string + version: number type: string icon: string category: string description: string baseClasses: string[] + credential: INodeParams inputs: INodeParams[] constructor() { this.label = 'Azure OpenAI Embeddings' this.name = 'azureOpenAIEmbeddings' + this.version = 1.0 this.type = 'AzureOpenAIEmbeddings' this.icon = 'Azure.svg' this.category = 'Embeddings' this.description = 'Azure OpenAI API to generate embeddings for a given text' this.baseClasses = [this.type, ...getBaseClasses(OpenAIEmbeddings)] + this.credential = { + label: 'Connect Credential', + name: 'credential', + type: 'credential', + credentialNames: ['azureOpenAIApi'] + } this.inputs = [ { - label: 'Azure OpenAI Api Key', - name: 'azureOpenAIApiKey', - type: 'password' - }, - { - label: 'Azure OpenAI Api Instance Name', - name: 'azureOpenAIApiInstanceName', - type: 'string', - placeholder: 'YOUR-INSTANCE-NAME' - }, - { - label: 'Azure OpenAI Api Deployment Name', - name: 'azureOpenAIApiDeploymentName', - type: 'string', - placeholder: 'YOUR-DEPLOYMENT-NAME' - }, - { - label: 'Azure OpenAI Api Version', - name: 'azureOpenAIApiVersion', - type: 'options', - options: [ - { - label: '2023-03-15-preview', - name: '2023-03-15-preview' - }, - { - label: '2022-12-01', - name: '2022-12-01' - } - ], - default: '2023-03-15-preview' + label: 'Batch Size', + name: 'batchSize', + type: 'number', + default: '1', + optional: true, + additionalParams: true }, { label: 'Timeout', @@ -65,13 +49,16 @@ class AzureOpenAIEmbedding_Embeddings implements INode { ] } - async init(nodeData: INodeData): Promise { - const azureOpenAIApiKey = nodeData.inputs?.azureOpenAIApiKey as string - const azureOpenAIApiInstanceName = nodeData.inputs?.azureOpenAIApiInstanceName as string - const azureOpenAIApiDeploymentName = nodeData.inputs?.azureOpenAIApiDeploymentName as string - const azureOpenAIApiVersion = nodeData.inputs?.azureOpenAIApiVersion as string + async init(nodeData: INodeData, _: string, options: ICommonObject): Promise { + const batchSize = nodeData.inputs?.batchSize as string const timeout = nodeData.inputs?.timeout as string + const credentialData = await getCredentialData(nodeData.credential ?? '', options) + const azureOpenAIApiKey = getCredentialParam('azureOpenAIApiKey', credentialData, nodeData) + const azureOpenAIApiInstanceName = getCredentialParam('azureOpenAIApiInstanceName', credentialData, nodeData) + const azureOpenAIApiDeploymentName = getCredentialParam('azureOpenAIApiDeploymentName', credentialData, nodeData) + const azureOpenAIApiVersion = getCredentialParam('azureOpenAIApiVersion', credentialData, nodeData) + const obj: Partial & Partial = { azureOpenAIApiKey, azureOpenAIApiInstanceName, @@ -79,6 +66,7 @@ class AzureOpenAIEmbedding_Embeddings implements INode { azureOpenAIApiVersion } + if (batchSize) obj.batchSize = parseInt(batchSize, 10) if (timeout) obj.timeout = parseInt(timeout, 10) const model = new OpenAIEmbeddings(obj) diff --git a/packages/components/nodes/embeddings/CohereEmbedding/CohereEmbedding.ts b/packages/components/nodes/embeddings/CohereEmbedding/CohereEmbedding.ts index 344713a48..b42a0357e 100644 --- a/packages/components/nodes/embeddings/CohereEmbedding/CohereEmbedding.ts +++ b/packages/components/nodes/embeddings/CohereEmbedding/CohereEmbedding.ts @@ -1,31 +1,35 @@ -import { INode, INodeData, INodeParams } from '../../../src/Interface' -import { getBaseClasses } from '../../../src/utils' +import { ICommonObject, INode, INodeData, INodeParams } from '../../../src/Interface' +import { getBaseClasses, getCredentialData, getCredentialParam } from '../../../src/utils' import { CohereEmbeddings, CohereEmbeddingsParams } from 'langchain/embeddings/cohere' class CohereEmbedding_Embeddings implements INode { label: string name: string + version: number type: string icon: string category: string description: string baseClasses: string[] + credential: INodeParams inputs: INodeParams[] constructor() { this.label = 'Cohere Embeddings' this.name = 'cohereEmbeddings' + this.version = 1.0 this.type = 'CohereEmbeddings' this.icon = 'cohere.png' this.category = 'Embeddings' this.description = 'Cohere API to generate embeddings for a given text' this.baseClasses = [this.type, ...getBaseClasses(CohereEmbeddings)] + this.credential = { + label: 'Connect Credential', + name: 'credential', + type: 'credential', + credentialNames: ['cohereApi'] + } this.inputs = [ - { - label: 'Cohere API Key', - name: 'cohereApiKey', - type: 'password' - }, { label: 'Model Name', name: 'modelName', @@ -50,12 +54,14 @@ class CohereEmbedding_Embeddings implements INode { ] } - async init(nodeData: INodeData): Promise { - const apiKey = nodeData.inputs?.cohereApiKey as string + async init(nodeData: INodeData, _: string, options: ICommonObject): Promise { const modelName = nodeData.inputs?.modelName as string + const credentialData = await getCredentialData(nodeData.credential ?? '', options) + const cohereApiKey = getCredentialParam('cohereApiKey', credentialData, nodeData) + const obj: Partial & { apiKey?: string } = { - apiKey + apiKey: cohereApiKey } if (modelName) obj.modelName = modelName diff --git a/packages/components/nodes/embeddings/GoogleVertexAIEmbedding/GoogleVertexAIEmbedding.ts b/packages/components/nodes/embeddings/GoogleVertexAIEmbedding/GoogleVertexAIEmbedding.ts new file mode 100644 index 000000000..23bd3565e --- /dev/null +++ b/packages/components/nodes/embeddings/GoogleVertexAIEmbedding/GoogleVertexAIEmbedding.ts @@ -0,0 +1,63 @@ +import { GoogleVertexAIEmbeddings, GoogleVertexAIEmbeddingsParams } from 'langchain/embeddings/googlevertexai' +import { ICommonObject, INode, INodeData, INodeParams } from '../../../src/Interface' +import { getBaseClasses, getCredentialData, getCredentialParam } from '../../../src/utils' +import { GoogleAuthOptions } from 'google-auth-library' + +class GoogleVertexAIEmbedding_Embeddings implements INode { + label: string + name: string + version: number + type: string + icon: string + category: string + description: string + baseClasses: string[] + credential: INodeParams + inputs: INodeParams[] + + constructor() { + this.label = 'GoogleVertexAI Embeddings' + this.name = 'googlevertexaiEmbeddings' + this.version = 1.0 + this.type = 'GoogleVertexAIEmbeddings' + this.icon = 'vertexai.svg' + this.category = 'Embeddings' + this.description = 'Google vertexAI API to generate embeddings for a given text' + this.baseClasses = [this.type, ...getBaseClasses(GoogleVertexAIEmbeddings)] + this.credential = { + label: 'Connect Credential', + name: 'credential', + type: 'credential', + credentialNames: ['googleVertexAuth'] + } + this.inputs = [] + } + + async init(nodeData: INodeData, _: string, options: ICommonObject): Promise { + const credentialData = await getCredentialData(nodeData.credential ?? '', options) + const googleApplicationCredentialFilePath = getCredentialParam('googleApplicationCredentialFilePath', credentialData, nodeData) + const googleApplicationCredential = getCredentialParam('googleApplicationCredential', credentialData, nodeData) + const projectID = getCredentialParam('projectID', credentialData, nodeData) + + if (!googleApplicationCredentialFilePath && !googleApplicationCredential) + throw new Error('Please specify your Google Application Credential') + if (googleApplicationCredentialFilePath && googleApplicationCredential) + throw new Error('Please use either Google Application Credential File Path or Google Credential JSON Object') + + const authOptions: GoogleAuthOptions = {} + if (googleApplicationCredentialFilePath && !googleApplicationCredential) authOptions.keyFile = googleApplicationCredentialFilePath + else if (!googleApplicationCredentialFilePath && googleApplicationCredential) + authOptions.credentials = JSON.parse(googleApplicationCredential) + + if (projectID) authOptions.projectId = projectID + + const obj: GoogleVertexAIEmbeddingsParams = { + authOptions + } + + const model = new GoogleVertexAIEmbeddings(obj) + return model + } +} + +module.exports = { nodeClass: GoogleVertexAIEmbedding_Embeddings } diff --git a/packages/components/nodes/embeddings/GoogleVertexAIEmbedding/vertexai.svg b/packages/components/nodes/embeddings/GoogleVertexAIEmbedding/vertexai.svg new file mode 100644 index 000000000..31244412a --- /dev/null +++ b/packages/components/nodes/embeddings/GoogleVertexAIEmbedding/vertexai.svg @@ -0,0 +1,2 @@ + + \ No newline at end of file diff --git a/packages/components/nodes/embeddings/HuggingFaceInferenceEmbedding/HuggingFaceInferenceEmbedding.ts b/packages/components/nodes/embeddings/HuggingFaceInferenceEmbedding/HuggingFaceInferenceEmbedding.ts index 6f14325a6..6d75b9559 100644 --- a/packages/components/nodes/embeddings/HuggingFaceInferenceEmbedding/HuggingFaceInferenceEmbedding.ts +++ b/packages/components/nodes/embeddings/HuggingFaceInferenceEmbedding/HuggingFaceInferenceEmbedding.ts @@ -1,49 +1,67 @@ -import { INode, INodeData, INodeParams } from '../../../src/Interface' -import { getBaseClasses } from '../../../src/utils' -import { HuggingFaceInferenceEmbeddings, HuggingFaceInferenceEmbeddingsParams } from 'langchain/embeddings/hf' +import { ICommonObject, INode, INodeData, INodeParams } from '../../../src/Interface' +import { getBaseClasses, getCredentialData, getCredentialParam } from '../../../src/utils' +import { HuggingFaceInferenceEmbeddings, HuggingFaceInferenceEmbeddingsParams } from './core' class HuggingFaceInferenceEmbedding_Embeddings implements INode { label: string name: string + version: number type: string icon: string category: string description: string baseClasses: string[] + credential: INodeParams inputs: INodeParams[] constructor() { this.label = 'HuggingFace Inference Embeddings' this.name = 'huggingFaceInferenceEmbeddings' + this.version = 1.0 this.type = 'HuggingFaceInferenceEmbeddings' this.icon = 'huggingface.png' this.category = 'Embeddings' this.description = 'HuggingFace Inference API to generate embeddings for a given text' this.baseClasses = [this.type, ...getBaseClasses(HuggingFaceInferenceEmbeddings)] + this.credential = { + label: 'Connect Credential', + name: 'credential', + type: 'credential', + credentialNames: ['huggingFaceApi'] + } this.inputs = [ - { - label: 'HuggingFace Api Key', - name: 'apiKey', - type: 'password' - }, { label: 'Model', name: 'modelName', type: 'string', + description: 'If using own inference endpoint, leave this blank', + placeholder: 'sentence-transformers/distilbert-base-nli-mean-tokens', + optional: true + }, + { + label: 'Endpoint', + name: 'endpoint', + type: 'string', + placeholder: 'https://xyz.eu-west-1.aws.endpoints.huggingface.cloud/sentence-transformers/all-MiniLM-L6-v2', + description: 'Using your own inference endpoint', optional: true } ] } - async init(nodeData: INodeData): Promise { - const apiKey = nodeData.inputs?.apiKey as string + async init(nodeData: INodeData, _: string, options: ICommonObject): Promise { const modelName = nodeData.inputs?.modelName as string + const endpoint = nodeData.inputs?.endpoint as string + + const credentialData = await getCredentialData(nodeData.credential ?? '', options) + const huggingFaceApiKey = getCredentialParam('huggingFaceApiKey', credentialData, nodeData) const obj: Partial = { - apiKey + apiKey: huggingFaceApiKey } if (modelName) obj.model = modelName + if (endpoint) obj.endpoint = endpoint const model = new HuggingFaceInferenceEmbeddings(obj) return model diff --git a/packages/components/nodes/embeddings/HuggingFaceInferenceEmbedding/core.ts b/packages/components/nodes/embeddings/HuggingFaceInferenceEmbedding/core.ts new file mode 100644 index 000000000..c75658d45 --- /dev/null +++ b/packages/components/nodes/embeddings/HuggingFaceInferenceEmbedding/core.ts @@ -0,0 +1,55 @@ +import { HfInference } from '@huggingface/inference' +import { Embeddings, EmbeddingsParams } from 'langchain/embeddings/base' +import { getEnvironmentVariable } from '../../../src/utils' + +export interface HuggingFaceInferenceEmbeddingsParams extends EmbeddingsParams { + apiKey?: string + model?: string + endpoint?: string +} + +export class HuggingFaceInferenceEmbeddings extends Embeddings implements HuggingFaceInferenceEmbeddingsParams { + apiKey?: string + + endpoint?: string + + model: string + + client: HfInference + + constructor(fields?: HuggingFaceInferenceEmbeddingsParams) { + super(fields ?? {}) + + this.model = fields?.model ?? 'sentence-transformers/distilbert-base-nli-mean-tokens' + this.apiKey = fields?.apiKey ?? getEnvironmentVariable('HUGGINGFACEHUB_API_KEY') + this.endpoint = fields?.endpoint ?? '' + this.client = new HfInference(this.apiKey) + if (this.endpoint) this.client.endpoint(this.endpoint) + } + + async _embed(texts: string[]): Promise { + // replace newlines, which can negatively affect performance. + const clean = texts.map((text) => text.replace(/\n/g, ' ')) + const hf = new HfInference(this.apiKey) + const obj: any = { + inputs: clean + } + if (this.endpoint) { + hf.endpoint(this.endpoint) + } else { + obj.model = this.model + } + + const res = await this.caller.callWithOptions({}, hf.featureExtraction.bind(hf), obj) + return res as number[][] + } + + async embedQuery(document: string): Promise { + const res = await this._embed([document]) + return res[0] + } + + async embedDocuments(documents: string[]): Promise { + return this._embed(documents) + } +} diff --git a/packages/components/nodes/embeddings/LocalAIEmbedding/LocalAIEmbedding.ts b/packages/components/nodes/embeddings/LocalAIEmbedding/LocalAIEmbedding.ts new file mode 100644 index 000000000..557e35d68 --- /dev/null +++ b/packages/components/nodes/embeddings/LocalAIEmbedding/LocalAIEmbedding.ts @@ -0,0 +1,55 @@ +import { INode, INodeData, INodeParams } from '../../../src/Interface' +import { OpenAIEmbeddings, OpenAIEmbeddingsParams } from 'langchain/embeddings/openai' + +class LocalAIEmbedding_Embeddings implements INode { + label: string + name: string + version: number + type: string + icon: string + category: string + description: string + baseClasses: string[] + inputs: INodeParams[] + + constructor() { + this.label = 'LocalAI Embeddings' + this.name = 'localAIEmbeddings' + this.version = 1.0 + this.type = 'LocalAI Embeddings' + this.icon = 'localai.png' + this.category = 'Embeddings' + this.description = 'Use local embeddings models like llama.cpp' + this.baseClasses = [this.type, 'Embeddings'] + this.inputs = [ + { + label: 'Base Path', + name: 'basePath', + type: 'string', + placeholder: 'http://localhost:8080/v1' + }, + { + label: 'Model Name', + name: 'modelName', + type: 'string', + placeholder: 'text-embedding-ada-002' + } + ] + } + + async init(nodeData: INodeData): Promise { + const modelName = nodeData.inputs?.modelName as string + const basePath = nodeData.inputs?.basePath as string + + const obj: Partial & { openAIApiKey?: string } = { + modelName, + openAIApiKey: 'sk-' + } + + const model = new OpenAIEmbeddings(obj, { basePath }) + + return model + } +} + +module.exports = { nodeClass: LocalAIEmbedding_Embeddings } diff --git a/packages/components/nodes/embeddings/LocalAIEmbedding/localai.png b/packages/components/nodes/embeddings/LocalAIEmbedding/localai.png new file mode 100644 index 000000000..321403973 Binary files /dev/null and b/packages/components/nodes/embeddings/LocalAIEmbedding/localai.png differ diff --git a/packages/components/nodes/embeddings/OpenAIEmbedding/OpenAIEmbedding.ts b/packages/components/nodes/embeddings/OpenAIEmbedding/OpenAIEmbedding.ts index 3ccfab820..d21b6dcaa 100644 --- a/packages/components/nodes/embeddings/OpenAIEmbedding/OpenAIEmbedding.ts +++ b/packages/components/nodes/embeddings/OpenAIEmbedding/OpenAIEmbedding.ts @@ -1,31 +1,35 @@ -import { INode, INodeData, INodeParams } from '../../../src/Interface' -import { getBaseClasses } from '../../../src/utils' +import { ICommonObject, INode, INodeData, INodeParams } from '../../../src/Interface' +import { getBaseClasses, getCredentialData, getCredentialParam } from '../../../src/utils' import { OpenAIEmbeddings, OpenAIEmbeddingsParams } from 'langchain/embeddings/openai' class OpenAIEmbedding_Embeddings implements INode { label: string name: string + version: number type: string icon: string category: string description: string baseClasses: string[] + credential: INodeParams inputs: INodeParams[] constructor() { this.label = 'OpenAI Embeddings' this.name = 'openAIEmbeddings' + this.version = 1.0 this.type = 'OpenAIEmbeddings' this.icon = 'openai.png' this.category = 'Embeddings' this.description = 'OpenAI API to generate embeddings for a given text' this.baseClasses = [this.type, ...getBaseClasses(OpenAIEmbeddings)] + this.credential = { + label: 'Connect Credential', + name: 'credential', + type: 'credential', + credentialNames: ['openAIApi'] + } this.inputs = [ - { - label: 'OpenAI Api Key', - name: 'openAIApiKey', - type: 'password' - }, { label: 'Strip New Lines', name: 'stripNewLines', @@ -46,15 +50,25 @@ class OpenAIEmbedding_Embeddings implements INode { type: 'number', optional: true, additionalParams: true + }, + { + label: 'BasePath', + name: 'basepath', + type: 'string', + optional: true, + additionalParams: true } ] } - async init(nodeData: INodeData): Promise { - const openAIApiKey = nodeData.inputs?.openAIApiKey as string + async init(nodeData: INodeData, _: string, options: ICommonObject): Promise { const stripNewLines = nodeData.inputs?.stripNewLines as boolean const batchSize = nodeData.inputs?.batchSize as string const timeout = nodeData.inputs?.timeout as string + const basePath = nodeData.inputs?.basepath as string + + const credentialData = await getCredentialData(nodeData.credential ?? '', options) + const openAIApiKey = getCredentialParam('openAIApiKey', credentialData, nodeData) const obj: Partial & { openAIApiKey?: string } = { openAIApiKey @@ -64,7 +78,7 @@ class OpenAIEmbedding_Embeddings implements INode { if (batchSize) obj.batchSize = parseInt(batchSize, 10) if (timeout) obj.timeout = parseInt(timeout, 10) - const model = new OpenAIEmbeddings(obj) + const model = new OpenAIEmbeddings(obj, { basePath }) return model } } diff --git a/packages/components/nodes/llms/Azure OpenAI/Azure.svg b/packages/components/nodes/llms/Azure OpenAI/Azure.svg index 51eb62535..47ad8c440 100644 --- a/packages/components/nodes/llms/Azure OpenAI/Azure.svg +++ b/packages/components/nodes/llms/Azure OpenAI/Azure.svg @@ -1,5 +1 @@ - - - - - \ No newline at end of file + \ No newline at end of file diff --git a/packages/components/nodes/llms/Azure OpenAI/AzureOpenAI.ts b/packages/components/nodes/llms/Azure OpenAI/AzureOpenAI.ts index b5d7d1e03..f48c4642b 100644 --- a/packages/components/nodes/llms/Azure OpenAI/AzureOpenAI.ts +++ b/packages/components/nodes/llms/Azure OpenAI/AzureOpenAI.ts @@ -1,31 +1,35 @@ -import { INode, INodeData, INodeParams } from '../../../src/Interface' -import { getBaseClasses } from '../../../src/utils' +import { ICommonObject, INode, INodeData, INodeParams } from '../../../src/Interface' +import { getBaseClasses, getCredentialData, getCredentialParam } from '../../../src/utils' import { AzureOpenAIInput, OpenAI, OpenAIInput } from 'langchain/llms/openai' class AzureOpenAI_LLMs implements INode { label: string name: string + version: number type: string icon: string category: string description: string baseClasses: string[] + credential: INodeParams inputs: INodeParams[] constructor() { this.label = 'Azure OpenAI' this.name = 'azureOpenAI' + this.version = 1.0 this.type = 'AzureOpenAI' this.icon = 'Azure.svg' this.category = 'LLMs' this.description = 'Wrapper around Azure OpenAI large language models' this.baseClasses = [this.type, ...getBaseClasses(OpenAI)] + this.credential = { + label: 'Connect Credential', + name: 'credential', + type: 'credential', + credentialNames: ['azureOpenAIApi'] + } this.inputs = [ - { - label: 'Azure OpenAI Api Key', - name: 'azureOpenAIApiKey', - type: 'password' - }, { label: 'Model Name', name: 'modelName', @@ -87,41 +91,15 @@ class AzureOpenAI_LLMs implements INode { label: 'Temperature', name: 'temperature', type: 'number', + step: 0.1, default: 0.9, optional: true }, - { - label: 'Azure OpenAI Api Instance Name', - name: 'azureOpenAIApiInstanceName', - type: 'string', - placeholder: 'YOUR-INSTANCE-NAME' - }, - { - label: 'Azure OpenAI Api Deployment Name', - name: 'azureOpenAIApiDeploymentName', - type: 'string', - placeholder: 'YOUR-DEPLOYMENT-NAME' - }, - { - label: 'Azure OpenAI Api Version', - name: 'azureOpenAIApiVersion', - type: 'options', - options: [ - { - label: '2023-03-15-preview', - name: '2023-03-15-preview' - }, - { - label: '2022-12-01', - name: '2022-12-01' - } - ], - default: '2023-03-15-preview' - }, { label: 'Max Tokens', name: 'maxTokens', type: 'number', + step: 1, optional: true, additionalParams: true }, @@ -129,6 +107,7 @@ class AzureOpenAI_LLMs implements INode { label: 'Top Probability', name: 'topP', type: 'number', + step: 0.1, optional: true, additionalParams: true }, @@ -136,6 +115,7 @@ class AzureOpenAI_LLMs implements INode { label: 'Best Of', name: 'bestOf', type: 'number', + step: 1, optional: true, additionalParams: true }, @@ -143,6 +123,7 @@ class AzureOpenAI_LLMs implements INode { label: 'Frequency Penalty', name: 'frequencyPenalty', type: 'number', + step: 0.1, optional: true, additionalParams: true }, @@ -150,6 +131,7 @@ class AzureOpenAI_LLMs implements INode { label: 'Presence Penalty', name: 'presencePenalty', type: 'number', + step: 0.1, optional: true, additionalParams: true }, @@ -157,39 +139,44 @@ class AzureOpenAI_LLMs implements INode { label: 'Timeout', name: 'timeout', type: 'number', + step: 1, optional: true, additionalParams: true } ] } - async init(nodeData: INodeData): Promise { - const azureOpenAIApiKey = nodeData.inputs?.azureOpenAIApiKey as string + async init(nodeData: INodeData, _: string, options: ICommonObject): Promise { const temperature = nodeData.inputs?.temperature as string const modelName = nodeData.inputs?.modelName as string - const azureOpenAIApiInstanceName = nodeData.inputs?.azureOpenAIApiInstanceName as string - const azureOpenAIApiDeploymentName = nodeData.inputs?.azureOpenAIApiDeploymentName as string - const azureOpenAIApiVersion = nodeData.inputs?.azureOpenAIApiVersion as string const maxTokens = nodeData.inputs?.maxTokens as string const topP = nodeData.inputs?.topP as string const frequencyPenalty = nodeData.inputs?.frequencyPenalty as string const presencePenalty = nodeData.inputs?.presencePenalty as string const timeout = nodeData.inputs?.timeout as string const bestOf = nodeData.inputs?.bestOf as string + const streaming = nodeData.inputs?.streaming as boolean + + const credentialData = await getCredentialData(nodeData.credential ?? '', options) + const azureOpenAIApiKey = getCredentialParam('azureOpenAIApiKey', credentialData, nodeData) + const azureOpenAIApiInstanceName = getCredentialParam('azureOpenAIApiInstanceName', credentialData, nodeData) + const azureOpenAIApiDeploymentName = getCredentialParam('azureOpenAIApiDeploymentName', credentialData, nodeData) + const azureOpenAIApiVersion = getCredentialParam('azureOpenAIApiVersion', credentialData, nodeData) const obj: Partial & Partial = { - temperature: parseInt(temperature, 10), + temperature: parseFloat(temperature), modelName, azureOpenAIApiKey, azureOpenAIApiInstanceName, azureOpenAIApiDeploymentName, - azureOpenAIApiVersion + azureOpenAIApiVersion, + streaming: streaming ?? true } if (maxTokens) obj.maxTokens = parseInt(maxTokens, 10) - if (topP) obj.topP = parseInt(topP, 10) - if (frequencyPenalty) obj.frequencyPenalty = parseInt(frequencyPenalty, 10) - if (presencePenalty) obj.presencePenalty = parseInt(presencePenalty, 10) + if (topP) obj.topP = parseFloat(topP) + if (frequencyPenalty) obj.frequencyPenalty = parseFloat(frequencyPenalty) + if (presencePenalty) obj.presencePenalty = parseFloat(presencePenalty) if (timeout) obj.timeout = parseInt(timeout, 10) if (bestOf) obj.bestOf = parseInt(bestOf, 10) diff --git a/packages/components/nodes/llms/Cohere/Cohere.ts b/packages/components/nodes/llms/Cohere/Cohere.ts index dc632ec31..4a3a8a807 100644 --- a/packages/components/nodes/llms/Cohere/Cohere.ts +++ b/packages/components/nodes/llms/Cohere/Cohere.ts @@ -1,31 +1,35 @@ -import { INode, INodeData, INodeParams } from '../../../src/Interface' -import { getBaseClasses } from '../../../src/utils' -import { Cohere, CohereInput } from 'langchain/llms/cohere' +import { ICommonObject, INode, INodeData, INodeParams } from '../../../src/Interface' +import { getBaseClasses, getCredentialData, getCredentialParam } from '../../../src/utils' +import { Cohere, CohereInput } from './core' class Cohere_LLMs implements INode { label: string name: string + version: number type: string icon: string category: string description: string baseClasses: string[] + credential: INodeParams inputs: INodeParams[] constructor() { this.label = 'Cohere' this.name = 'cohere' + this.version = 1.0 this.type = 'Cohere' this.icon = 'cohere.png' this.category = 'LLMs' this.description = 'Wrapper around Cohere large language models' this.baseClasses = [this.type, ...getBaseClasses(Cohere)] + this.credential = { + label: 'Connect Credential', + name: 'credential', + type: 'credential', + credentialNames: ['cohereApi'] + } this.inputs = [ - { - label: 'Cohere Api Key', - name: 'cohereApiKey', - type: 'password' - }, { label: 'Model Name', name: 'modelName', @@ -63,6 +67,7 @@ class Cohere_LLMs implements INode { label: 'Temperature', name: 'temperature', type: 'number', + step: 0.1, default: 0.7, optional: true }, @@ -70,24 +75,27 @@ class Cohere_LLMs implements INode { label: 'Max Tokens', name: 'maxTokens', type: 'number', + step: 1, optional: true } ] } - async init(nodeData: INodeData): Promise { + async init(nodeData: INodeData, _: string, options: ICommonObject): Promise { const temperature = nodeData.inputs?.temperature as string const modelName = nodeData.inputs?.modelName as string - const apiKey = nodeData.inputs?.cohereApiKey as string const maxTokens = nodeData.inputs?.maxTokens as string + const credentialData = await getCredentialData(nodeData.credential ?? '', options) + const cohereApiKey = getCredentialParam('cohereApiKey', credentialData, nodeData) + const obj: CohereInput = { - apiKey + apiKey: cohereApiKey } if (maxTokens) obj.maxTokens = parseInt(maxTokens, 10) if (modelName) obj.model = modelName - if (temperature) obj.temperature = parseInt(temperature, 10) + if (temperature) obj.temperature = parseFloat(temperature) const model = new Cohere(obj) return model diff --git a/packages/components/nodes/llms/Cohere/core.ts b/packages/components/nodes/llms/Cohere/core.ts new file mode 100644 index 000000000..97c815710 --- /dev/null +++ b/packages/components/nodes/llms/Cohere/core.ts @@ -0,0 +1,78 @@ +import { LLM, BaseLLMParams } from 'langchain/llms/base' + +export interface CohereInput extends BaseLLMParams { + /** Sampling temperature to use */ + temperature?: number + + /** + * Maximum number of tokens to generate in the completion. + */ + maxTokens?: number + + /** Model to use */ + model?: string + + apiKey?: string +} + +export class Cohere extends LLM implements CohereInput { + temperature = 0 + + maxTokens = 250 + + model: string + + apiKey: string + + constructor(fields?: CohereInput) { + super(fields ?? {}) + + const apiKey = fields?.apiKey ?? undefined + + if (!apiKey) { + throw new Error('Please set the COHERE_API_KEY environment variable or pass it to the constructor as the apiKey field.') + } + + this.apiKey = apiKey + this.maxTokens = fields?.maxTokens ?? this.maxTokens + this.temperature = fields?.temperature ?? this.temperature + this.model = fields?.model ?? this.model + } + + _llmType() { + return 'cohere' + } + + /** @ignore */ + async _call(prompt: string, options: this['ParsedCallOptions']): Promise { + const { cohere } = await Cohere.imports() + + cohere.init(this.apiKey) + + // Hit the `generate` endpoint on the `large` model + const generateResponse = await this.caller.callWithOptions({ signal: options.signal }, cohere.generate.bind(cohere), { + prompt, + model: this.model, + max_tokens: this.maxTokens, + temperature: this.temperature, + end_sequences: options.stop + }) + try { + return generateResponse.body.generations[0].text + } catch { + throw new Error('Could not parse response.') + } + } + + /** @ignore */ + static async imports(): Promise<{ + cohere: typeof import('cohere-ai') + }> { + try { + const { default: cohere } = await import('cohere-ai') + return { cohere } + } catch (e) { + throw new Error('Please install cohere-ai as a dependency with, e.g. `yarn add cohere-ai`') + } + } +} diff --git a/packages/components/nodes/llms/GoogleVertexAI/GoogleVertexAI.ts b/packages/components/nodes/llms/GoogleVertexAI/GoogleVertexAI.ts new file mode 100644 index 000000000..4d9b3aeda --- /dev/null +++ b/packages/components/nodes/llms/GoogleVertexAI/GoogleVertexAI.ts @@ -0,0 +1,117 @@ +import { ICommonObject, INode, INodeData, INodeParams } from '../../../src/Interface' +import { getBaseClasses, getCredentialData, getCredentialParam } from '../../../src/utils' +import { GoogleVertexAI, GoogleVertexAITextInput } from 'langchain/llms/googlevertexai' +import { GoogleAuthOptions } from 'google-auth-library' + +class GoogleVertexAI_LLMs implements INode { + label: string + name: string + version: number + type: string + icon: string + category: string + description: string + baseClasses: string[] + credential: INodeParams + inputs: INodeParams[] + + constructor() { + this.label = 'GoogleVertexAI' + this.name = 'googlevertexai' + this.version = 1.0 + this.type = 'GoogleVertexAI' + this.icon = 'vertexai.svg' + this.category = 'LLMs' + this.description = 'Wrapper around GoogleVertexAI large language models' + this.baseClasses = [this.type, ...getBaseClasses(GoogleVertexAI)] + this.credential = { + label: 'Connect Credential', + name: 'credential', + type: 'credential', + credentialNames: ['googleVertexAuth'] + } + this.inputs = [ + { + label: 'Model Name', + name: 'modelName', + type: 'options', + options: [ + { + label: 'text-bison', + name: 'text-bison' + }, + { + label: 'code-bison', + name: 'code-bison' + }, + { + label: 'code-gecko', + name: 'code-gecko' + } + ], + default: 'text-bison' + }, + { + label: 'Temperature', + name: 'temperature', + type: 'number', + step: 0.1, + default: 0.7, + optional: true + }, + { + label: 'max Output Tokens', + name: 'maxOutputTokens', + type: 'number', + step: 1, + optional: true, + additionalParams: true + }, + { + label: 'Top Probability', + name: 'topP', + type: 'number', + step: 0.1, + optional: true, + additionalParams: true + } + ] + } + + async init(nodeData: INodeData, _: string, options: ICommonObject): Promise { + const credentialData = await getCredentialData(nodeData.credential ?? '', options) + const googleApplicationCredentialFilePath = getCredentialParam('googleApplicationCredentialFilePath', credentialData, nodeData) + const googleApplicationCredential = getCredentialParam('googleApplicationCredential', credentialData, nodeData) + const projectID = getCredentialParam('projectID', credentialData, nodeData) + + if (!googleApplicationCredentialFilePath && !googleApplicationCredential) + throw new Error('Please specify your Google Application Credential') + if (googleApplicationCredentialFilePath && googleApplicationCredential) + throw new Error('Please use either Google Application Credential File Path or Google Credential JSON Object') + + const authOptions: GoogleAuthOptions = {} + if (googleApplicationCredentialFilePath && !googleApplicationCredential) authOptions.keyFile = googleApplicationCredentialFilePath + else if (!googleApplicationCredentialFilePath && googleApplicationCredential) + authOptions.credentials = JSON.parse(googleApplicationCredential) + if (projectID) authOptions.projectId = projectID + + const temperature = nodeData.inputs?.temperature as string + const modelName = nodeData.inputs?.modelName as string + const maxOutputTokens = nodeData.inputs?.maxOutputTokens as string + const topP = nodeData.inputs?.topP as string + + const obj: Partial = { + temperature: parseFloat(temperature), + model: modelName, + authOptions + } + + if (maxOutputTokens) obj.maxOutputTokens = parseInt(maxOutputTokens, 10) + if (topP) obj.topP = parseFloat(topP) + + const model = new GoogleVertexAI(obj) + return model + } +} + +module.exports = { nodeClass: GoogleVertexAI_LLMs } diff --git a/packages/components/nodes/llms/GoogleVertexAI/vertexai.svg b/packages/components/nodes/llms/GoogleVertexAI/vertexai.svg new file mode 100644 index 000000000..31244412a --- /dev/null +++ b/packages/components/nodes/llms/GoogleVertexAI/vertexai.svg @@ -0,0 +1,2 @@ + + \ No newline at end of file diff --git a/packages/components/nodes/llms/HuggingFaceInference/HuggingFaceInference.ts b/packages/components/nodes/llms/HuggingFaceInference/HuggingFaceInference.ts index 6aa3f4f4f..c7f6a37e8 100644 --- a/packages/components/nodes/llms/HuggingFaceInference/HuggingFaceInference.ts +++ b/packages/components/nodes/llms/HuggingFaceInference/HuggingFaceInference.ts @@ -1,48 +1,124 @@ -import { INode, INodeData, INodeParams } from '../../../src/Interface' -import { getBaseClasses } from '../../../src/utils' -import { HuggingFaceInference } from 'langchain/llms/hf' +import { ICommonObject, INode, INodeData, INodeParams } from '../../../src/Interface' +import { getBaseClasses, getCredentialData, getCredentialParam } from '../../../src/utils' +import { HFInput, HuggingFaceInference } from './core' class HuggingFaceInference_LLMs implements INode { label: string name: string + version: number type: string icon: string category: string description: string baseClasses: string[] + credential: INodeParams inputs: INodeParams[] constructor() { this.label = 'HuggingFace Inference' this.name = 'huggingFaceInference_LLMs' + this.version = 1.0 this.type = 'HuggingFaceInference' this.icon = 'huggingface.png' this.category = 'LLMs' this.description = 'Wrapper around HuggingFace large language models' this.baseClasses = [this.type, ...getBaseClasses(HuggingFaceInference)] + this.credential = { + label: 'Connect Credential', + name: 'credential', + type: 'credential', + credentialNames: ['huggingFaceApi'] + } this.inputs = [ { label: 'Model', name: 'model', type: 'string', - placeholder: 'gpt2' + description: 'If using own inference endpoint, leave this blank', + placeholder: 'gpt2', + optional: true }, { - label: 'HuggingFace Api Key', - name: 'apiKey', - type: 'password' + label: 'Endpoint', + name: 'endpoint', + type: 'string', + placeholder: 'https://xyz.eu-west-1.aws.endpoints.huggingface.cloud/gpt2', + description: 'Using your own inference endpoint', + optional: true + }, + { + label: 'Temperature', + name: 'temperature', + type: 'number', + step: 0.1, + description: 'Temperature parameter may not apply to certain model. Please check available model parameters', + optional: true, + additionalParams: true + }, + { + label: 'Max Tokens', + name: 'maxTokens', + type: 'number', + step: 1, + description: 'Max Tokens parameter may not apply to certain model. Please check available model parameters', + optional: true, + additionalParams: true + }, + { + label: 'Top Probability', + name: 'topP', + type: 'number', + step: 0.1, + description: 'Top Probability parameter may not apply to certain model. Please check available model parameters', + optional: true, + additionalParams: true + }, + { + label: 'Top K', + name: 'hfTopK', + type: 'number', + step: 0.1, + description: 'Top K parameter may not apply to certain model. Please check available model parameters', + optional: true, + additionalParams: true + }, + { + label: 'Frequency Penalty', + name: 'frequencyPenalty', + type: 'number', + step: 0.1, + description: 'Frequency Penalty parameter may not apply to certain model. Please check available model parameters', + optional: true, + additionalParams: true } ] } - async init(nodeData: INodeData): Promise { + async init(nodeData: INodeData, _: string, options: ICommonObject): Promise { const model = nodeData.inputs?.model as string - const apiKey = nodeData.inputs?.apiKey as string + const temperature = nodeData.inputs?.temperature as string + const maxTokens = nodeData.inputs?.maxTokens as string + const topP = nodeData.inputs?.topP as string + const hfTopK = nodeData.inputs?.hfTopK as string + const frequencyPenalty = nodeData.inputs?.frequencyPenalty as string + const endpoint = nodeData.inputs?.endpoint as string - const huggingFace = new HuggingFaceInference({ + const credentialData = await getCredentialData(nodeData.credential ?? '', options) + const huggingFaceApiKey = getCredentialParam('huggingFaceApiKey', credentialData, nodeData) + + const obj: Partial = { model, - apiKey - }) + apiKey: huggingFaceApiKey + } + + if (temperature) obj.temperature = parseFloat(temperature) + if (maxTokens) obj.maxTokens = parseInt(maxTokens, 10) + if (topP) obj.topP = parseFloat(topP) + if (hfTopK) obj.topK = parseFloat(hfTopK) + if (frequencyPenalty) obj.frequencyPenalty = parseFloat(frequencyPenalty) + if (endpoint) obj.endpoint = endpoint + + const huggingFace = new HuggingFaceInference(obj) return huggingFace } } diff --git a/packages/components/nodes/llms/HuggingFaceInference/core.ts b/packages/components/nodes/llms/HuggingFaceInference/core.ts new file mode 100644 index 000000000..416567f0d --- /dev/null +++ b/packages/components/nodes/llms/HuggingFaceInference/core.ts @@ -0,0 +1,113 @@ +import { getEnvironmentVariable } from '../../../src/utils' +import { LLM, BaseLLMParams } from 'langchain/llms/base' + +export interface HFInput { + /** Model to use */ + model: string + + /** Sampling temperature to use */ + temperature?: number + + /** + * Maximum number of tokens to generate in the completion. + */ + maxTokens?: number + + /** Total probability mass of tokens to consider at each step */ + topP?: number + + /** Integer to define the top tokens considered within the sample operation to create new text. */ + topK?: number + + /** Penalizes repeated tokens according to frequency */ + frequencyPenalty?: number + + /** API key to use. */ + apiKey?: string + + /** Private endpoint to use. */ + endpoint?: string +} + +export class HuggingFaceInference extends LLM implements HFInput { + get lc_secrets(): { [key: string]: string } | undefined { + return { + apiKey: 'HUGGINGFACEHUB_API_KEY' + } + } + + model = 'gpt2' + + temperature: number | undefined = undefined + + maxTokens: number | undefined = undefined + + topP: number | undefined = undefined + + topK: number | undefined = undefined + + frequencyPenalty: number | undefined = undefined + + apiKey: string | undefined = undefined + + endpoint: string | undefined = undefined + + constructor(fields?: Partial & BaseLLMParams) { + super(fields ?? {}) + + this.model = fields?.model ?? this.model + this.temperature = fields?.temperature ?? this.temperature + this.maxTokens = fields?.maxTokens ?? this.maxTokens + this.topP = fields?.topP ?? this.topP + this.topK = fields?.topK ?? this.topK + this.frequencyPenalty = fields?.frequencyPenalty ?? this.frequencyPenalty + this.endpoint = fields?.endpoint ?? '' + this.apiKey = fields?.apiKey ?? getEnvironmentVariable('HUGGINGFACEHUB_API_KEY') + if (!this.apiKey) { + throw new Error( + 'Please set an API key for HuggingFace Hub in the environment variable HUGGINGFACEHUB_API_KEY or in the apiKey field of the HuggingFaceInference constructor.' + ) + } + } + + _llmType() { + return 'hf' + } + + /** @ignore */ + async _call(prompt: string, options: this['ParsedCallOptions']): Promise { + const { HfInference } = await HuggingFaceInference.imports() + const hf = new HfInference(this.apiKey) + const obj: any = { + parameters: { + // make it behave similar to openai, returning only the generated text + return_full_text: false, + temperature: this.temperature, + max_new_tokens: this.maxTokens, + top_p: this.topP, + top_k: this.topK, + repetition_penalty: this.frequencyPenalty + }, + inputs: prompt + } + if (this.endpoint) { + hf.endpoint(this.endpoint) + } else { + obj.model = this.model + } + const res = await this.caller.callWithOptions({ signal: options.signal }, hf.textGeneration.bind(hf), obj) + return res.generated_text + } + + /** @ignore */ + static async imports(): Promise<{ + HfInference: typeof import('@huggingface/inference').HfInference + }> { + try { + const { HfInference } = await import('@huggingface/inference') + return { HfInference } + } catch (e) { + throw new Error('Please install huggingface as a dependency with, e.g. `yarn add @huggingface/inference`') + } + } +} diff --git a/packages/components/nodes/llms/OpenAI/OpenAI.ts b/packages/components/nodes/llms/OpenAI/OpenAI.ts index af44965e3..4e35d659f 100644 --- a/packages/components/nodes/llms/OpenAI/OpenAI.ts +++ b/packages/components/nodes/llms/OpenAI/OpenAI.ts @@ -1,31 +1,35 @@ -import { INode, INodeData, INodeParams } from '../../../src/Interface' -import { getBaseClasses } from '../../../src/utils' +import { ICommonObject, INode, INodeData, INodeParams } from '../../../src/Interface' +import { getBaseClasses, getCredentialData, getCredentialParam } from '../../../src/utils' import { OpenAI, OpenAIInput } from 'langchain/llms/openai' class OpenAI_LLMs implements INode { label: string name: string + version: number type: string icon: string category: string description: string baseClasses: string[] + credential: INodeParams inputs: INodeParams[] constructor() { this.label = 'OpenAI' this.name = 'openAI' + this.version = 1.0 this.type = 'OpenAI' this.icon = 'openai.png' this.category = 'LLMs' this.description = 'Wrapper around OpenAI large language models' this.baseClasses = [this.type, ...getBaseClasses(OpenAI)] + this.credential = { + label: 'Connect Credential', + name: 'credential', + type: 'credential', + credentialNames: ['openAIApi'] + } this.inputs = [ - { - label: 'OpenAI Api Key', - name: 'openAIApiKey', - type: 'password' - }, { label: 'Model Name', name: 'modelName', @@ -55,6 +59,7 @@ class OpenAI_LLMs implements INode { label: 'Temperature', name: 'temperature', type: 'number', + step: 0.1, default: 0.7, optional: true }, @@ -62,6 +67,7 @@ class OpenAI_LLMs implements INode { label: 'Max Tokens', name: 'maxTokens', type: 'number', + step: 1, optional: true, additionalParams: true }, @@ -69,6 +75,7 @@ class OpenAI_LLMs implements INode { label: 'Top Probability', name: 'topP', type: 'number', + step: 0.1, optional: true, additionalParams: true }, @@ -76,6 +83,7 @@ class OpenAI_LLMs implements INode { label: 'Best Of', name: 'bestOf', type: 'number', + step: 1, optional: true, additionalParams: true }, @@ -83,6 +91,7 @@ class OpenAI_LLMs implements INode { label: 'Frequency Penalty', name: 'frequencyPenalty', type: 'number', + step: 0.1, optional: true, additionalParams: true }, @@ -90,6 +99,7 @@ class OpenAI_LLMs implements INode { label: 'Presence Penalty', name: 'presencePenalty', type: 'number', + step: 0.1, optional: true, additionalParams: true }, @@ -97,6 +107,7 @@ class OpenAI_LLMs implements INode { label: 'Batch Size', name: 'batchSize', type: 'number', + step: 1, optional: true, additionalParams: true }, @@ -104,16 +115,23 @@ class OpenAI_LLMs implements INode { label: 'Timeout', name: 'timeout', type: 'number', + step: 1, + optional: true, + additionalParams: true + }, + { + label: 'BasePath', + name: 'basepath', + type: 'string', optional: true, additionalParams: true } ] } - async init(nodeData: INodeData): Promise { + async init(nodeData: INodeData, _: string, options: ICommonObject): Promise { const temperature = nodeData.inputs?.temperature as string const modelName = nodeData.inputs?.modelName as string - const openAIApiKey = nodeData.inputs?.openAIApiKey as string const maxTokens = nodeData.inputs?.maxTokens as string const topP = nodeData.inputs?.topP as string const frequencyPenalty = nodeData.inputs?.frequencyPenalty as string @@ -121,22 +139,30 @@ class OpenAI_LLMs implements INode { const timeout = nodeData.inputs?.timeout as string const batchSize = nodeData.inputs?.batchSize as string const bestOf = nodeData.inputs?.bestOf as string + const streaming = nodeData.inputs?.streaming as boolean + const basePath = nodeData.inputs?.basepath as string + + const credentialData = await getCredentialData(nodeData.credential ?? '', options) + const openAIApiKey = getCredentialParam('openAIApiKey', credentialData, nodeData) const obj: Partial & { openAIApiKey?: string } = { - temperature: parseInt(temperature, 10), + temperature: parseFloat(temperature), modelName, - openAIApiKey + openAIApiKey, + streaming: streaming ?? true } if (maxTokens) obj.maxTokens = parseInt(maxTokens, 10) - if (topP) obj.topP = parseInt(topP, 10) - if (frequencyPenalty) obj.frequencyPenalty = parseInt(frequencyPenalty, 10) - if (presencePenalty) obj.presencePenalty = parseInt(presencePenalty, 10) + if (topP) obj.topP = parseFloat(topP) + if (frequencyPenalty) obj.frequencyPenalty = parseFloat(frequencyPenalty) + if (presencePenalty) obj.presencePenalty = parseFloat(presencePenalty) if (timeout) obj.timeout = parseInt(timeout, 10) if (batchSize) obj.batchSize = parseInt(batchSize, 10) if (bestOf) obj.bestOf = parseInt(bestOf, 10) - const model = new OpenAI(obj) + const model = new OpenAI(obj, { + basePath + }) return model } } diff --git a/packages/components/nodes/llms/Replicate/Replicate.ts b/packages/components/nodes/llms/Replicate/Replicate.ts new file mode 100644 index 000000000..22c6e93aa --- /dev/null +++ b/packages/components/nodes/llms/Replicate/Replicate.ts @@ -0,0 +1,128 @@ +import { ICommonObject, INode, INodeData, INodeParams } from '../../../src/Interface' +import { getBaseClasses, getCredentialData, getCredentialParam } from '../../../src/utils' +import { Replicate, ReplicateInput } from 'langchain/llms/replicate' + +class Replicate_LLMs implements INode { + label: string + name: string + version: number + type: string + icon: string + category: string + description: string + baseClasses: string[] + credential: INodeParams + inputs: INodeParams[] + + constructor() { + this.label = 'Replicate' + this.name = 'replicate' + this.version = 1.0 + this.type = 'Replicate' + this.icon = 'replicate.svg' + this.category = 'LLMs' + this.description = 'Use Replicate to run open source models on cloud' + this.baseClasses = [this.type, 'BaseChatModel', ...getBaseClasses(Replicate)] + this.credential = { + label: 'Connect Credential', + name: 'credential', + type: 'credential', + credentialNames: ['replicateApi'] + } + this.inputs = [ + { + label: 'Model', + name: 'model', + type: 'string', + placeholder: 'a16z-infra/llama13b-v2-chat:df7690f1994d94e96ad9d568eac121aecf50684a0b0963b25a41cc40061269e5', + optional: true + }, + { + label: 'Temperature', + name: 'temperature', + type: 'number', + step: 0.1, + description: + 'Adjusts randomness of outputs, greater than 1 is random and 0 is deterministic, 0.75 is a good starting value.', + default: 0.7, + optional: true + }, + { + label: 'Max Tokens', + name: 'maxTokens', + type: 'number', + step: 1, + description: 'Maximum number of tokens to generate. A word is generally 2-3 tokens', + optional: true, + additionalParams: true + }, + { + label: 'Top Probability', + name: 'topP', + type: 'number', + step: 0.1, + description: + 'When decoding text, samples from the top p percentage of most likely tokens; lower to ignore less likely tokens', + optional: true, + additionalParams: true + }, + { + label: 'Repetition Penalty', + name: 'repetitionPenalty', + type: 'number', + step: 0.1, + description: + 'Penalty for repeated words in generated text; 1 is no penalty, values greater than 1 discourage repetition, less than 1 encourage it. (minimum: 0.01; maximum: 5)', + optional: true, + additionalParams: true + }, + { + label: 'Additional Inputs', + name: 'additionalInputs', + type: 'json', + description: + 'Each model has different parameters, refer to the specific model accepted inputs. For example: llama13b-v2', + additionalParams: true, + optional: true + } + ] + } + + async init(nodeData: INodeData, _: string, options: ICommonObject): Promise { + const modelName = nodeData.inputs?.model as string + const temperature = nodeData.inputs?.temperature as string + const maxTokens = nodeData.inputs?.maxTokens as string + const topP = nodeData.inputs?.topP as string + const repetitionPenalty = nodeData.inputs?.repetitionPenalty as string + const additionalInputs = nodeData.inputs?.additionalInputs as string + + const credentialData = await getCredentialData(nodeData.credential ?? '', options) + const apiKey = getCredentialParam('replicateApiKey', credentialData, nodeData) + + const version = modelName.split(':').pop() + const name = modelName.split(':')[0].split('/').pop() + const org = modelName.split(':')[0].split('/')[0] + + const obj: ReplicateInput = { + model: `${org}/${name}:${version}`, + apiKey + } + + let inputs: any = {} + if (maxTokens) inputs.max_length = parseInt(maxTokens, 10) + if (temperature) inputs.temperature = parseFloat(temperature) + if (topP) inputs.top_p = parseFloat(topP) + if (repetitionPenalty) inputs.repetition_penalty = parseFloat(repetitionPenalty) + if (additionalInputs) { + const parsedInputs = + typeof additionalInputs === 'object' ? additionalInputs : additionalInputs ? JSON.parse(additionalInputs) : {} + inputs = { ...inputs, ...parsedInputs } + } + if (Object.keys(inputs).length) obj.input = inputs + + const model = new Replicate(obj) + return model + } +} + +module.exports = { nodeClass: Replicate_LLMs } diff --git a/packages/components/nodes/llms/Replicate/replicate.svg b/packages/components/nodes/llms/Replicate/replicate.svg new file mode 100644 index 000000000..2e46453f8 --- /dev/null +++ b/packages/components/nodes/llms/Replicate/replicate.svg @@ -0,0 +1,7 @@ + \ No newline at end of file diff --git a/packages/components/nodes/memory/BufferMemory/BufferMemory.ts b/packages/components/nodes/memory/BufferMemory/BufferMemory.ts index fd635ff47..7793d96d4 100644 --- a/packages/components/nodes/memory/BufferMemory/BufferMemory.ts +++ b/packages/components/nodes/memory/BufferMemory/BufferMemory.ts @@ -5,6 +5,7 @@ import { BufferMemory } from 'langchain/memory' class BufferMemory_Memory implements INode { label: string name: string + version: number description: string type: string icon: string @@ -15,6 +16,7 @@ class BufferMemory_Memory implements INode { constructor() { this.label = 'Buffer Memory' this.name = 'bufferMemory' + this.version = 1.0 this.type = 'BufferMemory' this.icon = 'memory.svg' this.category = 'Memory' diff --git a/packages/components/nodes/memory/BufferWindowMemory/BufferWindowMemory.ts b/packages/components/nodes/memory/BufferWindowMemory/BufferWindowMemory.ts new file mode 100644 index 000000000..cf8e7f1dc --- /dev/null +++ b/packages/components/nodes/memory/BufferWindowMemory/BufferWindowMemory.ts @@ -0,0 +1,64 @@ +import { INode, INodeData, INodeParams } from '../../../src/Interface' +import { getBaseClasses } from '../../../src/utils' +import { BufferWindowMemory, BufferWindowMemoryInput } from 'langchain/memory' + +class BufferWindowMemory_Memory implements INode { + label: string + name: string + version: number + description: string + type: string + icon: string + category: string + baseClasses: string[] + inputs: INodeParams[] + + constructor() { + this.label = 'Buffer Window Memory' + this.name = 'bufferWindowMemory' + this.version = 1.0 + this.type = 'BufferWindowMemory' + this.icon = 'memory.svg' + this.category = 'Memory' + this.description = 'Uses a window of size k to surface the last k back-and-forths to use as memory' + this.baseClasses = [this.type, ...getBaseClasses(BufferWindowMemory)] + this.inputs = [ + { + label: 'Memory Key', + name: 'memoryKey', + type: 'string', + default: 'chat_history' + }, + { + label: 'Input Key', + name: 'inputKey', + type: 'string', + default: 'input' + }, + { + label: 'Size', + name: 'k', + type: 'number', + default: '4', + description: 'Window of size k to surface the last k back-and-forths to use as memory.' + } + ] + } + + async init(nodeData: INodeData): Promise { + const memoryKey = nodeData.inputs?.memoryKey as string + const inputKey = nodeData.inputs?.inputKey as string + const k = nodeData.inputs?.k as string + + const obj: Partial = { + returnMessages: true, + memoryKey: memoryKey, + inputKey: inputKey, + k: parseInt(k, 10) + } + + return new BufferWindowMemory(obj) + } +} + +module.exports = { nodeClass: BufferWindowMemory_Memory } diff --git a/packages/components/nodes/memory/BufferWindowMemory/memory.svg b/packages/components/nodes/memory/BufferWindowMemory/memory.svg new file mode 100644 index 000000000..ca8e17da1 --- /dev/null +++ b/packages/components/nodes/memory/BufferWindowMemory/memory.svg @@ -0,0 +1,8 @@ + + + + + + + + \ No newline at end of file diff --git a/packages/components/nodes/memory/ConversationSummaryMemory/ConversationSummaryMemory.ts b/packages/components/nodes/memory/ConversationSummaryMemory/ConversationSummaryMemory.ts new file mode 100644 index 000000000..332d73aa9 --- /dev/null +++ b/packages/components/nodes/memory/ConversationSummaryMemory/ConversationSummaryMemory.ts @@ -0,0 +1,63 @@ +import { INode, INodeData, INodeParams } from '../../../src/Interface' +import { getBaseClasses } from '../../../src/utils' +import { ConversationSummaryMemory, ConversationSummaryMemoryInput } from 'langchain/memory' +import { BaseLanguageModel } from 'langchain/base_language' + +class ConversationSummaryMemory_Memory implements INode { + label: string + name: string + version: number + description: string + type: string + icon: string + category: string + baseClasses: string[] + inputs: INodeParams[] + + constructor() { + this.label = 'Conversation Summary Memory' + this.name = 'conversationSummaryMemory' + this.version = 1.0 + this.type = 'ConversationSummaryMemory' + this.icon = 'memory.svg' + this.category = 'Memory' + this.description = 'Summarizes the conversation and stores the current summary in memory' + this.baseClasses = [this.type, ...getBaseClasses(ConversationSummaryMemory)] + this.inputs = [ + { + label: 'Chat Model', + name: 'model', + type: 'BaseChatModel' + }, + { + label: 'Memory Key', + name: 'memoryKey', + type: 'string', + default: 'chat_history' + }, + { + label: 'Input Key', + name: 'inputKey', + type: 'string', + default: 'input' + } + ] + } + + async init(nodeData: INodeData): Promise { + const model = nodeData.inputs?.model as BaseLanguageModel + const memoryKey = nodeData.inputs?.memoryKey as string + const inputKey = nodeData.inputs?.inputKey as string + + const obj: ConversationSummaryMemoryInput = { + llm: model, + returnMessages: true, + memoryKey, + inputKey + } + + return new ConversationSummaryMemory(obj) + } +} + +module.exports = { nodeClass: ConversationSummaryMemory_Memory } diff --git a/packages/components/nodes/memory/ConversationSummaryMemory/memory.svg b/packages/components/nodes/memory/ConversationSummaryMemory/memory.svg new file mode 100644 index 000000000..ca8e17da1 --- /dev/null +++ b/packages/components/nodes/memory/ConversationSummaryMemory/memory.svg @@ -0,0 +1,8 @@ + + + + + + + + \ No newline at end of file diff --git a/packages/components/nodes/memory/DynamoDb/DynamoDb.ts b/packages/components/nodes/memory/DynamoDb/DynamoDb.ts new file mode 100644 index 000000000..68b09b7b2 --- /dev/null +++ b/packages/components/nodes/memory/DynamoDb/DynamoDb.ts @@ -0,0 +1,133 @@ +import { ICommonObject, INode, INodeData, INodeParams, getBaseClasses, getCredentialData, getCredentialParam } from '../../../src' +import { DynamoDBChatMessageHistory } from 'langchain/stores/message/dynamodb' +import { BufferMemory, BufferMemoryInput } from 'langchain/memory' + +class DynamoDb_Memory implements INode { + label: string + name: string + version: number + description: string + type: string + icon: string + category: string + baseClasses: string[] + credential: INodeParams + inputs: INodeParams[] + + constructor() { + this.label = 'DynamoDB Chat Memory' + this.name = 'DynamoDBChatMemory' + this.version = 1.0 + this.type = 'DynamoDBChatMemory' + this.icon = 'dynamodb.svg' + this.category = 'Memory' + this.description = 'Stores the conversation in dynamo db table' + this.baseClasses = [this.type, ...getBaseClasses(BufferMemory)] + this.credential = { + label: 'Connect Credential', + name: 'credential', + type: 'credential', + credentialNames: ['dynamodbMemoryApi'] + } + this.inputs = [ + { + label: 'Table Name', + name: 'tableName', + type: 'string' + }, + { + label: 'Partition Key', + name: 'partitionKey', + type: 'string' + }, + { + label: 'Region', + name: 'region', + type: 'string', + description: 'The aws region in which table is located', + placeholder: 'us-east-1' + }, + { + label: 'Session ID', + name: 'sessionId', + type: 'string', + description: 'If not specified, the first CHAT_MESSAGE_ID will be used as sessionId', + default: '', + additionalParams: true, + optional: true + }, + { + label: 'Memory Key', + name: 'memoryKey', + type: 'string', + default: 'chat_history', + additionalParams: true + } + ] + } + + async init(nodeData: INodeData, _: string, options: ICommonObject): Promise { + return initalizeDynamoDB(nodeData, options) + } + + async clearSessionMemory(nodeData: INodeData, options: ICommonObject): Promise { + const dynamodbMemory = await initalizeDynamoDB(nodeData, options) + const sessionId = nodeData.inputs?.sessionId as string + const chatId = options?.chatId as string + options.logger.info(`Clearing DynamoDb memory session ${sessionId ? sessionId : chatId}`) + await dynamodbMemory.clear() + options.logger.info(`Successfully cleared DynamoDb memory session ${sessionId ? sessionId : chatId}`) + } +} + +const initalizeDynamoDB = async (nodeData: INodeData, options: ICommonObject): Promise => { + const tableName = nodeData.inputs?.tableName as string + const partitionKey = nodeData.inputs?.partitionKey as string + const sessionId = nodeData.inputs?.sessionId as string + const region = nodeData.inputs?.region as string + const memoryKey = nodeData.inputs?.memoryKey as string + const chatId = options.chatId + + let isSessionIdUsingChatMessageId = false + if (!sessionId && chatId) isSessionIdUsingChatMessageId = true + + const credentialData = await getCredentialData(nodeData.credential ?? '', options) + const accessKeyId = getCredentialParam('accessKey', credentialData, nodeData) + const secretAccessKey = getCredentialParam('secretAccessKey', credentialData, nodeData) + + const dynamoDb = new DynamoDBChatMessageHistory({ + tableName, + partitionKey, + sessionId: sessionId ? sessionId : chatId, + config: { + region, + credentials: { + accessKeyId, + secretAccessKey + } + } + }) + + const memory = new BufferMemoryExtended({ + memoryKey, + chatHistory: dynamoDb, + returnMessages: true, + isSessionIdUsingChatMessageId + }) + return memory +} + +interface BufferMemoryExtendedInput { + isSessionIdUsingChatMessageId: boolean +} + +class BufferMemoryExtended extends BufferMemory { + isSessionIdUsingChatMessageId? = false + + constructor(fields: BufferMemoryInput & Partial) { + super(fields) + this.isSessionIdUsingChatMessageId = fields.isSessionIdUsingChatMessageId + } +} + +module.exports = { nodeClass: DynamoDb_Memory } diff --git a/packages/components/nodes/memory/DynamoDb/dynamodb.svg b/packages/components/nodes/memory/DynamoDb/dynamodb.svg new file mode 100644 index 000000000..f2798350a --- /dev/null +++ b/packages/components/nodes/memory/DynamoDb/dynamodb.svg @@ -0,0 +1,18 @@ + + + + Icon-Architecture/16/Arch_Amazon-DynamoDB_16 + Created with Sketch. + + + + + + + + + + + + + \ No newline at end of file diff --git a/packages/components/nodes/memory/MotorheadMemory/MotorheadMemory.ts b/packages/components/nodes/memory/MotorheadMemory/MotorheadMemory.ts new file mode 100644 index 000000000..0ec2f42ad --- /dev/null +++ b/packages/components/nodes/memory/MotorheadMemory/MotorheadMemory.ts @@ -0,0 +1,149 @@ +import { INode, INodeData, INodeParams } from '../../../src/Interface' +import { getBaseClasses, getCredentialData, getCredentialParam } from '../../../src/utils' +import { ICommonObject } from '../../../src' +import { MotorheadMemory, MotorheadMemoryInput } from 'langchain/memory' +import fetch from 'node-fetch' + +class MotorMemory_Memory implements INode { + label: string + name: string + version: number + description: string + type: string + icon: string + category: string + baseClasses: string[] + credential: INodeParams + inputs: INodeParams[] + + constructor() { + this.label = 'Motorhead Memory' + this.name = 'motorheadMemory' + this.version = 1.0 + this.type = 'MotorheadMemory' + this.icon = 'motorhead.png' + this.category = 'Memory' + this.description = 'Use Motorhead Memory to store chat conversations' + this.baseClasses = [this.type, ...getBaseClasses(MotorheadMemory)] + this.credential = { + label: 'Connect Credential', + name: 'credential', + type: 'credential', + optional: true, + description: 'Only needed when using hosted solution - https://getmetal.io', + credentialNames: ['motorheadMemoryApi'] + } + this.inputs = [ + { + label: 'Base URL', + name: 'baseURL', + type: 'string', + optional: true, + description: 'To use the online version, leave the URL blank. More details at https://getmetal.io.' + }, + { + label: 'Session Id', + name: 'sessionId', + type: 'string', + description: 'If not specified, the first CHAT_MESSAGE_ID will be used as sessionId', + default: '', + additionalParams: true, + optional: true + }, + { + label: 'Memory Key', + name: 'memoryKey', + type: 'string', + default: 'chat_history', + additionalParams: true + } + ] + } + + async init(nodeData: INodeData, _: string, options: ICommonObject): Promise { + return initalizeMotorhead(nodeData, options) + } + + async clearSessionMemory(nodeData: INodeData, options: ICommonObject): Promise { + const motorhead = await initalizeMotorhead(nodeData, options) + const sessionId = nodeData.inputs?.sessionId as string + const chatId = options?.chatId as string + options.logger.info(`Clearing Motorhead memory session ${sessionId ? sessionId : chatId}`) + await motorhead.clear() + options.logger.info(`Successfully cleared Motorhead memory session ${sessionId ? sessionId : chatId}`) + } +} + +const initalizeMotorhead = async (nodeData: INodeData, options: ICommonObject): Promise => { + const memoryKey = nodeData.inputs?.memoryKey as string + const baseURL = nodeData.inputs?.baseURL as string + const sessionId = nodeData.inputs?.sessionId as string + const chatId = options?.chatId as string + + let isSessionIdUsingChatMessageId = false + if (!sessionId && chatId) isSessionIdUsingChatMessageId = true + + const credentialData = await getCredentialData(nodeData.credential ?? '', options) + const apiKey = getCredentialParam('apiKey', credentialData, nodeData) + const clientId = getCredentialParam('clientId', credentialData, nodeData) + + let obj: MotorheadMemoryInput & Partial = { + returnMessages: true, + sessionId: sessionId ? sessionId : chatId, + memoryKey + } + + if (baseURL) { + obj = { + ...obj, + url: baseURL + } + } else { + obj = { + ...obj, + apiKey, + clientId + } + } + + if (isSessionIdUsingChatMessageId) obj.isSessionIdUsingChatMessageId = true + + const motorheadMemory = new MotorheadMemoryExtended(obj) + + // Get messages from sessionId + await motorheadMemory.init() + + return motorheadMemory +} + +interface MotorheadMemoryExtendedInput { + isSessionIdUsingChatMessageId: boolean +} + +class MotorheadMemoryExtended extends MotorheadMemory { + isSessionIdUsingChatMessageId? = false + + constructor(fields: MotorheadMemoryInput & Partial) { + super(fields) + this.isSessionIdUsingChatMessageId = fields.isSessionIdUsingChatMessageId + } + + async clear(): Promise { + try { + await this.caller.call(fetch, `${this.url}/sessions/${this.sessionId}/memory`, { + //@ts-ignore + signal: this.timeout ? AbortSignal.timeout(this.timeout) : undefined, + headers: this._getHeaders() as ICommonObject, + method: 'DELETE' + }) + } catch (error) { + console.error('Error deleting session: ', error) + } + + // Clear the superclass's chat history + await this.chatHistory.clear() + await super.clear() + } +} + +module.exports = { nodeClass: MotorMemory_Memory } diff --git a/packages/components/nodes/memory/MotorheadMemory/motorhead.png b/packages/components/nodes/memory/MotorheadMemory/motorhead.png new file mode 100644 index 000000000..e1dfbde08 Binary files /dev/null and b/packages/components/nodes/memory/MotorheadMemory/motorhead.png differ diff --git a/packages/components/nodes/memory/RedisBackedChatMemory/RedisBackedChatMemory.ts b/packages/components/nodes/memory/RedisBackedChatMemory/RedisBackedChatMemory.ts new file mode 100644 index 000000000..f10f25ce8 --- /dev/null +++ b/packages/components/nodes/memory/RedisBackedChatMemory/RedisBackedChatMemory.ts @@ -0,0 +1,123 @@ +import { INode, INodeData, INodeParams } from '../../../src/Interface' +import { getBaseClasses } from '../../../src/utils' +import { ICommonObject } from '../../../src' +import { BufferMemory, BufferMemoryInput } from 'langchain/memory' +import { RedisChatMessageHistory, RedisChatMessageHistoryInput } from 'langchain/stores/message/redis' +import { createClient } from 'redis' + +class RedisBackedChatMemory_Memory implements INode { + label: string + name: string + version: number + description: string + type: string + icon: string + category: string + baseClasses: string[] + inputs: INodeParams[] + + constructor() { + this.label = 'Redis-Backed Chat Memory' + this.name = 'RedisBackedChatMemory' + this.version = 1.0 + this.type = 'RedisBackedChatMemory' + this.icon = 'redis.svg' + this.category = 'Memory' + this.description = 'Summarizes the conversation and stores the memory in Redis server' + this.baseClasses = [this.type, ...getBaseClasses(BufferMemory)] + this.inputs = [ + { + label: 'Base URL', + name: 'baseURL', + type: 'string', + default: 'redis://localhost:6379' + }, + { + label: 'Session Id', + name: 'sessionId', + type: 'string', + description: 'If not specified, the first CHAT_MESSAGE_ID will be used as sessionId', + default: '', + additionalParams: true, + optional: true + }, + { + label: 'Session Timeouts', + name: 'sessionTTL', + type: 'number', + description: 'Omit this parameter to make sessions never expire', + additionalParams: true, + optional: true + }, + { + label: 'Memory Key', + name: 'memoryKey', + type: 'string', + default: 'chat_history', + additionalParams: true + } + ] + } + + async init(nodeData: INodeData, _: string, options: ICommonObject): Promise { + return initalizeRedis(nodeData, options) + } + + async clearSessionMemory(nodeData: INodeData, options: ICommonObject): Promise { + const redis = initalizeRedis(nodeData, options) + const sessionId = nodeData.inputs?.sessionId as string + const chatId = options?.chatId as string + options.logger.info(`Clearing Redis memory session ${sessionId ? sessionId : chatId}`) + await redis.clear() + options.logger.info(`Successfully cleared Redis memory session ${sessionId ? sessionId : chatId}`) + } +} + +const initalizeRedis = (nodeData: INodeData, options: ICommonObject): BufferMemory => { + const baseURL = nodeData.inputs?.baseURL as string + const sessionId = nodeData.inputs?.sessionId as string + const sessionTTL = nodeData.inputs?.sessionTTL as number + const memoryKey = nodeData.inputs?.memoryKey as string + const chatId = options?.chatId as string + + let isSessionIdUsingChatMessageId = false + if (!sessionId && chatId) isSessionIdUsingChatMessageId = true + + const redisClient = createClient({ url: baseURL }) + let obj: RedisChatMessageHistoryInput = { + sessionId: sessionId ? sessionId : chatId, + client: redisClient + } + + if (sessionTTL) { + obj = { + ...obj, + sessionTTL + } + } + + const redisChatMessageHistory = new RedisChatMessageHistory(obj) + + const memory = new BufferMemoryExtended({ + memoryKey, + chatHistory: redisChatMessageHistory, + returnMessages: true, + isSessionIdUsingChatMessageId + }) + return memory +} + +interface BufferMemoryExtendedInput { + isSessionIdUsingChatMessageId: boolean +} + +class BufferMemoryExtended extends BufferMemory { + isSessionIdUsingChatMessageId? = false + + constructor(fields: BufferMemoryInput & Partial) { + super(fields) + this.isSessionIdUsingChatMessageId = fields.isSessionIdUsingChatMessageId + } +} + +module.exports = { nodeClass: RedisBackedChatMemory_Memory } diff --git a/packages/components/nodes/memory/RedisBackedChatMemory/redis.svg b/packages/components/nodes/memory/RedisBackedChatMemory/redis.svg new file mode 100644 index 000000000..903590697 --- /dev/null +++ b/packages/components/nodes/memory/RedisBackedChatMemory/redis.svg @@ -0,0 +1 @@ + \ No newline at end of file diff --git a/packages/components/nodes/memory/ZepMemory/ZepMemory.ts b/packages/components/nodes/memory/ZepMemory/ZepMemory.ts new file mode 100644 index 000000000..0c05563a3 --- /dev/null +++ b/packages/components/nodes/memory/ZepMemory/ZepMemory.ts @@ -0,0 +1,196 @@ +import { SystemMessage } from 'langchain/schema' +import { INode, INodeData, INodeParams } from '../../../src/Interface' +import { getBaseClasses, getCredentialData, getCredentialParam } from '../../../src/utils' +import { ZepMemory, ZepMemoryInput } from 'langchain/memory/zep' +import { ICommonObject } from '../../../src' + +class ZepMemory_Memory implements INode { + label: string + name: string + version: number + description: string + type: string + icon: string + category: string + baseClasses: string[] + credential: INodeParams + inputs: INodeParams[] + + constructor() { + this.label = 'Zep Memory' + this.name = 'ZepMemory' + this.version = 1.0 + this.type = 'ZepMemory' + this.icon = 'zep.png' + this.category = 'Memory' + this.description = 'Summarizes the conversation and stores the memory in zep server' + this.baseClasses = [this.type, ...getBaseClasses(ZepMemory)] + this.credential = { + label: 'Connect Credential', + name: 'credential', + type: 'credential', + optional: true, + description: 'Configure JWT authentication on your Zep instance (Optional)', + credentialNames: ['zepMemoryApi'] + } + this.inputs = [ + { + label: 'Base URL', + name: 'baseURL', + type: 'string', + default: 'http://127.0.0.1:8000' + }, + { + label: 'Auto Summary', + name: 'autoSummary', + type: 'boolean', + default: true + }, + { + label: 'Session Id', + name: 'sessionId', + type: 'string', + description: 'If not specified, the first CHAT_MESSAGE_ID will be used as sessionId', + default: '', + additionalParams: true, + optional: true + }, + { + label: 'Size', + name: 'k', + type: 'number', + default: '10', + description: 'Window of size k to surface the last k back-and-forths to use as memory.' + }, + { + label: 'Auto Summary Template', + name: 'autoSummaryTemplate', + type: 'string', + default: 'This is the summary of the following conversation:\n{summary}', + additionalParams: true + }, + { + label: 'AI Prefix', + name: 'aiPrefix', + type: 'string', + default: 'ai', + additionalParams: true + }, + { + label: 'Human Prefix', + name: 'humanPrefix', + type: 'string', + default: 'human', + additionalParams: true + }, + { + label: 'Memory Key', + name: 'memoryKey', + type: 'string', + default: 'chat_history', + additionalParams: true + }, + { + label: 'Input Key', + name: 'inputKey', + type: 'string', + default: 'input', + additionalParams: true + }, + { + label: 'Output Key', + name: 'outputKey', + type: 'string', + default: 'text', + additionalParams: true + } + ] + } + + async init(nodeData: INodeData, _: string, options: ICommonObject): Promise { + const autoSummaryTemplate = nodeData.inputs?.autoSummaryTemplate as string + const autoSummary = nodeData.inputs?.autoSummary as boolean + + const k = nodeData.inputs?.k as string + + let zep = await initalizeZep(nodeData, options) + + // hack to support summary + let tmpFunc = zep.loadMemoryVariables + zep.loadMemoryVariables = async (values) => { + let data = await tmpFunc.bind(zep, values)() + if (autoSummary && zep.returnMessages && data[zep.memoryKey] && data[zep.memoryKey].length) { + const zepClient = await zep.zepClientPromise + const memory = await zepClient.memory.getMemory(zep.sessionId, parseInt(k, 10) ?? 10) + if (memory?.summary) { + let summary = autoSummaryTemplate.replace(/{summary}/g, memory.summary.content) + // eslint-disable-next-line no-console + console.log('[ZepMemory] auto summary:', summary) + data[zep.memoryKey].unshift(new SystemMessage(summary)) + } + } + // for langchain zep memory compatibility, or we will get "Missing value for input variable chat_history" + if (data instanceof Array) { + data = { + [zep.memoryKey]: data + } + } + return data + } + return zep + } + + async clearSessionMemory(nodeData: INodeData, options: ICommonObject): Promise { + const zep = await initalizeZep(nodeData, options) + const sessionId = nodeData.inputs?.sessionId as string + const chatId = options?.chatId as string + options.logger.info(`Clearing Zep memory session ${sessionId ? sessionId : chatId}`) + await zep.clear() + options.logger.info(`Successfully cleared Zep memory session ${sessionId ? sessionId : chatId}`) + } +} + +const initalizeZep = async (nodeData: INodeData, options: ICommonObject): Promise => { + const baseURL = nodeData.inputs?.baseURL as string + const aiPrefix = nodeData.inputs?.aiPrefix as string + const humanPrefix = nodeData.inputs?.humanPrefix as string + const memoryKey = nodeData.inputs?.memoryKey as string + const inputKey = nodeData.inputs?.inputKey as string + const sessionId = nodeData.inputs?.sessionId as string + const chatId = options?.chatId as string + + let isSessionIdUsingChatMessageId = false + if (!sessionId && chatId) isSessionIdUsingChatMessageId = true + + const credentialData = await getCredentialData(nodeData.credential ?? '', options) + const apiKey = getCredentialParam('apiKey', credentialData, nodeData) + + const obj: ZepMemoryInput & Partial = { + baseURL, + sessionId: sessionId ? sessionId : chatId, + aiPrefix, + humanPrefix, + returnMessages: true, + memoryKey, + inputKey + } + if (apiKey) obj.apiKey = apiKey + if (isSessionIdUsingChatMessageId) obj.isSessionIdUsingChatMessageId = true + + return new ZepMemoryExtended(obj) +} + +interface ZepMemoryExtendedInput { + isSessionIdUsingChatMessageId: boolean +} + +class ZepMemoryExtended extends ZepMemory { + isSessionIdUsingChatMessageId? = false + + constructor(fields: ZepMemoryInput & Partial) { + super(fields) + this.isSessionIdUsingChatMessageId = fields.isSessionIdUsingChatMessageId + } +} + +module.exports = { nodeClass: ZepMemory_Memory } diff --git a/packages/components/nodes/memory/ZepMemory/zep.png b/packages/components/nodes/memory/ZepMemory/zep.png new file mode 100644 index 000000000..2fdb23827 Binary files /dev/null and b/packages/components/nodes/memory/ZepMemory/zep.png differ diff --git a/packages/components/nodes/prompts/ChatPromptTemplate/ChatPromptTemplate.ts b/packages/components/nodes/prompts/ChatPromptTemplate/ChatPromptTemplate.ts index c3c4d77f6..c9ec751d8 100644 --- a/packages/components/nodes/prompts/ChatPromptTemplate/ChatPromptTemplate.ts +++ b/packages/components/nodes/prompts/ChatPromptTemplate/ChatPromptTemplate.ts @@ -5,6 +5,7 @@ import { ChatPromptTemplate, SystemMessagePromptTemplate, HumanMessagePromptTemp class ChatPromptTemplate_Prompts implements INode { label: string name: string + version: number description: string type: string icon: string @@ -15,6 +16,7 @@ class ChatPromptTemplate_Prompts implements INode { constructor() { this.label = 'Chat Prompt Template' this.name = 'chatPromptTemplate' + this.version = 1.0 this.type = 'ChatPromptTemplate' this.icon = 'prompt.svg' this.category = 'Prompts' @@ -38,12 +40,7 @@ class ChatPromptTemplate_Prompts implements INode { { label: 'Format Prompt Values', name: 'promptValues', - type: 'string', - rows: 4, - placeholder: `{ - "input_language": "English", - "output_language": "French" -}`, + type: 'json', optional: true, acceptVariable: true, list: true @@ -63,7 +60,7 @@ class ChatPromptTemplate_Prompts implements INode { let promptValues: ICommonObject = {} if (promptValuesStr) { - promptValues = JSON.parse(promptValuesStr.replace(/\s/g, '')) + promptValues = JSON.parse(promptValuesStr) } // @ts-ignore prompt.promptValues = promptValues diff --git a/packages/components/nodes/prompts/FewShotPromptTemplate/FewShotPromptTemplate.ts b/packages/components/nodes/prompts/FewShotPromptTemplate/FewShotPromptTemplate.ts index a42a1d088..ed1d3cb21 100644 --- a/packages/components/nodes/prompts/FewShotPromptTemplate/FewShotPromptTemplate.ts +++ b/packages/components/nodes/prompts/FewShotPromptTemplate/FewShotPromptTemplate.ts @@ -7,6 +7,7 @@ import { TemplateFormat } from 'langchain/dist/prompts/template' class FewShotPromptTemplate_Prompts implements INode { label: string name: string + version: number description: string type: string icon: string @@ -17,6 +18,7 @@ class FewShotPromptTemplate_Prompts implements INode { constructor() { this.label = 'Few Shot Prompt Template' this.name = 'fewShotPromptTemplate' + this.version = 1.0 this.type = 'FewShotPromptTemplate' this.icon = 'prompt.svg' this.category = 'Prompts' @@ -86,7 +88,7 @@ class FewShotPromptTemplate_Prompts implements INode { const examplePrompt = nodeData.inputs?.examplePrompt as PromptTemplate const inputVariables = getInputVariables(suffix) - const examples: Example[] = JSON.parse(examplesStr.replace(/\s/g, '')) + const examples: Example[] = JSON.parse(examplesStr) try { const obj: FewShotPromptTemplateInput = { diff --git a/packages/components/nodes/prompts/PromptTemplate/PromptTemplate.ts b/packages/components/nodes/prompts/PromptTemplate/PromptTemplate.ts index f976d64c6..a401e2823 100644 --- a/packages/components/nodes/prompts/PromptTemplate/PromptTemplate.ts +++ b/packages/components/nodes/prompts/PromptTemplate/PromptTemplate.ts @@ -5,6 +5,7 @@ import { PromptTemplateInput } from 'langchain/prompts' class PromptTemplate_Prompts implements INode { label: string name: string + version: number description: string type: string icon: string @@ -15,6 +16,7 @@ class PromptTemplate_Prompts implements INode { constructor() { this.label = 'Prompt Template' this.name = 'promptTemplate' + this.version = 1.0 this.type = 'PromptTemplate' this.icon = 'prompt.svg' this.category = 'Prompts' @@ -31,12 +33,7 @@ class PromptTemplate_Prompts implements INode { { label: 'Format Prompt Values', name: 'promptValues', - type: 'string', - rows: 4, - placeholder: `{ - "input_language": "English", - "output_language": "French" -}`, + type: 'json', optional: true, acceptVariable: true, list: true @@ -50,7 +47,7 @@ class PromptTemplate_Prompts implements INode { let promptValues: ICommonObject = {} if (promptValuesStr) { - promptValues = JSON.parse(promptValuesStr.replace(/\s/g, '')) + promptValues = JSON.parse(promptValuesStr) } const inputVariables = getInputVariables(template) diff --git a/packages/components/nodes/retrievers/HydeRetriever/HydeRetriever.ts b/packages/components/nodes/retrievers/HydeRetriever/HydeRetriever.ts new file mode 100644 index 000000000..2baf677eb --- /dev/null +++ b/packages/components/nodes/retrievers/HydeRetriever/HydeRetriever.ts @@ -0,0 +1,123 @@ +import { VectorStore } from 'langchain/vectorstores/base' +import { INode, INodeData, INodeParams } from '../../../src/Interface' +import { HydeRetriever, HydeRetrieverOptions, PromptKey } from 'langchain/retrievers/hyde' +import { BaseLanguageModel } from 'langchain/base_language' +import { PromptTemplate } from 'langchain/prompts' + +class HydeRetriever_Retrievers implements INode { + label: string + name: string + version: number + description: string + type: string + icon: string + category: string + baseClasses: string[] + inputs: INodeParams[] + + constructor() { + this.label = 'Hyde Retriever' + this.name = 'HydeRetriever' + this.version = 1.0 + this.type = 'HydeRetriever' + this.icon = 'hyderetriever.svg' + this.category = 'Retrievers' + this.description = 'Use HyDE retriever to retrieve from a vector store' + this.baseClasses = [this.type, 'BaseRetriever'] + this.inputs = [ + { + label: 'Language Model', + name: 'model', + type: 'BaseLanguageModel' + }, + { + label: 'Vector Store', + name: 'vectorStore', + type: 'VectorStore' + }, + { + label: 'Prompt Key', + name: 'promptKey', + type: 'options', + options: [ + { + label: 'websearch', + name: 'websearch' + }, + { + label: 'scifact', + name: 'scifact' + }, + { + label: 'arguana', + name: 'arguana' + }, + { + label: 'trec-covid', + name: 'trec-covid' + }, + { + label: 'fiqa', + name: 'fiqa' + }, + { + label: 'dbpedia-entity', + name: 'dbpedia-entity' + }, + { + label: 'trec-news', + name: 'trec-news' + }, + { + label: 'mr-tydi', + name: 'mr-tydi' + } + ], + default: 'websearch' + }, + { + label: 'Custom Prompt', + name: 'customPrompt', + description: 'If custom prompt is used, this will override Prompt Key', + placeholder: 'Please write a passage to answer the question\nQuestion: {question}\nPassage:', + type: 'string', + rows: 4, + additionalParams: true, + optional: true + }, + { + label: 'Top K', + name: 'topK', + description: 'Number of top results to fetch. Default to 4', + placeholder: '4', + type: 'number', + default: 4, + additionalParams: true, + optional: true + } + ] + } + + async init(nodeData: INodeData): Promise { + const llm = nodeData.inputs?.model as BaseLanguageModel + const vectorStore = nodeData.inputs?.vectorStore as VectorStore + const promptKey = nodeData.inputs?.promptKey as PromptKey + const customPrompt = nodeData.inputs?.customPrompt as string + const topK = nodeData.inputs?.topK as string + const k = topK ? parseInt(topK, 10) : 4 + + const obj: HydeRetrieverOptions = { + llm, + vectorStore, + k + } + + if (customPrompt) obj.promptTemplate = PromptTemplate.fromTemplate(customPrompt) + else if (promptKey) obj.promptTemplate = promptKey + + const retriever = new HydeRetriever(obj) + return retriever + } +} + +module.exports = { nodeClass: HydeRetriever_Retrievers } diff --git a/packages/components/nodes/retrievers/HydeRetriever/hyderetriever.svg b/packages/components/nodes/retrievers/HydeRetriever/hyderetriever.svg new file mode 100644 index 000000000..da3a9f207 --- /dev/null +++ b/packages/components/nodes/retrievers/HydeRetriever/hyderetriever.svg @@ -0,0 +1,9 @@ + + + + + + + + + \ No newline at end of file diff --git a/packages/components/nodes/retrievers/PromptRetriever/PromptRetriever.ts b/packages/components/nodes/retrievers/PromptRetriever/PromptRetriever.ts new file mode 100644 index 000000000..7ffaa64fa --- /dev/null +++ b/packages/components/nodes/retrievers/PromptRetriever/PromptRetriever.ts @@ -0,0 +1,64 @@ +import { INode, INodeData, INodeParams, PromptRetriever, PromptRetrieverInput } from '../../../src/Interface' + +class PromptRetriever_Retrievers implements INode { + label: string + name: string + version: number + description: string + type: string + icon: string + category: string + baseClasses: string[] + inputs: INodeParams[] + + constructor() { + this.label = 'Prompt Retriever' + this.name = 'promptRetriever' + this.version = 1.0 + this.type = 'PromptRetriever' + this.icon = 'promptretriever.svg' + this.category = 'Retrievers' + this.description = 'Store prompt template with name & description to be later queried by MultiPromptChain' + this.baseClasses = [this.type] + this.inputs = [ + { + label: 'Prompt Name', + name: 'name', + type: 'string', + placeholder: 'physics-qa' + }, + { + label: 'Prompt Description', + name: 'description', + type: 'string', + rows: 3, + description: 'Description of what the prompt does and when it should be used', + placeholder: 'Good for answering questions about physics' + }, + { + label: 'Prompt System Message', + name: 'systemMessage', + type: 'string', + rows: 4, + placeholder: `You are a very smart physics professor. You are great at answering questions about physics in a concise and easy to understand manner. When you don't know the answer to a question you admit that you don't know.` + } + ] + } + + async init(nodeData: INodeData): Promise { + const name = nodeData.inputs?.name as string + const description = nodeData.inputs?.description as string + const systemMessage = nodeData.inputs?.systemMessage as string + + const obj = { + name, + description, + systemMessage + } as PromptRetrieverInput + + const retriever = new PromptRetriever(obj) + return retriever + } +} + +module.exports = { nodeClass: PromptRetriever_Retrievers } diff --git a/packages/components/nodes/retrievers/PromptRetriever/promptretriever.svg b/packages/components/nodes/retrievers/PromptRetriever/promptretriever.svg new file mode 100644 index 000000000..db48e8a51 --- /dev/null +++ b/packages/components/nodes/retrievers/PromptRetriever/promptretriever.svg @@ -0,0 +1,8 @@ + + + + + + + + \ No newline at end of file diff --git a/packages/components/nodes/retrievers/VectorStoreRetriever/VectorStoreRetriever.ts b/packages/components/nodes/retrievers/VectorStoreRetriever/VectorStoreRetriever.ts new file mode 100644 index 000000000..41f665719 --- /dev/null +++ b/packages/components/nodes/retrievers/VectorStoreRetriever/VectorStoreRetriever.ts @@ -0,0 +1,63 @@ +import { VectorStore } from 'langchain/vectorstores/base' +import { INode, INodeData, INodeParams, VectorStoreRetriever, VectorStoreRetrieverInput } from '../../../src/Interface' + +class VectorStoreRetriever_Retrievers implements INode { + label: string + name: string + version: number + description: string + type: string + icon: string + category: string + baseClasses: string[] + inputs: INodeParams[] + + constructor() { + this.label = 'Vector Store Retriever' + this.name = 'vectorStoreRetriever' + this.version = 1.0 + this.type = 'VectorStoreRetriever' + this.icon = 'vectorretriever.svg' + this.category = 'Retrievers' + this.description = 'Store vector store as retriever to be later queried by MultiRetrievalQAChain' + this.baseClasses = [this.type] + this.inputs = [ + { + label: 'Vector Store', + name: 'vectorStore', + type: 'VectorStore' + }, + { + label: 'Retriever Name', + name: 'name', + type: 'string', + placeholder: 'netflix movies' + }, + { + label: 'Retriever Description', + name: 'description', + type: 'string', + rows: 3, + description: 'Description of when to use the vector store retriever', + placeholder: 'Good for answering questions about netflix movies' + } + ] + } + + async init(nodeData: INodeData): Promise { + const name = nodeData.inputs?.name as string + const description = nodeData.inputs?.description as string + const vectorStore = nodeData.inputs?.vectorStore as VectorStore + + const obj = { + name, + description, + vectorStore + } as VectorStoreRetrieverInput + + const retriever = new VectorStoreRetriever(obj) + return retriever + } +} + +module.exports = { nodeClass: VectorStoreRetriever_Retrievers } diff --git a/packages/components/nodes/retrievers/VectorStoreRetriever/vectorretriever.svg b/packages/components/nodes/retrievers/VectorStoreRetriever/vectorretriever.svg new file mode 100644 index 000000000..da3a9f207 --- /dev/null +++ b/packages/components/nodes/retrievers/VectorStoreRetriever/vectorretriever.svg @@ -0,0 +1,9 @@ + + + + + + + + + \ No newline at end of file diff --git a/packages/components/nodes/textsplitters/CharacterTextSplitter/CharacterTextSplitter.ts b/packages/components/nodes/textsplitters/CharacterTextSplitter/CharacterTextSplitter.ts index 90387e8b6..f9427d10a 100644 --- a/packages/components/nodes/textsplitters/CharacterTextSplitter/CharacterTextSplitter.ts +++ b/packages/components/nodes/textsplitters/CharacterTextSplitter/CharacterTextSplitter.ts @@ -5,6 +5,7 @@ import { CharacterTextSplitter, CharacterTextSplitterParams } from 'langchain/te class CharacterTextSplitter_TextSplitters implements INode { label: string name: string + version: number description: string type: string icon: string @@ -15,6 +16,7 @@ class CharacterTextSplitter_TextSplitters implements INode { constructor() { this.label = 'Character Text Splitter' this.name = 'characterTextSplitter' + this.version = 1.0 this.type = 'CharacterTextSplitter' this.icon = 'textsplitter.svg' this.category = 'Text Splitters' diff --git a/packages/components/nodes/textsplitters/CodeTextSplitter/CodeTextSplitter.ts b/packages/components/nodes/textsplitters/CodeTextSplitter/CodeTextSplitter.ts new file mode 100644 index 000000000..ed643f330 --- /dev/null +++ b/packages/components/nodes/textsplitters/CodeTextSplitter/CodeTextSplitter.ts @@ -0,0 +1,130 @@ +import { INode, INodeData, INodeParams } from '../../../src/Interface' +import { getBaseClasses } from '../../../src/utils' +import { + RecursiveCharacterTextSplitter, + RecursiveCharacterTextSplitterParams, + SupportedTextSplitterLanguage +} from 'langchain/text_splitter' + +class CodeTextSplitter_TextSplitters implements INode { + label: string + name: string + version: number + description: string + type: string + icon: string + category: string + baseClasses: string[] + inputs: INodeParams[] + constructor() { + this.label = 'Code Text Splitter' + this.name = 'codeTextSplitter' + this.version = 1.0 + this.type = 'CodeTextSplitter' + this.icon = 'codeTextSplitter.svg' + this.category = 'Text Splitters' + this.description = `Split documents based on language-specific syntax` + this.baseClasses = [this.type, ...getBaseClasses(RecursiveCharacterTextSplitter)] + this.inputs = [ + { + label: 'Language', + name: 'language', + type: 'options', + options: [ + { + label: 'cpp', + name: 'cpp' + }, + { + label: 'go', + name: 'go' + }, + { + label: 'java', + name: 'java' + }, + { + label: 'js', + name: 'js' + }, + { + label: 'php', + name: 'php' + }, + { + label: 'proto', + name: 'proto' + }, + { + label: 'python', + name: 'python' + }, + { + label: 'rst', + name: 'rst' + }, + { + label: 'ruby', + name: 'ruby' + }, + { + label: 'rust', + name: 'rust' + }, + { + label: 'scala', + name: 'scala' + }, + { + label: 'swift', + name: 'swift' + }, + { + label: 'markdown', + name: 'markdown' + }, + { + label: 'latex', + name: 'latex' + }, + { + label: 'html', + name: 'html' + }, + { + label: 'sol', + name: 'sol' + } + ] + }, + { + label: 'Chunk Size', + name: 'chunkSize', + type: 'number', + default: 1000, + optional: true + }, + { + label: 'Chunk Overlap', + name: 'chunkOverlap', + type: 'number', + optional: true + } + ] + } + async init(nodeData: INodeData): Promise { + const chunkSize = nodeData.inputs?.chunkSize as string + const chunkOverlap = nodeData.inputs?.chunkOverlap as string + const language = nodeData.inputs?.language as SupportedTextSplitterLanguage + + const obj = {} as RecursiveCharacterTextSplitterParams + + if (chunkSize) obj.chunkSize = parseInt(chunkSize, 10) + if (chunkOverlap) obj.chunkOverlap = parseInt(chunkOverlap, 10) + + const splitter = RecursiveCharacterTextSplitter.fromLanguage(language, obj) + + return splitter + } +} +module.exports = { nodeClass: CodeTextSplitter_TextSplitters } diff --git a/packages/components/nodes/textsplitters/CodeTextSplitter/codeTextSplitter.svg b/packages/components/nodes/textsplitters/CodeTextSplitter/codeTextSplitter.svg new file mode 100644 index 000000000..d3b3d188a --- /dev/null +++ b/packages/components/nodes/textsplitters/CodeTextSplitter/codeTextSplitter.svg @@ -0,0 +1,8 @@ + + + + + + + + \ No newline at end of file diff --git a/packages/components/nodes/textsplitters/HtmlToMarkdownTextSplitter/HtmlToMarkdownTextSplitter.ts b/packages/components/nodes/textsplitters/HtmlToMarkdownTextSplitter/HtmlToMarkdownTextSplitter.ts new file mode 100644 index 000000000..699764e54 --- /dev/null +++ b/packages/components/nodes/textsplitters/HtmlToMarkdownTextSplitter/HtmlToMarkdownTextSplitter.ts @@ -0,0 +1,72 @@ +import { INode, INodeData, INodeParams } from '../../../src/Interface' +import { getBaseClasses } from '../../../src/utils' +import { MarkdownTextSplitter, MarkdownTextSplitterParams } from 'langchain/text_splitter' +import { NodeHtmlMarkdown } from 'node-html-markdown' + +class HtmlToMarkdownTextSplitter_TextSplitters implements INode { + label: string + name: string + version: number + description: string + type: string + icon: string + category: string + baseClasses: string[] + inputs: INodeParams[] + + constructor() { + this.label = 'HtmlToMarkdown Text Splitter' + this.name = 'htmlToMarkdownTextSplitter' + this.version = 1.0 + this.type = 'HtmlToMarkdownTextSplitter' + this.icon = 'htmlToMarkdownTextSplitter.svg' + this.category = 'Text Splitters' + this.description = `Converts Html to Markdown and then split your content into documents based on the Markdown headers` + this.baseClasses = [this.type, ...getBaseClasses(HtmlToMarkdownTextSplitter)] + this.inputs = [ + { + label: 'Chunk Size', + name: 'chunkSize', + type: 'number', + default: 1000, + optional: true + }, + { + label: 'Chunk Overlap', + name: 'chunkOverlap', + type: 'number', + optional: true + } + ] + } + + async init(nodeData: INodeData): Promise { + const chunkSize = nodeData.inputs?.chunkSize as string + const chunkOverlap = nodeData.inputs?.chunkOverlap as string + + const obj = {} as MarkdownTextSplitterParams + + if (chunkSize) obj.chunkSize = parseInt(chunkSize, 10) + if (chunkOverlap) obj.chunkOverlap = parseInt(chunkOverlap, 10) + + const splitter = new HtmlToMarkdownTextSplitter(obj) + + return splitter + } +} +class HtmlToMarkdownTextSplitter extends MarkdownTextSplitter implements MarkdownTextSplitterParams { + constructor(fields?: Partial) { + { + super(fields) + } + } + splitText(text: string): Promise { + return new Promise((resolve) => { + const markdown = NodeHtmlMarkdown.translate(text) + super.splitText(markdown).then((result) => { + resolve(result) + }) + }) + } +} +module.exports = { nodeClass: HtmlToMarkdownTextSplitter_TextSplitters } diff --git a/packages/components/nodes/textsplitters/HtmlToMarkdownTextSplitter/htmlToMarkdownTextSplitter.svg b/packages/components/nodes/textsplitters/HtmlToMarkdownTextSplitter/htmlToMarkdownTextSplitter.svg new file mode 100644 index 000000000..f7d45d603 --- /dev/null +++ b/packages/components/nodes/textsplitters/HtmlToMarkdownTextSplitter/htmlToMarkdownTextSplitter.svg @@ -0,0 +1,6 @@ + + + + + + \ No newline at end of file diff --git a/packages/components/nodes/textsplitters/MarkdownTextSplitter/MarkdownTextSplitter.ts b/packages/components/nodes/textsplitters/MarkdownTextSplitter/MarkdownTextSplitter.ts index 02c37d8d5..0a12845ae 100644 --- a/packages/components/nodes/textsplitters/MarkdownTextSplitter/MarkdownTextSplitter.ts +++ b/packages/components/nodes/textsplitters/MarkdownTextSplitter/MarkdownTextSplitter.ts @@ -5,6 +5,7 @@ import { MarkdownTextSplitter, MarkdownTextSplitterParams } from 'langchain/text class MarkdownTextSplitter_TextSplitters implements INode { label: string name: string + version: number description: string type: string icon: string @@ -15,6 +16,7 @@ class MarkdownTextSplitter_TextSplitters implements INode { constructor() { this.label = 'Markdown Text Splitter' this.name = 'markdownTextSplitter' + this.version = 1.0 this.type = 'MarkdownTextSplitter' this.icon = 'markdownTextSplitter.svg' this.category = 'Text Splitters' diff --git a/packages/components/nodes/textsplitters/RecursiveCharacterTextSplitter/RecursiveCharacterTextSplitter.ts b/packages/components/nodes/textsplitters/RecursiveCharacterTextSplitter/RecursiveCharacterTextSplitter.ts index 432b5ca90..dcca70ba2 100644 --- a/packages/components/nodes/textsplitters/RecursiveCharacterTextSplitter/RecursiveCharacterTextSplitter.ts +++ b/packages/components/nodes/textsplitters/RecursiveCharacterTextSplitter/RecursiveCharacterTextSplitter.ts @@ -5,6 +5,7 @@ import { RecursiveCharacterTextSplitter, RecursiveCharacterTextSplitterParams } class RecursiveCharacterTextSplitter_TextSplitters implements INode { label: string name: string + version: number description: string type: string icon: string @@ -15,6 +16,7 @@ class RecursiveCharacterTextSplitter_TextSplitters implements INode { constructor() { this.label = 'Recursive Character Text Splitter' this.name = 'recursiveCharacterTextSplitter' + this.version = 1.0 this.type = 'RecursiveCharacterTextSplitter' this.icon = 'textsplitter.svg' this.category = 'Text Splitters' diff --git a/packages/components/nodes/textsplitters/TokenTextSplitter/TokenTextSplitter.ts b/packages/components/nodes/textsplitters/TokenTextSplitter/TokenTextSplitter.ts index 8c8d6abea..0b11eebc9 100644 --- a/packages/components/nodes/textsplitters/TokenTextSplitter/TokenTextSplitter.ts +++ b/packages/components/nodes/textsplitters/TokenTextSplitter/TokenTextSplitter.ts @@ -6,6 +6,7 @@ import { TiktokenEncoding } from '@dqbd/tiktoken' class TokenTextSplitter_TextSplitters implements INode { label: string name: string + version: number description: string type: string icon: string @@ -16,6 +17,7 @@ class TokenTextSplitter_TextSplitters implements INode { constructor() { this.label = 'Token Text Splitter' this.name = 'tokenTextSplitter' + this.version = 1.0 this.type = 'TokenTextSplitter' this.icon = 'tiktoken.svg' this.category = 'Text Splitters' diff --git a/packages/components/nodes/tools/AIPlugin/AIPlugin.ts b/packages/components/nodes/tools/AIPlugin/AIPlugin.ts index ad21f8dbc..e9c0fa3dc 100644 --- a/packages/components/nodes/tools/AIPlugin/AIPlugin.ts +++ b/packages/components/nodes/tools/AIPlugin/AIPlugin.ts @@ -5,6 +5,7 @@ import { getBaseClasses } from '../../../src/utils' class AIPlugin implements INode { label: string name: string + version: number description: string type: string icon: string @@ -15,6 +16,7 @@ class AIPlugin implements INode { constructor() { this.label = 'AI Plugin' this.name = 'aiPlugin' + this.version = 1.0 this.type = 'AIPlugin' this.icon = 'aiplugin.svg' this.category = 'Tools' diff --git a/packages/components/nodes/tools/BraveSearchAPI/BraveSearchAPI.ts b/packages/components/nodes/tools/BraveSearchAPI/BraveSearchAPI.ts new file mode 100644 index 000000000..9e9c760d0 --- /dev/null +++ b/packages/components/nodes/tools/BraveSearchAPI/BraveSearchAPI.ts @@ -0,0 +1,42 @@ +import { ICommonObject, INode, INodeData, INodeParams } from '../../../src/Interface' +import { getBaseClasses, getCredentialData, getCredentialParam } from '../../../src/utils' +import { BraveSearch } from 'langchain/tools' + +class BraveSearchAPI_Tools implements INode { + label: string + name: string + version: number + description: string + type: string + icon: string + category: string + baseClasses: string[] + credential: INodeParams + inputs: INodeParams[] + + constructor() { + this.label = 'BraveSearch API' + this.name = 'braveSearchAPI' + this.version = 1.0 + this.type = 'BraveSearchAPI' + this.icon = 'brave.svg' + this.category = 'Tools' + this.description = 'Wrapper around BraveSearch API - a real-time API to access Brave search results' + this.inputs = [] + this.credential = { + label: 'Connect Credential', + name: 'credential', + type: 'credential', + credentialNames: ['braveSearchApi'] + } + this.baseClasses = [this.type, ...getBaseClasses(BraveSearch)] + } + + async init(nodeData: INodeData, _: string, options: ICommonObject): Promise { + const credentialData = await getCredentialData(nodeData.credential ?? '', options) + const braveApiKey = getCredentialParam('braveApiKey', credentialData, nodeData) + return new BraveSearch({ apiKey: braveApiKey }) + } +} + +module.exports = { nodeClass: BraveSearchAPI_Tools } diff --git a/packages/components/nodes/tools/BraveSearchAPI/brave.svg b/packages/components/nodes/tools/BraveSearchAPI/brave.svg new file mode 100644 index 000000000..0c0c0e86e --- /dev/null +++ b/packages/components/nodes/tools/BraveSearchAPI/brave.svg @@ -0,0 +1 @@ + \ No newline at end of file diff --git a/packages/components/nodes/tools/Calculator/Calculator.ts b/packages/components/nodes/tools/Calculator/Calculator.ts index 85284f0fd..db1e0b2b7 100644 --- a/packages/components/nodes/tools/Calculator/Calculator.ts +++ b/packages/components/nodes/tools/Calculator/Calculator.ts @@ -5,6 +5,7 @@ import { Calculator } from 'langchain/tools/calculator' class Calculator_Tools implements INode { label: string name: string + version: number description: string type: string icon: string @@ -14,6 +15,7 @@ class Calculator_Tools implements INode { constructor() { this.label = 'Calculator' this.name = 'calculator' + this.version = 1.0 this.type = 'Calculator' this.icon = 'calculator.svg' this.category = 'Tools' diff --git a/packages/components/nodes/tools/ChainTool/ChainTool.ts b/packages/components/nodes/tools/ChainTool/ChainTool.ts index 32e414af7..42b5e6e1e 100644 --- a/packages/components/nodes/tools/ChainTool/ChainTool.ts +++ b/packages/components/nodes/tools/ChainTool/ChainTool.ts @@ -1,11 +1,12 @@ import { INode, INodeData, INodeParams } from '../../../src/Interface' import { getBaseClasses } from '../../../src/utils' -import { ChainTool } from 'langchain/tools' import { BaseChain } from 'langchain/chains' +import { ChainTool } from './core' class ChainTool_Tools implements INode { label: string name: string + version: number description: string type: string icon: string @@ -16,6 +17,7 @@ class ChainTool_Tools implements INode { constructor() { this.label = 'Chain Tool' this.name = 'chainTool' + this.version = 1.0 this.type = 'ChainTool' this.icon = 'chaintool.svg' this.category = 'Tools' diff --git a/packages/components/nodes/tools/ChainTool/chaintool.svg b/packages/components/nodes/tools/ChainTool/chaintool.svg index c5bd0fbcc..ab76749b4 100644 --- a/packages/components/nodes/tools/ChainTool/chaintool.svg +++ b/packages/components/nodes/tools/ChainTool/chaintool.svg @@ -1,4 +1,8 @@ - + - + + + + + \ No newline at end of file diff --git a/packages/components/nodes/tools/ChainTool/core.ts b/packages/components/nodes/tools/ChainTool/core.ts new file mode 100644 index 000000000..6c3dba554 --- /dev/null +++ b/packages/components/nodes/tools/ChainTool/core.ts @@ -0,0 +1,25 @@ +import { DynamicTool, DynamicToolInput } from 'langchain/tools' +import { BaseChain } from 'langchain/chains' + +export interface ChainToolInput extends Omit { + chain: BaseChain +} + +export class ChainTool extends DynamicTool { + chain: BaseChain + + constructor({ chain, ...rest }: ChainToolInput) { + super({ + ...rest, + func: async (input, runManager) => { + // To enable LLM Chain which has promptValues + if ((chain as any).prompt && (chain as any).prompt.promptValues) { + const values = await chain.call((chain as any).prompt.promptValues, runManager?.getChild()) + return values?.text + } + return chain.run(input, runManager?.getChild()) + } + }) + this.chain = chain + } +} diff --git a/packages/components/nodes/tools/CustomTool/CustomTool.ts b/packages/components/nodes/tools/CustomTool/CustomTool.ts new file mode 100644 index 000000000..c070df317 --- /dev/null +++ b/packages/components/nodes/tools/CustomTool/CustomTool.ts @@ -0,0 +1,112 @@ +import { ICommonObject, IDatabaseEntity, INode, INodeData, INodeOptionsValue, INodeParams } from '../../../src/Interface' +import { getBaseClasses } from '../../../src/utils' +import { DynamicStructuredTool } from './core' +import { z } from 'zod' +import { DataSource } from 'typeorm' + +class CustomTool_Tools implements INode { + label: string + name: string + version: number + description: string + type: string + icon: string + category: string + baseClasses: string[] + inputs: INodeParams[] + + constructor() { + this.label = 'Custom Tool' + this.name = 'customTool' + this.version = 1.0 + this.type = 'CustomTool' + this.icon = 'customtool.svg' + this.category = 'Tools' + this.description = `Use custom tool you've created in Flowise within chatflow` + this.inputs = [ + { + label: 'Select Tool', + name: 'selectedTool', + type: 'asyncOptions', + loadMethod: 'listTools' + } + ] + this.baseClasses = [this.type, 'Tool', ...getBaseClasses(DynamicStructuredTool)] + } + + //@ts-ignore + loadMethods = { + async listTools(_: INodeData, options: ICommonObject): Promise { + const returnData: INodeOptionsValue[] = [] + + const appDataSource = options.appDataSource as DataSource + const databaseEntities = options.databaseEntities as IDatabaseEntity + + if (appDataSource === undefined || !appDataSource) { + return returnData + } + + const tools = await appDataSource.getRepository(databaseEntities['Tool']).find() + + for (let i = 0; i < tools.length; i += 1) { + const data = { + label: tools[i].name, + name: tools[i].id, + description: tools[i].description + } as INodeOptionsValue + returnData.push(data) + } + return returnData + } + } + + async init(nodeData: INodeData, _: string, options: ICommonObject): Promise { + const selectedToolId = nodeData.inputs?.selectedTool as string + const customToolFunc = nodeData.inputs?.customToolFunc as string + + const appDataSource = options.appDataSource as DataSource + const databaseEntities = options.databaseEntities as IDatabaseEntity + + try { + const tool = await appDataSource.getRepository(databaseEntities['Tool']).findOneBy({ + id: selectedToolId + }) + + if (!tool) throw new Error(`Tool ${selectedToolId} not found`) + const obj = { + name: tool.name, + description: tool.description, + schema: z.object(convertSchemaToZod(tool.schema)), + code: tool.func + } + if (customToolFunc) obj.code = customToolFunc + return new DynamicStructuredTool(obj) + } catch (e) { + throw new Error(e) + } + } +} + +const convertSchemaToZod = (schema: string) => { + try { + const parsedSchema = JSON.parse(schema) + const zodObj: any = {} + for (const sch of parsedSchema) { + if (sch.type === 'string') { + if (sch.required) z.string({ required_error: `${sch.property} required` }).describe(sch.description) + zodObj[sch.property] = z.string().describe(sch.description) + } else if (sch.type === 'number') { + if (sch.required) z.number({ required_error: `${sch.property} required` }).describe(sch.description) + zodObj[sch.property] = z.number().describe(sch.description) + } else if (sch.type === 'boolean') { + if (sch.required) z.boolean({ required_error: `${sch.property} required` }).describe(sch.description) + zodObj[sch.property] = z.boolean().describe(sch.description) + } + } + return zodObj + } catch (e) { + throw new Error(e) + } +} + +module.exports = { nodeClass: CustomTool_Tools } diff --git a/packages/components/nodes/tools/CustomTool/core.ts b/packages/components/nodes/tools/CustomTool/core.ts new file mode 100644 index 000000000..12dd72f19 --- /dev/null +++ b/packages/components/nodes/tools/CustomTool/core.ts @@ -0,0 +1,120 @@ +import { z } from 'zod' +import { CallbackManagerForToolRun } from 'langchain/callbacks' +import { StructuredTool, ToolParams } from 'langchain/tools' +import { NodeVM } from 'vm2' + +/* + * List of dependencies allowed to be import in vm2 + */ +const availableDependencies = [ + '@dqbd/tiktoken', + '@getzep/zep-js', + '@huggingface/inference', + '@pinecone-database/pinecone', + '@supabase/supabase-js', + 'axios', + 'cheerio', + 'chromadb', + 'cohere-ai', + 'd3-dsv', + 'form-data', + 'graphql', + 'html-to-text', + 'langchain', + 'linkifyjs', + 'mammoth', + 'moment', + 'node-fetch', + 'pdf-parse', + 'pdfjs-dist', + 'playwright', + 'puppeteer', + 'srt-parser-2', + 'typeorm', + 'weaviate-ts-client' +] + +export interface BaseDynamicToolInput extends ToolParams { + name: string + description: string + code: string + returnDirect?: boolean +} + +export interface DynamicStructuredToolInput< + // eslint-disable-next-line + T extends z.ZodObject = z.ZodObject +> extends BaseDynamicToolInput { + func?: (input: z.infer, runManager?: CallbackManagerForToolRun) => Promise + schema: T +} + +export class DynamicStructuredTool< + // eslint-disable-next-line + T extends z.ZodObject = z.ZodObject +> extends StructuredTool { + name: string + + description: string + + code: string + + func: DynamicStructuredToolInput['func'] + + schema: T + + constructor(fields: DynamicStructuredToolInput) { + super(fields) + this.name = fields.name + this.description = fields.description + this.code = fields.code + this.func = fields.func + this.returnDirect = fields.returnDirect ?? this.returnDirect + this.schema = fields.schema + } + + protected async _call(arg: z.output): Promise { + let sandbox: any = {} + if (typeof arg === 'object' && Object.keys(arg).length) { + for (const item in arg) { + sandbox[`$${item}`] = arg[item] + } + } + + const defaultAllowBuiltInDep = [ + 'assert', + 'buffer', + 'crypto', + 'events', + 'http', + 'https', + 'net', + 'path', + 'querystring', + 'timers', + 'tls', + 'url', + 'zlib' + ] + + const builtinDeps = process.env.TOOL_FUNCTION_BUILTIN_DEP + ? defaultAllowBuiltInDep.concat(process.env.TOOL_FUNCTION_BUILTIN_DEP.split(',')) + : defaultAllowBuiltInDep + const externalDeps = process.env.TOOL_FUNCTION_EXTERNAL_DEP ? process.env.TOOL_FUNCTION_EXTERNAL_DEP.split(',') : [] + const deps = availableDependencies.concat(externalDeps) + + const options = { + console: 'inherit', + sandbox, + require: { + external: { modules: deps }, + builtin: builtinDeps + } + } as any + + const vm = new NodeVM(options) + const response = await vm.run(`module.exports = async function() {${this.code}}()`, __dirname) + + return response + } +} diff --git a/packages/components/nodes/tools/CustomTool/customtool.svg b/packages/components/nodes/tools/CustomTool/customtool.svg new file mode 100644 index 000000000..c5bd0fbcc --- /dev/null +++ b/packages/components/nodes/tools/CustomTool/customtool.svg @@ -0,0 +1,4 @@ + + + + \ No newline at end of file diff --git a/packages/components/nodes/tools/GoogleSearchAPI/GoogleSearchAPI.ts b/packages/components/nodes/tools/GoogleSearchAPI/GoogleSearchAPI.ts new file mode 100644 index 000000000..29ebae8b2 --- /dev/null +++ b/packages/components/nodes/tools/GoogleSearchAPI/GoogleSearchAPI.ts @@ -0,0 +1,43 @@ +import { ICommonObject, INode, INodeData, INodeParams } from '../../../src/Interface' +import { getBaseClasses, getCredentialData, getCredentialParam } from '../../../src/utils' +import { GoogleCustomSearch } from 'langchain/tools' + +class GoogleCustomSearchAPI_Tools implements INode { + label: string + name: string + version: number + description: string + type: string + icon: string + category: string + baseClasses: string[] + credential: INodeParams + inputs: INodeParams[] + + constructor() { + this.label = 'Google Custom Search' + this.name = 'googleCustomSearch' + this.version = 1.0 + this.type = 'GoogleCustomSearchAPI' + this.icon = 'google.png' + this.category = 'Tools' + this.description = 'Wrapper around Google Custom Search API - a real-time API to access Google search results' + this.inputs = [] + this.credential = { + label: 'Connect Credential', + name: 'credential', + type: 'credential', + credentialNames: ['googleCustomSearchApi'] + } + this.baseClasses = [this.type, ...getBaseClasses(GoogleCustomSearch)] + } + + async init(nodeData: INodeData, _: string, options: ICommonObject): Promise { + const credentialData = await getCredentialData(nodeData.credential ?? '', options) + const googleApiKey = getCredentialParam('googleCustomSearchApiKey', credentialData, nodeData) + const googleCseId = getCredentialParam('googleCustomSearchApiId', credentialData, nodeData) + return new GoogleCustomSearch({ apiKey: googleApiKey, googleCSEId: googleCseId }) + } +} + +module.exports = { nodeClass: GoogleCustomSearchAPI_Tools } diff --git a/packages/components/nodes/tools/GoogleSearchAPI/google.png b/packages/components/nodes/tools/GoogleSearchAPI/google.png new file mode 100644 index 000000000..c7cd4ca11 Binary files /dev/null and b/packages/components/nodes/tools/GoogleSearchAPI/google.png differ diff --git a/packages/components/nodes/tools/MakeWebhook/MakeWebhook.ts b/packages/components/nodes/tools/MakeWebhook/MakeWebhook.ts deleted file mode 100644 index 38e0cdd1e..000000000 --- a/packages/components/nodes/tools/MakeWebhook/MakeWebhook.ts +++ /dev/null @@ -1,48 +0,0 @@ -import { INode, INodeData, INodeParams } from '../../../src/Interface' -import { getBaseClasses } from '../../../src/utils' -import { MakeWebhookTool } from './core' - -class MakeWebhook_Tools implements INode { - label: string - name: string - description: string - type: string - icon: string - category: string - baseClasses: string[] - inputs: INodeParams[] - - constructor() { - this.label = 'Make.com Webhook' - this.name = 'makeWebhook' - this.type = 'MakeWebhook' - this.icon = 'make.png' - this.category = 'Tools' - this.description = 'Execute webhook calls on Make.com' - this.inputs = [ - { - label: 'Webhook Url', - name: 'url', - type: 'string', - placeholder: 'https://hook.eu1.make.com/abcdefg' - }, - { - label: 'Tool Description', - name: 'desc', - type: 'string', - rows: 4, - placeholder: 'Useful when need to send message to Discord' - } - ] - this.baseClasses = [this.type, ...getBaseClasses(MakeWebhookTool)] - } - - async init(nodeData: INodeData): Promise { - const url = nodeData.inputs?.url as string - const desc = nodeData.inputs?.desc as string - - return new MakeWebhookTool(url, desc, 'GET') - } -} - -module.exports = { nodeClass: MakeWebhook_Tools } diff --git a/packages/components/nodes/tools/MakeWebhook/core.ts b/packages/components/nodes/tools/MakeWebhook/core.ts deleted file mode 100644 index 8b04ecb96..000000000 --- a/packages/components/nodes/tools/MakeWebhook/core.ts +++ /dev/null @@ -1,41 +0,0 @@ -import axios, { AxiosRequestConfig, Method } from 'axios' -import { Tool } from 'langchain/tools' -import { ICommonObject } from '../../../src/Interface' - -export class MakeWebhookTool extends Tool { - private url: string - - name: string - - description: string - - method: string - - headers: ICommonObject - - constructor(url: string, description: string, method = 'POST', headers: ICommonObject = {}) { - super() - this.url = url - this.name = 'make_webhook' - this.description = description ?? `useful for when you need to execute tasks on Make` - this.method = method - this.headers = headers - } - - async _call(): Promise { - try { - const axiosConfig: AxiosRequestConfig = { - method: this.method as Method, - url: this.url, - headers: { - ...this.headers, - 'Content-Type': 'application/json' - } - } - const response = await axios(axiosConfig) - return typeof response.data === 'object' ? JSON.stringify(response.data) : response.data - } catch (error) { - throw new Error(`HTTP error ${error}`) - } - } -} diff --git a/packages/components/nodes/tools/MakeWebhook/make.png b/packages/components/nodes/tools/MakeWebhook/make.png deleted file mode 100644 index 968afcb58..000000000 Binary files a/packages/components/nodes/tools/MakeWebhook/make.png and /dev/null differ diff --git a/packages/components/nodes/tools/OpenAPIToolkit/OpenAPIToolkit.ts b/packages/components/nodes/tools/OpenAPIToolkit/OpenAPIToolkit.ts new file mode 100644 index 000000000..d1bf38911 --- /dev/null +++ b/packages/components/nodes/tools/OpenAPIToolkit/OpenAPIToolkit.ts @@ -0,0 +1,78 @@ +import { ICommonObject, INode, INodeData, INodeParams } from '../../../src/Interface' +import { OpenApiToolkit } from 'langchain/agents' +import { JsonSpec, JsonObject } from 'langchain/tools' +import { BaseLanguageModel } from 'langchain/base_language' +import { load } from 'js-yaml' +import { getCredentialData, getCredentialParam } from '../../../src' + +class OpenAPIToolkit_Tools implements INode { + label: string + name: string + version: number + description: string + type: string + icon: string + category: string + baseClasses: string[] + credential: INodeParams + inputs: INodeParams[] + + constructor() { + this.label = 'OpenAPI Toolkit' + this.name = 'openAPIToolkit' + this.version = 1.0 + this.type = 'OpenAPIToolkit' + this.icon = 'openapi.png' + this.category = 'Tools' + this.description = 'Load OpenAPI specification' + this.credential = { + label: 'Connect Credential', + name: 'credential', + type: 'credential', + description: 'Only needed if the YAML OpenAPI Spec requires authentication', + optional: true, + credentialNames: ['openAPIAuth'] + } + this.inputs = [ + { + label: 'Language Model', + name: 'model', + type: 'BaseLanguageModel' + }, + { + label: 'YAML File', + name: 'yamlFile', + type: 'file', + fileType: '.yaml' + } + ] + this.baseClasses = [this.type, 'Tool'] + } + + async init(nodeData: INodeData, _: string, options: ICommonObject): Promise { + const model = nodeData.inputs?.model as BaseLanguageModel + const yamlFileBase64 = nodeData.inputs?.yamlFile as string + + const credentialData = await getCredentialData(nodeData.credential ?? '', options) + const openAPIToken = getCredentialParam('openAPIToken', credentialData, nodeData) + + const splitDataURI = yamlFileBase64.split(',') + splitDataURI.pop() + const bf = Buffer.from(splitDataURI.pop() || '', 'base64') + const utf8String = bf.toString('utf-8') + const data = load(utf8String) as JsonObject + if (!data) { + throw new Error('Failed to load OpenAPI spec') + } + + const headers: ICommonObject = { + 'Content-Type': 'application/json' + } + if (openAPIToken) headers.Authorization = `Bearer ${openAPIToken}` + const toolkit = new OpenApiToolkit(new JsonSpec(data), model, headers) + + return toolkit.tools + } +} + +module.exports = { nodeClass: OpenAPIToolkit_Tools } diff --git a/packages/components/nodes/tools/OpenAPIToolkit/openapi.png b/packages/components/nodes/tools/OpenAPIToolkit/openapi.png new file mode 100644 index 000000000..457c2e405 Binary files /dev/null and b/packages/components/nodes/tools/OpenAPIToolkit/openapi.png differ diff --git a/packages/components/nodes/tools/ReadFile/ReadFile.ts b/packages/components/nodes/tools/ReadFile/ReadFile.ts index b66789438..2aa2c66e8 100644 --- a/packages/components/nodes/tools/ReadFile/ReadFile.ts +++ b/packages/components/nodes/tools/ReadFile/ReadFile.ts @@ -6,6 +6,7 @@ import { NodeFileStore } from 'langchain/stores/file/node' class ReadFile_Tools implements INode { label: string name: string + version: number description: string type: string icon: string @@ -16,6 +17,7 @@ class ReadFile_Tools implements INode { constructor() { this.label = 'Read File' this.name = 'readFile' + this.version = 1.0 this.type = 'ReadFile' this.icon = 'readfile.svg' this.category = 'Tools' diff --git a/packages/components/nodes/tools/RequestsGet/RequestsGet.ts b/packages/components/nodes/tools/RequestsGet/RequestsGet.ts index 0b7f0ac80..91cff5000 100644 --- a/packages/components/nodes/tools/RequestsGet/RequestsGet.ts +++ b/packages/components/nodes/tools/RequestsGet/RequestsGet.ts @@ -5,6 +5,7 @@ import { desc, RequestParameters, RequestsGetTool } from './core' class RequestsGet_Tools implements INode { label: string name: string + version: number description: string type: string icon: string @@ -15,6 +16,7 @@ class RequestsGet_Tools implements INode { constructor() { this.label = 'Requests Get' this.name = 'requestsGet' + this.version = 1.0 this.type = 'RequestsGet' this.icon = 'requestsget.svg' this.category = 'Tools' diff --git a/packages/components/nodes/tools/RequestsPost/RequestsPost.ts b/packages/components/nodes/tools/RequestsPost/RequestsPost.ts index 0e64556fa..9ff3d1426 100644 --- a/packages/components/nodes/tools/RequestsPost/RequestsPost.ts +++ b/packages/components/nodes/tools/RequestsPost/RequestsPost.ts @@ -5,6 +5,7 @@ import { RequestParameters, desc, RequestsPostTool } from './core' class RequestsPost_Tools implements INode { label: string name: string + version: number description: string type: string icon: string @@ -15,6 +16,7 @@ class RequestsPost_Tools implements INode { constructor() { this.label = 'Requests Post' this.name = 'requestsPost' + this.version = 1.0 this.type = 'RequestsPost' this.icon = 'requestspost.svg' this.category = 'Tools' diff --git a/packages/components/nodes/tools/RetrieverTool/RetrieverTool.ts b/packages/components/nodes/tools/RetrieverTool/RetrieverTool.ts new file mode 100644 index 000000000..6217ca6e6 --- /dev/null +++ b/packages/components/nodes/tools/RetrieverTool/RetrieverTool.ts @@ -0,0 +1,65 @@ +import { INode, INodeData, INodeParams } from '../../../src/Interface' +import { getBaseClasses } from '../../../src/utils' +import { DynamicTool } from 'langchain/tools' +import { createRetrieverTool } from 'langchain/agents/toolkits' +import { BaseRetriever } from 'langchain/schema/retriever' + +class Retriever_Tools implements INode { + label: string + name: string + version: number + description: string + type: string + icon: string + category: string + baseClasses: string[] + credential: INodeParams + inputs: INodeParams[] + + constructor() { + this.label = 'Retriever Tool' + this.name = 'retrieverTool' + this.version = 1.0 + this.type = 'RetrieverTool' + this.icon = 'retriever-tool.png' + this.category = 'Tools' + this.description = 'Use a retriever as allowed tool for agent' + this.baseClasses = [this.type, 'DynamicTool', ...getBaseClasses(DynamicTool)] + this.inputs = [ + { + label: 'Retriever Name', + name: 'name', + type: 'string', + placeholder: 'search_state_of_union' + }, + { + label: 'Retriever Description', + name: 'description', + type: 'string', + description: 'When should agent uses to retrieve documents', + rows: 3, + placeholder: 'Searches and returns documents regarding the state-of-the-union.' + }, + { + label: 'Retriever', + name: 'retriever', + type: 'BaseRetriever' + } + ] + } + + async init(nodeData: INodeData): Promise { + const name = nodeData.inputs?.name as string + const description = nodeData.inputs?.description as string + const retriever = nodeData.inputs?.retriever as BaseRetriever + + const tool = createRetrieverTool(retriever, { + name, + description + }) + + return tool + } +} + +module.exports = { nodeClass: Retriever_Tools } diff --git a/packages/components/nodes/tools/RetrieverTool/retriever-tool.png b/packages/components/nodes/tools/RetrieverTool/retriever-tool.png new file mode 100644 index 000000000..4814d0075 Binary files /dev/null and b/packages/components/nodes/tools/RetrieverTool/retriever-tool.png differ diff --git a/packages/components/nodes/tools/SerpAPI/SerpAPI.ts b/packages/components/nodes/tools/SerpAPI/SerpAPI.ts index 694324083..b7230c858 100644 --- a/packages/components/nodes/tools/SerpAPI/SerpAPI.ts +++ b/packages/components/nodes/tools/SerpAPI/SerpAPI.ts @@ -1,37 +1,41 @@ -import { INode, INodeData, INodeParams } from '../../../src/Interface' -import { getBaseClasses } from '../../../src/utils' +import { ICommonObject, INode, INodeData, INodeParams } from '../../../src/Interface' +import { getBaseClasses, getCredentialData, getCredentialParam } from '../../../src/utils' import { SerpAPI } from 'langchain/tools' class SerpAPI_Tools implements INode { label: string name: string + version: number description: string type: string icon: string category: string baseClasses: string[] + credential: INodeParams inputs: INodeParams[] constructor() { this.label = 'Serp API' this.name = 'serpAPI' + this.version = 1.0 this.type = 'SerpAPI' this.icon = 'serp.png' this.category = 'Tools' this.description = 'Wrapper around SerpAPI - a real-time API to access Google search results' - this.inputs = [ - { - label: 'Serp Api Key', - name: 'apiKey', - type: 'password' - } - ] + this.inputs = [] + this.credential = { + label: 'Connect Credential', + name: 'credential', + type: 'credential', + credentialNames: ['serpApi'] + } this.baseClasses = [this.type, ...getBaseClasses(SerpAPI)] } - async init(nodeData: INodeData): Promise { - const apiKey = nodeData.inputs?.apiKey as string - return new SerpAPI(apiKey) + async init(nodeData: INodeData, _: string, options: ICommonObject): Promise { + const credentialData = await getCredentialData(nodeData.credential ?? '', options) + const serpApiKey = getCredentialParam('serpApiKey', credentialData, nodeData) + return new SerpAPI(serpApiKey) } } diff --git a/packages/components/nodes/tools/Serper/Serper.ts b/packages/components/nodes/tools/Serper/Serper.ts new file mode 100644 index 000000000..1facdb3dd --- /dev/null +++ b/packages/components/nodes/tools/Serper/Serper.ts @@ -0,0 +1,42 @@ +import { ICommonObject, INode, INodeData, INodeParams } from '../../../src/Interface' +import { getBaseClasses, getCredentialData, getCredentialParam } from '../../../src/utils' +import { Serper } from 'langchain/tools' + +class Serper_Tools implements INode { + label: string + name: string + version: number + description: string + type: string + icon: string + category: string + baseClasses: string[] + credential: INodeParams + inputs: INodeParams[] + + constructor() { + this.label = 'Serper' + this.name = 'serper' + this.version = 1.0 + this.type = 'Serper' + this.icon = 'serper.png' + this.category = 'Tools' + this.description = 'Wrapper around Serper.dev - Google Search API' + this.inputs = [] + this.credential = { + label: 'Connect Credential', + name: 'credential', + type: 'credential', + credentialNames: ['serperApi'] + } + this.baseClasses = [this.type, ...getBaseClasses(Serper)] + } + + async init(nodeData: INodeData, _: string, options: ICommonObject): Promise { + const credentialData = await getCredentialData(nodeData.credential ?? '', options) + const serperApiKey = getCredentialParam('serperApiKey', credentialData, nodeData) + return new Serper(serperApiKey) + } +} + +module.exports = { nodeClass: Serper_Tools } diff --git a/packages/components/nodes/tools/Serper/serper.png b/packages/components/nodes/tools/Serper/serper.png new file mode 100644 index 000000000..0b094037b Binary files /dev/null and b/packages/components/nodes/tools/Serper/serper.png differ diff --git a/packages/components/nodes/tools/WebBrowser/WebBrowser.ts b/packages/components/nodes/tools/WebBrowser/WebBrowser.ts index 09478047a..64a093d0d 100644 --- a/packages/components/nodes/tools/WebBrowser/WebBrowser.ts +++ b/packages/components/nodes/tools/WebBrowser/WebBrowser.ts @@ -7,6 +7,7 @@ import { Embeddings } from 'langchain/embeddings/base' class WebBrowser_Tools implements INode { label: string name: string + version: number description: string type: string icon: string @@ -17,6 +18,7 @@ class WebBrowser_Tools implements INode { constructor() { this.label = 'Web Browser' this.name = 'webBrowser' + this.version = 1.0 this.type = 'WebBrowser' this.icon = 'webBrowser.svg' this.category = 'Tools' diff --git a/packages/components/nodes/tools/WriteFile/WriteFile.ts b/packages/components/nodes/tools/WriteFile/WriteFile.ts index 208166d86..2eb7843f3 100644 --- a/packages/components/nodes/tools/WriteFile/WriteFile.ts +++ b/packages/components/nodes/tools/WriteFile/WriteFile.ts @@ -6,6 +6,7 @@ import { NodeFileStore } from 'langchain/stores/file/node' class WriteFile_Tools implements INode { label: string name: string + version: number description: string type: string icon: string @@ -16,6 +17,7 @@ class WriteFile_Tools implements INode { constructor() { this.label = 'Write File' this.name = 'writeFile' + this.version = 1.0 this.type = 'WriteFile' this.icon = 'writefile.svg' this.category = 'Tools' diff --git a/packages/components/nodes/tools/ZapierNLA/ZapierNLA.ts b/packages/components/nodes/tools/ZapierNLA/ZapierNLA.ts index 849f5946d..49543136a 100644 --- a/packages/components/nodes/tools/ZapierNLA/ZapierNLA.ts +++ b/packages/components/nodes/tools/ZapierNLA/ZapierNLA.ts @@ -1,39 +1,44 @@ -import { ZapierNLAWrapper, ZapiterNLAWrapperParams } from 'langchain/tools' -import { INode, INodeData, INodeParams } from '../../../src/Interface' +import { ZapierNLAWrapper, ZapierNLAWrapperParams } from 'langchain/tools' +import { ICommonObject, INode, INodeData, INodeParams } from '../../../src/Interface' import { ZapierToolKit } from 'langchain/agents' +import { getCredentialData, getCredentialParam } from '../../../src' class ZapierNLA_Tools implements INode { label: string name: string + version: number description: string type: string icon: string category: string baseClasses: string[] inputs: INodeParams[] + credential: INodeParams constructor() { this.label = 'Zapier NLA' this.name = 'zapierNLA' + this.version = 1.0 this.type = 'ZapierNLA' - this.icon = 'zapier.png' + this.icon = 'zapier.svg' this.category = 'Tools' this.description = "Access to apps and actions on Zapier's platform through a natural language API interface" - this.inputs = [ - { - label: 'Zapier NLA Api Key', - name: 'apiKey', - type: 'password' - } - ] + this.inputs = [] + this.credential = { + label: 'Connect Credential', + name: 'credential', + type: 'credential', + credentialNames: ['zapierNLAApi'] + } this.baseClasses = [this.type, 'Tool'] } - async init(nodeData: INodeData): Promise { - const apiKey = nodeData.inputs?.apiKey as string + async init(nodeData: INodeData, _: string, options: ICommonObject): Promise { + const credentialData = await getCredentialData(nodeData.credential ?? '', options) + const zapierNLAApiKey = getCredentialParam('zapierNLAApiKey', credentialData, nodeData) - const obj: Partial = { - apiKey + const obj: Partial = { + apiKey: zapierNLAApiKey } const zapier = new ZapierNLAWrapper(obj) const toolkit = await ZapierToolKit.fromZapierNLAWrapper(zapier) diff --git a/packages/components/nodes/tools/ZapierNLA/zapier.png b/packages/components/nodes/tools/ZapierNLA/zapier.png deleted file mode 100644 index 769716faa..000000000 Binary files a/packages/components/nodes/tools/ZapierNLA/zapier.png and /dev/null differ diff --git a/packages/components/nodes/tools/ZapierNLA/zapier.svg b/packages/components/nodes/tools/ZapierNLA/zapier.svg new file mode 100644 index 000000000..6ed35f295 --- /dev/null +++ b/packages/components/nodes/tools/ZapierNLA/zapier.svg @@ -0,0 +1,8 @@ + + + + + + + + \ No newline at end of file diff --git a/packages/components/nodes/vectorstores/Chroma_Existing/Chroma_Existing.ts b/packages/components/nodes/vectorstores/Chroma/Chroma_Existing.ts similarity index 59% rename from packages/components/nodes/vectorstores/Chroma_Existing/Chroma_Existing.ts rename to packages/components/nodes/vectorstores/Chroma/Chroma_Existing.ts index fbaa5cbb4..f55faa404 100644 --- a/packages/components/nodes/vectorstores/Chroma_Existing/Chroma_Existing.ts +++ b/packages/components/nodes/vectorstores/Chroma/Chroma_Existing.ts @@ -1,27 +1,39 @@ -import { INode, INodeData, INodeOutputsValue, INodeParams } from '../../../src/Interface' +import { ICommonObject, INode, INodeData, INodeOutputsValue, INodeParams } from '../../../src/Interface' import { Chroma } from 'langchain/vectorstores/chroma' import { Embeddings } from 'langchain/embeddings/base' -import { getBaseClasses } from '../../../src/utils' +import { getBaseClasses, getCredentialData, getCredentialParam } from '../../../src/utils' +import { ChromaExtended } from './core' class Chroma_Existing_VectorStores implements INode { label: string name: string + version: number description: string type: string icon: string category: string baseClasses: string[] inputs: INodeParams[] + credential: INodeParams outputs: INodeOutputsValue[] constructor() { this.label = 'Chroma Load Existing Index' this.name = 'chromaExistingIndex' + this.version = 1.0 this.type = 'Chroma' this.icon = 'chroma.svg' this.category = 'Vector Stores' this.description = 'Load existing index from Chroma (i.e: Document has been upserted)' this.baseClasses = [this.type, 'VectorStoreRetriever', 'BaseRetriever'] + this.credential = { + label: 'Connect Credential', + name: 'credential', + type: 'credential', + description: 'Only needed if you have chroma on cloud services with X-Api-key', + optional: true, + credentialNames: ['chromaApi'] + } this.inputs = [ { label: 'Embeddings', @@ -38,6 +50,15 @@ class Chroma_Existing_VectorStores implements INode { name: 'chromaURL', type: 'string', optional: true + }, + { + label: 'Top K', + name: 'topK', + description: 'Number of top results to fetch. Default to 4', + placeholder: '4', + type: 'number', + additionalParams: true, + optional: true } ] this.outputs = [ @@ -54,24 +75,32 @@ class Chroma_Existing_VectorStores implements INode { ] } - async init(nodeData: INodeData): Promise { + async init(nodeData: INodeData, _: string, options: ICommonObject): Promise { const collectionName = nodeData.inputs?.collectionName as string const embeddings = nodeData.inputs?.embeddings as Embeddings const chromaURL = nodeData.inputs?.chromaURL as string const output = nodeData.outputs?.output as string + const topK = nodeData.inputs?.topK as string + const k = topK ? parseFloat(topK) : 4 + + const credentialData = await getCredentialData(nodeData.credential ?? '', options) + const chromaApiKey = getCredentialParam('chromaApiKey', credentialData, nodeData) const obj: { collectionName: string url?: string + chromaApiKey?: string } = { collectionName } if (chromaURL) obj.url = chromaURL + if (chromaApiKey) obj.chromaApiKey = chromaApiKey - const vectorStore = await Chroma.fromExistingCollection(embeddings, obj) + const vectorStore = await ChromaExtended.fromExistingCollection(embeddings, obj) if (output === 'retriever') { - const retriever = vectorStore.asRetriever() + const retriever = vectorStore.asRetriever(k) return retriever } else if (output === 'vectorStore') { + ;(vectorStore as any).k = k return vectorStore } return vectorStore diff --git a/packages/components/nodes/vectorstores/Chroma_Upsert/Chroma_Upsert.ts b/packages/components/nodes/vectorstores/Chroma/Chroma_Upsert.ts similarity index 61% rename from packages/components/nodes/vectorstores/Chroma_Upsert/Chroma_Upsert.ts rename to packages/components/nodes/vectorstores/Chroma/Chroma_Upsert.ts index fb7c404e3..0527b7297 100644 --- a/packages/components/nodes/vectorstores/Chroma_Upsert/Chroma_Upsert.ts +++ b/packages/components/nodes/vectorstores/Chroma/Chroma_Upsert.ts @@ -1,28 +1,41 @@ -import { INode, INodeData, INodeOutputsValue, INodeParams } from '../../../src/Interface' +import { ICommonObject, INode, INodeData, INodeOutputsValue, INodeParams } from '../../../src/Interface' import { Chroma } from 'langchain/vectorstores/chroma' import { Embeddings } from 'langchain/embeddings/base' import { Document } from 'langchain/document' -import { getBaseClasses } from '../../../src/utils' +import { getBaseClasses, getCredentialData, getCredentialParam } from '../../../src/utils' +import { flatten } from 'lodash' +import { ChromaExtended } from './core' class ChromaUpsert_VectorStores implements INode { label: string name: string + version: number description: string type: string icon: string category: string baseClasses: string[] inputs: INodeParams[] + credential: INodeParams outputs: INodeOutputsValue[] constructor() { this.label = 'Chroma Upsert Document' this.name = 'chromaUpsert' + this.version = 1.0 this.type = 'Chroma' this.icon = 'chroma.svg' this.category = 'Vector Stores' this.description = 'Upsert documents to Chroma' this.baseClasses = [this.type, 'VectorStoreRetriever', 'BaseRetriever'] + this.credential = { + label: 'Connect Credential', + name: 'credential', + type: 'credential', + description: 'Only needed if you have chroma on cloud services with X-Api-key', + optional: true, + credentialNames: ['chromaApi'] + } this.inputs = [ { label: 'Document', @@ -45,6 +58,15 @@ class ChromaUpsert_VectorStores implements INode { name: 'chromaURL', type: 'string', optional: true + }, + { + label: 'Top K', + name: 'topK', + description: 'Number of top results to fetch. Default to 4', + placeholder: '4', + type: 'number', + additionalParams: true, + optional: true } ] this.outputs = [ @@ -61,14 +83,19 @@ class ChromaUpsert_VectorStores implements INode { ] } - async init(nodeData: INodeData): Promise { + async init(nodeData: INodeData, _: string, options: ICommonObject): Promise { const collectionName = nodeData.inputs?.collectionName as string const docs = nodeData.inputs?.document as Document[] const embeddings = nodeData.inputs?.embeddings as Embeddings const chromaURL = nodeData.inputs?.chromaURL as string const output = nodeData.outputs?.output as string + const topK = nodeData.inputs?.topK as string + const k = topK ? parseFloat(topK) : 4 - const flattenDocs = docs && docs.length ? docs.flat() : [] + const credentialData = await getCredentialData(nodeData.credential ?? '', options) + const chromaApiKey = getCredentialParam('chromaApiKey', credentialData, nodeData) + + const flattenDocs = docs && docs.length ? flatten(docs) : [] const finalDocs = [] for (let i = 0; i < flattenDocs.length; i += 1) { finalDocs.push(new Document(flattenDocs[i])) @@ -77,15 +104,18 @@ class ChromaUpsert_VectorStores implements INode { const obj: { collectionName: string url?: string + chromaApiKey?: string } = { collectionName } if (chromaURL) obj.url = chromaURL + if (chromaApiKey) obj.chromaApiKey = chromaApiKey - const vectorStore = await Chroma.fromDocuments(finalDocs, embeddings, obj) + const vectorStore = await ChromaExtended.fromDocuments(finalDocs, embeddings, obj) if (output === 'retriever') { - const retriever = vectorStore.asRetriever() + const retriever = vectorStore.asRetriever(k) return retriever } else if (output === 'vectorStore') { + ;(vectorStore as any).k = k return vectorStore } return vectorStore diff --git a/packages/components/nodes/vectorstores/Chroma_Existing/chroma.svg b/packages/components/nodes/vectorstores/Chroma/chroma.svg similarity index 100% rename from packages/components/nodes/vectorstores/Chroma_Existing/chroma.svg rename to packages/components/nodes/vectorstores/Chroma/chroma.svg diff --git a/packages/components/nodes/vectorstores/Chroma/core.ts b/packages/components/nodes/vectorstores/Chroma/core.ts new file mode 100644 index 000000000..ccdbe03c9 --- /dev/null +++ b/packages/components/nodes/vectorstores/Chroma/core.ts @@ -0,0 +1,49 @@ +import { Chroma, ChromaLibArgs } from 'langchain/vectorstores/chroma' +import { Embeddings } from 'langchain/embeddings/base' +import type { Collection } from 'chromadb' + +interface ChromaAuth { + chromaApiKey?: string +} + +export class ChromaExtended extends Chroma { + chromaApiKey?: string + + constructor(embeddings: Embeddings, args: ChromaLibArgs & Partial) { + super(embeddings, args) + this.chromaApiKey = args.chromaApiKey + } + + static async fromExistingCollection(embeddings: Embeddings, dbConfig: ChromaLibArgs & Partial): Promise { + const instance = new this(embeddings, dbConfig) + await instance.ensureCollection() + return instance + } + + async ensureCollection(): Promise { + if (!this.collection) { + if (!this.index) { + const { ChromaClient } = await Chroma.imports() + const obj: any = { + path: this.url + } + if (this.chromaApiKey) { + obj.fetchOptions = { + headers: { + 'X-Api-Key': this.chromaApiKey + } + } + } + this.index = new ChromaClient(obj) + } + try { + this.collection = await this.index.getOrCreateCollection({ + name: this.collectionName + }) + } catch (err) { + throw new Error(`Chroma getOrCreateCollection error: ${err}`) + } + } + return this.collection + } +} diff --git a/packages/components/nodes/vectorstores/Chroma_Upsert/chroma.svg b/packages/components/nodes/vectorstores/Chroma_Upsert/chroma.svg deleted file mode 100644 index 64090685b..000000000 --- a/packages/components/nodes/vectorstores/Chroma_Upsert/chroma.svg +++ /dev/null @@ -1,7 +0,0 @@ - - - - - - - diff --git a/packages/components/nodes/vectorstores/Faiss_Existing/Faiss_Existing.ts b/packages/components/nodes/vectorstores/Faiss_Existing/Faiss_Existing.ts new file mode 100644 index 000000000..8c8d03a84 --- /dev/null +++ b/packages/components/nodes/vectorstores/Faiss_Existing/Faiss_Existing.ts @@ -0,0 +1,84 @@ +import { INode, INodeData, INodeOutputsValue, INodeParams } from '../../../src/Interface' +import { FaissStore } from 'langchain/vectorstores/faiss' +import { Embeddings } from 'langchain/embeddings/base' +import { getBaseClasses } from '../../../src/utils' + +class Faiss_Existing_VectorStores implements INode { + label: string + name: string + version: number + description: string + type: string + icon: string + category: string + baseClasses: string[] + inputs: INodeParams[] + outputs: INodeOutputsValue[] + + constructor() { + this.label = 'Faiss Load Existing Index' + this.name = 'faissExistingIndex' + this.version = 1.0 + this.type = 'Faiss' + this.icon = 'faiss.svg' + this.category = 'Vector Stores' + this.description = 'Load existing index from Faiss (i.e: Document has been upserted)' + this.baseClasses = [this.type, 'VectorStoreRetriever', 'BaseRetriever'] + this.inputs = [ + { + label: 'Embeddings', + name: 'embeddings', + type: 'Embeddings' + }, + { + label: 'Base Path to load', + name: 'basePath', + description: 'Path to load faiss.index file', + placeholder: `C:\\Users\\User\\Desktop`, + type: 'string' + }, + { + label: 'Top K', + name: 'topK', + description: 'Number of top results to fetch. Default to 4', + placeholder: '4', + type: 'number', + additionalParams: true, + optional: true + } + ] + this.outputs = [ + { + label: 'Faiss Retriever', + name: 'retriever', + baseClasses: this.baseClasses + }, + { + label: 'Faiss Vector Store', + name: 'vectorStore', + baseClasses: [this.type, ...getBaseClasses(FaissStore)] + } + ] + } + + async init(nodeData: INodeData): Promise { + const embeddings = nodeData.inputs?.embeddings as Embeddings + const basePath = nodeData.inputs?.basePath as string + const output = nodeData.outputs?.output as string + const topK = nodeData.inputs?.topK as string + const k = topK ? parseFloat(topK) : 4 + + const vectorStore = await FaissStore.load(basePath, embeddings) + + if (output === 'retriever') { + const retriever = vectorStore.asRetriever(k) + return retriever + } else if (output === 'vectorStore') { + ;(vectorStore as any).k = k + return vectorStore + } + return vectorStore + } +} + +module.exports = { nodeClass: Faiss_Existing_VectorStores } diff --git a/packages/components/nodes/vectorstores/Faiss_Existing/faiss.svg b/packages/components/nodes/vectorstores/Faiss_Existing/faiss.svg new file mode 100644 index 000000000..5fbe98322 --- /dev/null +++ b/packages/components/nodes/vectorstores/Faiss_Existing/faiss.svg @@ -0,0 +1,10 @@ + + + + + + + + + + \ No newline at end of file diff --git a/packages/components/nodes/vectorstores/Faiss_Upsert/Faiss_Upsert.ts b/packages/components/nodes/vectorstores/Faiss_Upsert/Faiss_Upsert.ts new file mode 100644 index 000000000..f56eccdfe --- /dev/null +++ b/packages/components/nodes/vectorstores/Faiss_Upsert/Faiss_Upsert.ts @@ -0,0 +1,100 @@ +import { INode, INodeData, INodeOutputsValue, INodeParams } from '../../../src/Interface' +import { Embeddings } from 'langchain/embeddings/base' +import { Document } from 'langchain/document' +import { getBaseClasses } from '../../../src/utils' +import { FaissStore } from 'langchain/vectorstores/faiss' +import { flatten } from 'lodash' + +class FaissUpsert_VectorStores implements INode { + label: string + name: string + version: number + description: string + type: string + icon: string + category: string + baseClasses: string[] + inputs: INodeParams[] + outputs: INodeOutputsValue[] + + constructor() { + this.label = 'Faiss Upsert Document' + this.name = 'faissUpsert' + this.version = 1.0 + this.type = 'Faiss' + this.icon = 'faiss.svg' + this.category = 'Vector Stores' + this.description = 'Upsert documents to Faiss' + this.baseClasses = [this.type, 'VectorStoreRetriever', 'BaseRetriever'] + this.inputs = [ + { + label: 'Document', + name: 'document', + type: 'Document', + list: true + }, + { + label: 'Embeddings', + name: 'embeddings', + type: 'Embeddings' + }, + { + label: 'Base Path to store', + name: 'basePath', + description: 'Path to store faiss.index file', + placeholder: `C:\\Users\\User\\Desktop`, + type: 'string' + }, + { + label: 'Top K', + name: 'topK', + description: 'Number of top results to fetch. Default to 4', + placeholder: '4', + type: 'number', + additionalParams: true, + optional: true + } + ] + this.outputs = [ + { + label: 'Faiss Retriever', + name: 'retriever', + baseClasses: this.baseClasses + }, + { + label: 'Faiss Vector Store', + name: 'vectorStore', + baseClasses: [this.type, ...getBaseClasses(FaissStore)] + } + ] + } + + async init(nodeData: INodeData): Promise { + const docs = nodeData.inputs?.document as Document[] + const embeddings = nodeData.inputs?.embeddings as Embeddings + const output = nodeData.outputs?.output as string + const basePath = nodeData.inputs?.basePath as string + const topK = nodeData.inputs?.topK as string + const k = topK ? parseFloat(topK) : 4 + + const flattenDocs = docs && docs.length ? flatten(docs) : [] + const finalDocs = [] + for (let i = 0; i < flattenDocs.length; i += 1) { + finalDocs.push(new Document(flattenDocs[i])) + } + + const vectorStore = await FaissStore.fromDocuments(finalDocs, embeddings) + await vectorStore.save(basePath) + + if (output === 'retriever') { + const retriever = vectorStore.asRetriever(k) + return retriever + } else if (output === 'vectorStore') { + ;(vectorStore as any).k = k + return vectorStore + } + return vectorStore + } +} + +module.exports = { nodeClass: FaissUpsert_VectorStores } diff --git a/packages/components/nodes/vectorstores/Faiss_Upsert/faiss.svg b/packages/components/nodes/vectorstores/Faiss_Upsert/faiss.svg new file mode 100644 index 000000000..5fbe98322 --- /dev/null +++ b/packages/components/nodes/vectorstores/Faiss_Upsert/faiss.svg @@ -0,0 +1,10 @@ + + + + + + + + + + \ No newline at end of file diff --git a/packages/components/nodes/vectorstores/InMemory/InMemoryVectorStore.ts b/packages/components/nodes/vectorstores/InMemory/InMemoryVectorStore.ts index 8d89b2ef1..55a01e2b3 100644 --- a/packages/components/nodes/vectorstores/InMemory/InMemoryVectorStore.ts +++ b/packages/components/nodes/vectorstores/InMemory/InMemoryVectorStore.ts @@ -3,10 +3,12 @@ import { MemoryVectorStore } from 'langchain/vectorstores/memory' import { Embeddings } from 'langchain/embeddings/base' import { Document } from 'langchain/document' import { getBaseClasses } from '../../../src/utils' +import { flatten } from 'lodash' class InMemoryVectorStore_VectorStores implements INode { label: string name: string + version: number description: string type: string icon: string @@ -18,6 +20,7 @@ class InMemoryVectorStore_VectorStores implements INode { constructor() { this.label = 'In-Memory Vector Store' this.name = 'memoryVectorStore' + this.version = 1.0 this.type = 'Memory' this.icon = 'memory.svg' this.category = 'Vector Stores' @@ -34,6 +37,14 @@ class InMemoryVectorStore_VectorStores implements INode { label: 'Embeddings', name: 'embeddings', type: 'Embeddings' + }, + { + label: 'Top K', + name: 'topK', + description: 'Number of top results to fetch. Default to 4', + placeholder: '4', + type: 'number', + optional: true } ] this.outputs = [ @@ -54,8 +65,10 @@ class InMemoryVectorStore_VectorStores implements INode { const docs = nodeData.inputs?.document as Document[] const embeddings = nodeData.inputs?.embeddings as Embeddings const output = nodeData.outputs?.output as string + const topK = nodeData.inputs?.topK as string + const k = topK ? parseFloat(topK) : 4 - const flattenDocs = docs && docs.length ? docs.flat() : [] + const flattenDocs = docs && docs.length ? flatten(docs) : [] const finalDocs = [] for (let i = 0; i < flattenDocs.length; i += 1) { finalDocs.push(new Document(flattenDocs[i])) @@ -64,9 +77,10 @@ class InMemoryVectorStore_VectorStores implements INode { const vectorStore = await MemoryVectorStore.fromDocuments(finalDocs, embeddings) if (output === 'retriever') { - const retriever = vectorStore.asRetriever() + const retriever = vectorStore.asRetriever(k) return retriever } else if (output === 'vectorStore') { + ;(vectorStore as any).k = k return vectorStore } return vectorStore diff --git a/packages/components/nodes/vectorstores/OpenSearch_Existing/OpenSearch_existing.ts b/packages/components/nodes/vectorstores/OpenSearch_Existing/OpenSearch_existing.ts new file mode 100644 index 000000000..c8d09470a --- /dev/null +++ b/packages/components/nodes/vectorstores/OpenSearch_Existing/OpenSearch_existing.ts @@ -0,0 +1,97 @@ +import { INode, INodeData, INodeOutputsValue, INodeParams } from '../../../src/Interface' +import { OpenSearchVectorStore } from 'langchain/vectorstores/opensearch' +import { Embeddings } from 'langchain/embeddings/base' +import { Client } from '@opensearch-project/opensearch' +import { getBaseClasses } from '../../../src/utils' + +class OpenSearch_Existing_VectorStores implements INode { + label: string + name: string + version: number + description: string + type: string + icon: string + category: string + baseClasses: string[] + inputs: INodeParams[] + outputs: INodeOutputsValue[] + + constructor() { + this.label = 'OpenSearch Load Existing Index' + this.name = 'openSearchExistingIndex' + this.version = 1.0 + this.type = 'OpenSearch' + this.icon = 'opensearch.png' + this.category = 'Vector Stores' + this.description = 'Load existing index from OpenSearch (i.e: Document has been upserted)' + this.baseClasses = [this.type, 'VectorStoreRetriever', 'BaseRetriever'] + this.inputs = [ + { + label: 'Embeddings', + name: 'embeddings', + type: 'Embeddings' + }, + { + label: 'OpenSearch URL', + name: 'opensearchURL', + type: 'string', + placeholder: 'http://127.0.0.1:9200' + }, + { + label: 'Index Name', + name: 'indexName', + type: 'string' + }, + { + label: 'Top K', + name: 'topK', + description: 'Number of top results to fetch. Default to 4', + placeholder: '4', + type: 'number', + additionalParams: true, + optional: true + } + ] + this.outputs = [ + { + label: 'OpenSearch Retriever', + name: 'retriever', + baseClasses: this.baseClasses + }, + { + label: 'OpenSearch Vector Store', + name: 'vectorStore', + baseClasses: [this.type, ...getBaseClasses(OpenSearchVectorStore)] + } + ] + } + + async init(nodeData: INodeData): Promise { + const embeddings = nodeData.inputs?.embeddings as Embeddings + const opensearchURL = nodeData.inputs?.opensearchURL as string + const indexName = nodeData.inputs?.indexName as string + const output = nodeData.outputs?.output as string + const topK = nodeData.inputs?.topK as string + const k = topK ? parseFloat(topK) : 4 + + const client = new Client({ + nodes: [opensearchURL] + }) + + const vectorStore = new OpenSearchVectorStore(embeddings, { + client, + indexName + }) + + if (output === 'retriever') { + const retriever = vectorStore.asRetriever(k) + return retriever + } else if (output === 'vectorStore') { + ;(vectorStore as any).k = k + return vectorStore + } + return vectorStore + } +} + +module.exports = { nodeClass: OpenSearch_Existing_VectorStores } diff --git a/packages/components/nodes/vectorstores/OpenSearch_Existing/opensearch.png b/packages/components/nodes/vectorstores/OpenSearch_Existing/opensearch.png new file mode 100644 index 000000000..3fdcfd3f0 Binary files /dev/null and b/packages/components/nodes/vectorstores/OpenSearch_Existing/opensearch.png differ diff --git a/packages/components/nodes/vectorstores/OpenSearch_Upsert/OpenSearch_Upsert.ts b/packages/components/nodes/vectorstores/OpenSearch_Upsert/OpenSearch_Upsert.ts new file mode 100644 index 000000000..c11d8b115 --- /dev/null +++ b/packages/components/nodes/vectorstores/OpenSearch_Upsert/OpenSearch_Upsert.ts @@ -0,0 +1,112 @@ +import { INode, INodeData, INodeOutputsValue, INodeParams } from '../../../src/Interface' +import { OpenSearchVectorStore } from 'langchain/vectorstores/opensearch' +import { Embeddings } from 'langchain/embeddings/base' +import { Document } from 'langchain/document' +import { Client } from '@opensearch-project/opensearch' +import { flatten } from 'lodash' +import { getBaseClasses } from '../../../src/utils' + +class OpenSearchUpsert_VectorStores implements INode { + label: string + name: string + version: number + description: string + type: string + icon: string + category: string + baseClasses: string[] + inputs: INodeParams[] + outputs: INodeOutputsValue[] + + constructor() { + this.label = 'OpenSearch Upsert Document' + this.name = 'openSearchUpsertDocument' + this.version = 1.0 + this.type = 'OpenSearch' + this.icon = 'opensearch.png' + this.category = 'Vector Stores' + this.description = 'Upsert documents to OpenSearch' + this.baseClasses = [this.type, 'VectorStoreRetriever', 'BaseRetriever'] + this.inputs = [ + { + label: 'Document', + name: 'document', + type: 'Document', + list: true + }, + { + label: 'Embeddings', + name: 'embeddings', + type: 'Embeddings' + }, + { + label: 'OpenSearch URL', + name: 'opensearchURL', + type: 'string', + placeholder: 'http://127.0.0.1:9200' + }, + { + label: 'Index Name', + name: 'indexName', + type: 'string' + }, + { + label: 'Top K', + name: 'topK', + description: 'Number of top results to fetch. Default to 4', + placeholder: '4', + type: 'number', + additionalParams: true, + optional: true + } + ] + this.outputs = [ + { + label: 'OpenSearch Retriever', + name: 'retriever', + baseClasses: this.baseClasses + }, + { + label: 'OpenSearch Vector Store', + name: 'vectorStore', + baseClasses: [this.type, ...getBaseClasses(OpenSearchVectorStore)] + } + ] + } + + async init(nodeData: INodeData): Promise { + const docs = nodeData.inputs?.document as Document[] + const embeddings = nodeData.inputs?.embeddings as Embeddings + const opensearchURL = nodeData.inputs?.opensearchURL as string + const indexName = nodeData.inputs?.indexName as string + const output = nodeData.outputs?.output as string + const topK = nodeData.inputs?.topK as string + const k = topK ? parseFloat(topK) : 4 + + const flattenDocs = docs && docs.length ? flatten(docs) : [] + const finalDocs = [] + for (let i = 0; i < flattenDocs.length; i += 1) { + finalDocs.push(new Document(flattenDocs[i])) + } + + const client = new Client({ + nodes: [opensearchURL] + }) + + const vectorStore = await OpenSearchVectorStore.fromDocuments(finalDocs, embeddings, { + client, + indexName: indexName + }) + + if (output === 'retriever') { + const retriever = vectorStore.asRetriever(k) + return retriever + } else if (output === 'vectorStore') { + ;(vectorStore as any).k = k + return vectorStore + } + return vectorStore + } +} + +module.exports = { nodeClass: OpenSearchUpsert_VectorStores } diff --git a/packages/components/nodes/vectorstores/OpenSearch_Upsert/opensearch.png b/packages/components/nodes/vectorstores/OpenSearch_Upsert/opensearch.png new file mode 100644 index 000000000..3fdcfd3f0 Binary files /dev/null and b/packages/components/nodes/vectorstores/OpenSearch_Upsert/opensearch.png differ diff --git a/packages/components/nodes/vectorstores/Pinecone_Existing/Pinecone_Existing.ts b/packages/components/nodes/vectorstores/Pinecone_Existing/Pinecone_Existing.ts index 04706ed07..2369165d8 100644 --- a/packages/components/nodes/vectorstores/Pinecone_Existing/Pinecone_Existing.ts +++ b/packages/components/nodes/vectorstores/Pinecone_Existing/Pinecone_Existing.ts @@ -1,44 +1,43 @@ -import { INode, INodeData, INodeOutputsValue, INodeParams } from '../../../src/Interface' +import { ICommonObject, INode, INodeData, INodeOutputsValue, INodeParams } from '../../../src/Interface' import { PineconeClient } from '@pinecone-database/pinecone' import { PineconeLibArgs, PineconeStore } from 'langchain/vectorstores/pinecone' import { Embeddings } from 'langchain/embeddings/base' -import { getBaseClasses } from '../../../src/utils' +import { getBaseClasses, getCredentialData, getCredentialParam } from '../../../src/utils' class Pinecone_Existing_VectorStores implements INode { label: string name: string + version: number description: string type: string icon: string category: string baseClasses: string[] inputs: INodeParams[] + credential: INodeParams outputs: INodeOutputsValue[] constructor() { this.label = 'Pinecone Load Existing Index' this.name = 'pineconeExistingIndex' + this.version = 1.0 this.type = 'Pinecone' this.icon = 'pinecone.png' this.category = 'Vector Stores' this.description = 'Load existing index from Pinecone (i.e: Document has been upserted)' this.baseClasses = [this.type, 'VectorStoreRetriever', 'BaseRetriever'] + this.credential = { + label: 'Connect Credential', + name: 'credential', + type: 'credential', + credentialNames: ['pineconeApi'] + } this.inputs = [ { label: 'Embeddings', name: 'embeddings', type: 'Embeddings' }, - { - label: 'Pinecone Api Key', - name: 'pineconeApiKey', - type: 'password' - }, - { - label: 'Pinecone Environment', - name: 'pineconeEnv', - type: 'string' - }, { label: 'Pinecone Index', name: 'pineconeIndex', @@ -49,6 +48,7 @@ class Pinecone_Existing_VectorStores implements INode { name: 'pineconeNamespace', type: 'string', placeholder: 'my-first-namespace', + additionalParams: true, optional: true }, { @@ -57,6 +57,15 @@ class Pinecone_Existing_VectorStores implements INode { type: 'json', optional: true, additionalParams: true + }, + { + label: 'Top K', + name: 'topK', + description: 'Number of top results to fetch. Default to 4', + placeholder: '4', + type: 'number', + additionalParams: true, + optional: true } ] this.outputs = [ @@ -73,15 +82,18 @@ class Pinecone_Existing_VectorStores implements INode { ] } - async init(nodeData: INodeData): Promise { - const pineconeApiKey = nodeData.inputs?.pineconeApiKey as string - const pineconeEnv = nodeData.inputs?.pineconeEnv as string + async init(nodeData: INodeData, _: string, options: ICommonObject): Promise { const index = nodeData.inputs?.pineconeIndex as string const pineconeNamespace = nodeData.inputs?.pineconeNamespace as string const pineconeMetadataFilter = nodeData.inputs?.pineconeMetadataFilter - const embeddings = nodeData.inputs?.embeddings as Embeddings const output = nodeData.outputs?.output as string + const topK = nodeData.inputs?.topK as string + const k = topK ? parseFloat(topK) : 4 + + const credentialData = await getCredentialData(nodeData.credential ?? '', options) + const pineconeApiKey = getCredentialParam('pineconeApiKey', credentialData, nodeData) + const pineconeEnv = getCredentialParam('pineconeEnv', credentialData, nodeData) const client = new PineconeClient() await client.init({ @@ -104,9 +116,10 @@ class Pinecone_Existing_VectorStores implements INode { const vectorStore = await PineconeStore.fromExistingIndex(embeddings, obj) if (output === 'retriever') { - const retriever = vectorStore.asRetriever() + const retriever = vectorStore.asRetriever(k) return retriever } else if (output === 'vectorStore') { + ;(vectorStore as any).k = k return vectorStore } return vectorStore diff --git a/packages/components/nodes/vectorstores/Pinecone_Upsert/Pinecone_Upsert.ts b/packages/components/nodes/vectorstores/Pinecone_Upsert/Pinecone_Upsert.ts index d89174d3d..3d2a6497d 100644 --- a/packages/components/nodes/vectorstores/Pinecone_Upsert/Pinecone_Upsert.ts +++ b/packages/components/nodes/vectorstores/Pinecone_Upsert/Pinecone_Upsert.ts @@ -1,29 +1,39 @@ -import { INode, INodeData, INodeOutputsValue, INodeParams } from '../../../src/Interface' +import { ICommonObject, INode, INodeData, INodeOutputsValue, INodeParams } from '../../../src/Interface' import { PineconeClient } from '@pinecone-database/pinecone' import { PineconeLibArgs, PineconeStore } from 'langchain/vectorstores/pinecone' import { Embeddings } from 'langchain/embeddings/base' import { Document } from 'langchain/document' -import { getBaseClasses } from '../../../src/utils' +import { getBaseClasses, getCredentialData, getCredentialParam } from '../../../src/utils' +import { flatten } from 'lodash' class PineconeUpsert_VectorStores implements INode { label: string name: string + version: number description: string type: string icon: string category: string baseClasses: string[] inputs: INodeParams[] + credential: INodeParams outputs: INodeOutputsValue[] constructor() { this.label = 'Pinecone Upsert Document' this.name = 'pineconeUpsert' + this.version = 1.0 this.type = 'Pinecone' this.icon = 'pinecone.png' this.category = 'Vector Stores' this.description = 'Upsert documents to Pinecone' this.baseClasses = [this.type, 'VectorStoreRetriever', 'BaseRetriever'] + this.credential = { + label: 'Connect Credential', + name: 'credential', + type: 'credential', + credentialNames: ['pineconeApi'] + } this.inputs = [ { label: 'Document', @@ -36,16 +46,6 @@ class PineconeUpsert_VectorStores implements INode { name: 'embeddings', type: 'Embeddings' }, - { - label: 'Pinecone Api Key', - name: 'pineconeApiKey', - type: 'password' - }, - { - label: 'Pinecone Environment', - name: 'pineconeEnv', - type: 'string' - }, { label: 'Pinecone Index', name: 'pineconeIndex', @@ -56,6 +56,16 @@ class PineconeUpsert_VectorStores implements INode { name: 'pineconeNamespace', type: 'string', placeholder: 'my-first-namespace', + additionalParams: true, + optional: true + }, + { + label: 'Top K', + name: 'topK', + description: 'Number of top results to fetch. Default to 4', + placeholder: '4', + type: 'number', + additionalParams: true, optional: true } ] @@ -73,14 +83,18 @@ class PineconeUpsert_VectorStores implements INode { ] } - async init(nodeData: INodeData): Promise { - const pineconeApiKey = nodeData.inputs?.pineconeApiKey as string - const pineconeEnv = nodeData.inputs?.pineconeEnv as string + async init(nodeData: INodeData, _: string, options: ICommonObject): Promise { const index = nodeData.inputs?.pineconeIndex as string const pineconeNamespace = nodeData.inputs?.pineconeNamespace as string const docs = nodeData.inputs?.document as Document[] const embeddings = nodeData.inputs?.embeddings as Embeddings const output = nodeData.outputs?.output as string + const topK = nodeData.inputs?.topK as string + const k = topK ? parseFloat(topK) : 4 + + const credentialData = await getCredentialData(nodeData.credential ?? '', options) + const pineconeApiKey = getCredentialParam('pineconeApiKey', credentialData, nodeData) + const pineconeEnv = getCredentialParam('pineconeEnv', credentialData, nodeData) const client = new PineconeClient() await client.init({ @@ -90,7 +104,7 @@ class PineconeUpsert_VectorStores implements INode { const pineconeIndex = client.Index(index) - const flattenDocs = docs && docs.length ? docs.flat() : [] + const flattenDocs = docs && docs.length ? flatten(docs) : [] const finalDocs = [] for (let i = 0; i < flattenDocs.length; i += 1) { finalDocs.push(new Document(flattenDocs[i])) @@ -105,9 +119,10 @@ class PineconeUpsert_VectorStores implements INode { const vectorStore = await PineconeStore.fromDocuments(finalDocs, embeddings, obj) if (output === 'retriever') { - const retriever = vectorStore.asRetriever() + const retriever = vectorStore.asRetriever(k) return retriever } else if (output === 'vectorStore') { + ;(vectorStore as any).k = k return vectorStore } return vectorStore diff --git a/packages/components/nodes/vectorstores/Qdrant_Existing/Qdrant_Existing.ts b/packages/components/nodes/vectorstores/Qdrant_Existing/Qdrant_Existing.ts new file mode 100644 index 000000000..16f83b086 --- /dev/null +++ b/packages/components/nodes/vectorstores/Qdrant_Existing/Qdrant_Existing.ts @@ -0,0 +1,126 @@ +import { ICommonObject, INode, INodeData, INodeOutputsValue, INodeParams } from '../../../src/Interface' +import { QdrantClient } from '@qdrant/js-client-rest' +import { QdrantVectorStore, QdrantLibArgs } from 'langchain/vectorstores/qdrant' +import { Embeddings } from 'langchain/embeddings/base' +import { getBaseClasses, getCredentialData, getCredentialParam } from '../../../src/utils' + +class Qdrant_Existing_VectorStores implements INode { + label: string + name: string + version: number + description: string + type: string + icon: string + category: string + baseClasses: string[] + inputs: INodeParams[] + credential: INodeParams + outputs: INodeOutputsValue[] + + constructor() { + this.label = 'Qdrant Load Existing Index' + this.name = 'qdrantExistingIndex' + this.version = 1.0 + this.type = 'Qdrant' + this.icon = 'qdrant.png' + this.category = 'Vector Stores' + this.description = 'Load existing index from Qdrant (i.e., documents have been upserted)' + this.baseClasses = [this.type, 'VectorStoreRetriever', 'BaseRetriever'] + this.credential = { + label: 'Connect Credential', + name: 'credential', + type: 'credential', + description: 'Only needed when using Qdrant cloud hosted', + optional: true, + credentialNames: ['qdrantApi'] + } + this.inputs = [ + { + label: 'Embeddings', + name: 'embeddings', + type: 'Embeddings' + }, + { + label: 'Qdrant Server URL', + name: 'qdrantServerUrl', + type: 'string', + placeholder: 'http://localhost:6333' + }, + { + label: 'Qdrant Collection Name', + name: 'qdrantCollection', + type: 'string' + }, + { + label: 'Qdrant Collection Cofiguration', + name: 'qdrantCollectionCofiguration', + type: 'json', + optional: true, + additionalParams: true + }, + { + label: 'Top K', + name: 'topK', + description: 'Number of top results to fetch. Default to 4', + placeholder: '4', + type: 'number', + additionalParams: true, + optional: true + } + ] + this.outputs = [ + { + label: 'Qdrant Retriever', + name: 'retriever', + baseClasses: this.baseClasses + }, + { + label: 'Qdrant Vector Store', + name: 'vectorStore', + baseClasses: [this.type, ...getBaseClasses(QdrantVectorStore)] + } + ] + } + + async init(nodeData: INodeData, _: string, options: ICommonObject): Promise { + const qdrantServerUrl = nodeData.inputs?.qdrantServerUrl as string + const collectionName = nodeData.inputs?.qdrantCollection as string + let qdrantCollectionCofiguration = nodeData.inputs?.qdrantCollectionCofiguration + const embeddings = nodeData.inputs?.embeddings as Embeddings + const output = nodeData.outputs?.output as string + const topK = nodeData.inputs?.topK as string + const k = topK ? parseFloat(topK) : 4 + + const credentialData = await getCredentialData(nodeData.credential ?? '', options) + const qdrantApiKey = getCredentialParam('qdrantApiKey', credentialData, nodeData) + + const client = new QdrantClient({ + url: qdrantServerUrl, + apiKey: qdrantApiKey + }) + + const dbConfig: QdrantLibArgs = { + client, + collectionName + } + + if (qdrantCollectionCofiguration) { + qdrantCollectionCofiguration = + typeof qdrantCollectionCofiguration === 'object' ? qdrantCollectionCofiguration : JSON.parse(qdrantCollectionCofiguration) + dbConfig.collectionConfig = qdrantCollectionCofiguration + } + + const vectorStore = await QdrantVectorStore.fromExistingCollection(embeddings, dbConfig) + + if (output === 'retriever') { + const retriever = vectorStore.asRetriever(k) + return retriever + } else if (output === 'vectorStore') { + ;(vectorStore as any).k = k + return vectorStore + } + return vectorStore + } +} + +module.exports = { nodeClass: Qdrant_Existing_VectorStores } diff --git a/packages/components/nodes/vectorstores/Qdrant_Existing/qdrant.png b/packages/components/nodes/vectorstores/Qdrant_Existing/qdrant.png new file mode 100644 index 000000000..ecb2a56d5 Binary files /dev/null and b/packages/components/nodes/vectorstores/Qdrant_Existing/qdrant.png differ diff --git a/packages/components/nodes/vectorstores/Qdrant_Upsert/Qdrant_Upsert.ts b/packages/components/nodes/vectorstores/Qdrant_Upsert/Qdrant_Upsert.ts new file mode 100644 index 000000000..dcc3099df --- /dev/null +++ b/packages/components/nodes/vectorstores/Qdrant_Upsert/Qdrant_Upsert.ts @@ -0,0 +1,127 @@ +import { ICommonObject, INode, INodeData, INodeOutputsValue, INodeParams } from '../../../src/Interface' +import { QdrantClient } from '@qdrant/js-client-rest' +import { QdrantVectorStore, QdrantLibArgs } from 'langchain/vectorstores/qdrant' +import { Embeddings } from 'langchain/embeddings/base' +import { Document } from 'langchain/document' +import { getBaseClasses, getCredentialData, getCredentialParam } from '../../../src/utils' +import { flatten } from 'lodash' + +class QdrantUpsert_VectorStores implements INode { + label: string + name: string + version: number + description: string + type: string + icon: string + category: string + baseClasses: string[] + inputs: INodeParams[] + credential: INodeParams + outputs: INodeOutputsValue[] + + constructor() { + this.label = 'Qdrant Upsert Document' + this.name = 'qdrantUpsert' + this.version = 1.0 + this.type = 'Qdrant' + this.icon = 'qdrant.png' + this.category = 'Vector Stores' + this.description = 'Upsert documents to Qdrant' + this.baseClasses = [this.type, 'VectorStoreRetriever', 'BaseRetriever'] + this.credential = { + label: 'Connect Credential', + name: 'credential', + type: 'credential', + description: 'Only needed when using Qdrant cloud hosted', + optional: true, + credentialNames: ['qdrantApi'] + } + this.inputs = [ + { + label: 'Document', + name: 'document', + type: 'Document', + list: true + }, + { + label: 'Embeddings', + name: 'embeddings', + type: 'Embeddings' + }, + { + label: 'Qdrant Server URL', + name: 'qdrantServerUrl', + type: 'string', + placeholder: 'http://localhost:6333' + }, + { + label: 'Qdrant Collection Name', + name: 'qdrantCollection', + type: 'string' + }, + { + label: 'Top K', + name: 'topK', + description: 'Number of top results to fetch. Default to 4', + placeholder: '4', + type: 'number', + additionalParams: true, + optional: true + } + ] + this.outputs = [ + { + label: 'Qdrant Retriever', + name: 'retriever', + baseClasses: this.baseClasses + }, + { + label: 'Qdrant Vector Store', + name: 'vectorStore', + baseClasses: [this.type, ...getBaseClasses(QdrantVectorStore)] + } + ] + } + + async init(nodeData: INodeData, _: string, options: ICommonObject): Promise { + const qdrantServerUrl = nodeData.inputs?.qdrantServerUrl as string + const collectionName = nodeData.inputs?.qdrantCollection as string + const docs = nodeData.inputs?.document as Document[] + const embeddings = nodeData.inputs?.embeddings as Embeddings + const output = nodeData.outputs?.output as string + const topK = nodeData.inputs?.topK as string + const k = topK ? parseFloat(topK) : 4 + + const credentialData = await getCredentialData(nodeData.credential ?? '', options) + const qdrantApiKey = getCredentialParam('qdrantApiKey', credentialData, nodeData) + + const client = new QdrantClient({ + url: qdrantServerUrl, + apiKey: qdrantApiKey + }) + + const flattenDocs = docs && docs.length ? flatten(docs) : [] + const finalDocs = [] + for (let i = 0; i < flattenDocs.length; i += 1) { + finalDocs.push(new Document(flattenDocs[i])) + } + + const dbConfig: QdrantLibArgs = { + client, + url: qdrantServerUrl, + collectionName + } + const vectorStore = await QdrantVectorStore.fromDocuments(finalDocs, embeddings, dbConfig) + + if (output === 'retriever') { + const retriever = vectorStore.asRetriever(k) + return retriever + } else if (output === 'vectorStore') { + ;(vectorStore as any).k = k + return vectorStore + } + return vectorStore + } +} + +module.exports = { nodeClass: QdrantUpsert_VectorStores } diff --git a/packages/components/nodes/vectorstores/Qdrant_Upsert/qdrant.png b/packages/components/nodes/vectorstores/Qdrant_Upsert/qdrant.png new file mode 100644 index 000000000..ecb2a56d5 Binary files /dev/null and b/packages/components/nodes/vectorstores/Qdrant_Upsert/qdrant.png differ diff --git a/packages/components/nodes/vectorstores/Singlestore_Existing/Singlestore_Existing.ts b/packages/components/nodes/vectorstores/Singlestore_Existing/Singlestore_Existing.ts new file mode 100644 index 000000000..c5f6fbceb --- /dev/null +++ b/packages/components/nodes/vectorstores/Singlestore_Existing/Singlestore_Existing.ts @@ -0,0 +1,146 @@ +import { ICommonObject, INode, INodeData, INodeOutputsValue, INodeParams } from '../../../src/Interface' +import { Embeddings } from 'langchain/embeddings/base' +import { getBaseClasses, getCredentialData, getCredentialParam } from '../../../src/utils' +import { SingleStoreVectorStore, SingleStoreVectorStoreConfig } from 'langchain/vectorstores/singlestore' + +class SingleStoreExisting_VectorStores implements INode { + label: string + name: string + version: number + description: string + type: string + icon: string + category: string + baseClasses: string[] + inputs: INodeParams[] + credential: INodeParams + outputs: INodeOutputsValue[] + + constructor() { + this.label = 'SingleStore Load Existing Table' + this.name = 'singlestoreExisting' + this.version = 1.0 + this.type = 'SingleStore' + this.icon = 'singlestore.svg' + this.category = 'Vector Stores' + this.description = 'Load existing document from SingleStore' + this.baseClasses = [this.type, 'VectorStoreRetriever', 'BaseRetriever'] + this.credential = { + label: 'Connect Credential', + name: 'credential', + type: 'credential', + description: 'Needed when using SingleStore cloud hosted', + optional: true, + credentialNames: ['singleStoreApi'] + } + this.inputs = [ + { + label: 'Embeddings', + name: 'embeddings', + type: 'Embeddings' + }, + { + label: 'Host', + name: 'host', + type: 'string' + }, + { + label: 'Database', + name: 'database', + type: 'string' + }, + { + label: 'Table Name', + name: 'tableName', + type: 'string', + placeholder: 'embeddings', + additionalParams: true, + optional: true + }, + { + label: 'Content Column Name', + name: 'contentColumnName', + type: 'string', + placeholder: 'content', + additionalParams: true, + optional: true + }, + { + label: 'Vector Column Name', + name: 'vectorColumnName', + type: 'string', + placeholder: 'vector', + additionalParams: true, + optional: true + }, + { + label: 'Metadata Column Name', + name: 'metadataColumnName', + type: 'string', + placeholder: 'metadata', + additionalParams: true, + optional: true + }, + { + label: 'Top K', + name: 'topK', + placeholder: '4', + type: 'number', + additionalParams: true, + optional: true + } + ] + this.outputs = [ + { + label: 'SingleStore Retriever', + name: 'retriever', + baseClasses: this.baseClasses + }, + { + label: 'SingleStore Vector Store', + name: 'vectorStore', + baseClasses: [this.type, ...getBaseClasses(SingleStoreVectorStore)] + } + ] + } + + async init(nodeData: INodeData, _: string, options: ICommonObject): Promise { + const credentialData = await getCredentialData(nodeData.credential ?? '', options) + const user = getCredentialParam('user', credentialData, nodeData) + const password = getCredentialParam('password', credentialData, nodeData) + + const singleStoreConnectionConfig = { + connectionOptions: { + host: nodeData.inputs?.host as string, + port: 3306, + user, + password, + database: nodeData.inputs?.database as string + }, + ...(nodeData.inputs?.tableName ? { tableName: nodeData.inputs.tableName as string } : {}), + ...(nodeData.inputs?.contentColumnName ? { contentColumnName: nodeData.inputs.contentColumnName as string } : {}), + ...(nodeData.inputs?.vectorColumnName ? { vectorColumnName: nodeData.inputs.vectorColumnName as string } : {}), + ...(nodeData.inputs?.metadataColumnName ? { metadataColumnName: nodeData.inputs.metadataColumnName as string } : {}) + } as SingleStoreVectorStoreConfig + + const embeddings = nodeData.inputs?.embeddings as Embeddings + const output = nodeData.outputs?.output as string + const topK = nodeData.inputs?.topK as string + const k = topK ? parseFloat(topK) : 4 + + let vectorStore: SingleStoreVectorStore + + vectorStore = new SingleStoreVectorStore(embeddings, singleStoreConnectionConfig) + + if (output === 'retriever') { + const retriever = vectorStore.asRetriever(k) + return retriever + } else if (output === 'vectorStore') { + ;(vectorStore as any).k = k + return vectorStore + } + return vectorStore + } +} + +module.exports = { nodeClass: SingleStoreExisting_VectorStores } diff --git a/packages/components/nodes/vectorstores/Singlestore_Existing/singlestore.svg b/packages/components/nodes/vectorstores/Singlestore_Existing/singlestore.svg new file mode 100644 index 000000000..bd8dc8177 --- /dev/null +++ b/packages/components/nodes/vectorstores/Singlestore_Existing/singlestore.svg @@ -0,0 +1,20 @@ + + + SingleStore + + + + + + + + + + + + + + + + + diff --git a/packages/components/nodes/vectorstores/Singlestore_Upsert/Singlestore_Upsert.ts b/packages/components/nodes/vectorstores/Singlestore_Upsert/Singlestore_Upsert.ts new file mode 100644 index 000000000..9889a1545 --- /dev/null +++ b/packages/components/nodes/vectorstores/Singlestore_Upsert/Singlestore_Upsert.ts @@ -0,0 +1,162 @@ +import { ICommonObject, INode, INodeData, INodeOutputsValue, INodeParams } from '../../../src/Interface' +import { Embeddings } from 'langchain/embeddings/base' +import { Document } from 'langchain/document' +import { getBaseClasses, getCredentialData, getCredentialParam } from '../../../src/utils' +import { SingleStoreVectorStore, SingleStoreVectorStoreConfig } from 'langchain/vectorstores/singlestore' +import { flatten } from 'lodash' + +class SingleStoreUpsert_VectorStores implements INode { + label: string + name: string + version: number + description: string + type: string + icon: string + category: string + baseClasses: string[] + inputs: INodeParams[] + credential: INodeParams + outputs: INodeOutputsValue[] + + constructor() { + this.label = 'SingleStore Upsert Document' + this.name = 'singlestoreUpsert' + this.version = 1.0 + this.type = 'SingleStore' + this.icon = 'singlestore.svg' + this.category = 'Vector Stores' + this.description = 'Upsert documents to SingleStore' + this.baseClasses = [this.type, 'VectorStoreRetriever', 'BaseRetriever'] + this.credential = { + label: 'Connect Credential', + name: 'credential', + type: 'credential', + description: 'Needed when using SingleStore cloud hosted', + optional: true, + credentialNames: ['singleStoreApi'] + } + this.inputs = [ + { + label: 'Document', + name: 'document', + type: 'Document', + list: true + }, + { + label: 'Embeddings', + name: 'embeddings', + type: 'Embeddings' + }, + { + label: 'Host', + name: 'host', + type: 'string' + }, + { + label: 'Database', + name: 'database', + type: 'string' + }, + { + label: 'Table Name', + name: 'tableName', + type: 'string', + placeholder: 'embeddings', + additionalParams: true, + optional: true + }, + { + label: 'Content Column Name', + name: 'contentColumnName', + type: 'string', + placeholder: 'content', + additionalParams: true, + optional: true + }, + { + label: 'Vector Column Name', + name: 'vectorColumnName', + type: 'string', + placeholder: 'vector', + additionalParams: true, + optional: true + }, + { + label: 'Metadata Column Name', + name: 'metadataColumnName', + type: 'string', + placeholder: 'metadata', + additionalParams: true, + optional: true + }, + { + label: 'Top K', + name: 'topK', + placeholder: '4', + type: 'number', + additionalParams: true, + optional: true + } + ] + this.outputs = [ + { + label: 'SingleStore Retriever', + name: 'retriever', + baseClasses: this.baseClasses + }, + { + label: 'SingleStore Vector Store', + name: 'vectorStore', + baseClasses: [this.type, ...getBaseClasses(SingleStoreVectorStore)] + } + ] + } + + async init(nodeData: INodeData, _: string, options: ICommonObject): Promise { + const credentialData = await getCredentialData(nodeData.credential ?? '', options) + const user = getCredentialParam('user', credentialData, nodeData) + const password = getCredentialParam('password', credentialData, nodeData) + + const singleStoreConnectionConfig = { + connectionOptions: { + host: nodeData.inputs?.host as string, + port: 3306, + user, + password, + database: nodeData.inputs?.database as string + }, + ...(nodeData.inputs?.tableName ? { tableName: nodeData.inputs.tableName as string } : {}), + ...(nodeData.inputs?.contentColumnName ? { contentColumnName: nodeData.inputs.contentColumnName as string } : {}), + ...(nodeData.inputs?.vectorColumnName ? { vectorColumnName: nodeData.inputs.vectorColumnName as string } : {}), + ...(nodeData.inputs?.metadataColumnName ? { metadataColumnName: nodeData.inputs.metadataColumnName as string } : {}) + } as SingleStoreVectorStoreConfig + + const docs = nodeData.inputs?.document as Document[] + const embeddings = nodeData.inputs?.embeddings as Embeddings + const output = nodeData.outputs?.output as string + const topK = nodeData.inputs?.topK as string + const k = topK ? parseFloat(topK) : 4 + + const flattenDocs = docs && docs.length ? flatten(docs) : [] + const finalDocs = [] + for (let i = 0; i < flattenDocs.length; i += 1) { + finalDocs.push(new Document(flattenDocs[i])) + } + + let vectorStore: SingleStoreVectorStore + + vectorStore = new SingleStoreVectorStore(embeddings, singleStoreConnectionConfig) + vectorStore.addDocuments.bind(vectorStore)(finalDocs) + + if (output === 'retriever') { + const retriever = vectorStore.asRetriever(k) + return retriever + } else if (output === 'vectorStore') { + ;(vectorStore as any).k = k + return vectorStore + } + return vectorStore + } +} + +module.exports = { nodeClass: SingleStoreUpsert_VectorStores } diff --git a/packages/components/nodes/vectorstores/Singlestore_Upsert/singlestore.svg b/packages/components/nodes/vectorstores/Singlestore_Upsert/singlestore.svg new file mode 100644 index 000000000..bd8dc8177 --- /dev/null +++ b/packages/components/nodes/vectorstores/Singlestore_Upsert/singlestore.svg @@ -0,0 +1,20 @@ + + + SingleStore + + + + + + + + + + + + + + + + + diff --git a/packages/components/nodes/vectorstores/Supabase_Existing/Supabase_Exisiting.ts b/packages/components/nodes/vectorstores/Supabase_Existing/Supabase_Exisiting.ts index f97b18873..ed6febb5b 100644 --- a/packages/components/nodes/vectorstores/Supabase_Existing/Supabase_Exisiting.ts +++ b/packages/components/nodes/vectorstores/Supabase_Existing/Supabase_Exisiting.ts @@ -1,39 +1,43 @@ -import { INode, INodeData, INodeOutputsValue, INodeParams } from '../../../src/Interface' +import { ICommonObject, INode, INodeData, INodeOutputsValue, INodeParams } from '../../../src/Interface' import { Embeddings } from 'langchain/embeddings/base' -import { getBaseClasses } from '../../../src/utils' +import { getBaseClasses, getCredentialData, getCredentialParam } from '../../../src/utils' import { SupabaseLibArgs, SupabaseVectorStore } from 'langchain/vectorstores/supabase' import { createClient } from '@supabase/supabase-js' class Supabase_Existing_VectorStores implements INode { label: string name: string + version: number description: string type: string icon: string category: string baseClasses: string[] inputs: INodeParams[] + credential: INodeParams outputs: INodeOutputsValue[] constructor() { this.label = 'Supabase Load Existing Index' this.name = 'supabaseExistingIndex' + this.version = 1.0 this.type = 'Supabase' this.icon = 'supabase.svg' this.category = 'Vector Stores' this.description = 'Load existing index from Supabase (i.e: Document has been upserted)' this.baseClasses = [this.type, 'VectorStoreRetriever', 'BaseRetriever'] + this.credential = { + label: 'Connect Credential', + name: 'credential', + type: 'credential', + credentialNames: ['supabaseApi'] + } this.inputs = [ { label: 'Embeddings', name: 'embeddings', type: 'Embeddings' }, - { - label: 'Supabase API Key', - name: 'supabaseApiKey', - type: 'password' - }, { label: 'Supabase Project URL', name: 'supabaseProjUrl', @@ -55,6 +59,15 @@ class Supabase_Existing_VectorStores implements INode { type: 'json', optional: true, additionalParams: true + }, + { + label: 'Top K', + name: 'topK', + description: 'Number of top results to fetch. Default to 4', + placeholder: '4', + type: 'number', + additionalParams: true, + optional: true } ] this.outputs = [ @@ -71,14 +84,18 @@ class Supabase_Existing_VectorStores implements INode { ] } - async init(nodeData: INodeData): Promise { - const supabaseApiKey = nodeData.inputs?.supabaseApiKey as string + async init(nodeData: INodeData, _: string, options: ICommonObject): Promise { const supabaseProjUrl = nodeData.inputs?.supabaseProjUrl as string const tableName = nodeData.inputs?.tableName as string const queryName = nodeData.inputs?.queryName as string const embeddings = nodeData.inputs?.embeddings as Embeddings const supabaseMetadataFilter = nodeData.inputs?.supabaseMetadataFilter const output = nodeData.outputs?.output as string + const topK = nodeData.inputs?.topK as string + const k = topK ? parseFloat(topK) : 4 + + const credentialData = await getCredentialData(nodeData.credential ?? '', options) + const supabaseApiKey = getCredentialParam('supabaseApiKey', credentialData, nodeData) const client = createClient(supabaseProjUrl, supabaseApiKey) @@ -96,9 +113,10 @@ class Supabase_Existing_VectorStores implements INode { const vectorStore = await SupabaseVectorStore.fromExistingIndex(embeddings, obj) if (output === 'retriever') { - const retriever = vectorStore.asRetriever() + const retriever = vectorStore.asRetriever(k) return retriever } else if (output === 'vectorStore') { + ;(vectorStore as any).k = k return vectorStore } return vectorStore diff --git a/packages/components/nodes/vectorstores/Supabase_Upsert/Supabase_Upsert.ts b/packages/components/nodes/vectorstores/Supabase_Upsert/Supabase_Upsert.ts index 0a8af6fd1..90fe2121f 100644 --- a/packages/components/nodes/vectorstores/Supabase_Upsert/Supabase_Upsert.ts +++ b/packages/components/nodes/vectorstores/Supabase_Upsert/Supabase_Upsert.ts @@ -1,29 +1,39 @@ -import { INode, INodeData, INodeOutputsValue, INodeParams } from '../../../src/Interface' +import { ICommonObject, INode, INodeData, INodeOutputsValue, INodeParams } from '../../../src/Interface' import { Embeddings } from 'langchain/embeddings/base' import { Document } from 'langchain/document' -import { getBaseClasses } from '../../../src/utils' +import { getBaseClasses, getCredentialData, getCredentialParam } from '../../../src/utils' import { SupabaseVectorStore } from 'langchain/vectorstores/supabase' import { createClient } from '@supabase/supabase-js' +import { flatten } from 'lodash' class SupabaseUpsert_VectorStores implements INode { label: string name: string + version: number description: string type: string icon: string category: string baseClasses: string[] inputs: INodeParams[] + credential: INodeParams outputs: INodeOutputsValue[] constructor() { this.label = 'Supabase Upsert Document' this.name = 'supabaseUpsert' + this.version = 1.0 this.type = 'Supabase' this.icon = 'supabase.svg' this.category = 'Vector Stores' this.description = 'Upsert documents to Supabase' this.baseClasses = [this.type, 'VectorStoreRetriever', 'BaseRetriever'] + this.credential = { + label: 'Connect Credential', + name: 'credential', + type: 'credential', + credentialNames: ['supabaseApi'] + } this.inputs = [ { label: 'Document', @@ -36,11 +46,6 @@ class SupabaseUpsert_VectorStores implements INode { name: 'embeddings', type: 'Embeddings' }, - { - label: 'Supabase API Key', - name: 'supabaseApiKey', - type: 'password' - }, { label: 'Supabase Project URL', name: 'supabaseProjUrl', @@ -55,6 +60,15 @@ class SupabaseUpsert_VectorStores implements INode { label: 'Query Name', name: 'queryName', type: 'string' + }, + { + label: 'Top K', + name: 'topK', + description: 'Number of top results to fetch. Default to 4', + placeholder: '4', + type: 'number', + additionalParams: true, + optional: true } ] this.outputs = [ @@ -71,18 +85,22 @@ class SupabaseUpsert_VectorStores implements INode { ] } - async init(nodeData: INodeData): Promise { - const supabaseApiKey = nodeData.inputs?.supabaseApiKey as string + async init(nodeData: INodeData, _: string, options: ICommonObject): Promise { const supabaseProjUrl = nodeData.inputs?.supabaseProjUrl as string const tableName = nodeData.inputs?.tableName as string const queryName = nodeData.inputs?.queryName as string const docs = nodeData.inputs?.document as Document[] const embeddings = nodeData.inputs?.embeddings as Embeddings const output = nodeData.outputs?.output as string + const topK = nodeData.inputs?.topK as string + const k = topK ? parseFloat(topK) : 4 + + const credentialData = await getCredentialData(nodeData.credential ?? '', options) + const supabaseApiKey = getCredentialParam('supabaseApiKey', credentialData, nodeData) const client = createClient(supabaseProjUrl, supabaseApiKey) - const flattenDocs = docs && docs.length ? docs.flat() : [] + const flattenDocs = docs && docs.length ? flatten(docs) : [] const finalDocs = [] for (let i = 0; i < flattenDocs.length; i += 1) { finalDocs.push(new Document(flattenDocs[i])) @@ -95,9 +113,10 @@ class SupabaseUpsert_VectorStores implements INode { }) if (output === 'retriever') { - const retriever = vectorStore.asRetriever() + const retriever = vectorStore.asRetriever(k) return retriever } else if (output === 'vectorStore') { + ;(vectorStore as any).k = k return vectorStore } return vectorStore diff --git a/packages/components/nodes/vectorstores/Vectara_Existing/Vectara_Existing.ts b/packages/components/nodes/vectorstores/Vectara_Existing/Vectara_Existing.ts new file mode 100644 index 000000000..3ef04f079 --- /dev/null +++ b/packages/components/nodes/vectorstores/Vectara_Existing/Vectara_Existing.ts @@ -0,0 +1,133 @@ +import { ICommonObject, INode, INodeData, INodeOutputsValue, INodeParams } from '../../../src/Interface' +import { getBaseClasses, getCredentialData, getCredentialParam } from '../../../src/utils' +import { VectaraStore, VectaraLibArgs, VectaraFilter, VectaraContextConfig } from 'langchain/vectorstores/vectara' + +class VectaraExisting_VectorStores implements INode { + label: string + name: string + version: number + description: string + type: string + icon: string + category: string + baseClasses: string[] + inputs: INodeParams[] + credential: INodeParams + outputs: INodeOutputsValue[] + + constructor() { + this.label = 'Vectara Load Existing Index' + this.name = 'vectaraExistingIndex' + this.version = 1.0 + this.type = 'Vectara' + this.icon = 'vectara.png' + this.category = 'Vector Stores' + this.description = 'Load existing index from Vectara (i.e: Document has been upserted)' + this.baseClasses = [this.type, 'VectorStoreRetriever', 'BaseRetriever'] + this.credential = { + label: 'Connect Credential', + name: 'credential', + type: 'credential', + credentialNames: ['vectaraApi'] + } + this.inputs = [ + { + label: 'Vectara Metadata Filter', + name: 'filter', + description: + 'Filter to apply to Vectara metadata. Refer to the documentation on how to use Vectara filters with Flowise.', + type: 'string', + additionalParams: true, + optional: true + }, + { + label: 'Sentences Before', + name: 'sentencesBefore', + description: 'Number of sentences to fetch before the matched sentence. Defaults to 2.', + type: 'number', + additionalParams: true, + optional: true + }, + { + label: 'Sentences After', + name: 'sentencesAfter', + description: 'Number of sentences to fetch after the matched sentence. Defaults to 2.', + type: 'number', + additionalParams: true, + optional: true + }, + { + label: 'Lambda', + name: 'lambda', + description: + 'Improves retrieval accuracy by adjusting the balance (from 0 to 1) between neural search and keyword-based search factors.', + type: 'number', + additionalParams: true, + optional: true + }, + { + label: 'Top K', + name: 'topK', + description: 'Number of top results to fetch. Defaults to 4', + placeholder: '4', + type: 'number', + additionalParams: true, + optional: true + } + ] + this.outputs = [ + { + label: 'Vectara Retriever', + name: 'retriever', + baseClasses: this.baseClasses + }, + { + label: 'Vectara Vector Store', + name: 'vectorStore', + baseClasses: [this.type, ...getBaseClasses(VectaraStore)] + } + ] + } + async init(nodeData: INodeData, _: string, options: ICommonObject): Promise { + const credentialData = await getCredentialData(nodeData.credential ?? '', options) + const apiKey = getCredentialParam('apiKey', credentialData, nodeData) + const customerId = getCredentialParam('customerID', credentialData, nodeData) + const corpusId = getCredentialParam('corpusID', credentialData, nodeData) + + const vectaraMetadataFilter = nodeData.inputs?.filter as string + const sentencesBefore = nodeData.inputs?.sentencesBefore as number + const sentencesAfter = nodeData.inputs?.sentencesAfter as number + const lambda = nodeData.inputs?.lambda as number + const output = nodeData.outputs?.output as string + const topK = nodeData.inputs?.topK as string + const k = topK ? parseInt(topK, 10) : 4 + + const vectaraArgs: VectaraLibArgs = { + apiKey: apiKey, + customerId: customerId, + corpusId: corpusId + } + + const vectaraFilter: VectaraFilter = {} + if (vectaraMetadataFilter) vectaraFilter.filter = vectaraMetadataFilter + if (lambda) vectaraFilter.lambda = lambda + + const vectaraContextConfig: VectaraContextConfig = {} + if (sentencesBefore) vectaraContextConfig.sentencesBefore = sentencesBefore + if (sentencesAfter) vectaraContextConfig.sentencesAfter = sentencesAfter + vectaraFilter.contextConfig = vectaraContextConfig + + const vectorStore = new VectaraStore(vectaraArgs) + + if (output === 'retriever') { + const retriever = vectorStore.asRetriever(k, vectaraFilter) + return retriever + } else if (output === 'vectorStore') { + ;(vectorStore as any).k = k + return vectorStore + } + return vectorStore + } +} + +module.exports = { nodeClass: VectaraExisting_VectorStores } diff --git a/packages/components/nodes/vectorstores/Vectara_Existing/vectara.png b/packages/components/nodes/vectorstores/Vectara_Existing/vectara.png new file mode 100644 index 000000000..a13a34e6b Binary files /dev/null and b/packages/components/nodes/vectorstores/Vectara_Existing/vectara.png differ diff --git a/packages/components/nodes/vectorstores/Vectara_Upsert/Vectara_Upsert.ts b/packages/components/nodes/vectorstores/Vectara_Upsert/Vectara_Upsert.ts new file mode 100644 index 000000000..51fb67ed5 --- /dev/null +++ b/packages/components/nodes/vectorstores/Vectara_Upsert/Vectara_Upsert.ts @@ -0,0 +1,150 @@ +import { ICommonObject, INode, INodeData, INodeOutputsValue, INodeParams } from '../../../src/Interface' +import { Embeddings } from 'langchain/embeddings/base' +import { getBaseClasses, getCredentialData, getCredentialParam } from '../../../src/utils' +import { VectaraStore, VectaraLibArgs, VectaraFilter, VectaraContextConfig } from 'langchain/vectorstores/vectara' +import { Document } from 'langchain/document' +import { flatten } from 'lodash' + +class VectaraUpsert_VectorStores implements INode { + label: string + name: string + version: number + description: string + type: string + icon: string + category: string + baseClasses: string[] + inputs: INodeParams[] + credential: INodeParams + outputs: INodeOutputsValue[] + + constructor() { + this.label = 'Vectara Upsert Document' + this.name = 'vectaraUpsert' + this.version = 1.0 + this.type = 'Vectara' + this.icon = 'vectara.png' + this.category = 'Vector Stores' + this.description = 'Upsert documents to Vectara' + this.baseClasses = [this.type, 'VectorStoreRetriever', 'BaseRetriever'] + this.credential = { + label: 'Connect Credential', + name: 'credential', + type: 'credential', + credentialNames: ['vectaraApi'] + } + this.inputs = [ + { + label: 'Document', + name: 'document', + type: 'Document', + list: true + }, + { + label: 'Vectara Metadata Filter', + name: 'filter', + description: + 'Filter to apply to Vectara metadata. Refer to the documentation on how to use Vectara filters with Flowise.', + type: 'string', + additionalParams: true, + optional: true + }, + { + label: 'Sentences Before', + name: 'sentencesBefore', + description: 'Number of sentences to fetch before the matched sentence. Defaults to 2.', + type: 'number', + additionalParams: true, + optional: true + }, + { + label: 'Sentences After', + name: 'sentencesAfter', + description: 'Number of sentences to fetch after the matched sentence. Defaults to 2.', + type: 'number', + additionalParams: true, + optional: true + }, + { + label: 'Lambda', + name: 'lambda', + description: + 'Improves retrieval accuracy by adjusting the balance (from 0 to 1) between neural search and keyword-based search factors.', + type: 'number', + additionalParams: true, + optional: true + }, + { + label: 'Top K', + name: 'topK', + description: 'Number of top results to fetch. Defaults to 4', + placeholder: '4', + type: 'number', + additionalParams: true, + optional: true + } + ] + this.outputs = [ + { + label: 'Vectara Retriever', + name: 'retriever', + baseClasses: this.baseClasses + }, + { + label: 'Vectara Vector Store', + name: 'vectorStore', + baseClasses: [this.type, ...getBaseClasses(VectaraStore)] + } + ] + } + async init(nodeData: INodeData, _: string, options: ICommonObject): Promise { + const credentialData = await getCredentialData(nodeData.credential ?? '', options) + const apiKey = getCredentialParam('apiKey', credentialData, nodeData) + const customerId = getCredentialParam('customerID', credentialData, nodeData) + const corpusId = getCredentialParam('corpusID', credentialData, nodeData) + + const docs = nodeData.inputs?.document as Document[] + const embeddings = {} as Embeddings + const vectaraMetadataFilter = nodeData.inputs?.filter as string + const sentencesBefore = nodeData.inputs?.sentencesBefore as number + const sentencesAfter = nodeData.inputs?.sentencesAfter as number + const lambda = nodeData.inputs?.lambda as number + const output = nodeData.outputs?.output as string + const topK = nodeData.inputs?.topK as string + const k = topK ? parseInt(topK, 10) : 4 + + const vectaraArgs: VectaraLibArgs = { + apiKey: apiKey, + customerId: customerId, + corpusId: corpusId + } + + const vectaraFilter: VectaraFilter = {} + if (vectaraMetadataFilter) vectaraFilter.filter = vectaraMetadataFilter + if (lambda) vectaraFilter.lambda = lambda + + const vectaraContextConfig: VectaraContextConfig = {} + if (sentencesBefore) vectaraContextConfig.sentencesBefore = sentencesBefore + if (sentencesAfter) vectaraContextConfig.sentencesAfter = sentencesAfter + vectaraFilter.contextConfig = vectaraContextConfig + + const flattenDocs = docs && docs.length ? flatten(docs) : [] + const finalDocs = [] + for (let i = 0; i < flattenDocs.length; i += 1) { + finalDocs.push(new Document(flattenDocs[i])) + } + + const vectorStore = await VectaraStore.fromDocuments(finalDocs, embeddings, vectaraArgs) + + if (output === 'retriever') { + const retriever = vectorStore.asRetriever(k, vectaraFilter) + return retriever + } else if (output === 'vectorStore') { + ;(vectorStore as any).k = k + return vectorStore + } + return vectorStore + } +} + +module.exports = { nodeClass: VectaraUpsert_VectorStores } diff --git a/packages/components/nodes/vectorstores/Vectara_Upsert/vectara.png b/packages/components/nodes/vectorstores/Vectara_Upsert/vectara.png new file mode 100644 index 000000000..a13a34e6b Binary files /dev/null and b/packages/components/nodes/vectorstores/Vectara_Upsert/vectara.png differ diff --git a/packages/components/nodes/vectorstores/Weaviate_Existing/Weaviate_Existing.ts b/packages/components/nodes/vectorstores/Weaviate_Existing/Weaviate_Existing.ts index ba0ab502c..e35a39171 100644 --- a/packages/components/nodes/vectorstores/Weaviate_Existing/Weaviate_Existing.ts +++ b/packages/components/nodes/vectorstores/Weaviate_Existing/Weaviate_Existing.ts @@ -1,28 +1,39 @@ -import { INode, INodeData, INodeOutputsValue, INodeParams } from '../../../src/Interface' +import { ICommonObject, INode, INodeData, INodeOutputsValue, INodeParams } from '../../../src/Interface' import { Embeddings } from 'langchain/embeddings/base' -import { getBaseClasses } from '../../../src/utils' +import { getBaseClasses, getCredentialData, getCredentialParam } from '../../../src/utils' import weaviate, { WeaviateClient, ApiKey } from 'weaviate-ts-client' import { WeaviateLibArgs, WeaviateStore } from 'langchain/vectorstores/weaviate' class Weaviate_Existing_VectorStores implements INode { label: string name: string + version: number description: string type: string icon: string category: string baseClasses: string[] inputs: INodeParams[] + credential: INodeParams outputs: INodeOutputsValue[] constructor() { this.label = 'Weaviate Load Existing Index' this.name = 'weaviateExistingIndex' + this.version = 1.0 this.type = 'Weaviate' this.icon = 'weaviate.png' this.category = 'Vector Stores' this.description = 'Load existing index from Weaviate (i.e: Document has been upserted)' this.baseClasses = [this.type, 'VectorStoreRetriever', 'BaseRetriever'] + this.credential = { + label: 'Connect Credential', + name: 'credential', + type: 'credential', + description: 'Only needed when using Weaviate cloud hosted', + optional: true, + credentialNames: ['weaviateApi'] + } this.inputs = [ { label: 'Embeddings', @@ -57,12 +68,6 @@ class Weaviate_Existing_VectorStores implements INode { type: 'string', placeholder: 'Test' }, - { - label: 'Weaviate API Key', - name: 'weaviateApiKey', - type: 'password', - optional: true - }, { label: 'Weaviate Text Key', name: 'weaviateTextKey', @@ -79,6 +84,15 @@ class Weaviate_Existing_VectorStores implements INode { placeholder: `["foo"]`, optional: true, additionalParams: true + }, + { + label: 'Top K', + name: 'topK', + description: 'Number of top results to fetch. Default to 4', + placeholder: '4', + type: 'number', + additionalParams: true, + optional: true } ] this.outputs = [ @@ -95,16 +109,19 @@ class Weaviate_Existing_VectorStores implements INode { ] } - async init(nodeData: INodeData): Promise { + async init(nodeData: INodeData, _: string, options: ICommonObject): Promise { const weaviateScheme = nodeData.inputs?.weaviateScheme as string const weaviateHost = nodeData.inputs?.weaviateHost as string const weaviateIndex = nodeData.inputs?.weaviateIndex as string - const weaviateApiKey = nodeData.inputs?.weaviateApiKey as string const weaviateTextKey = nodeData.inputs?.weaviateTextKey as string const weaviateMetadataKeys = nodeData.inputs?.weaviateMetadataKeys as string - const embeddings = nodeData.inputs?.embeddings as Embeddings const output = nodeData.outputs?.output as string + const topK = nodeData.inputs?.topK as string + const k = topK ? parseFloat(topK) : 4 + + const credentialData = await getCredentialData(nodeData.credential ?? '', options) + const weaviateApiKey = getCredentialParam('weaviateApiKey', credentialData, nodeData) const clientConfig: any = { scheme: weaviateScheme, @@ -125,9 +142,10 @@ class Weaviate_Existing_VectorStores implements INode { const vectorStore = await WeaviateStore.fromExistingIndex(embeddings, obj) if (output === 'retriever') { - const retriever = vectorStore.asRetriever() + const retriever = vectorStore.asRetriever(k) return retriever } else if (output === 'vectorStore') { + ;(vectorStore as any).k = k return vectorStore } return vectorStore diff --git a/packages/components/nodes/vectorstores/Weaviate_Upsert/Weaviate_Upsert.ts b/packages/components/nodes/vectorstores/Weaviate_Upsert/Weaviate_Upsert.ts index 0528d2492..a2f82831e 100644 --- a/packages/components/nodes/vectorstores/Weaviate_Upsert/Weaviate_Upsert.ts +++ b/packages/components/nodes/vectorstores/Weaviate_Upsert/Weaviate_Upsert.ts @@ -1,29 +1,41 @@ -import { INode, INodeData, INodeOutputsValue, INodeParams } from '../../../src/Interface' +import { ICommonObject, INode, INodeData, INodeOutputsValue, INodeParams } from '../../../src/Interface' import { Embeddings } from 'langchain/embeddings/base' import { Document } from 'langchain/document' -import { getBaseClasses } from '../../../src/utils' +import { getBaseClasses, getCredentialData, getCredentialParam } from '../../../src/utils' import { WeaviateLibArgs, WeaviateStore } from 'langchain/vectorstores/weaviate' import weaviate, { WeaviateClient, ApiKey } from 'weaviate-ts-client' +import { flatten } from 'lodash' class WeaviateUpsert_VectorStores implements INode { label: string name: string + version: number description: string type: string icon: string category: string baseClasses: string[] inputs: INodeParams[] + credential: INodeParams outputs: INodeOutputsValue[] constructor() { this.label = 'Weaviate Upsert Document' this.name = 'weaviateUpsert' + this.version = 1.0 this.type = 'Weaviate' this.icon = 'weaviate.png' this.category = 'Vector Stores' this.description = 'Upsert documents to Weaviate' this.baseClasses = [this.type, 'VectorStoreRetriever', 'BaseRetriever'] + this.credential = { + label: 'Connect Credential', + name: 'credential', + type: 'credential', + description: 'Only needed when using Weaviate cloud hosted', + optional: true, + credentialNames: ['weaviateApi'] + } this.inputs = [ { label: 'Document', @@ -64,12 +76,6 @@ class WeaviateUpsert_VectorStores implements INode { type: 'string', placeholder: 'Test' }, - { - label: 'Weaviate API Key', - name: 'weaviateApiKey', - type: 'password', - optional: true - }, { label: 'Weaviate Text Key', name: 'weaviateTextKey', @@ -86,6 +92,15 @@ class WeaviateUpsert_VectorStores implements INode { placeholder: `["foo"]`, optional: true, additionalParams: true + }, + { + label: 'Top K', + name: 'topK', + description: 'Number of top results to fetch. Default to 4', + placeholder: '4', + type: 'number', + additionalParams: true, + optional: true } ] this.outputs = [ @@ -102,17 +117,20 @@ class WeaviateUpsert_VectorStores implements INode { ] } - async init(nodeData: INodeData): Promise { + async init(nodeData: INodeData, _: string, options: ICommonObject): Promise { const weaviateScheme = nodeData.inputs?.weaviateScheme as string const weaviateHost = nodeData.inputs?.weaviateHost as string const weaviateIndex = nodeData.inputs?.weaviateIndex as string - const weaviateApiKey = nodeData.inputs?.weaviateApiKey as string const weaviateTextKey = nodeData.inputs?.weaviateTextKey as string const weaviateMetadataKeys = nodeData.inputs?.weaviateMetadataKeys as string - const docs = nodeData.inputs?.document as Document[] const embeddings = nodeData.inputs?.embeddings as Embeddings const output = nodeData.outputs?.output as string + const topK = nodeData.inputs?.topK as string + const k = topK ? parseFloat(topK) : 4 + + const credentialData = await getCredentialData(nodeData.credential ?? '', options) + const weaviateApiKey = getCredentialParam('weaviateApiKey', credentialData, nodeData) const clientConfig: any = { scheme: weaviateScheme, @@ -122,7 +140,7 @@ class WeaviateUpsert_VectorStores implements INode { const client: WeaviateClient = weaviate.client(clientConfig) - const flattenDocs = docs && docs.length ? docs.flat() : [] + const flattenDocs = docs && docs.length ? flatten(docs) : [] const finalDocs = [] for (let i = 0; i < flattenDocs.length; i += 1) { finalDocs.push(new Document(flattenDocs[i])) @@ -139,9 +157,10 @@ class WeaviateUpsert_VectorStores implements INode { const vectorStore = await WeaviateStore.fromDocuments(finalDocs, embeddings, obj) if (output === 'retriever') { - const retriever = vectorStore.asRetriever() + const retriever = vectorStore.asRetriever(k) return retriever } else if (output === 'vectorStore') { + ;(vectorStore as any).k = k return vectorStore } return vectorStore diff --git a/packages/components/nodes/vectorstores/Zep/Zep_Existing.ts b/packages/components/nodes/vectorstores/Zep/Zep_Existing.ts new file mode 100644 index 000000000..a2c2261f7 --- /dev/null +++ b/packages/components/nodes/vectorstores/Zep/Zep_Existing.ts @@ -0,0 +1,235 @@ +import { ICommonObject, INode, INodeData, INodeOutputsValue, INodeParams } from '../../../src/Interface' +import { ZepVectorStore, IZepConfig } from 'langchain/vectorstores/zep' +import { Embeddings } from 'langchain/embeddings/base' +import { Document } from 'langchain/document' +import { getBaseClasses, getCredentialData, getCredentialParam } from '../../../src/utils' +import { IDocument, ZepClient } from '@getzep/zep-js' + +class Zep_Existing_VectorStores implements INode { + label: string + name: string + version: number + description: string + type: string + icon: string + category: string + baseClasses: string[] + inputs: INodeParams[] + credential: INodeParams + outputs: INodeOutputsValue[] + + constructor() { + this.label = 'Zep Load Existing Index' + this.name = 'zepExistingIndex' + this.version = 1.0 + this.type = 'Zep' + this.icon = 'zep.png' + this.category = 'Vector Stores' + this.description = 'Load existing index from Zep (i.e: Document has been upserted)' + this.baseClasses = [this.type, 'VectorStoreRetriever', 'BaseRetriever'] + this.credential = { + label: 'Connect Credential', + name: 'credential', + type: 'credential', + optional: true, + description: 'Configure JWT authentication on your Zep instance (Optional)', + credentialNames: ['zepMemoryApi'] + } + this.inputs = [ + { + label: 'Embeddings', + name: 'embeddings', + type: 'Embeddings' + }, + { + label: 'Base URL', + name: 'baseURL', + type: 'string', + default: 'http://127.0.0.1:8000' + }, + { + label: 'Zep Collection', + name: 'zepCollection', + type: 'string', + placeholder: 'my-first-collection' + }, + { + label: 'Zep Metadata Filter', + name: 'zepMetadataFilter', + type: 'json', + optional: true, + additionalParams: true + }, + { + label: 'Embedding Dimension', + name: 'dimension', + type: 'number', + default: 1536, + additionalParams: true + }, + { + label: 'Top K', + name: 'topK', + description: 'Number of top results to fetch. Default to 4', + placeholder: '4', + type: 'number', + additionalParams: true, + optional: true + } + ] + this.outputs = [ + { + label: 'Pinecone Retriever', + name: 'retriever', + baseClasses: this.baseClasses + }, + { + label: 'Pinecone Vector Store', + name: 'vectorStore', + baseClasses: [this.type, ...getBaseClasses(ZepVectorStore)] + } + ] + } + + async init(nodeData: INodeData, _: string, options: ICommonObject): Promise { + const baseURL = nodeData.inputs?.baseURL as string + const zepCollection = nodeData.inputs?.zepCollection as string + const zepMetadataFilter = nodeData.inputs?.zepMetadataFilter + const dimension = nodeData.inputs?.dimension as number + const embeddings = nodeData.inputs?.embeddings as Embeddings + const output = nodeData.outputs?.output as string + const topK = nodeData.inputs?.topK as string + const k = topK ? parseFloat(topK) : 4 + + const credentialData = await getCredentialData(nodeData.credential ?? '', options) + const apiKey = getCredentialParam('apiKey', credentialData, nodeData) + + const zepConfig: IZepConfig & Partial = { + apiUrl: baseURL, + collectionName: zepCollection, + embeddingDimensions: dimension, + isAutoEmbedded: false + } + if (apiKey) zepConfig.apiKey = apiKey + if (zepMetadataFilter) { + const metadatafilter = typeof zepMetadataFilter === 'object' ? zepMetadataFilter : JSON.parse(zepMetadataFilter) + zepConfig.filter = metadatafilter + } + + const vectorStore = await ZepExistingVS.fromExistingIndex(embeddings, zepConfig) + + if (output === 'retriever') { + const retriever = vectorStore.asRetriever(k) + return retriever + } else if (output === 'vectorStore') { + ;(vectorStore as any).k = k + return vectorStore + } + return vectorStore + } +} + +interface ZepFilter { + filter: Record +} + +function zepDocsToDocumentsAndScore(results: IDocument[]): [Document, number][] { + return results.map((d) => [ + new Document({ + pageContent: d.content, + metadata: d.metadata + }), + d.score ? d.score : 0 + ]) +} + +function assignMetadata(value: string | Record | object | undefined): Record | undefined { + if (typeof value === 'object' && value !== null) { + return value as Record + } + if (value !== undefined) { + console.warn('Metadata filters must be an object, Record, or undefined.') + } + return undefined +} + +class ZepExistingVS extends ZepVectorStore { + filter?: Record + args?: IZepConfig & Partial + + constructor(embeddings: Embeddings, args: IZepConfig & Partial) { + super(embeddings, args) + this.filter = args.filter + this.args = args + } + + async initalizeCollection(args: IZepConfig & Partial) { + this.client = await ZepClient.init(args.apiUrl, args.apiKey) + try { + this.collection = await this.client.document.getCollection(args.collectionName) + } catch (err) { + if (err instanceof Error) { + if (err.name === 'NotFoundError') { + await this.createNewCollection(args) + } else { + throw err + } + } + } + } + + async createNewCollection(args: IZepConfig & Partial) { + if (!args.embeddingDimensions) { + throw new Error( + `Collection ${args.collectionName} not found. You can create a new Collection by providing embeddingDimensions.` + ) + } + + this.collection = await this.client.document.addCollection({ + name: args.collectionName, + description: args.description, + metadata: args.metadata, + embeddingDimensions: args.embeddingDimensions, + isAutoEmbedded: false + }) + } + + async similaritySearchVectorWithScore( + query: number[], + k: number, + filter?: Record | undefined + ): Promise<[Document, number][]> { + if (filter && this.filter) { + throw new Error('cannot provide both `filter` and `this.filter`') + } + const _filters = filter ?? this.filter + const ANDFilters = [] + for (const filterKey in _filters) { + let filterVal = _filters[filterKey] + if (typeof filterVal === 'string') filterVal = `"${filterVal}"` + ANDFilters.push({ jsonpath: `$[*] ? (@.${filterKey} == ${filterVal})` }) + } + const newfilter = { + where: { and: ANDFilters } + } + await this.initalizeCollection(this.args!).catch((err) => { + console.error('Error initializing collection:', err) + throw err + }) + const results = await this.collection.search( + { + embedding: new Float32Array(query), + metadata: assignMetadata(newfilter) + }, + k + ) + return zepDocsToDocumentsAndScore(results) + } + + static async fromExistingIndex(embeddings: Embeddings, dbConfig: IZepConfig & Partial): Promise { + const instance = new this(embeddings, dbConfig) + return instance + } +} + +module.exports = { nodeClass: Zep_Existing_VectorStores } diff --git a/packages/components/nodes/vectorstores/Zep/Zep_Upsert.ts b/packages/components/nodes/vectorstores/Zep/Zep_Upsert.ts new file mode 100644 index 000000000..0f976d2b5 --- /dev/null +++ b/packages/components/nodes/vectorstores/Zep/Zep_Upsert.ts @@ -0,0 +1,133 @@ +import { ICommonObject, INode, INodeData, INodeOutputsValue, INodeParams } from '../../../src/Interface' +import { ZepVectorStore, IZepConfig } from 'langchain/vectorstores/zep' +import { Embeddings } from 'langchain/embeddings/base' +import { Document } from 'langchain/document' +import { getBaseClasses, getCredentialData, getCredentialParam } from '../../../src/utils' +import { flatten } from 'lodash' + +class Zep_Upsert_VectorStores implements INode { + label: string + name: string + version: number + description: string + type: string + icon: string + category: string + baseClasses: string[] + inputs: INodeParams[] + credential: INodeParams + outputs: INodeOutputsValue[] + + constructor() { + this.label = 'Zep Upsert Document' + this.name = 'zepUpsert' + this.version = 1.0 + this.type = 'Zep' + this.icon = 'zep.png' + this.category = 'Vector Stores' + this.description = 'Upsert documents to Zep' + this.baseClasses = [this.type, 'VectorStoreRetriever', 'BaseRetriever'] + this.credential = { + label: 'Connect Credential', + name: 'credential', + type: 'credential', + optional: true, + description: 'Configure JWT authentication on your Zep instance (Optional)', + credentialNames: ['zepMemoryApi'] + } + this.inputs = [ + { + label: 'Document', + name: 'document', + type: 'Document', + list: true + }, + { + label: 'Embeddings', + name: 'embeddings', + type: 'Embeddings' + }, + { + label: 'Base URL', + name: 'baseURL', + type: 'string', + default: 'http://127.0.0.1:8000' + }, + { + label: 'Zep Collection', + name: 'zepCollection', + type: 'string', + placeholder: 'my-first-collection' + }, + { + label: 'Embedding Dimension', + name: 'dimension', + type: 'number', + default: 1536, + additionalParams: true + }, + { + label: 'Top K', + name: 'topK', + description: 'Number of top results to fetch. Default to 4', + placeholder: '4', + type: 'number', + additionalParams: true, + optional: true + } + ] + this.outputs = [ + { + label: 'Zep Retriever', + name: 'retriever', + baseClasses: this.baseClasses + }, + { + label: 'Zep Vector Store', + name: 'vectorStore', + baseClasses: [this.type, ...getBaseClasses(ZepVectorStore)] + } + ] + } + + async init(nodeData: INodeData, _: string, options: ICommonObject): Promise { + const baseURL = nodeData.inputs?.baseURL as string + const zepCollection = nodeData.inputs?.zepCollection as string + const dimension = (nodeData.inputs?.dimension as number) ?? 1536 + const docs = nodeData.inputs?.document as Document[] + const embeddings = nodeData.inputs?.embeddings as Embeddings + const topK = nodeData.inputs?.topK as string + const k = topK ? parseFloat(topK) : 4 + const output = nodeData.outputs?.output as string + + const credentialData = await getCredentialData(nodeData.credential ?? '', options) + const apiKey = getCredentialParam('apiKey', credentialData, nodeData) + + const flattenDocs = docs && docs.length ? flatten(docs) : [] + const finalDocs = [] + for (let i = 0; i < flattenDocs.length; i += 1) { + finalDocs.push(new Document(flattenDocs[i])) + } + + const zepConfig: IZepConfig = { + apiUrl: baseURL, + collectionName: zepCollection, + embeddingDimensions: dimension, + isAutoEmbedded: false + } + if (apiKey) zepConfig.apiKey = apiKey + + const vectorStore = await ZepVectorStore.fromDocuments(finalDocs, embeddings, zepConfig) + + if (output === 'retriever') { + const retriever = vectorStore.asRetriever(k) + return retriever + } else if (output === 'vectorStore') { + ;(vectorStore as any).k = k + return vectorStore + } + return vectorStore + } +} + +module.exports = { nodeClass: Zep_Upsert_VectorStores } diff --git a/packages/components/nodes/vectorstores/Zep/zep.png b/packages/components/nodes/vectorstores/Zep/zep.png new file mode 100644 index 000000000..2fdb23827 Binary files /dev/null and b/packages/components/nodes/vectorstores/Zep/zep.png differ diff --git a/packages/components/package.json b/packages/components/package.json index bbce5cb94..2192dba8c 100644 --- a/packages/components/package.json +++ b/packages/components/package.json @@ -1,6 +1,6 @@ { "name": "flowise-components", - "version": "1.2.7", + "version": "1.3.3", "description": "Flowiseai Components", "main": "dist/src/index", "types": "dist/src/index.d.ts", @@ -16,26 +16,47 @@ }, "license": "SEE LICENSE IN LICENSE.md", "dependencies": { + "@aws-sdk/client-dynamodb": "^3.360.0", "@dqbd/tiktoken": "^1.0.7", - "@huggingface/inference": "1", - "@pinecone-database/pinecone": "^0.0.12", - "@supabase/supabase-js": "^2.21.0", - "@zilliz/milvus2-sdk-node": "^2.2.10", + "@getzep/zep-js": "^0.6.3", + "@huggingface/inference": "^2.6.1", + "@notionhq/client": "^2.2.8", + "@opensearch-project/opensearch": "^1.2.0", + "@pinecone-database/pinecone": "^0.0.14", + "@qdrant/js-client-rest": "^1.2.2", + "@supabase/supabase-js": "^2.29.0", + "@types/js-yaml": "^4.0.5", + "apify-client": "^2.7.1", + "@types/jsdom": "^21.1.1", "axios": "^0.27.2", "cheerio": "^1.0.0-rc.12", - "chromadb": "^1.3.1", + "chromadb": "^1.5.3", "cohere-ai": "^6.2.0", "d3-dsv": "2", "dotenv": "^16.0.0", "express": "^4.17.3", + "faiss-node": "^0.2.2", "form-data": "^4.0.0", + "google-auth-library": "^9.0.0", "graphql": "^16.6.0", - "langchain": "^0.0.78", + "html-to-text": "^9.0.5", + "langchain": "^0.0.128", "linkifyjs": "^4.1.1", "mammoth": "^1.5.1", "moment": "^2.29.3", + "mysql2": "^3.5.1", "node-fetch": "^2.6.11", + "node-html-markdown": "^1.3.0", + "notion-to-md": "^3.1.1", "pdf-parse": "^1.1.1", + "pdfjs-dist": "^3.7.107", + "playwright": "^1.35.0", + "puppeteer": "^20.7.1", + "pyodide": ">=0.21.0-alpha.2", + "redis": "^4.6.7", + "replicate": "^0.12.3", + "srt-parser-2": "^1.2.3", + "vm2": "^3.9.19", "weaviate-ts-client": "^1.1.0", "ws": "^8.9.0" }, diff --git a/packages/components/src/Interface.ts b/packages/components/src/Interface.ts index 181cc206e..e883d056e 100644 --- a/packages/components/src/Interface.ts +++ b/packages/components/src/Interface.ts @@ -2,7 +2,18 @@ * Types */ -export type NodeParamsType = 'options' | 'string' | 'number' | 'boolean' | 'password' | 'json' | 'code' | 'date' | 'file' | 'folder' +export type NodeParamsType = + | 'asyncOptions' + | 'options' + | 'string' + | 'number' + | 'boolean' + | 'password' + | 'json' + | 'code' + | 'date' + | 'file' + | 'folder' export type CommonType = string | number | boolean | undefined | null @@ -16,6 +27,10 @@ export interface ICommonObject { [key: string]: any | CommonType | ICommonObject | CommonType[] | ICommonObject[] } +export type IDatabaseEntity = { + [key: string]: any +} + export interface IAttachment { content: string contentType: string @@ -42,14 +57,18 @@ export interface INodeParams { type: NodeParamsType | string default?: CommonType | ICommonObject | ICommonObject[] description?: string + warning?: string options?: Array + credentialNames?: Array optional?: boolean | INodeDisplay + step?: number rows?: number list?: boolean acceptVariable?: boolean placeholder?: string fileType?: string additionalParams?: boolean + loadMethod?: string } export interface INodeExecutionData { @@ -65,6 +84,7 @@ export interface INodeProperties { name: string type: string icon: string + version: number category: string baseClasses: string[] description?: string @@ -74,15 +94,28 @@ export interface INodeProperties { export interface INode extends INodeProperties { inputs?: INodeParams[] output?: INodeOutputsValue[] + loadMethods?: { + [key: string]: (nodeData: INodeData, options?: ICommonObject) => Promise + } init?(nodeData: INodeData, input: string, options?: ICommonObject): Promise - run?(nodeData: INodeData, input: string, options?: ICommonObject): Promise + run?(nodeData: INodeData, input: string, options?: ICommonObject): Promise + clearSessionMemory?(nodeData: INodeData, options?: ICommonObject): Promise } export interface INodeData extends INodeProperties { id: string inputs?: ICommonObject outputs?: ICommonObject + credential?: string instance?: any + loadMethod?: string // method to load async options +} + +export interface INodeCredential { + label: string + name: string + description?: string + inputs?: INodeParams[] } export interface IMessage { @@ -95,6 +128,7 @@ export interface IMessage { */ import { PromptTemplate as LangchainPromptTemplate, PromptTemplateInput } from 'langchain/prompts' +import { VectorStore } from 'langchain/vectorstores/base' export class PromptTemplate extends LangchainPromptTemplate { promptValues: ICommonObject @@ -103,3 +137,42 @@ export class PromptTemplate extends LangchainPromptTemplate { super(input) } } + +export interface PromptRetrieverInput { + name: string + description: string + systemMessage: string +} + +const fixedTemplate = `Here is a question: +{input} +` +export class PromptRetriever { + name: string + description: string + systemMessage: string + + constructor(fields: PromptRetrieverInput) { + this.name = fields.name + this.description = fields.description + this.systemMessage = `${fields.systemMessage}\n${fixedTemplate}` + } +} + +export interface VectorStoreRetrieverInput { + name: string + description: string + vectorStore: VectorStore +} + +export class VectorStoreRetriever { + name: string + description: string + vectorStore: VectorStore + + constructor(fields: VectorStoreRetrieverInput) { + this.name = fields.name + this.description = fields.description + this.vectorStore = fields.vectorStore + } +} diff --git a/packages/components/src/handler.ts b/packages/components/src/handler.ts new file mode 100644 index 000000000..8e3633617 --- /dev/null +++ b/packages/components/src/handler.ts @@ -0,0 +1,180 @@ +import { BaseTracer, Run, BaseCallbackHandler } from 'langchain/callbacks' +import { AgentAction, ChainValues } from 'langchain/schema' +import { Logger } from 'winston' +import { Server } from 'socket.io' + +interface AgentRun extends Run { + actions: AgentAction[] +} + +function tryJsonStringify(obj: unknown, fallback: string) { + try { + return JSON.stringify(obj, null, 2) + } catch (err) { + return fallback + } +} + +function elapsed(run: Run): string { + if (!run.end_time) return '' + const elapsed = run.end_time - run.start_time + if (elapsed < 1000) { + return `${elapsed}ms` + } + return `${(elapsed / 1000).toFixed(2)}s` +} + +export class ConsoleCallbackHandler extends BaseTracer { + name = 'console_callback_handler' as const + logger: Logger + + protected persistRun(_run: Run) { + return Promise.resolve() + } + + constructor(logger: Logger) { + super() + this.logger = logger + } + + // utility methods + + getParents(run: Run) { + const parents: Run[] = [] + let currentRun = run + while (currentRun.parent_run_id) { + const parent = this.runMap.get(currentRun.parent_run_id) + if (parent) { + parents.push(parent) + currentRun = parent + } else { + break + } + } + return parents + } + + getBreadcrumbs(run: Run) { + const parents = this.getParents(run).reverse() + const string = [...parents, run] + .map((parent) => { + const name = `${parent.execution_order}:${parent.run_type}:${parent.name}` + return name + }) + .join(' > ') + return string + } + + // logging methods + + onChainStart(run: Run) { + const crumbs = this.getBreadcrumbs(run) + this.logger.verbose(`[chain/start] [${crumbs}] Entering Chain run with input: ${tryJsonStringify(run.inputs, '[inputs]')}`) + } + + onChainEnd(run: Run) { + const crumbs = this.getBreadcrumbs(run) + this.logger.verbose( + `[chain/end] [${crumbs}] [${elapsed(run)}] Exiting Chain run with output: ${tryJsonStringify(run.outputs, '[outputs]')}` + ) + } + + onChainError(run: Run) { + const crumbs = this.getBreadcrumbs(run) + this.logger.verbose( + `[chain/error] [${crumbs}] [${elapsed(run)}] Chain run errored with error: ${tryJsonStringify(run.error, '[error]')}` + ) + } + + onLLMStart(run: Run) { + const crumbs = this.getBreadcrumbs(run) + const inputs = 'prompts' in run.inputs ? { prompts: (run.inputs.prompts as string[]).map((p) => p.trim()) } : run.inputs + this.logger.verbose(`[llm/start] [${crumbs}] Entering LLM run with input: ${tryJsonStringify(inputs, '[inputs]')}`) + } + + onLLMEnd(run: Run) { + const crumbs = this.getBreadcrumbs(run) + this.logger.verbose( + `[llm/end] [${crumbs}] [${elapsed(run)}] Exiting LLM run with output: ${tryJsonStringify(run.outputs, '[response]')}` + ) + } + + onLLMError(run: Run) { + const crumbs = this.getBreadcrumbs(run) + this.logger.verbose( + `[llm/error] [${crumbs}] [${elapsed(run)}] LLM run errored with error: ${tryJsonStringify(run.error, '[error]')}` + ) + } + + onToolStart(run: Run) { + const crumbs = this.getBreadcrumbs(run) + this.logger.verbose(`[tool/start] [${crumbs}] Entering Tool run with input: "${run.inputs.input?.trim()}"`) + } + + onToolEnd(run: Run) { + const crumbs = this.getBreadcrumbs(run) + this.logger.verbose(`[tool/end] [${crumbs}] [${elapsed(run)}] Exiting Tool run with output: "${run.outputs?.output?.trim()}"`) + } + + onToolError(run: Run) { + const crumbs = this.getBreadcrumbs(run) + this.logger.verbose( + `[tool/error] [${crumbs}] [${elapsed(run)}] Tool run errored with error: ${tryJsonStringify(run.error, '[error]')}` + ) + } + + onAgentAction(run: Run) { + const agentRun = run as AgentRun + const crumbs = this.getBreadcrumbs(run) + this.logger.verbose( + `[agent/action] [${crumbs}] Agent selected action: ${tryJsonStringify( + agentRun.actions[agentRun.actions.length - 1], + '[action]' + )}` + ) + } +} + +/** + * Custom chain handler class + */ +export class CustomChainHandler extends BaseCallbackHandler { + name = 'custom_chain_handler' + isLLMStarted = false + socketIO: Server + socketIOClientId = '' + skipK = 0 // Skip streaming for first K numbers of handleLLMStart + returnSourceDocuments = false + + constructor(socketIO: Server, socketIOClientId: string, skipK?: number, returnSourceDocuments?: boolean) { + super() + this.socketIO = socketIO + this.socketIOClientId = socketIOClientId + this.skipK = skipK ?? this.skipK + this.returnSourceDocuments = returnSourceDocuments ?? this.returnSourceDocuments + } + + handleLLMStart() { + if (this.skipK > 0) this.skipK -= 1 + } + + handleLLMNewToken(token: string) { + if (this.skipK === 0) { + if (!this.isLLMStarted) { + this.isLLMStarted = true + this.socketIO.to(this.socketIOClientId).emit('start', token) + } + this.socketIO.to(this.socketIOClientId).emit('token', token) + } + } + + handleLLMEnd() { + this.socketIO.to(this.socketIOClientId).emit('end') + } + + handleChainEnd(outputs: ChainValues): void | Promise { + if (this.returnSourceDocuments) { + this.socketIO.to(this.socketIOClientId).emit('sourceDocuments', outputs?.sourceDocuments) + } + } +} diff --git a/packages/components/src/index.ts b/packages/components/src/index.ts index d04f5bf61..ae2e380ee 100644 --- a/packages/components/src/index.ts +++ b/packages/components/src/index.ts @@ -2,7 +2,7 @@ import dotenv from 'dotenv' import path from 'path' const envPath = path.join(__dirname, '..', '..', '.env') -dotenv.config({ path: envPath }) +dotenv.config({ path: envPath, override: true }) export * from './Interface' export * from './utils' diff --git a/packages/components/src/utils.ts b/packages/components/src/utils.ts index 10091d602..8d06a6501 100644 --- a/packages/components/src/utils.ts +++ b/packages/components/src/utils.ts @@ -2,6 +2,12 @@ import axios from 'axios' import { load } from 'cheerio' import * as fs from 'fs' import * as path from 'path' +import { JSDOM } from 'jsdom' +import { DataSource } from 'typeorm' +import { ICommonObject, IDatabaseEntity, IMessage, INodeData } from './Interface' +import { AES, enc } from 'crypto-js' +import { ChatMessageHistory } from 'langchain/memory' +import { AIMessage, HumanMessage } from 'langchain/schema' export const numberOrExpressionRegex = '^(\\d+\\.?\\d*|{{.*}})$' //return true if string consists only numbers OR expression {{}} export const notEmptyRegex = '(.|\\s)*\\S(.|\\s)*' //return true if string is not empty or blank @@ -15,6 +21,7 @@ export const notEmptyRegex = '(.|\\s)*\\S(.|\\s)*' //return true if string is no */ export const getBaseClasses = (targetClass: any) => { const baseClasses: string[] = [] + const skipClassNames = ['BaseLangChain', 'Serializable'] if (targetClass instanceof Function) { let baseClass = targetClass @@ -23,7 +30,7 @@ export const getBaseClasses = (targetClass: any) => { const newBaseClass = Object.getPrototypeOf(baseClass) if (newBaseClass && newBaseClass !== Object && newBaseClass.name) { baseClass = newBaseClass - baseClasses.push(baseClass.name) + if (!skipClassNames.includes(baseClass.name)) baseClasses.push(baseClass.name) } else { break } @@ -129,7 +136,7 @@ export const getInputVariables = (paramValue: string): string[] => { const variableStack = [] const inputVariables = [] let startIdx = 0 - const endIdx = returnVal.length - 1 + const endIdx = returnVal.length while (startIdx < endIdx) { const substr = returnVal.substring(startIdx, startIdx + 1) @@ -152,6 +159,12 @@ export const getInputVariables = (paramValue: string): string[] => { return inputVariables } +/** + * Crawl all available urls given a domain url and limit + * @param {string} url + * @param {number} limit + * @returns {string[]} + */ export const getAvailableURLs = async (url: string, limit: number) => { try { const availableUrls: string[] = [] @@ -190,3 +203,339 @@ export const getAvailableURLs = async (url: string, limit: number) => { throw new Error(`getAvailableURLs: ${err?.message}`) } } + +/** + * Search for href through htmlBody string + * @param {string} htmlBody + * @param {string} baseURL + * @returns {string[]} + */ +function getURLsFromHTML(htmlBody: string, baseURL: string): string[] { + const dom = new JSDOM(htmlBody) + const linkElements = dom.window.document.querySelectorAll('a') + const urls: string[] = [] + for (const linkElement of linkElements) { + if (linkElement.href.slice(0, 1) === '/') { + try { + const urlObj = new URL(baseURL + linkElement.href) + urls.push(urlObj.href) //relative + } catch (err) { + if (process.env.DEBUG === 'true') console.error(`error with relative url: ${err.message}`) + continue + } + } else { + try { + const urlObj = new URL(linkElement.href) + urls.push(urlObj.href) //absolute + } catch (err) { + if (process.env.DEBUG === 'true') console.error(`error with absolute url: ${err.message}`) + continue + } + } + } + return urls +} + +/** + * Normalize URL to prevent crawling the same page + * @param {string} urlString + * @returns {string} + */ +function normalizeURL(urlString: string): string { + const urlObj = new URL(urlString) + const hostPath = urlObj.hostname + urlObj.pathname + if (hostPath.length > 0 && hostPath.slice(-1) == '/') { + // handling trailing slash + return hostPath.slice(0, -1) + } + return hostPath +} + +/** + * Recursive crawl using normalizeURL and getURLsFromHTML + * @param {string} baseURL + * @param {string} currentURL + * @param {string[]} pages + * @param {number} limit + * @returns {Promise} + */ +async function crawl(baseURL: string, currentURL: string, pages: string[], limit: number): Promise { + const baseURLObj = new URL(baseURL) + const currentURLObj = new URL(currentURL) + + if (limit !== 0 && pages.length === limit) return pages + + if (baseURLObj.hostname !== currentURLObj.hostname) return pages + + const normalizeCurrentURL = baseURLObj.protocol + '//' + normalizeURL(currentURL) + if (pages.includes(normalizeCurrentURL)) { + return pages + } + + pages.push(normalizeCurrentURL) + + if (process.env.DEBUG === 'true') console.info(`actively crawling ${currentURL}`) + try { + const resp = await fetch(currentURL) + + if (resp.status > 399) { + if (process.env.DEBUG === 'true') console.error(`error in fetch with status code: ${resp.status}, on page: ${currentURL}`) + return pages + } + + const contentType: string | null = resp.headers.get('content-type') + if ((contentType && !contentType.includes('text/html')) || !contentType) { + if (process.env.DEBUG === 'true') console.error(`non html response, content type: ${contentType}, on page: ${currentURL}`) + return pages + } + + const htmlBody = await resp.text() + const nextURLs = getURLsFromHTML(htmlBody, baseURL) + for (const nextURL of nextURLs) { + pages = await crawl(baseURL, nextURL, pages, limit) + } + } catch (err) { + if (process.env.DEBUG === 'true') console.error(`error in fetch url: ${err.message}, on page: ${currentURL}`) + } + return pages +} + +/** + * Prep URL before passing into recursive carwl function + * @param {string} stringURL + * @param {number} limit + * @returns {Promise} + */ +export async function webCrawl(stringURL: string, limit: number): Promise { + const URLObj = new URL(stringURL) + const modifyURL = stringURL.slice(-1) === '/' ? stringURL.slice(0, -1) : stringURL + return await crawl(URLObj.protocol + '//' + URLObj.hostname, modifyURL, [], limit) +} + +export function getURLsFromXML(xmlBody: string, limit: number): string[] { + const dom = new JSDOM(xmlBody, { contentType: 'text/xml' }) + const linkElements = dom.window.document.querySelectorAll('url') + const urls: string[] = [] + for (const linkElement of linkElements) { + const locElement = linkElement.querySelector('loc') + if (limit !== 0 && urls.length === limit) break + if (locElement?.textContent) { + urls.push(locElement.textContent) + } + } + return urls +} + +export async function xmlScrape(currentURL: string, limit: number): Promise { + let urls: string[] = [] + if (process.env.DEBUG === 'true') console.info(`actively scarping ${currentURL}`) + try { + const resp = await fetch(currentURL) + + if (resp.status > 399) { + if (process.env.DEBUG === 'true') console.error(`error in fetch with status code: ${resp.status}, on page: ${currentURL}`) + return urls + } + + const contentType: string | null = resp.headers.get('content-type') + if ((contentType && !contentType.includes('application/xml') && !contentType.includes('text/xml')) || !contentType) { + if (process.env.DEBUG === 'true') console.error(`non xml response, content type: ${contentType}, on page: ${currentURL}`) + return urls + } + + const xmlBody = await resp.text() + urls = getURLsFromXML(xmlBody, limit) + } catch (err) { + if (process.env.DEBUG === 'true') console.error(`error in fetch url: ${err.message}, on page: ${currentURL}`) + } + return urls +} + +/** + * Get env variables + * @param {string} name + * @returns {string | undefined} + */ +export const getEnvironmentVariable = (name: string): string | undefined => { + try { + return typeof process !== 'undefined' ? process.env?.[name] : undefined + } catch (e) { + return undefined + } +} + +/** + * Returns the path of encryption key + * @returns {string} + */ +const getEncryptionKeyFilePath = (): string => { + const checkPaths = [ + path.join(__dirname, '..', '..', 'encryption.key'), + path.join(__dirname, '..', '..', 'server', 'encryption.key'), + path.join(__dirname, '..', '..', '..', 'encryption.key'), + path.join(__dirname, '..', '..', '..', 'server', 'encryption.key'), + path.join(__dirname, '..', '..', '..', '..', 'encryption.key'), + path.join(__dirname, '..', '..', '..', '..', 'server', 'encryption.key'), + path.join(__dirname, '..', '..', '..', '..', '..', 'encryption.key'), + path.join(__dirname, '..', '..', '..', '..', '..', 'server', 'encryption.key') + ] + for (const checkPath of checkPaths) { + if (fs.existsSync(checkPath)) { + return checkPath + } + } + return '' +} + +const getEncryptionKeyPath = (): string => { + return process.env.SECRETKEY_PATH ? path.join(process.env.SECRETKEY_PATH, 'encryption.key') : getEncryptionKeyFilePath() +} + +/** + * Returns the encryption key + * @returns {Promise} + */ +const getEncryptionKey = async (): Promise => { + try { + return await fs.promises.readFile(getEncryptionKeyPath(), 'utf8') + } catch (error) { + throw new Error(error) + } +} + +/** + * Decrypt credential data + * @param {string} encryptedData + * @param {string} componentCredentialName + * @param {IComponentCredentials} componentCredentials + * @returns {Promise} + */ +const decryptCredentialData = async (encryptedData: string): Promise => { + const encryptKey = await getEncryptionKey() + const decryptedData = AES.decrypt(encryptedData, encryptKey) + try { + return JSON.parse(decryptedData.toString(enc.Utf8)) + } catch (e) { + console.error(e) + throw new Error('Credentials could not be decrypted.') + } +} + +/** + * Get credential data + * @param {string} selectedCredentialId + * @param {ICommonObject} options + * @returns {Promise} + */ +export const getCredentialData = async (selectedCredentialId: string, options: ICommonObject): Promise => { + const appDataSource = options.appDataSource as DataSource + const databaseEntities = options.databaseEntities as IDatabaseEntity + + try { + const credential = await appDataSource.getRepository(databaseEntities['Credential']).findOneBy({ + id: selectedCredentialId + }) + + if (!credential) return {} + + // Decrpyt credentialData + const decryptedCredentialData = await decryptCredentialData(credential.encryptedData) + + return decryptedCredentialData + } catch (e) { + throw new Error(e) + } +} + +export const getCredentialParam = (paramName: string, credentialData: ICommonObject, nodeData: INodeData): any => { + return (nodeData.inputs as ICommonObject)[paramName] ?? credentialData[paramName] ?? undefined +} + +// reference https://www.freeformatter.com/json-escape.html +const jsonEscapeCharacters = [ + { escape: '"', value: 'FLOWISE_DOUBLE_QUOTE' }, + { escape: '\n', value: 'FLOWISE_NEWLINE' }, + { escape: '\b', value: 'FLOWISE_BACKSPACE' }, + { escape: '\f', value: 'FLOWISE_FORM_FEED' }, + { escape: '\r', value: 'FLOWISE_CARRIAGE_RETURN' }, + { escape: '\t', value: 'FLOWISE_TAB' }, + { escape: '\\', value: 'FLOWISE_BACKSLASH' } +] + +function handleEscapesJSONParse(input: string, reverse: Boolean): string { + for (const element of jsonEscapeCharacters) { + input = reverse ? input.replaceAll(element.value, element.escape) : input.replaceAll(element.escape, element.value) + } + return input +} + +function iterateEscapesJSONParse(input: any, reverse: Boolean): any { + for (const element in input) { + const type = typeof input[element] + if (type === 'string') input[element] = handleEscapesJSONParse(input[element], reverse) + else if (type === 'object') input[element] = iterateEscapesJSONParse(input[element], reverse) + } + return input +} + +export function handleEscapeCharacters(input: any, reverse: Boolean): any { + const type = typeof input + if (type === 'string') return handleEscapesJSONParse(input, reverse) + else if (type === 'object') return iterateEscapesJSONParse(input, reverse) + return input +} + +/** + * Get user home dir + * @returns {string} + */ +export const getUserHome = (): string => { + let variableName = 'HOME' + if (process.platform === 'win32') { + variableName = 'USERPROFILE' + } + + if (process.env[variableName] === undefined) { + // If for some reason the variable does not exist, fall back to current folder + return process.cwd() + } + return process.env[variableName] as string +} + +/** + * Map incoming chat history to ChatMessageHistory + * @param {options} ICommonObject + * @returns {ChatMessageHistory} + */ +export const mapChatHistory = (options: ICommonObject): ChatMessageHistory => { + const chatHistory = [] + const histories: IMessage[] = options.chatHistory ?? [] + + for (const message of histories) { + if (message.type === 'apiMessage') { + chatHistory.push(new AIMessage(message.message)) + } else if (message.type === 'userMessage') { + chatHistory.push(new HumanMessage(message.message)) + } + } + return new ChatMessageHistory(chatHistory) +} + +/** + * Convert incoming chat history to string + * @param {IMessage[]} chatHistory + * @returns {string} + */ +export const convertChatHistoryToText = (chatHistory: IMessage[]): string => { + return chatHistory + .map((chatMessage) => { + if (chatMessage.type === 'apiMessage') { + return `Assistant: ${chatMessage.message}` + } else if (chatMessage.type === 'userMessage') { + return `Human: ${chatMessage.message}` + } else { + return `${chatMessage.message}` + } + }) + .join('\n') +} diff --git a/packages/components/tsconfig.json b/packages/components/tsconfig.json index 2002d62f7..d213dabc1 100644 --- a/packages/components/tsconfig.json +++ b/packages/components/tsconfig.json @@ -1,6 +1,6 @@ { "compilerOptions": { - "lib": ["ES2020"], + "lib": ["ES2020", "ES2021.String"], "experimentalDecorators": true /* Enable experimental support for TC39 stage 2 draft decorators. */, "emitDecoratorMetadata": true /* Emit design-type metadata for decorated declarations in source files. */, "target": "ES2020", // or higher @@ -16,5 +16,5 @@ "declaration": true, "module": "commonjs" }, - "include": ["src", "nodes"] + "include": ["src", "nodes", "credentials"] } diff --git a/packages/server/.env.example b/packages/server/.env.example index 2131b8d15..bedbf6381 100644 --- a/packages/server/.env.example +++ b/packages/server/.env.example @@ -1,4 +1,26 @@ PORT=3000 -# USERNAME=user -# PASSWORD=1234 -# EXECUTION_MODE=child or main \ No newline at end of file +PASSPHRASE=MYPASSPHRASE # Passphrase used to create encryption key +# DATABASE_PATH=/your_database_path/.flowise +# APIKEY_PATH=/your_api_key_path/.flowise +# SECRETKEY_PATH=/your_api_key_path/.flowise +# LOG_PATH=/your_log_path/.flowise/logs + +# DATABASE_TYPE=postgres +# DATABASE_PORT="" +# DATABASE_HOST="" +# DATABASE_NAME="flowise" +# DATABASE_USER="" +# DATABASE_PASSWORD="" +# OVERRIDE_DATABASE=true + +# FLOWISE_USERNAME=user +# FLOWISE_PASSWORD=1234 +# DEBUG=true +# LOG_LEVEL=debug (error | warn | info | verbose | debug) +# TOOL_FUNCTION_BUILTIN_DEP=crypto,fs +# TOOL_FUNCTION_EXTERNAL_DEP=moment,lodash + +# LANGCHAIN_TRACING_V2=true +# LANGCHAIN_ENDPOINT=https://api.smith.langchain.com +# LANGCHAIN_API_KEY=your_api_key +# LANGCHAIN_PROJECT=your_project diff --git a/packages/server/README-ZH.md b/packages/server/README-ZH.md new file mode 100644 index 000000000..e58f08bfa --- /dev/null +++ b/packages/server/README-ZH.md @@ -0,0 +1,100 @@ + + +# Flowise - 低代码 LLM 应用程序构建器 + +[English](./README.md) | 中文 + +![Flowise](https://github.com/FlowiseAI/Flowise/blob/main/images/flowise.gif?raw=true) + +拖放界面来构建自定义的 LLM 流程 + +## ⚡ 快速入门 + +1. 安装 Flowise + ```bash + npm install -g flowise + ``` +2. 启动 Flowise + + ```bash + npx flowise start + ``` + +3. 打开[http://localhost:3000](http://localhost:3000) + +## 🔒 身份验证 + +要启用应用级身份验证,请将`FLOWISE_USERNAME`和`FLOWISE_PASSWORD`添加到`.env`文件中: + +``` +FLOWISE_USERNAME=user +FLOWISE_PASSWORD=1234 +``` + +## 🌱 环境变量 + +Flowise 支持不同的环境变量来配置您的实例。您可以在`packages/server`文件夹中的`.env`文件中指定以下变量。阅读[更多](https://docs.flowiseai.com/environment-variables) + +| 变量 | 描述 | 类型 | 默认值 | +| -------------------------- | ------------------------------------------------------ | ----------------------------------------------- | ----------------------------------- | +| PORT | Flowise 运行的 HTTP 端口 | 数字 | 3000 | +| FLOWISE_USERNAME | 登录的用户名 | 字符串 | | +| FLOWISE_PASSWORD | 登录的密码 | 字符串 | | +| DEBUG | 打印组件的日志 | 布尔值 | | +| LOG_PATH | 存储日志文件的位置 | 字符串 | `your-path/Flowise/logs` | +| LOG_LEVEL | 日志的不同级别 | 枚举字符串:`error`、`info`、`verbose`、`debug` | `info` | +| APIKEY_PATH | 存储 API 密钥的位置 | 字符串 | `your-path/Flowise/packages/server` | +| TOOL_FUNCTION_BUILTIN_DEP | 用于工具函数的 NodeJS 内置模块 | 字符串 | | +| TOOL_FUNCTION_EXTERNAL_DEP | 用于工具函数的外部模块 | 字符串 | | +| OVERRIDE_DATABASE | 使用默认值覆盖当前数据库 | 枚举字符串:`true`、`false` | `true` | +| DATABASE_TYPE | 存储 flowise 数据的数据库类型 | 枚举字符串:`sqlite`、`mysql`、`postgres` | `sqlite` | +| DATABASE_PATH | 数据库的保存位置(当 DATABASE_TYPE 为 sqlite 时) | 字符串 | `your-home-dir/.flowise` | +| DATABASE_HOST | 主机 URL 或 IP 地址(当 DATABASE_TYPE 不为 sqlite 时) | 字符串 | | +| DATABASE_PORT | 数据库端口(当 DATABASE_TYPE 不为 sqlite 时) | 字符串 | | +| DATABASE_USERNAME | 数据库用户名(当 DATABASE_TYPE 不为 sqlite 时) | 字符串 | | +| DATABASE_PASSWORD | 数据库密码(当 DATABASE_TYPE 不为 sqlite 时) | 字符串 | | +| DATABASE_NAME | 数据库名称(当 DATABASE_TYPE 不为 sqlite 时) | 字符串 | | + +您还可以在使用`npx`时指定环境变量。例如: + +``` +npx flowise start --PORT=3000 --DEBUG=true +``` + +## 📖 文档 + +[Flowise 文档](https://docs.flowiseai.com/) + +## 🌐 自托管 + +### [Railway](https://docs.flowiseai.com/deployment/railway) + +[![在Railway上部署](https://railway.app/button.svg)](https://railway.app/template/YK7J0v) + +### [Render](https://docs.flowiseai.com/deployment/render) + +[![部署到Render](https://render.com/images/deploy-to-render-button.svg)](https://docs.flowiseai.com/deployment/render) + +### [AWS](https://docs.flowiseai.com/deployment/aws) + +### [Azure](https://docs.flowiseai.com/deployment/azure) + +### [DigitalOcean](https://docs.flowiseai.com/deployment/digital-ocean) + +### [GCP](https://docs.flowiseai.com/deployment/gcp) + +## 💻 云托管 + +即将推出 + +## 🙋 支持 + +在[讨论区](https://github.com/FlowiseAI/Flowise/discussions)中随时提出任何问题、报告问题和请求新功能。 + +## 🙌 贡献 + +请参阅[贡献指南](https://github.com/FlowiseAI/Flowise/blob/master/CONTRIBUTING.md)。如果您有任何问题或问题,请在[Discord](https://discord.gg/jbaHfsRVBW)上与我们联系。 + +## 📄 许可证 + +本仓库中的源代码在[MIT 许可证](https://github.com/FlowiseAI/Flowise/blob/master/LICENSE.md)下提供。 diff --git a/packages/server/README.md b/packages/server/README.md index 1915863b9..de36549c6 100644 --- a/packages/server/README.md +++ b/packages/server/README.md @@ -1,10 +1,12 @@ -# Flowise - LangchainJS UI +# Flowise - Low-Code LLM apps builder + +English | [中文](./README-ZH.md) ![Flowise](https://github.com/FlowiseAI/Flowise/blob/main/images/flowise.gif?raw=true) -Drag & drop UI to build your customized LLM flow using [LangchainJS](https://github.com/hwchase17/langchainjs) +Drag & drop UI to build your customized LLM flow ## ⚡Quick Start @@ -22,23 +24,47 @@ Drag & drop UI to build your customized LLM flow using [LangchainJS](https://git ## 🔒 Authentication -To enable app level authentication, add `USERNAME` and `PASSWORD` to the `.env` file: +To enable app level authentication, add `FLOWISE_USERNAME` and `FLOWISE_PASSWORD` to the `.env` file: ``` -USERNAME=user -PASSWORD=1234 +FLOWISE_USERNAME=user +FLOWISE_PASSWORD=1234 +``` + +## 🌱 Env Variables + +Flowise support different environment variables to configure your instance. You can specify the following variables in the `.env` file inside `packages/server` folder. Read [more](https://github.com/FlowiseAI/Flowise/blob/main/CONTRIBUTING.md#-env-variables) + +You can also specify the env variables when using `npx`. For example: + +``` +npx flowise start --PORT=3000 --DEBUG=true ``` ## 📖 Documentation -Coming Soon - -## 💻 Cloud Hosted - -Coming Soon +[Flowise Docs](https://docs.flowiseai.com/) ## 🌐 Self Host +### [Railway](https://docs.flowiseai.com/deployment/railway) + +[![Deploy on Railway](https://railway.app/button.svg)](https://railway.app/template/YK7J0v) + +### [Render](https://docs.flowiseai.com/deployment/render) + +[![Deploy to Render](https://render.com/images/deploy-to-render-button.svg)](https://docs.flowiseai.com/deployment/render) + +### [AWS](https://docs.flowiseai.com/deployment/aws) + +### [Azure](https://docs.flowiseai.com/deployment/azure) + +### [DigitalOcean](https://docs.flowiseai.com/deployment/digital-ocean) + +### [GCP](https://docs.flowiseai.com/deployment/gcp) + +## 💻 Cloud Hosted + Coming Soon ## 🙋 Support diff --git a/packages/server/marketplaces/chatflows/API Agent OpenAI.json b/packages/server/marketplaces/chatflows/API Agent OpenAI.json new file mode 100644 index 000000000..01e3d8f99 --- /dev/null +++ b/packages/server/marketplaces/chatflows/API Agent OpenAI.json @@ -0,0 +1,650 @@ +{ + "description": "Use OpenAI Function Agent and Chain to automatically decide which API to call, generating url and body request from conversation", + "nodes": [ + { + "width": 300, + "height": 510, + "id": "openApiChain_1", + "position": { + "x": 1203.1825726424859, + "y": 300.7226683414998 + }, + "type": "customNode", + "data": { + "id": "openApiChain_1", + "label": "OpenAPI Chain", + "name": "openApiChain", + "version": 1, + "type": "OpenAPIChain", + "baseClasses": ["OpenAPIChain", "BaseChain"], + "category": "Chains", + "description": "Chain that automatically select and call APIs based only on an OpenAPI spec", + "inputParams": [ + { + "label": "YAML Link", + "name": "yamlLink", + "type": "string", + "placeholder": "https://api.speak.com/openapi.yaml", + "description": "If YAML link is provided, uploaded YAML File will be ignored and YAML link will be used instead", + "id": "openApiChain_1-input-yamlLink-string" + }, + { + "label": "YAML File", + "name": "yamlFile", + "type": "file", + "fileType": ".yaml", + "description": "If YAML link is provided, uploaded YAML File will be ignored and YAML link will be used instead", + "id": "openApiChain_1-input-yamlFile-file" + }, + { + "label": "Headers", + "name": "headers", + "type": "json", + "additionalParams": true, + "optional": true, + "id": "openApiChain_1-input-headers-json" + } + ], + "inputAnchors": [ + { + "label": "ChatOpenAI Model", + "name": "model", + "type": "ChatOpenAI", + "id": "openApiChain_1-input-model-ChatOpenAI" + } + ], + "inputs": { + "model": "{{chatOpenAI_1.data.instance}}", + "yamlLink": "https://gist.githubusercontent.com/roaldnefs/053e505b2b7a807290908fe9aa3e1f00/raw/0a212622ebfef501163f91e23803552411ed00e4/openapi.yaml", + "headers": "" + }, + "outputAnchors": [ + { + "id": "openApiChain_1-output-openApiChain-OpenAPIChain|BaseChain", + "name": "openApiChain", + "label": "OpenAPIChain", + "type": "OpenAPIChain | BaseChain" + } + ], + "outputs": {}, + "selected": false + }, + "selected": false, + "positionAbsolute": { + "x": 1203.1825726424859, + "y": 300.7226683414998 + }, + "dragging": false + }, + { + "width": 300, + "height": 523, + "id": "chatOpenAI_1", + "position": { + "x": 792.3201947594027, + "y": 293.61889966751846 + }, + "type": "customNode", + "data": { + "id": "chatOpenAI_1", + "label": "ChatOpenAI", + "name": "chatOpenAI", + "version": 1, + "type": "ChatOpenAI", + "baseClasses": ["ChatOpenAI", "BaseChatModel", "BaseLanguageModel"], + "category": "Chat Models", + "description": "Wrapper around OpenAI large language models that use the Chat endpoint", + "inputParams": [ + { + "label": "Connect Credential", + "name": "credential", + "type": "credential", + "credentialNames": ["openAIApi"], + "id": "chatOpenAI_1-input-credential-credential" + }, + { + "label": "Model Name", + "name": "modelName", + "type": "options", + "options": [ + { + "label": "gpt-4", + "name": "gpt-4" + }, + { + "label": "gpt-4-0613", + "name": "gpt-4-0613" + }, + { + "label": "gpt-4-32k", + "name": "gpt-4-32k" + }, + { + "label": "gpt-4-32k-0613", + "name": "gpt-4-32k-0613" + }, + { + "label": "gpt-3.5-turbo", + "name": "gpt-3.5-turbo" + }, + { + "label": "gpt-3.5-turbo-0613", + "name": "gpt-3.5-turbo-0613" + }, + { + "label": "gpt-3.5-turbo-16k", + "name": "gpt-3.5-turbo-16k" + }, + { + "label": "gpt-3.5-turbo-16k-0613", + "name": "gpt-3.5-turbo-16k-0613" + } + ], + "default": "gpt-3.5-turbo", + "optional": true, + "id": "chatOpenAI_1-input-modelName-options" + }, + { + "label": "Temperature", + "name": "temperature", + "type": "number", + "default": 0.9, + "optional": true, + "id": "chatOpenAI_1-input-temperature-number" + }, + { + "label": "Max Tokens", + "name": "maxTokens", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_1-input-maxTokens-number" + }, + { + "label": "Top Probability", + "name": "topP", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_1-input-topP-number" + }, + { + "label": "Frequency Penalty", + "name": "frequencyPenalty", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_1-input-frequencyPenalty-number" + }, + { + "label": "Presence Penalty", + "name": "presencePenalty", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_1-input-presencePenalty-number" + }, + { + "label": "Timeout", + "name": "timeout", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_1-input-timeout-number" + }, + { + "label": "BasePath", + "name": "basepath", + "type": "string", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_1-input-basepath-string" + } + ], + "inputAnchors": [], + "inputs": { + "modelName": "gpt-3.5-turbo", + "temperature": 0.9, + "maxTokens": "", + "topP": "", + "frequencyPenalty": "", + "presencePenalty": "", + "timeout": "", + "basepath": "" + }, + "outputAnchors": [ + { + "id": "chatOpenAI_1-output-chatOpenAI-ChatOpenAI|BaseChatModel|BaseLanguageModel", + "name": "chatOpenAI", + "label": "ChatOpenAI", + "type": "ChatOpenAI | BaseChatModel | BaseLanguageModel" + } + ], + "outputs": {}, + "selected": false + }, + "selected": false, + "positionAbsolute": { + "x": 792.3201947594027, + "y": 293.61889966751846 + }, + "dragging": false + }, + { + "width": 300, + "height": 602, + "id": "chainTool_0", + "position": { + "x": 1635.3466862861876, + "y": 272.3189405402944 + }, + "type": "customNode", + "data": { + "id": "chainTool_0", + "label": "Chain Tool", + "name": "chainTool", + "version": 1, + "type": "ChainTool", + "baseClasses": ["ChainTool", "DynamicTool", "Tool", "StructuredTool"], + "category": "Tools", + "description": "Use a chain as allowed tool for agent", + "inputParams": [ + { + "label": "Chain Name", + "name": "name", + "type": "string", + "placeholder": "state-of-union-qa", + "id": "chainTool_0-input-name-string" + }, + { + "label": "Chain Description", + "name": "description", + "type": "string", + "rows": 3, + "placeholder": "State of the Union QA - useful for when you need to ask questions about the most recent state of the union address.", + "id": "chainTool_0-input-description-string" + }, + { + "label": "Return Direct", + "name": "returnDirect", + "type": "boolean", + "optional": true, + "id": "chainTool_0-input-returnDirect-boolean" + } + ], + "inputAnchors": [ + { + "label": "Base Chain", + "name": "baseChain", + "type": "BaseChain", + "id": "chainTool_0-input-baseChain-BaseChain" + } + ], + "inputs": { + "name": "comic-qa", + "description": "useful for when you need to ask question about comic", + "returnDirect": "", + "baseChain": "{{openApiChain_1.data.instance}}" + }, + "outputAnchors": [ + { + "id": "chainTool_0-output-chainTool-ChainTool|DynamicTool|Tool|StructuredTool", + "name": "chainTool", + "label": "ChainTool", + "type": "ChainTool | DynamicTool | Tool | StructuredTool" + } + ], + "outputs": {}, + "selected": false + }, + "selected": false, + "positionAbsolute": { + "x": 1635.3466862861876, + "y": 272.3189405402944 + }, + "dragging": false + }, + { + "width": 300, + "height": 383, + "id": "openAIFunctionAgent_0", + "position": { + "x": 2076.1829525256576, + "y": 706.1299276365058 + }, + "type": "customNode", + "data": { + "id": "openAIFunctionAgent_0", + "label": "OpenAI Function Agent", + "name": "openAIFunctionAgent", + "version": 1, + "type": "AgentExecutor", + "baseClasses": ["AgentExecutor", "BaseChain"], + "category": "Agents", + "description": "An agent that uses OpenAI's Function Calling functionality to pick the tool and args to call", + "inputParams": [ + { + "label": "System Message", + "name": "systemMessage", + "type": "string", + "rows": 4, + "optional": true, + "additionalParams": true, + "id": "openAIFunctionAgent_0-input-systemMessage-string" + } + ], + "inputAnchors": [ + { + "label": "Allowed Tools", + "name": "tools", + "type": "Tool", + "list": true, + "id": "openAIFunctionAgent_0-input-tools-Tool" + }, + { + "label": "Memory", + "name": "memory", + "type": "BaseChatMemory", + "id": "openAIFunctionAgent_0-input-memory-BaseChatMemory" + }, + { + "label": "OpenAI Chat Model", + "name": "model", + "description": "Only works with gpt-3.5-turbo-0613 and gpt-4-0613. Refer docs for more info", + "type": "BaseChatModel", + "id": "openAIFunctionAgent_0-input-model-BaseChatModel" + } + ], + "inputs": { + "tools": ["{{chainTool_0.data.instance}}"], + "memory": "{{bufferMemory_0.data.instance}}", + "model": "{{chatOpenAI_2.data.instance}}", + "systemMessage": "" + }, + "outputAnchors": [ + { + "id": "openAIFunctionAgent_0-output-openAIFunctionAgent-AgentExecutor|BaseChain", + "name": "openAIFunctionAgent", + "label": "AgentExecutor", + "type": "AgentExecutor | BaseChain" + } + ], + "outputs": {}, + "selected": false + }, + "selected": false, + "positionAbsolute": { + "x": 2076.1829525256576, + "y": 706.1299276365058 + }, + "dragging": false + }, + { + "width": 300, + "height": 523, + "id": "chatOpenAI_2", + "position": { + "x": 1645.450699499575, + "y": 992.6341744217375 + }, + "type": "customNode", + "data": { + "id": "chatOpenAI_2", + "label": "ChatOpenAI", + "name": "chatOpenAI", + "version": 1, + "type": "ChatOpenAI", + "baseClasses": ["ChatOpenAI", "BaseChatModel", "BaseLanguageModel"], + "category": "Chat Models", + "description": "Wrapper around OpenAI large language models that use the Chat endpoint", + "inputParams": [ + { + "label": "Connect Credential", + "name": "credential", + "type": "credential", + "credentialNames": ["openAIApi"], + "id": "chatOpenAI_2-input-credential-credential" + }, + { + "label": "Model Name", + "name": "modelName", + "type": "options", + "options": [ + { + "label": "gpt-4", + "name": "gpt-4" + }, + { + "label": "gpt-4-0613", + "name": "gpt-4-0613" + }, + { + "label": "gpt-4-32k", + "name": "gpt-4-32k" + }, + { + "label": "gpt-4-32k-0613", + "name": "gpt-4-32k-0613" + }, + { + "label": "gpt-3.5-turbo", + "name": "gpt-3.5-turbo" + }, + { + "label": "gpt-3.5-turbo-0613", + "name": "gpt-3.5-turbo-0613" + }, + { + "label": "gpt-3.5-turbo-16k", + "name": "gpt-3.5-turbo-16k" + }, + { + "label": "gpt-3.5-turbo-16k-0613", + "name": "gpt-3.5-turbo-16k-0613" + } + ], + "default": "gpt-3.5-turbo", + "optional": true, + "id": "chatOpenAI_2-input-modelName-options" + }, + { + "label": "Temperature", + "name": "temperature", + "type": "number", + "default": 0.9, + "optional": true, + "id": "chatOpenAI_2-input-temperature-number" + }, + { + "label": "Max Tokens", + "name": "maxTokens", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_2-input-maxTokens-number" + }, + { + "label": "Top Probability", + "name": "topP", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_2-input-topP-number" + }, + { + "label": "Frequency Penalty", + "name": "frequencyPenalty", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_2-input-frequencyPenalty-number" + }, + { + "label": "Presence Penalty", + "name": "presencePenalty", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_2-input-presencePenalty-number" + }, + { + "label": "Timeout", + "name": "timeout", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_2-input-timeout-number" + }, + { + "label": "BasePath", + "name": "basepath", + "type": "string", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_2-input-basepath-string" + } + ], + "inputAnchors": [], + "inputs": { + "modelName": "gpt-3.5-turbo", + "temperature": 0.9, + "maxTokens": "", + "topP": "", + "frequencyPenalty": "", + "presencePenalty": "", + "timeout": "", + "basepath": "" + }, + "outputAnchors": [ + { + "id": "chatOpenAI_2-output-chatOpenAI-ChatOpenAI|BaseChatModel|BaseLanguageModel", + "name": "chatOpenAI", + "label": "ChatOpenAI", + "type": "ChatOpenAI | BaseChatModel | BaseLanguageModel" + } + ], + "outputs": {}, + "selected": false + }, + "selected": false, + "positionAbsolute": { + "x": 1645.450699499575, + "y": 992.6341744217375 + }, + "dragging": false + }, + { + "width": 300, + "height": 376, + "id": "bufferMemory_0", + "position": { + "x": 1148.8461056155377, + "y": 967.8215757228843 + }, + "type": "customNode", + "data": { + "id": "bufferMemory_0", + "label": "Buffer Memory", + "name": "bufferMemory", + "version": 1, + "type": "BufferMemory", + "baseClasses": ["BufferMemory", "BaseChatMemory", "BaseMemory"], + "category": "Memory", + "description": "Remembers previous conversational back and forths directly", + "inputParams": [ + { + "label": "Memory Key", + "name": "memoryKey", + "type": "string", + "default": "chat_history", + "id": "bufferMemory_0-input-memoryKey-string" + }, + { + "label": "Input Key", + "name": "inputKey", + "type": "string", + "default": "input", + "id": "bufferMemory_0-input-inputKey-string" + } + ], + "inputAnchors": [], + "inputs": { + "memoryKey": "chat_history", + "inputKey": "input" + }, + "outputAnchors": [ + { + "id": "bufferMemory_0-output-bufferMemory-BufferMemory|BaseChatMemory|BaseMemory", + "name": "bufferMemory", + "label": "BufferMemory", + "type": "BufferMemory | BaseChatMemory | BaseMemory" + } + ], + "outputs": {}, + "selected": false + }, + "positionAbsolute": { + "x": 1148.8461056155377, + "y": 967.8215757228843 + }, + "selected": false + } + ], + "edges": [ + { + "source": "chatOpenAI_1", + "sourceHandle": "chatOpenAI_1-output-chatOpenAI-ChatOpenAI|BaseChatModel|BaseLanguageModel", + "target": "openApiChain_1", + "targetHandle": "openApiChain_1-input-model-ChatOpenAI", + "type": "buttonedge", + "id": "chatOpenAI_1-chatOpenAI_1-output-chatOpenAI-ChatOpenAI|BaseChatModel|BaseLanguageModel-openApiChain_1-openApiChain_1-input-model-ChatOpenAI", + "data": { + "label": "" + } + }, + { + "source": "openApiChain_1", + "sourceHandle": "openApiChain_1-output-openApiChain-OpenAPIChain|BaseChain", + "target": "chainTool_0", + "targetHandle": "chainTool_0-input-baseChain-BaseChain", + "type": "buttonedge", + "id": "openApiChain_1-openApiChain_1-output-openApiChain-OpenAPIChain|BaseChain-chainTool_0-chainTool_0-input-baseChain-BaseChain", + "data": { + "label": "" + } + }, + { + "source": "chainTool_0", + "sourceHandle": "chainTool_0-output-chainTool-ChainTool|DynamicTool|Tool|StructuredTool", + "target": "openAIFunctionAgent_0", + "targetHandle": "openAIFunctionAgent_0-input-tools-Tool", + "type": "buttonedge", + "id": "chainTool_0-chainTool_0-output-chainTool-ChainTool|DynamicTool|Tool|StructuredTool-openAIFunctionAgent_0-openAIFunctionAgent_0-input-tools-Tool", + "data": { + "label": "" + } + }, + { + "source": "chatOpenAI_2", + "sourceHandle": "chatOpenAI_2-output-chatOpenAI-ChatOpenAI|BaseChatModel|BaseLanguageModel", + "target": "openAIFunctionAgent_0", + "targetHandle": "openAIFunctionAgent_0-input-model-BaseChatModel", + "type": "buttonedge", + "id": "chatOpenAI_2-chatOpenAI_2-output-chatOpenAI-ChatOpenAI|BaseChatModel|BaseLanguageModel-openAIFunctionAgent_0-openAIFunctionAgent_0-input-model-BaseChatModel", + "data": { + "label": "" + } + }, + { + "source": "bufferMemory_0", + "sourceHandle": "bufferMemory_0-output-bufferMemory-BufferMemory|BaseChatMemory|BaseMemory", + "target": "openAIFunctionAgent_0", + "targetHandle": "openAIFunctionAgent_0-input-memory-BaseChatMemory", + "type": "buttonedge", + "id": "bufferMemory_0-bufferMemory_0-output-bufferMemory-BufferMemory|BaseChatMemory|BaseMemory-openAIFunctionAgent_0-openAIFunctionAgent_0-input-memory-BaseChatMemory", + "data": { + "label": "" + } + } + ] +} diff --git a/packages/server/marketplaces/chatflows/API Agent.json b/packages/server/marketplaces/chatflows/API Agent.json new file mode 100644 index 000000000..b9862add3 --- /dev/null +++ b/packages/server/marketplaces/chatflows/API Agent.json @@ -0,0 +1,1015 @@ +{ + "description": "Given API docs, agent automatically decide which API to call, generating url and body request from conversation", + "nodes": [ + { + "width": 300, + "height": 459, + "id": "getApiChain_0", + "position": { + "x": 1222.6923202234623, + "y": 359.97676456347756 + }, + "type": "customNode", + "data": { + "id": "getApiChain_0", + "label": "GET API Chain", + "name": "getApiChain", + "version": 1, + "type": "GETApiChain", + "baseClasses": ["GETApiChain", "BaseChain", "BaseLangChain"], + "category": "Chains", + "description": "Chain to run queries against GET API", + "inputParams": [ + { + "label": "API Documentation", + "name": "apiDocs", + "type": "string", + "description": "Description of how API works. Please refer to more examples", + "rows": 4, + "id": "getApiChain_0-input-apiDocs-string" + }, + { + "label": "Headers", + "name": "headers", + "type": "json", + "additionalParams": true, + "optional": true, + "id": "getApiChain_0-input-headers-json" + }, + { + "label": "URL Prompt", + "name": "urlPrompt", + "type": "string", + "description": "Prompt used to tell LLMs how to construct the URL. Must contains {api_docs} and {question}", + "default": "You are given the below API Documentation:\n{api_docs}\nUsing this documentation, generate the full API url to call for answering the user question.\nYou should build the API url in order to get a response that is as short as possible, while still getting the necessary information to answer the question. Pay attention to deliberately exclude any unnecessary pieces of data in the API call.\n\nQuestion:{question}\nAPI url:", + "rows": 4, + "additionalParams": true, + "id": "getApiChain_0-input-urlPrompt-string" + }, + { + "label": "Answer Prompt", + "name": "ansPrompt", + "type": "string", + "description": "Prompt used to tell LLMs how to return the API response. Must contains {api_response}, {api_url}, and {question}", + "default": "Given this {api_response} response for {api_url}. use the given response to answer this {question}", + "rows": 4, + "additionalParams": true, + "id": "getApiChain_0-input-ansPrompt-string" + } + ], + "inputAnchors": [ + { + "label": "Language Model", + "name": "model", + "type": "BaseLanguageModel", + "id": "getApiChain_0-input-model-BaseLanguageModel" + } + ], + "inputs": { + "model": "{{chatOpenAI_1.data.instance}}", + "apiDocs": "BASE URL: https://api.open-meteo.com/\n\nAPI Documentation\nThe API endpoint /v1/forecast accepts a geographical coordinate, a list of weather variables and responds with a JSON hourly weather forecast for 7 days. Time always starts at 0:00 today and contains 168 hours. All URL parameters are listed below:\n\nParameter\tFormat\tRequired\tDefault\tDescription\nlatitude, longitude\tFloating point\tYes\t\tGeographical WGS84 coordinate of the location\nhourly\tString array\tNo\t\tA list of weather variables which should be returned. Values can be comma separated, or multiple &hourly= parameter in the URL can be used.\ndaily\tString array\tNo\t\tA list of daily weather variable aggregations which should be returned. Values can be comma separated, or multiple &daily= parameter in the URL can be used. If daily weather variables are specified, parameter timezone is required.\ncurrent_weather\tBool\tNo\tfalse\tInclude current weather conditions in the JSON output.\ntemperature_unit\tString\tNo\tcelsius\tIf fahrenheit is set, all temperature values are converted to Fahrenheit.\nwindspeed_unit\tString\tNo\tkmh\tOther wind speed speed units: ms, mph and kn\nprecipitation_unit\tString\tNo\tmm\tOther precipitation amount units: inch\ntimeformat\tString\tNo\tiso8601\tIf format unixtime is selected, all time values are returned in UNIX epoch time in seconds. Please note that all timestamp are in GMT+0! For daily values with unix timestamps, please apply utc_offset_seconds again to get the correct date.\ntimezone\tString\tNo\tGMT\tIf timezone is set, all timestamps are returned as local-time and data is returned starting at 00:00 local-time. Any time zone name from the time zone database is supported. If auto is set as a time zone, the coordinates will be automatically resolved to the local time zone.\npast_days\tInteger (0-2)\tNo\t0\tIf past_days is set, yesterday or the day before yesterday data are also returned.\nstart_date\nend_date\tString (yyyy-mm-dd)\tNo\t\tThe time interval to get weather data. A day must be specified as an ISO8601 date (e.g. 2022-06-30).\nmodels\tString array\tNo\tauto\tManually select one or more weather models. Per default, the best suitable weather models will be combined.\n\nHourly Parameter Definition\nThe parameter &hourly= accepts the following values. Most weather variables are given as an instantaneous value for the indicated hour. Some variables like precipitation are calculated from the preceding hour as an average or sum.\n\nVariable\tValid time\tUnit\tDescription\ntemperature_2m\tInstant\t°C (°F)\tAir temperature at 2 meters above ground\nsnowfall\tPreceding hour sum\tcm (inch)\tSnowfall amount of the preceding hour in centimeters. For the water equivalent in millimeter, divide by 7. E.g. 7 cm snow = 10 mm precipitation water equivalent\nrain\tPreceding hour sum\tmm (inch)\tRain from large scale weather systems of the preceding hour in millimeter\nshowers\tPreceding hour sum\tmm (inch)\tShowers from convective precipitation in millimeters from the preceding hour\nweathercode\tInstant\tWMO code\tWeather condition as a numeric code. Follow WMO weather interpretation codes. See table below for details.\nsnow_depth\tInstant\tmeters\tSnow depth on the ground\nfreezinglevel_height\tInstant\tmeters\tAltitude above sea level of the 0°C level\nvisibility\tInstant\tmeters\tViewing distance in meters. Influenced by low clouds, humidity and aerosols. Maximum visibility is approximately 24 km.", + "headers": "", + "urlPrompt": "You are given the below API Documentation:\n{api_docs}\nUsing this documentation, generate the full API url to call for answering the user question.\nYou should build the API url in order to get a response that is as short as possible, while still getting the necessary information to answer the question. Pay attention to deliberately exclude any unnecessary pieces of data in the API call.\n\nQuestion:{question}\nAPI url:", + "ansPrompt": "Given this {api_response} response for {api_url}. use the given response to answer this {question}" + }, + "outputAnchors": [ + { + "id": "getApiChain_0-output-getApiChain-GETApiChain|BaseChain|BaseLangChain", + "name": "getApiChain", + "label": "GETApiChain", + "type": "GETApiChain | BaseChain | BaseLangChain" + } + ], + "outputs": {}, + "selected": false + }, + "selected": false, + "positionAbsolute": { + "x": 1222.6923202234623, + "y": 359.97676456347756 + }, + "dragging": false + }, + { + "width": 300, + "height": 602, + "id": "chainTool_0", + "position": { + "x": 1600.1485877701232, + "y": 276.38970893436533 + }, + "type": "customNode", + "data": { + "id": "chainTool_0", + "label": "Chain Tool", + "name": "chainTool", + "version": 1, + "type": "ChainTool", + "baseClasses": ["ChainTool", "DynamicTool", "Tool", "StructuredTool", "BaseLangChain"], + "category": "Tools", + "description": "Use a chain as allowed tool for agent", + "inputParams": [ + { + "label": "Chain Name", + "name": "name", + "type": "string", + "placeholder": "state-of-union-qa", + "id": "chainTool_0-input-name-string" + }, + { + "label": "Chain Description", + "name": "description", + "type": "string", + "rows": 3, + "placeholder": "State of the Union QA - useful for when you need to ask questions about the most recent state of the union address.", + "id": "chainTool_0-input-description-string" + }, + { + "label": "Return Direct", + "name": "returnDirect", + "type": "boolean", + "optional": true, + "id": "chainTool_0-input-returnDirect-boolean" + } + ], + "inputAnchors": [ + { + "label": "Base Chain", + "name": "baseChain", + "type": "BaseChain", + "id": "chainTool_0-input-baseChain-BaseChain" + } + ], + "inputs": { + "name": "weather-qa", + "description": "useful for when you need to ask question about weather", + "returnDirect": "", + "baseChain": "{{getApiChain_0.data.instance}}" + }, + "outputAnchors": [ + { + "id": "chainTool_0-output-chainTool-ChainTool|DynamicTool|Tool|StructuredTool|BaseLangChain", + "name": "chainTool", + "label": "ChainTool", + "type": "ChainTool | DynamicTool | Tool | StructuredTool | BaseLangChain" + } + ], + "outputs": {}, + "selected": false + }, + "selected": false, + "positionAbsolute": { + "x": 1600.1485877701232, + "y": 276.38970893436533 + }, + "dragging": false + }, + { + "width": 300, + "height": 376, + "id": "bufferMemory_0", + "position": { + "x": 1642.0644080121785, + "y": 1715.6131926891728 + }, + "type": "customNode", + "data": { + "id": "bufferMemory_0", + "label": "Buffer Memory", + "name": "bufferMemory", + "version": 1, + "type": "BufferMemory", + "baseClasses": ["BufferMemory", "BaseChatMemory", "BaseMemory"], + "category": "Memory", + "description": "Remembers previous conversational back and forths directly", + "inputParams": [ + { + "label": "Memory Key", + "name": "memoryKey", + "type": "string", + "default": "chat_history", + "id": "bufferMemory_0-input-memoryKey-string" + }, + { + "label": "Input Key", + "name": "inputKey", + "type": "string", + "default": "input", + "id": "bufferMemory_0-input-inputKey-string" + } + ], + "inputAnchors": [], + "inputs": { + "memoryKey": "chat_history", + "inputKey": "input" + }, + "outputAnchors": [ + { + "id": "bufferMemory_0-output-bufferMemory-BufferMemory|BaseChatMemory|BaseMemory", + "name": "bufferMemory", + "label": "BufferMemory", + "type": "BufferMemory | BaseChatMemory | BaseMemory" + } + ], + "outputs": {}, + "selected": false + }, + "selected": false, + "positionAbsolute": { + "x": 1642.0644080121785, + "y": 1715.6131926891728 + }, + "dragging": false + }, + { + "width": 300, + "height": 602, + "id": "chainTool_1", + "position": { + "x": 1284.7746596034926, + "y": 895.1444797047182 + }, + "type": "customNode", + "data": { + "id": "chainTool_1", + "label": "Chain Tool", + "name": "chainTool", + "version": 1, + "type": "ChainTool", + "baseClasses": ["ChainTool", "DynamicTool", "Tool", "StructuredTool", "BaseLangChain"], + "category": "Tools", + "description": "Use a chain as allowed tool for agent", + "inputParams": [ + { + "label": "Chain Name", + "name": "name", + "type": "string", + "placeholder": "state-of-union-qa", + "id": "chainTool_1-input-name-string" + }, + { + "label": "Chain Description", + "name": "description", + "type": "string", + "rows": 3, + "placeholder": "State of the Union QA - useful for when you need to ask questions about the most recent state of the union address.", + "id": "chainTool_1-input-description-string" + }, + { + "label": "Return Direct", + "name": "returnDirect", + "type": "boolean", + "optional": true, + "id": "chainTool_1-input-returnDirect-boolean" + } + ], + "inputAnchors": [ + { + "label": "Base Chain", + "name": "baseChain", + "type": "BaseChain", + "id": "chainTool_1-input-baseChain-BaseChain" + } + ], + "inputs": { + "name": "discord-bot", + "description": "useful for when you need to send message to Discord", + "returnDirect": "", + "baseChain": "{{postApiChain_0.data.instance}}" + }, + "outputAnchors": [ + { + "id": "chainTool_1-output-chainTool-ChainTool|DynamicTool|Tool|StructuredTool|BaseLangChain", + "name": "chainTool", + "label": "ChainTool", + "type": "ChainTool | DynamicTool | Tool | StructuredTool | BaseLangChain" + } + ], + "outputs": {}, + "selected": false + }, + "selected": false, + "positionAbsolute": { + "x": 1284.7746596034926, + "y": 895.1444797047182 + }, + "dragging": false + }, + { + "width": 300, + "height": 459, + "id": "postApiChain_0", + "position": { + "x": 933.3631140153886, + "y": 974.8756002461283 + }, + "type": "customNode", + "data": { + "id": "postApiChain_0", + "label": "POST API Chain", + "name": "postApiChain", + "version": 1, + "type": "POSTApiChain", + "baseClasses": ["POSTApiChain", "BaseChain", "BaseLangChain"], + "category": "Chains", + "description": "Chain to run queries against POST API", + "inputParams": [ + { + "label": "API Documentation", + "name": "apiDocs", + "type": "string", + "description": "Description of how API works. Please refer to more examples", + "rows": 4, + "id": "postApiChain_0-input-apiDocs-string" + }, + { + "label": "Headers", + "name": "headers", + "type": "json", + "additionalParams": true, + "optional": true, + "id": "postApiChain_0-input-headers-json" + }, + { + "label": "URL Prompt", + "name": "urlPrompt", + "type": "string", + "description": "Prompt used to tell LLMs how to construct the URL. Must contains {api_docs} and {question}", + "default": "You are given the below API Documentation:\n{api_docs}\nUsing this documentation, generate a json string with two keys: \"url\" and \"data\".\nThe value of \"url\" should be a string, which is the API url to call for answering the user question.\nThe value of \"data\" should be a dictionary of key-value pairs you want to POST to the url as a JSON body.\nBe careful to always use double quotes for strings in the json string.\nYou should build the json string in order to get a response that is as short as possible, while still getting the necessary information to answer the question. Pay attention to deliberately exclude any unnecessary pieces of data in the API call.\n\nQuestion:{question}\njson string:", + "rows": 4, + "additionalParams": true, + "id": "postApiChain_0-input-urlPrompt-string" + }, + { + "label": "Answer Prompt", + "name": "ansPrompt", + "type": "string", + "description": "Prompt used to tell LLMs how to return the API response. Must contains {api_response}, {api_url}, and {question}", + "default": "You are given the below API Documentation:\n{api_docs}\nUsing this documentation, generate a json string with two keys: \"url\" and \"data\".\nThe value of \"url\" should be a string, which is the API url to call for answering the user question.\nThe value of \"data\" should be a dictionary of key-value pairs you want to POST to the url as a JSON body.\nBe careful to always use double quotes for strings in the json string.\nYou should build the json string in order to get a response that is as short as possible, while still getting the necessary information to answer the question. Pay attention to deliberately exclude any unnecessary pieces of data in the API call.\n\nQuestion:{question}\njson string: {api_url_body}\n\nHere is the response from the API:\n\n{api_response}\n\nSummarize this response to answer the original question.\n\nSummary:", + "rows": 4, + "additionalParams": true, + "id": "postApiChain_0-input-ansPrompt-string" + } + ], + "inputAnchors": [ + { + "label": "Language Model", + "name": "model", + "type": "BaseLanguageModel", + "id": "postApiChain_0-input-model-BaseLanguageModel" + } + ], + "inputs": { + "model": "{{chatOpenAI_2.data.instance}}", + "apiDocs": "API documentation:\nEndpoint: https://eog776prcv6dg0j.m.pipedream.net\n\nThis API is for sending Discord message\n\nQuery body table:\nmessage | string | Message to send | required\n\nResponse schema (string):\nresult | string", + "headers": "", + "urlPrompt": "You are given the below API Documentation:\n{api_docs}\nUsing this documentation, generate a json string with two keys: \"url\" and \"data\".\nThe value of \"url\" should be a string, which is the API url to call for answering the user question.\nThe value of \"data\" should be a dictionary of key-value pairs you want to POST to the url as a JSON body.\nBe careful to always use double quotes for strings in the json string.\nYou should build the json string in order to get a response that is as short as possible, while still getting the necessary information to answer the question. Pay attention to deliberately exclude any unnecessary pieces of data in the API call.\n\nQuestion:{question}\njson string:", + "ansPrompt": "You are given the below API Documentation:\n{api_docs}\nUsing this documentation, generate a json string with two keys: \"url\" and \"data\".\nThe value of \"url\" should be a string, which is the API url to call for answering the user question.\nThe value of \"data\" should be a dictionary of key-value pairs you want to POST to the url as a JSON body.\nBe careful to always use double quotes for strings in the json string.\nYou should build the json string in order to get a response that is as short as possible, while still getting the necessary information to answer the question. Pay attention to deliberately exclude any unnecessary pieces of data in the API call.\n\nQuestion:{question}\njson string: {api_url_body}\n\nHere is the response from the API:\n\n{api_response}\n\nSummarize this response to answer the original question.\n\nSummary:" + }, + "outputAnchors": [ + { + "id": "postApiChain_0-output-postApiChain-POSTApiChain|BaseChain|BaseLangChain", + "name": "postApiChain", + "label": "POSTApiChain", + "type": "POSTApiChain | BaseChain | BaseLangChain" + } + ], + "outputs": {}, + "selected": false + }, + "selected": false, + "positionAbsolute": { + "x": 933.3631140153886, + "y": 974.8756002461283 + }, + "dragging": false + }, + { + "width": 300, + "height": 523, + "id": "chatOpenAI_2", + "position": { + "x": 572.8941615312035, + "y": 937.8425220917356 + }, + "type": "customNode", + "data": { + "id": "chatOpenAI_2", + "label": "ChatOpenAI", + "name": "chatOpenAI", + "version": 1, + "type": "ChatOpenAI", + "baseClasses": ["ChatOpenAI", "BaseChatModel", "BaseLanguageModel"], + "category": "Chat Models", + "description": "Wrapper around OpenAI large language models that use the Chat endpoint", + "inputParams": [ + { + "label": "Connect Credential", + "name": "credential", + "type": "credential", + "credentialNames": ["openAIApi"], + "id": "chatOpenAI_2-input-credential-credential" + }, + { + "label": "Model Name", + "name": "modelName", + "type": "options", + "options": [ + { + "label": "gpt-4", + "name": "gpt-4" + }, + { + "label": "gpt-4-0613", + "name": "gpt-4-0613" + }, + { + "label": "gpt-4-32k", + "name": "gpt-4-32k" + }, + { + "label": "gpt-4-32k-0613", + "name": "gpt-4-32k-0613" + }, + { + "label": "gpt-3.5-turbo", + "name": "gpt-3.5-turbo" + }, + { + "label": "gpt-3.5-turbo-0613", + "name": "gpt-3.5-turbo-0613" + }, + { + "label": "gpt-3.5-turbo-16k", + "name": "gpt-3.5-turbo-16k" + }, + { + "label": "gpt-3.5-turbo-16k-0613", + "name": "gpt-3.5-turbo-16k-0613" + } + ], + "default": "gpt-3.5-turbo", + "optional": true, + "id": "chatOpenAI_2-input-modelName-options" + }, + { + "label": "Temperature", + "name": "temperature", + "type": "number", + "default": 0.9, + "optional": true, + "id": "chatOpenAI_2-input-temperature-number" + }, + { + "label": "Max Tokens", + "name": "maxTokens", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_2-input-maxTokens-number" + }, + { + "label": "Top Probability", + "name": "topP", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_2-input-topP-number" + }, + { + "label": "Frequency Penalty", + "name": "frequencyPenalty", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_2-input-frequencyPenalty-number" + }, + { + "label": "Presence Penalty", + "name": "presencePenalty", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_2-input-presencePenalty-number" + }, + { + "label": "Timeout", + "name": "timeout", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_2-input-timeout-number" + }, + { + "label": "BasePath", + "name": "basepath", + "type": "string", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_2-input-basepath-string" + } + ], + "inputAnchors": [], + "inputs": { + "modelName": "gpt-3.5-turbo", + "temperature": 0.9, + "maxTokens": "", + "topP": "", + "frequencyPenalty": "", + "presencePenalty": "", + "timeout": "", + "basepath": "" + }, + "outputAnchors": [ + { + "id": "chatOpenAI_2-output-chatOpenAI-ChatOpenAI|BaseChatModel|BaseLanguageModel", + "name": "chatOpenAI", + "label": "ChatOpenAI", + "type": "ChatOpenAI | BaseChatModel | BaseLanguageModel" + } + ], + "outputs": {}, + "selected": false + }, + "selected": false, + "positionAbsolute": { + "x": 572.8941615312035, + "y": 937.8425220917356 + }, + "dragging": false + }, + { + "width": 300, + "height": 523, + "id": "chatOpenAI_1", + "position": { + "x": 828.7788305309582, + "y": 302.8996144964516 + }, + "type": "customNode", + "data": { + "id": "chatOpenAI_1", + "label": "ChatOpenAI", + "name": "chatOpenAI", + "version": 1, + "type": "ChatOpenAI", + "baseClasses": ["ChatOpenAI", "BaseChatModel", "BaseLanguageModel"], + "category": "Chat Models", + "description": "Wrapper around OpenAI large language models that use the Chat endpoint", + "inputParams": [ + { + "label": "Connect Credential", + "name": "credential", + "type": "credential", + "credentialNames": ["openAIApi"], + "id": "chatOpenAI_1-input-credential-credential" + }, + { + "label": "Model Name", + "name": "modelName", + "type": "options", + "options": [ + { + "label": "gpt-4", + "name": "gpt-4" + }, + { + "label": "gpt-4-0613", + "name": "gpt-4-0613" + }, + { + "label": "gpt-4-32k", + "name": "gpt-4-32k" + }, + { + "label": "gpt-4-32k-0613", + "name": "gpt-4-32k-0613" + }, + { + "label": "gpt-3.5-turbo", + "name": "gpt-3.5-turbo" + }, + { + "label": "gpt-3.5-turbo-0613", + "name": "gpt-3.5-turbo-0613" + }, + { + "label": "gpt-3.5-turbo-16k", + "name": "gpt-3.5-turbo-16k" + }, + { + "label": "gpt-3.5-turbo-16k-0613", + "name": "gpt-3.5-turbo-16k-0613" + } + ], + "default": "gpt-3.5-turbo", + "optional": true, + "id": "chatOpenAI_1-input-modelName-options" + }, + { + "label": "Temperature", + "name": "temperature", + "type": "number", + "default": 0.9, + "optional": true, + "id": "chatOpenAI_1-input-temperature-number" + }, + { + "label": "Max Tokens", + "name": "maxTokens", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_1-input-maxTokens-number" + }, + { + "label": "Top Probability", + "name": "topP", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_1-input-topP-number" + }, + { + "label": "Frequency Penalty", + "name": "frequencyPenalty", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_1-input-frequencyPenalty-number" + }, + { + "label": "Presence Penalty", + "name": "presencePenalty", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_1-input-presencePenalty-number" + }, + { + "label": "Timeout", + "name": "timeout", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_1-input-timeout-number" + }, + { + "label": "BasePath", + "name": "basepath", + "type": "string", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_1-input-basepath-string" + } + ], + "inputAnchors": [], + "inputs": { + "modelName": "gpt-3.5-turbo", + "temperature": 0.9, + "maxTokens": "", + "topP": "", + "frequencyPenalty": "", + "presencePenalty": "", + "timeout": "", + "basepath": "" + }, + "outputAnchors": [ + { + "id": "chatOpenAI_1-output-chatOpenAI-ChatOpenAI|BaseChatModel|BaseLanguageModel", + "name": "chatOpenAI", + "label": "ChatOpenAI", + "type": "ChatOpenAI | BaseChatModel | BaseLanguageModel" + } + ], + "outputs": {}, + "selected": false + }, + "selected": false, + "positionAbsolute": { + "x": 828.7788305309582, + "y": 302.8996144964516 + }, + "dragging": false + }, + { + "width": 300, + "height": 523, + "id": "chatOpenAI_3", + "position": { + "x": 1148.338912314111, + "y": 1561.0888070167944 + }, + "type": "customNode", + "data": { + "id": "chatOpenAI_3", + "label": "ChatOpenAI", + "name": "chatOpenAI", + "version": 1, + "type": "ChatOpenAI", + "baseClasses": ["ChatOpenAI", "BaseChatModel", "BaseLanguageModel"], + "category": "Chat Models", + "description": "Wrapper around OpenAI large language models that use the Chat endpoint", + "inputParams": [ + { + "label": "Connect Credential", + "name": "credential", + "type": "credential", + "credentialNames": ["openAIApi"], + "id": "chatOpenAI_3-input-credential-credential" + }, + { + "label": "Model Name", + "name": "modelName", + "type": "options", + "options": [ + { + "label": "gpt-4", + "name": "gpt-4" + }, + { + "label": "gpt-4-0613", + "name": "gpt-4-0613" + }, + { + "label": "gpt-4-32k", + "name": "gpt-4-32k" + }, + { + "label": "gpt-4-32k-0613", + "name": "gpt-4-32k-0613" + }, + { + "label": "gpt-3.5-turbo", + "name": "gpt-3.5-turbo" + }, + { + "label": "gpt-3.5-turbo-0613", + "name": "gpt-3.5-turbo-0613" + }, + { + "label": "gpt-3.5-turbo-16k", + "name": "gpt-3.5-turbo-16k" + }, + { + "label": "gpt-3.5-turbo-16k-0613", + "name": "gpt-3.5-turbo-16k-0613" + } + ], + "default": "gpt-3.5-turbo", + "optional": true, + "id": "chatOpenAI_3-input-modelName-options" + }, + { + "label": "Temperature", + "name": "temperature", + "type": "number", + "default": 0.9, + "optional": true, + "id": "chatOpenAI_3-input-temperature-number" + }, + { + "label": "Max Tokens", + "name": "maxTokens", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_3-input-maxTokens-number" + }, + { + "label": "Top Probability", + "name": "topP", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_3-input-topP-number" + }, + { + "label": "Frequency Penalty", + "name": "frequencyPenalty", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_3-input-frequencyPenalty-number" + }, + { + "label": "Presence Penalty", + "name": "presencePenalty", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_3-input-presencePenalty-number" + }, + { + "label": "Timeout", + "name": "timeout", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_3-input-timeout-number" + }, + { + "label": "BasePath", + "name": "basepath", + "type": "string", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_3-input-basepath-string" + } + ], + "inputAnchors": [], + "inputs": { + "modelName": "gpt-3.5-turbo", + "temperature": 0.9, + "maxTokens": "", + "topP": "", + "frequencyPenalty": "", + "presencePenalty": "", + "timeout": "", + "basepath": "" + }, + "outputAnchors": [ + { + "id": "chatOpenAI_3-output-chatOpenAI-ChatOpenAI|BaseChatModel|BaseLanguageModel", + "name": "chatOpenAI", + "label": "ChatOpenAI", + "type": "ChatOpenAI | BaseChatModel | BaseLanguageModel" + } + ], + "outputs": {}, + "selected": false + }, + "selected": false, + "positionAbsolute": { + "x": 1148.338912314111, + "y": 1561.0888070167944 + }, + "dragging": false + }, + { + "width": 300, + "height": 383, + "id": "conversationalAgent_0", + "position": { + "x": 2114.071431691489, + "y": 941.7926368551367 + }, + "type": "customNode", + "data": { + "id": "conversationalAgent_0", + "label": "Conversational Agent", + "name": "conversationalAgent", + "version": 1, + "type": "AgentExecutor", + "baseClasses": ["AgentExecutor", "BaseChain"], + "category": "Agents", + "description": "Conversational agent for a chat model. It will utilize chat specific prompts", + "inputParams": [ + { + "label": "System Message", + "name": "systemMessage", + "type": "string", + "rows": 4, + "default": "Assistant is a large language model trained by OpenAI.\n\nAssistant is designed to be able to assist with a wide range of tasks, from answering simple questions to providing in-depth explanations and discussions on a wide range of topics. As a language model, Assistant is able to generate human-like text based on the input it receives, allowing it to engage in natural-sounding conversations and provide responses that are coherent and relevant to the topic at hand.\n\nAssistant is constantly learning and improving, and its capabilities are constantly evolving. It is able to process and understand large amounts of text, and can use this knowledge to provide accurate and informative responses to a wide range of questions. Additionally, Assistant is able to generate its own text based on the input it receives, allowing it to engage in discussions and provide explanations and descriptions on a wide range of topics.\n\nOverall, Assistant is a powerful system that can help with a wide range of tasks and provide valuable insights and information on a wide range of topics. Whether you need help with a specific question or just want to have a conversation about a particular topic, Assistant is here to assist.", + "optional": true, + "additionalParams": true, + "id": "conversationalAgent_0-input-systemMessage-string" + } + ], + "inputAnchors": [ + { + "label": "Allowed Tools", + "name": "tools", + "type": "Tool", + "list": true, + "id": "conversationalAgent_0-input-tools-Tool" + }, + { + "label": "Language Model", + "name": "model", + "type": "BaseLanguageModel", + "id": "conversationalAgent_0-input-model-BaseLanguageModel" + }, + { + "label": "Memory", + "name": "memory", + "type": "BaseChatMemory", + "id": "conversationalAgent_0-input-memory-BaseChatMemory" + } + ], + "inputs": { + "tools": ["{{chainTool_0.data.instance}}", "{{chainTool_1.data.instance}}"], + "model": "{{chatOpenAI_3.data.instance}}", + "memory": "{{bufferMemory_0.data.instance}}", + "systemMessage": "Assistant is a large language model trained by OpenAI.\n\nAssistant is designed to be able to assist with a wide range of tasks, from answering simple questions to providing in-depth explanations and discussions on a wide range of topics. As a language model, Assistant is able to generate human-like text based on the input it receives, allowing it to engage in natural-sounding conversations and provide responses that are coherent and relevant to the topic at hand.\n\nAssistant is constantly learning and improving, and its capabilities are constantly evolving. It is able to process and understand large amounts of text, and can use this knowledge to provide accurate and informative responses to a wide range of questions. Additionally, Assistant is able to generate its own text based on the input it receives, allowing it to engage in discussions and provide explanations and descriptions on a wide range of topics.\n\nOverall, Assistant is a powerful system that can help with a wide range of tasks and provide valuable insights and information on a wide range of topics. Whether you need help with a specific question or just want to have a conversation about a particular topic, Assistant is here to assist." + }, + "outputAnchors": [ + { + "id": "conversationalAgent_0-output-conversationalAgent-AgentExecutor|BaseChain", + "name": "conversationalAgent", + "label": "AgentExecutor", + "type": "AgentExecutor | BaseChain" + } + ], + "outputs": {}, + "selected": false + }, + "selected": false, + "dragging": false, + "positionAbsolute": { + "x": 2114.071431691489, + "y": 941.7926368551367 + } + } + ], + "edges": [ + { + "source": "getApiChain_0", + "sourceHandle": "getApiChain_0-output-getApiChain-GETApiChain|BaseChain|BaseLangChain", + "target": "chainTool_0", + "targetHandle": "chainTool_0-input-baseChain-BaseChain", + "type": "buttonedge", + "id": "getApiChain_0-getApiChain_0-output-getApiChain-GETApiChain|BaseChain|BaseLangChain-chainTool_0-chainTool_0-input-baseChain-BaseChain", + "data": { + "label": "" + } + }, + { + "source": "postApiChain_0", + "sourceHandle": "postApiChain_0-output-postApiChain-POSTApiChain|BaseChain|BaseLangChain", + "target": "chainTool_1", + "targetHandle": "chainTool_1-input-baseChain-BaseChain", + "type": "buttonedge", + "id": "postApiChain_0-postApiChain_0-output-postApiChain-POSTApiChain|BaseChain|BaseLangChain-chainTool_1-chainTool_1-input-baseChain-BaseChain", + "data": { + "label": "" + } + }, + { + "source": "chatOpenAI_2", + "sourceHandle": "chatOpenAI_2-output-chatOpenAI-ChatOpenAI|BaseChatModel|BaseLanguageModel", + "target": "postApiChain_0", + "targetHandle": "postApiChain_0-input-model-BaseLanguageModel", + "type": "buttonedge", + "id": "chatOpenAI_2-chatOpenAI_2-output-chatOpenAI-ChatOpenAI|BaseChatModel|BaseLanguageModel-postApiChain_0-postApiChain_0-input-model-BaseLanguageModel", + "data": { + "label": "" + } + }, + { + "source": "chatOpenAI_1", + "sourceHandle": "chatOpenAI_1-output-chatOpenAI-ChatOpenAI|BaseChatModel|BaseLanguageModel", + "target": "getApiChain_0", + "targetHandle": "getApiChain_0-input-model-BaseLanguageModel", + "type": "buttonedge", + "id": "chatOpenAI_1-chatOpenAI_1-output-chatOpenAI-ChatOpenAI|BaseChatModel|BaseLanguageModel-getApiChain_0-getApiChain_0-input-model-BaseLanguageModel", + "data": { + "label": "" + } + }, + { + "source": "chainTool_0", + "sourceHandle": "chainTool_0-output-chainTool-ChainTool|DynamicTool|Tool|StructuredTool|BaseLangChain", + "target": "conversationalAgent_0", + "targetHandle": "conversationalAgent_0-input-tools-Tool", + "type": "buttonedge", + "id": "chainTool_0-chainTool_0-output-chainTool-ChainTool|DynamicTool|Tool|StructuredTool|BaseLangChain-conversationalAgent_0-conversationalAgent_0-input-tools-Tool", + "data": { + "label": "" + } + }, + { + "source": "chainTool_1", + "sourceHandle": "chainTool_1-output-chainTool-ChainTool|DynamicTool|Tool|StructuredTool|BaseLangChain", + "target": "conversationalAgent_0", + "targetHandle": "conversationalAgent_0-input-tools-Tool", + "type": "buttonedge", + "id": "chainTool_1-chainTool_1-output-chainTool-ChainTool|DynamicTool|Tool|StructuredTool|BaseLangChain-conversationalAgent_0-conversationalAgent_0-input-tools-Tool", + "data": { + "label": "" + } + }, + { + "source": "chatOpenAI_3", + "sourceHandle": "chatOpenAI_3-output-chatOpenAI-ChatOpenAI|BaseChatModel|BaseLanguageModel", + "target": "conversationalAgent_0", + "targetHandle": "conversationalAgent_0-input-model-BaseLanguageModel", + "type": "buttonedge", + "id": "chatOpenAI_3-chatOpenAI_3-output-chatOpenAI-ChatOpenAI|BaseChatModel|BaseLanguageModel-conversationalAgent_0-conversationalAgent_0-input-model-BaseLanguageModel", + "data": { + "label": "" + } + }, + { + "source": "bufferMemory_0", + "sourceHandle": "bufferMemory_0-output-bufferMemory-BufferMemory|BaseChatMemory|BaseMemory", + "target": "conversationalAgent_0", + "targetHandle": "conversationalAgent_0-input-memory-BaseChatMemory", + "type": "buttonedge", + "id": "bufferMemory_0-bufferMemory_0-output-bufferMemory-BufferMemory|BaseChatMemory|BaseMemory-conversationalAgent_0-conversationalAgent_0-input-memory-BaseChatMemory", + "data": { + "label": "" + } + } + ] +} diff --git a/packages/server/marketplaces/Antonym.json b/packages/server/marketplaces/chatflows/Antonym.json similarity index 70% rename from packages/server/marketplaces/Antonym.json rename to packages/server/marketplaces/chatflows/Antonym.json index 507015b38..95d3c1517 100644 --- a/packages/server/marketplaces/Antonym.json +++ b/packages/server/marketplaces/chatflows/Antonym.json @@ -3,68 +3,7 @@ "nodes": [ { "width": 300, - "height": 534, - "id": "promptTemplate_1", - "position": { - "x": 532.2791692529131, - "y": -31.128527027841372 - }, - "type": "customNode", - "data": { - "id": "promptTemplate_1", - "label": "Prompt Template", - "name": "promptTemplate", - "type": "PromptTemplate", - "baseClasses": ["PromptTemplate", "BaseStringPromptTemplate", "BasePromptTemplate"], - "category": "Prompts", - "description": "Schema to represent a basic prompt for an LLM", - "inputParams": [ - { - "label": "Template", - "name": "template", - "type": "string", - "rows": 4, - "placeholder": "What is a good name for a company that makes {product}?", - "id": "promptTemplate_1-input-template-string" - }, - { - "label": "Format Prompt Values", - "name": "promptValues", - "type": "string", - "rows": 4, - "placeholder": "{\n \"input_language\": \"English\",\n \"output_language\": \"French\"\n}", - "optional": true, - "acceptVariable": true, - "list": true, - "id": "promptTemplate_1-input-promptValues-string" - } - ], - "inputAnchors": [], - "inputs": { - "template": "Word: {word}\\nAntonym: {antonym}\\n", - "promptValues": "" - }, - "outputAnchors": [ - { - "id": "promptTemplate_1-output-promptTemplate-PromptTemplate|BaseStringPromptTemplate|BasePromptTemplate", - "name": "promptTemplate", - "label": "PromptTemplate", - "type": "PromptTemplate | BaseStringPromptTemplate | BasePromptTemplate" - } - ], - "outputs": {}, - "selected": false - }, - "selected": false, - "positionAbsolute": { - "x": 532.2791692529131, - "y": -31.128527027841372 - }, - "dragging": false - }, - { - "width": 300, - "height": 956, + "height": 955, "id": "fewShotPromptTemplate_1", "position": { "x": 886.3229032369354, @@ -75,6 +14,7 @@ "id": "fewShotPromptTemplate_1", "label": "Few Shot Prompt Template", "name": "fewShotPromptTemplate", + "version": 1, "type": "FewShotPromptTemplate", "baseClasses": ["FewShotPromptTemplate", "BaseStringPromptTemplate", "BasePromptTemplate"], "category": "Prompts", @@ -139,7 +79,7 @@ ], "inputs": { "examples": "[\n { \"word\": \"happy\", \"antonym\": \"sad\" },\n { \"word\": \"tall\", \"antonym\": \"short\" }\n]", - "examplePrompt": "{{promptTemplate_1.data.instance}}", + "examplePrompt": "{{promptTemplate_0.data.instance}}", "prefix": "Give the antonym of every input", "suffix": "Word: {input}\\nAntonym:", "exampleSeparator": "\\n\\n", @@ -165,137 +105,52 @@ }, { "width": 300, - "height": 526, - "id": "openAI_1", + "height": 475, + "id": "promptTemplate_0", "position": { - "x": 1224.5139327142097, - "y": -30.864315286062364 + "x": 540.0140796251119, + "y": -33.31673494170347 }, "type": "customNode", "data": { - "id": "openAI_1", - "label": "OpenAI", - "name": "openAI", - "type": "OpenAI", - "baseClasses": ["OpenAI", "BaseLLM", "BaseLanguageModel", "BaseLangChain"], - "category": "LLMs", - "description": "Wrapper around OpenAI large language models", + "id": "promptTemplate_0", + "label": "Prompt Template", + "name": "promptTemplate", + "version": 1, + "type": "PromptTemplate", + "baseClasses": ["PromptTemplate", "BaseStringPromptTemplate", "BasePromptTemplate"], + "category": "Prompts", + "description": "Schema to represent a basic prompt for an LLM", "inputParams": [ { - "label": "OpenAI Api Key", - "name": "openAIApiKey", - "type": "password", - "id": "openAI_1-input-openAIApiKey-password" + "label": "Template", + "name": "template", + "type": "string", + "rows": 4, + "placeholder": "What is a good name for a company that makes {product}?", + "id": "promptTemplate_0-input-template-string" }, { - "label": "Model Name", - "name": "modelName", - "type": "options", - "options": [ - { - "label": "text-davinci-003", - "name": "text-davinci-003" - }, - { - "label": "text-davinci-002", - "name": "text-davinci-002" - }, - { - "label": "text-curie-001", - "name": "text-curie-001" - }, - { - "label": "text-babbage-001", - "name": "text-babbage-001" - } - ], - "default": "text-davinci-003", + "label": "Format Prompt Values", + "name": "promptValues", + "type": "json", "optional": true, - "id": "openAI_1-input-modelName-options" - }, - { - "label": "Temperature", - "name": "temperature", - "type": "number", - "default": 0.7, - "optional": true, - "id": "openAI_1-input-temperature-number" - }, - { - "label": "Max Tokens", - "name": "maxTokens", - "type": "number", - "optional": true, - "additionalParams": true, - "id": "openAI_1-input-maxTokens-number" - }, - { - "label": "Top Probability", - "name": "topP", - "type": "number", - "optional": true, - "additionalParams": true, - "id": "openAI_1-input-topP-number" - }, - { - "label": "Best Of", - "name": "bestOf", - "type": "number", - "optional": true, - "additionalParams": true, - "id": "openAI_1-input-bestOf-number" - }, - { - "label": "Frequency Penalty", - "name": "frequencyPenalty", - "type": "number", - "optional": true, - "additionalParams": true, - "id": "openAI_1-input-frequencyPenalty-number" - }, - { - "label": "Presence Penalty", - "name": "presencePenalty", - "type": "number", - "optional": true, - "additionalParams": true, - "id": "openAI_1-input-presencePenalty-number" - }, - { - "label": "Batch Size", - "name": "batchSize", - "type": "number", - "optional": true, - "additionalParams": true, - "id": "openAI_1-input-batchSize-number" - }, - { - "label": "Timeout", - "name": "timeout", - "type": "number", - "optional": true, - "additionalParams": true, - "id": "openAI_1-input-timeout-number" + "acceptVariable": true, + "list": true, + "id": "promptTemplate_0-input-promptValues-json" } ], "inputAnchors": [], "inputs": { - "modelName": "text-davinci-003", - "temperature": 0.7, - "maxTokens": "", - "topP": "", - "bestOf": "", - "frequencyPenalty": "", - "presencePenalty": "", - "batchSize": "", - "timeout": "" + "template": "Word: {word}\\nAntonym: {antonym}\\n", + "promptValues": "" }, "outputAnchors": [ { - "id": "openAI_1-output-openAI-OpenAI|BaseLLM|BaseLanguageModel|BaseLangChain", - "name": "openAI", - "label": "OpenAI", - "type": "OpenAI | BaseLLM | BaseLanguageModel | BaseLangChain" + "id": "promptTemplate_0-output-promptTemplate-PromptTemplate|BaseStringPromptTemplate|BasePromptTemplate", + "name": "promptTemplate", + "label": "PromptTemplate", + "type": "PromptTemplate | BaseStringPromptTemplate | BasePromptTemplate" } ], "outputs": {}, @@ -303,26 +158,181 @@ }, "selected": false, "positionAbsolute": { - "x": 1224.5139327142097, - "y": -30.864315286062364 + "x": 540.0140796251119, + "y": -33.31673494170347 }, "dragging": false }, { "width": 300, - "height": 407, - "id": "llmChain_1", + "height": 523, + "id": "chatOpenAI_0", "position": { - "x": 1635.363191180743, - "y": 450.00105475193766 + "x": 1226.7977900193628, + "y": 48.01100655894436 }, "type": "customNode", "data": { - "id": "llmChain_1", + "id": "chatOpenAI_0", + "label": "ChatOpenAI", + "name": "chatOpenAI", + "version": 1, + "type": "ChatOpenAI", + "baseClasses": ["ChatOpenAI", "BaseChatModel", "BaseLanguageModel"], + "category": "Chat Models", + "description": "Wrapper around OpenAI large language models that use the Chat endpoint", + "inputParams": [ + { + "label": "Connect Credential", + "name": "credential", + "type": "credential", + "credentialNames": ["openAIApi"], + "id": "chatOpenAI_0-input-credential-credential" + }, + { + "label": "Model Name", + "name": "modelName", + "type": "options", + "options": [ + { + "label": "gpt-4", + "name": "gpt-4" + }, + { + "label": "gpt-4-0613", + "name": "gpt-4-0613" + }, + { + "label": "gpt-4-32k", + "name": "gpt-4-32k" + }, + { + "label": "gpt-4-32k-0613", + "name": "gpt-4-32k-0613" + }, + { + "label": "gpt-3.5-turbo", + "name": "gpt-3.5-turbo" + }, + { + "label": "gpt-3.5-turbo-0613", + "name": "gpt-3.5-turbo-0613" + }, + { + "label": "gpt-3.5-turbo-16k", + "name": "gpt-3.5-turbo-16k" + }, + { + "label": "gpt-3.5-turbo-16k-0613", + "name": "gpt-3.5-turbo-16k-0613" + } + ], + "default": "gpt-3.5-turbo", + "optional": true, + "id": "chatOpenAI_0-input-modelName-options" + }, + { + "label": "Temperature", + "name": "temperature", + "type": "number", + "default": 0.9, + "optional": true, + "id": "chatOpenAI_0-input-temperature-number" + }, + { + "label": "Max Tokens", + "name": "maxTokens", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_0-input-maxTokens-number" + }, + { + "label": "Top Probability", + "name": "topP", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_0-input-topP-number" + }, + { + "label": "Frequency Penalty", + "name": "frequencyPenalty", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_0-input-frequencyPenalty-number" + }, + { + "label": "Presence Penalty", + "name": "presencePenalty", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_0-input-presencePenalty-number" + }, + { + "label": "Timeout", + "name": "timeout", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_0-input-timeout-number" + }, + { + "label": "BasePath", + "name": "basepath", + "type": "string", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_0-input-basepath-string" + } + ], + "inputAnchors": [], + "inputs": { + "modelName": "gpt-3.5-turbo", + "temperature": 0.9, + "maxTokens": "", + "topP": "", + "frequencyPenalty": "", + "presencePenalty": "", + "timeout": "", + "basepath": "" + }, + "outputAnchors": [ + { + "id": "chatOpenAI_0-output-chatOpenAI-ChatOpenAI|BaseChatModel|BaseLanguageModel", + "name": "chatOpenAI", + "label": "ChatOpenAI", + "type": "ChatOpenAI | BaseChatModel | BaseLanguageModel" + } + ], + "outputs": {}, + "selected": false + }, + "selected": false, + "positionAbsolute": { + "x": 1226.7977900193628, + "y": 48.01100655894436 + }, + "dragging": false + }, + { + "width": 300, + "height": 405, + "id": "llmChain_0", + "position": { + "x": 1573.7490072386481, + "y": 429.1905949837192 + }, + "type": "customNode", + "data": { + "id": "llmChain_0", "label": "LLM Chain", "name": "llmChain", + "version": 1, "type": "LLMChain", - "baseClasses": ["LLMChain", "BaseChain", "BaseLangChain"], + "baseClasses": ["LLMChain", "BaseChain"], "category": "Chains", "description": "Chain to run queries against LLMs", "inputParams": [ @@ -332,7 +342,7 @@ "type": "string", "placeholder": "Name Your Chain", "optional": true, - "id": "llmChain_1-input-chainName-string" + "id": "llmChain_0-input-chainName-string" } ], "inputAnchors": [ @@ -340,17 +350,17 @@ "label": "Language Model", "name": "model", "type": "BaseLanguageModel", - "id": "llmChain_1-input-model-BaseLanguageModel" + "id": "llmChain_0-input-model-BaseLanguageModel" }, { "label": "Prompt", "name": "prompt", "type": "BasePromptTemplate", - "id": "llmChain_1-input-prompt-BasePromptTemplate" + "id": "llmChain_0-input-prompt-BasePromptTemplate" } ], "inputs": { - "model": "{{openAI_1.data.instance}}", + "model": "{{chatOpenAI_0.data.instance}}", "prompt": "{{fewShotPromptTemplate_1.data.instance}}", "chainName": "" }, @@ -361,16 +371,16 @@ "type": "options", "options": [ { - "id": "llmChain_1-output-llmChain-LLMChain|BaseChain|BaseLangChain", + "id": "llmChain_0-output-llmChain-LLMChain|BaseChain", "name": "llmChain", "label": "LLM Chain", - "type": "LLMChain | BaseChain | BaseLangChain" + "type": "LLMChain | BaseChain" }, { - "id": "llmChain_1-output-outputPrediction-string", + "id": "llmChain_0-output-outputPrediction-string|json", "name": "outputPrediction", "label": "Output Prediction", - "type": "string" + "type": "string | json" } ], "default": "llmChain" @@ -381,33 +391,33 @@ }, "selected": false }, - "positionAbsolute": { - "x": 1635.363191180743, - "y": 450.00105475193766 - }, "selected": false, + "positionAbsolute": { + "x": 1573.7490072386481, + "y": 429.1905949837192 + }, "dragging": false } ], "edges": [ { - "source": "promptTemplate_1", - "sourceHandle": "promptTemplate_1-output-promptTemplate-PromptTemplate|BaseStringPromptTemplate|BasePromptTemplate", + "source": "promptTemplate_0", + "sourceHandle": "promptTemplate_0-output-promptTemplate-PromptTemplate|BaseStringPromptTemplate|BasePromptTemplate", "target": "fewShotPromptTemplate_1", "targetHandle": "fewShotPromptTemplate_1-input-examplePrompt-PromptTemplate", "type": "buttonedge", - "id": "promptTemplate_1-promptTemplate_1-output-promptTemplate-PromptTemplate|BaseStringPromptTemplate|BasePromptTemplate-fewShotPromptTemplate_1-fewShotPromptTemplate_1-input-examplePrompt-PromptTemplate", + "id": "promptTemplate_0-promptTemplate_0-output-promptTemplate-PromptTemplate|BaseStringPromptTemplate|BasePromptTemplate-fewShotPromptTemplate_1-fewShotPromptTemplate_1-input-examplePrompt-PromptTemplate", "data": { "label": "" } }, { - "source": "openAI_1", - "sourceHandle": "openAI_1-output-openAI-OpenAI|BaseLLM|BaseLanguageModel|BaseLangChain", - "target": "llmChain_1", - "targetHandle": "llmChain_1-input-model-BaseLanguageModel", + "source": "chatOpenAI_0", + "sourceHandle": "chatOpenAI_0-output-chatOpenAI-ChatOpenAI|BaseChatModel|BaseLanguageModel", + "target": "llmChain_0", + "targetHandle": "llmChain_0-input-model-BaseLanguageModel", "type": "buttonedge", - "id": "openAI_1-openAI_1-output-openAI-OpenAI|BaseLLM|BaseLanguageModel|BaseLangChain-llmChain_1-llmChain_1-input-model-BaseLanguageModel", + "id": "chatOpenAI_0-chatOpenAI_0-output-chatOpenAI-ChatOpenAI|BaseChatModel|BaseLanguageModel-llmChain_0-llmChain_0-input-model-BaseLanguageModel", "data": { "label": "" } @@ -415,10 +425,10 @@ { "source": "fewShotPromptTemplate_1", "sourceHandle": "fewShotPromptTemplate_1-output-fewShotPromptTemplate-FewShotPromptTemplate|BaseStringPromptTemplate|BasePromptTemplate", - "target": "llmChain_1", - "targetHandle": "llmChain_1-input-prompt-BasePromptTemplate", + "target": "llmChain_0", + "targetHandle": "llmChain_0-input-prompt-BasePromptTemplate", "type": "buttonedge", - "id": "fewShotPromptTemplate_1-fewShotPromptTemplate_1-output-fewShotPromptTemplate-FewShotPromptTemplate|BaseStringPromptTemplate|BasePromptTemplate-llmChain_1-llmChain_1-input-prompt-BasePromptTemplate", + "id": "fewShotPromptTemplate_1-fewShotPromptTemplate_1-output-fewShotPromptTemplate-FewShotPromptTemplate|BaseStringPromptTemplate|BasePromptTemplate-llmChain_0-llmChain_0-input-prompt-BasePromptTemplate", "data": { "label": "" } diff --git a/packages/server/marketplaces/AutoGPT.json b/packages/server/marketplaces/chatflows/AutoGPT.json similarity index 74% rename from packages/server/marketplaces/AutoGPT.json rename to packages/server/marketplaces/chatflows/AutoGPT.json index 1ec202776..538371516 100644 --- a/packages/server/marketplaces/AutoGPT.json +++ b/packages/server/marketplaces/chatflows/AutoGPT.json @@ -3,7 +3,7 @@ "nodes": [ { "width": 300, - "height": 629, + "height": 627, "id": "autoGPT_0", "position": { "x": 1627.8124366169843, @@ -14,6 +14,7 @@ "id": "autoGPT_0", "label": "AutoGPT", "name": "autoGPT", + "version": 1, "type": "AutoGPT", "baseClasses": ["AutoGPT"], "category": "Agents", @@ -67,8 +68,8 @@ ], "inputs": { "tools": ["{{readFile_0.data.instance}}", "{{writeFile_1.data.instance}}", "{{serpAPI_0.data.instance}}"], - "model": "{{chatOpenAI_1.data.instance}}", - "vectorStoreRetriever": "{{pineconeExistingIndex_1.data.instance}}", + "model": "{{chatOpenAI_0.data.instance}}", + "vectorStoreRetriever": "{{pineconeExistingIndex_0.data.instance}}", "aiName": "", "aiRole": "", "maxLoop": 5 @@ -93,148 +94,18 @@ }, { "width": 300, - "height": 526, - "id": "chatOpenAI_1", - "position": { - "x": 168.57515834535457, - "y": -90.74139976987627 - }, - "type": "customNode", - "data": { - "id": "chatOpenAI_1", - "label": "ChatOpenAI", - "name": "chatOpenAI", - "type": "ChatOpenAI", - "baseClasses": ["ChatOpenAI", "BaseChatModel", "BaseLanguageModel", "BaseLangChain"], - "category": "Chat Models", - "description": "Wrapper around OpenAI large language models that use the Chat endpoint", - "inputParams": [ - { - "label": "OpenAI Api Key", - "name": "openAIApiKey", - "type": "password", - "id": "chatOpenAI_1-input-openAIApiKey-password" - }, - { - "label": "Model Name", - "name": "modelName", - "type": "options", - "options": [ - { - "label": "gpt-4", - "name": "gpt-4" - }, - { - "label": "gpt-4-0314", - "name": "gpt-4-0314" - }, - { - "label": "gpt-4-32k-0314", - "name": "gpt-4-32k-0314" - }, - { - "label": "gpt-3.5-turbo", - "name": "gpt-3.5-turbo" - }, - { - "label": "gpt-3.5-turbo-0301", - "name": "gpt-3.5-turbo-0301" - } - ], - "default": "gpt-3.5-turbo", - "optional": true, - "id": "chatOpenAI_1-input-modelName-options" - }, - { - "label": "Temperature", - "name": "temperature", - "type": "number", - "default": 0.9, - "optional": true, - "id": "chatOpenAI_1-input-temperature-number" - }, - { - "label": "Max Tokens", - "name": "maxTokens", - "type": "number", - "optional": true, - "additionalParams": true, - "id": "chatOpenAI_1-input-maxTokens-number" - }, - { - "label": "Top Probability", - "name": "topP", - "type": "number", - "optional": true, - "additionalParams": true, - "id": "chatOpenAI_1-input-topP-number" - }, - { - "label": "Frequency Penalty", - "name": "frequencyPenalty", - "type": "number", - "optional": true, - "additionalParams": true, - "id": "chatOpenAI_1-input-frequencyPenalty-number" - }, - { - "label": "Presence Penalty", - "name": "presencePenalty", - "type": "number", - "optional": true, - "additionalParams": true, - "id": "chatOpenAI_1-input-presencePenalty-number" - }, - { - "label": "Timeout", - "name": "timeout", - "type": "number", - "optional": true, - "additionalParams": true, - "id": "chatOpenAI_1-input-timeout-number" - } - ], - "inputAnchors": [], - "inputs": { - "modelName": "gpt-3.5-turbo", - "temperature": "0", - "maxTokens": "", - "topP": "", - "frequencyPenalty": "", - "presencePenalty": "", - "timeout": "" - }, - "outputAnchors": [ - { - "id": "chatOpenAI_1-output-chatOpenAI-ChatOpenAI|BaseChatModel|BaseLanguageModel|BaseLangChain", - "name": "chatOpenAI", - "label": "ChatOpenAI", - "type": "ChatOpenAI | BaseChatModel | BaseLanguageModel | BaseLangChain" - } - ], - "outputs": {}, - "selected": false - }, - "selected": false, - "positionAbsolute": { - "x": 168.57515834535457, - "y": -90.74139976987627 - }, - "dragging": false - }, - { - "width": 300, - "height": 279, + "height": 278, "id": "writeFile_1", "position": { - "x": 546.3440710182241, - "y": 55.28691941459434 + "x": 539.4976647298655, + "y": 36.45930212160803 }, "type": "customNode", "data": { "id": "writeFile_1", "label": "Write File", "name": "writeFile", + "version": 1, "type": "WriteFile", "baseClasses": ["WriteFile", "Tool", "StructuredTool", "BaseLangChain"], "category": "Tools", @@ -265,15 +136,15 @@ "selected": false }, "positionAbsolute": { - "x": 546.3440710182241, - "y": 55.28691941459434 + "x": 539.4976647298655, + "y": 36.45930212160803 }, "selected": false, "dragging": false }, { "width": 300, - "height": 279, + "height": 278, "id": "readFile_0", "position": { "x": 881.2568465391292, @@ -284,6 +155,7 @@ "id": "readFile_0", "label": "Read File", "name": "readFile", + "version": 1, "type": "ReadFile", "baseClasses": ["ReadFile", "Tool", "StructuredTool", "BaseLangChain"], "category": "Tools", @@ -322,37 +194,39 @@ }, { "width": 300, - "height": 279, + "height": 277, "id": "serpAPI_0", "position": { - "x": 1244.740380161344, - "y": -193.9135818023827 + "x": 1247.066832724479, + "y": -193.77467220135756 }, "type": "customNode", "data": { "id": "serpAPI_0", "label": "Serp API", "name": "serpAPI", + "version": 1, "type": "SerpAPI", - "baseClasses": ["SerpAPI", "Tool", "StructuredTool", "BaseLangChain"], + "baseClasses": ["SerpAPI", "Tool", "StructuredTool"], "category": "Tools", "description": "Wrapper around SerpAPI - a real-time API to access Google search results", "inputParams": [ { - "label": "Serp Api Key", - "name": "apiKey", - "type": "password", - "id": "serpAPI_0-input-apiKey-password" + "label": "Connect Credential", + "name": "credential", + "type": "credential", + "credentialNames": ["serpApi"], + "id": "serpAPI_0-input-credential-credential" } ], "inputAnchors": [], "inputs": {}, "outputAnchors": [ { - "id": "serpAPI_0-output-serpAPI-SerpAPI|Tool|StructuredTool|BaseLangChain", + "id": "serpAPI_0-output-serpAPI-SerpAPI|Tool|StructuredTool", "name": "serpAPI", "label": "SerpAPI", - "type": "SerpAPI | Tool | StructuredTool | BaseLangChain" + "type": "SerpAPI | Tool | StructuredTool" } ], "outputs": {}, @@ -360,34 +234,190 @@ }, "selected": false, "positionAbsolute": { - "x": 1244.740380161344, - "y": -193.9135818023827 + "x": 1247.066832724479, + "y": -193.77467220135756 }, "dragging": false }, { "width": 300, - "height": 331, + "height": 523, + "id": "chatOpenAI_0", + "position": { + "x": 176.69787776192283, + "y": -116.3808686218022 + }, + "type": "customNode", + "data": { + "id": "chatOpenAI_0", + "label": "ChatOpenAI", + "name": "chatOpenAI", + "version": 1, + "type": "ChatOpenAI", + "baseClasses": ["ChatOpenAI", "BaseChatModel", "BaseLanguageModel"], + "category": "Chat Models", + "description": "Wrapper around OpenAI large language models that use the Chat endpoint", + "inputParams": [ + { + "label": "Connect Credential", + "name": "credential", + "type": "credential", + "credentialNames": ["openAIApi"], + "id": "chatOpenAI_0-input-credential-credential" + }, + { + "label": "Model Name", + "name": "modelName", + "type": "options", + "options": [ + { + "label": "gpt-4", + "name": "gpt-4" + }, + { + "label": "gpt-4-0613", + "name": "gpt-4-0613" + }, + { + "label": "gpt-4-32k", + "name": "gpt-4-32k" + }, + { + "label": "gpt-4-32k-0613", + "name": "gpt-4-32k-0613" + }, + { + "label": "gpt-3.5-turbo", + "name": "gpt-3.5-turbo" + }, + { + "label": "gpt-3.5-turbo-0613", + "name": "gpt-3.5-turbo-0613" + }, + { + "label": "gpt-3.5-turbo-16k", + "name": "gpt-3.5-turbo-16k" + }, + { + "label": "gpt-3.5-turbo-16k-0613", + "name": "gpt-3.5-turbo-16k-0613" + } + ], + "default": "gpt-3.5-turbo", + "optional": true, + "id": "chatOpenAI_0-input-modelName-options" + }, + { + "label": "Temperature", + "name": "temperature", + "type": "number", + "default": 0.9, + "optional": true, + "id": "chatOpenAI_0-input-temperature-number" + }, + { + "label": "Max Tokens", + "name": "maxTokens", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_0-input-maxTokens-number" + }, + { + "label": "Top Probability", + "name": "topP", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_0-input-topP-number" + }, + { + "label": "Frequency Penalty", + "name": "frequencyPenalty", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_0-input-frequencyPenalty-number" + }, + { + "label": "Presence Penalty", + "name": "presencePenalty", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_0-input-presencePenalty-number" + }, + { + "label": "Timeout", + "name": "timeout", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_0-input-timeout-number" + }, + { + "label": "BasePath", + "name": "basepath", + "type": "string", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_0-input-basepath-string" + } + ], + "inputAnchors": [], + "inputs": { + "modelName": "gpt-3.5-turbo", + "temperature": 0.9, + "maxTokens": "", + "topP": "", + "frequencyPenalty": "", + "presencePenalty": "", + "timeout": "", + "basepath": "" + }, + "outputAnchors": [ + { + "id": "chatOpenAI_0-output-chatOpenAI-ChatOpenAI|BaseChatModel|BaseLanguageModel", + "name": "chatOpenAI", + "label": "ChatOpenAI", + "type": "ChatOpenAI | BaseChatModel | BaseLanguageModel" + } + ], + "outputs": {}, + "selected": false + }, + "selected": false, + "positionAbsolute": { + "x": 176.69787776192283, + "y": -116.3808686218022 + }, + "dragging": false + }, + { + "width": 300, + "height": 329, "id": "openAIEmbeddings_0", "position": { - "x": 530.4714276286077, - "y": 487.0228196121594 + "x": 606.7317612889267, + "y": 439.5269912996025 }, "type": "customNode", "data": { "id": "openAIEmbeddings_0", "label": "OpenAI Embeddings", "name": "openAIEmbeddings", + "version": 1, "type": "OpenAIEmbeddings", "baseClasses": ["OpenAIEmbeddings", "Embeddings"], "category": "Embeddings", "description": "OpenAI API to generate embeddings for a given text", "inputParams": [ { - "label": "OpenAI Api Key", - "name": "openAIApiKey", - "type": "password", - "id": "openAIEmbeddings_0-input-openAIApiKey-password" + "label": "Connect Credential", + "name": "credential", + "type": "credential", + "credentialNames": ["openAIApi"], + "id": "openAIEmbeddings_0-input-credential-credential" }, { "label": "Strip New Lines", @@ -412,13 +442,22 @@ "optional": true, "additionalParams": true, "id": "openAIEmbeddings_0-input-timeout-number" + }, + { + "label": "BasePath", + "name": "basepath", + "type": "string", + "optional": true, + "additionalParams": true, + "id": "openAIEmbeddings_0-input-basepath-string" } ], "inputAnchors": [], "inputs": { "stripNewLines": "", "batchSize": "", - "timeout": "" + "timeout": "", + "basepath": "" }, "outputAnchors": [ { @@ -431,56 +470,53 @@ "outputs": {}, "selected": false }, - "positionAbsolute": { - "x": 530.4714276286077, - "y": 487.0228196121594 - }, "selected": false, + "positionAbsolute": { + "x": 606.7317612889267, + "y": 439.5269912996025 + }, "dragging": false }, { "width": 300, - "height": 652, - "id": "pineconeExistingIndex_1", + "height": 505, + "id": "pineconeExistingIndex_0", "position": { - "x": 943.1601557586332, - "y": 404.9622062733608 + "x": 1001.3784758268554, + "y": 415.24072209485803 }, "type": "customNode", "data": { - "id": "pineconeExistingIndex_1", + "id": "pineconeExistingIndex_0", "label": "Pinecone Load Existing Index", "name": "pineconeExistingIndex", + "version": 1, "type": "Pinecone", "baseClasses": ["Pinecone", "VectorStoreRetriever", "BaseRetriever"], "category": "Vector Stores", "description": "Load existing index from Pinecone (i.e: Document has been upserted)", "inputParams": [ { - "label": "Pinecone Api Key", - "name": "pineconeApiKey", - "type": "password", - "id": "pineconeExistingIndex_1-input-pineconeApiKey-password" - }, - { - "label": "Pinecone Environment", - "name": "pineconeEnv", - "type": "string", - "id": "pineconeExistingIndex_1-input-pineconeEnv-string" + "label": "Connect Credential", + "name": "credential", + "type": "credential", + "credentialNames": ["pineconeApi"], + "id": "pineconeExistingIndex_0-input-credential-credential" }, { "label": "Pinecone Index", "name": "pineconeIndex", "type": "string", - "id": "pineconeExistingIndex_1-input-pineconeIndex-string" + "id": "pineconeExistingIndex_0-input-pineconeIndex-string" }, { "label": "Pinecone Namespace", "name": "pineconeNamespace", "type": "string", "placeholder": "my-first-namespace", + "additionalParams": true, "optional": true, - "id": "pineconeExistingIndex_1-input-pineconeNamespace-string" + "id": "pineconeExistingIndex_0-input-pineconeNamespace-string" }, { "label": "Pinecone Metadata Filter", @@ -488,7 +524,17 @@ "type": "json", "optional": true, "additionalParams": true, - "id": "pineconeExistingIndex_1-input-pineconeMetadataFilter-json" + "id": "pineconeExistingIndex_0-input-pineconeMetadataFilter-json" + }, + { + "label": "Top K", + "name": "topK", + "description": "Number of top results to fetch. Default to 4", + "placeholder": "4", + "type": "number", + "additionalParams": true, + "optional": true, + "id": "pineconeExistingIndex_0-input-topK-number" } ], "inputAnchors": [ @@ -496,14 +542,15 @@ "label": "Embeddings", "name": "embeddings", "type": "Embeddings", - "id": "pineconeExistingIndex_1-input-embeddings-Embeddings" + "id": "pineconeExistingIndex_0-input-embeddings-Embeddings" } ], "inputs": { "embeddings": "{{openAIEmbeddings_0.data.instance}}", - "pineconeEnv": "us-west4-gcp", "pineconeIndex": "", - "pineconeNamespace": "" + "pineconeNamespace": "", + "pineconeMetadataFilter": "", + "topK": "" }, "outputAnchors": [ { @@ -512,13 +559,13 @@ "type": "options", "options": [ { - "id": "pineconeExistingIndex_1-output-retriever-Pinecone|VectorStoreRetriever|BaseRetriever", + "id": "pineconeExistingIndex_0-output-retriever-Pinecone|VectorStoreRetriever|BaseRetriever", "name": "retriever", "label": "Pinecone Retriever", "type": "Pinecone | VectorStoreRetriever | BaseRetriever" }, { - "id": "pineconeExistingIndex_1-output-vectorStore-Pinecone|VectorStore", + "id": "pineconeExistingIndex_0-output-vectorStore-Pinecone|VectorStore", "name": "vectorStore", "label": "Pinecone Vector Store", "type": "Pinecone | VectorStore" @@ -533,47 +580,14 @@ "selected": false }, "selected": false, + "dragging": false, "positionAbsolute": { - "x": 943.1601557586332, - "y": 404.9622062733608 - }, - "dragging": false + "x": 1001.3784758268554, + "y": 415.24072209485803 + } } ], "edges": [ - { - "source": "pineconeExistingIndex_1", - "sourceHandle": "pineconeExistingIndex_1-output-retriever-Pinecone|VectorStoreRetriever|BaseRetriever", - "target": "autoGPT_0", - "targetHandle": "autoGPT_0-input-vectorStoreRetriever-BaseRetriever", - "type": "buttonedge", - "id": "pineconeExistingIndex_1-pineconeExistingIndex_1-output-retriever-Pinecone|VectorStoreRetriever|BaseRetriever-autoGPT_0-autoGPT_0-input-vectorStoreRetriever-BaseRetriever", - "data": { - "label": "" - } - }, - { - "source": "openAIEmbeddings_0", - "sourceHandle": "openAIEmbeddings_0-output-openAIEmbeddings-OpenAIEmbeddings|Embeddings", - "target": "pineconeExistingIndex_1", - "targetHandle": "pineconeExistingIndex_1-input-embeddings-Embeddings", - "type": "buttonedge", - "id": "openAIEmbeddings_0-openAIEmbeddings_0-output-openAIEmbeddings-OpenAIEmbeddings|Embeddings-pineconeExistingIndex_1-pineconeExistingIndex_1-input-embeddings-Embeddings", - "data": { - "label": "" - } - }, - { - "source": "chatOpenAI_1", - "sourceHandle": "chatOpenAI_1-output-chatOpenAI-ChatOpenAI|BaseChatModel|BaseLanguageModel|BaseLangChain", - "target": "autoGPT_0", - "targetHandle": "autoGPT_0-input-model-BaseChatModel", - "type": "buttonedge", - "id": "chatOpenAI_1-chatOpenAI_1-output-chatOpenAI-ChatOpenAI|BaseChatModel|BaseLanguageModel|BaseLangChain-autoGPT_0-autoGPT_0-input-model-BaseChatModel", - "data": { - "label": "" - } - }, { "source": "writeFile_1", "sourceHandle": "writeFile_1-output-writeFile-WriteFile|Tool|StructuredTool|BaseLangChain", @@ -596,13 +610,46 @@ "label": "" } }, + { + "source": "pineconeExistingIndex_0", + "sourceHandle": "pineconeExistingIndex_0-output-retriever-Pinecone|VectorStoreRetriever|BaseRetriever", + "target": "autoGPT_0", + "targetHandle": "autoGPT_0-input-vectorStoreRetriever-BaseRetriever", + "type": "buttonedge", + "id": "pineconeExistingIndex_0-pineconeExistingIndex_0-output-retriever-Pinecone|VectorStoreRetriever|BaseRetriever-autoGPT_0-autoGPT_0-input-vectorStoreRetriever-BaseRetriever", + "data": { + "label": "" + } + }, + { + "source": "openAIEmbeddings_0", + "sourceHandle": "openAIEmbeddings_0-output-openAIEmbeddings-OpenAIEmbeddings|Embeddings", + "target": "pineconeExistingIndex_0", + "targetHandle": "pineconeExistingIndex_0-input-embeddings-Embeddings", + "type": "buttonedge", + "id": "openAIEmbeddings_0-openAIEmbeddings_0-output-openAIEmbeddings-OpenAIEmbeddings|Embeddings-pineconeExistingIndex_0-pineconeExistingIndex_0-input-embeddings-Embeddings", + "data": { + "label": "" + } + }, + { + "source": "chatOpenAI_0", + "sourceHandle": "chatOpenAI_0-output-chatOpenAI-ChatOpenAI|BaseChatModel|BaseLanguageModel", + "target": "autoGPT_0", + "targetHandle": "autoGPT_0-input-model-BaseChatModel", + "type": "buttonedge", + "id": "chatOpenAI_0-chatOpenAI_0-output-chatOpenAI-ChatOpenAI|BaseChatModel|BaseLanguageModel-autoGPT_0-autoGPT_0-input-model-BaseChatModel", + "data": { + "label": "" + } + }, { "source": "serpAPI_0", - "sourceHandle": "serpAPI_0-output-serpAPI-SerpAPI|Tool|StructuredTool|BaseLangChain", + "sourceHandle": "serpAPI_0-output-serpAPI-SerpAPI|Tool|StructuredTool", "target": "autoGPT_0", "targetHandle": "autoGPT_0-input-tools-Tool", "type": "buttonedge", - "id": "serpAPI_0-serpAPI_0-output-serpAPI-SerpAPI|Tool|StructuredTool|BaseLangChain-autoGPT_0-autoGPT_0-input-tools-Tool", + "id": "serpAPI_0-serpAPI_0-output-serpAPI-SerpAPI|Tool|StructuredTool-autoGPT_0-autoGPT_0-input-tools-Tool", "data": { "label": "" } diff --git a/packages/server/marketplaces/BabyAGI.json b/packages/server/marketplaces/chatflows/BabyAGI.json similarity index 64% rename from packages/server/marketplaces/BabyAGI.json rename to packages/server/marketplaces/chatflows/BabyAGI.json index 6cb40519d..c28975314 100644 --- a/packages/server/marketplaces/BabyAGI.json +++ b/packages/server/marketplaces/chatflows/BabyAGI.json @@ -3,312 +3,7 @@ "nodes": [ { "width": 300, - "height": 331, - "id": "openAIEmbeddings_1", - "position": { - "x": -84.60344342694289, - "y": -189.6930708050951 - }, - "type": "customNode", - "data": { - "id": "openAIEmbeddings_1", - "label": "OpenAI Embeddings", - "name": "openAIEmbeddings", - "type": "OpenAIEmbeddings", - "baseClasses": ["OpenAIEmbeddings", "Embeddings"], - "category": "Embeddings", - "description": "OpenAI API to generate embeddings for a given text", - "inputParams": [ - { - "label": "OpenAI Api Key", - "name": "openAIApiKey", - "type": "password", - "id": "openAIEmbeddings_1-input-openAIApiKey-password" - }, - { - "label": "Strip New Lines", - "name": "stripNewLines", - "type": "boolean", - "optional": true, - "additionalParams": true, - "id": "openAIEmbeddings_1-input-stripNewLines-boolean" - }, - { - "label": "Batch Size", - "name": "batchSize", - "type": "number", - "optional": true, - "additionalParams": true, - "id": "openAIEmbeddings_1-input-batchSize-number" - }, - { - "label": "Timeout", - "name": "timeout", - "type": "number", - "optional": true, - "additionalParams": true, - "id": "openAIEmbeddings_1-input-timeout-number" - } - ], - "inputAnchors": [], - "inputs": { - "stripNewLines": "", - "batchSize": "", - "timeout": "" - }, - "outputAnchors": [ - { - "id": "openAIEmbeddings_1-output-openAIEmbeddings-OpenAIEmbeddings|Embeddings", - "name": "openAIEmbeddings", - "label": "OpenAIEmbeddings", - "type": "OpenAIEmbeddings | Embeddings" - } - ], - "outputs": {}, - "selected": false - }, - "positionAbsolute": { - "x": -84.60344342694289, - "y": -189.6930708050951 - }, - "selected": false, - "dragging": false - }, - { - "width": 300, - "height": 652, - "id": "pineconeExistingIndex_1", - "position": { - "x": 264.729293346415, - "y": -190.36689763560724 - }, - "type": "customNode", - "data": { - "id": "pineconeExistingIndex_1", - "label": "Pinecone Load Existing Index", - "name": "pineconeExistingIndex", - "type": "Pinecone", - "baseClasses": ["Pinecone", "VectorStoreRetriever", "BaseRetriever"], - "category": "Vector Stores", - "description": "Load existing index from Pinecone (i.e: Document has been upserted)", - "inputParams": [ - { - "label": "Pinecone Api Key", - "name": "pineconeApiKey", - "type": "password", - "id": "pineconeExistingIndex_1-input-pineconeApiKey-password" - }, - { - "label": "Pinecone Environment", - "name": "pineconeEnv", - "type": "string", - "id": "pineconeExistingIndex_1-input-pineconeEnv-string" - }, - { - "label": "Pinecone Index", - "name": "pineconeIndex", - "type": "string", - "id": "pineconeExistingIndex_1-input-pineconeIndex-string" - }, - { - "label": "Pinecone Namespace", - "name": "pineconeNamespace", - "type": "string", - "placeholder": "my-first-namespace", - "optional": true, - "id": "pineconeExistingIndex_1-input-pineconeNamespace-string" - }, - { - "label": "Pinecone Metadata Filter", - "name": "pineconeMetadataFilter", - "type": "json", - "optional": true, - "additionalParams": true, - "id": "pineconeExistingIndex_1-input-pineconeMetadataFilter-json" - } - ], - "inputAnchors": [ - { - "label": "Embeddings", - "name": "embeddings", - "type": "Embeddings", - "id": "pineconeExistingIndex_1-input-embeddings-Embeddings" - } - ], - "inputs": { - "embeddings": "{{openAIEmbeddings_1.data.instance}}", - "pineconeEnv": "us-west4-gcp", - "pineconeIndex": "", - "pineconeNamespace": "" - }, - "outputAnchors": [ - { - "name": "output", - "label": "Output", - "type": "options", - "options": [ - { - "id": "pineconeExistingIndex_1-output-retriever-Pinecone|VectorStoreRetriever|BaseRetriever", - "name": "retriever", - "label": "Pinecone Retriever", - "type": "Pinecone | VectorStoreRetriever | BaseRetriever" - }, - { - "id": "pineconeExistingIndex_1-output-vectorStore-Pinecone|VectorStore", - "name": "vectorStore", - "label": "Pinecone Vector Store", - "type": "Pinecone | VectorStore" - } - ], - "default": "retriever" - } - ], - "outputs": { - "output": "vectorStore" - }, - "selected": false - }, - "selected": false, - "positionAbsolute": { - "x": 264.729293346415, - "y": -190.36689763560724 - }, - "dragging": false - }, - { - "width": 300, - "height": 526, - "id": "chatOpenAI_1", - "position": { - "x": 590.3367401418911, - "y": -374.0329977259934 - }, - "type": "customNode", - "data": { - "id": "chatOpenAI_1", - "label": "ChatOpenAI", - "name": "chatOpenAI", - "type": "ChatOpenAI", - "baseClasses": ["ChatOpenAI", "BaseChatModel", "BaseLanguageModel", "BaseLangChain"], - "category": "Chat Models", - "description": "Wrapper around OpenAI large language models that use the Chat endpoint", - "inputParams": [ - { - "label": "OpenAI Api Key", - "name": "openAIApiKey", - "type": "password", - "id": "chatOpenAI_1-input-openAIApiKey-password" - }, - { - "label": "Model Name", - "name": "modelName", - "type": "options", - "options": [ - { - "label": "gpt-4", - "name": "gpt-4" - }, - { - "label": "gpt-4-0314", - "name": "gpt-4-0314" - }, - { - "label": "gpt-4-32k-0314", - "name": "gpt-4-32k-0314" - }, - { - "label": "gpt-3.5-turbo", - "name": "gpt-3.5-turbo" - }, - { - "label": "gpt-3.5-turbo-0301", - "name": "gpt-3.5-turbo-0301" - } - ], - "default": "gpt-3.5-turbo", - "optional": true, - "id": "chatOpenAI_1-input-modelName-options" - }, - { - "label": "Temperature", - "name": "temperature", - "type": "number", - "default": 0.9, - "optional": true, - "id": "chatOpenAI_1-input-temperature-number" - }, - { - "label": "Max Tokens", - "name": "maxTokens", - "type": "number", - "optional": true, - "additionalParams": true, - "id": "chatOpenAI_1-input-maxTokens-number" - }, - { - "label": "Top Probability", - "name": "topP", - "type": "number", - "optional": true, - "additionalParams": true, - "id": "chatOpenAI_1-input-topP-number" - }, - { - "label": "Frequency Penalty", - "name": "frequencyPenalty", - "type": "number", - "optional": true, - "additionalParams": true, - "id": "chatOpenAI_1-input-frequencyPenalty-number" - }, - { - "label": "Presence Penalty", - "name": "presencePenalty", - "type": "number", - "optional": true, - "additionalParams": true, - "id": "chatOpenAI_1-input-presencePenalty-number" - }, - { - "label": "Timeout", - "name": "timeout", - "type": "number", - "optional": true, - "additionalParams": true, - "id": "chatOpenAI_1-input-timeout-number" - } - ], - "inputAnchors": [], - "inputs": { - "modelName": "gpt-3.5-turbo", - "temperature": "0", - "maxTokens": "", - "topP": "", - "frequencyPenalty": "", - "presencePenalty": "", - "timeout": "" - }, - "outputAnchors": [ - { - "id": "chatOpenAI_1-output-chatOpenAI-ChatOpenAI|BaseChatModel|BaseLanguageModel|BaseLangChain", - "name": "chatOpenAI", - "label": "ChatOpenAI", - "type": "ChatOpenAI | BaseChatModel | BaseLanguageModel | BaseLangChain" - } - ], - "outputs": {}, - "selected": false - }, - "selected": false, - "dragging": false, - "positionAbsolute": { - "x": 590.3367401418911, - "y": -374.0329977259934 - } - }, - { - "width": 300, - "height": 380, + "height": 379, "id": "babyAGI_1", "position": { "x": 950.8042093214954, @@ -319,6 +14,7 @@ "id": "babyAGI_1", "label": "BabyAGI", "name": "babyAGI", + "version": 1, "type": "BabyAGI", "baseClasses": ["BabyAGI"], "category": "Agents", @@ -347,8 +43,8 @@ } ], "inputs": { - "model": "{{chatOpenAI_1.data.instance}}", - "vectorStore": "{{pineconeExistingIndex_1.data.instance}}", + "model": "{{chatOpenAI_0.data.instance}}", + "vectorStore": "{{pineconeExistingIndex_0.data.instance}}", "taskLoop": 3 }, "outputAnchors": [ @@ -368,38 +64,385 @@ "x": 950.8042093214954, "y": 66.00028106865324 } + }, + { + "width": 300, + "height": 523, + "id": "chatOpenAI_0", + "position": { + "x": 587.1798180512677, + "y": -355.9845878719703 + }, + "type": "customNode", + "data": { + "id": "chatOpenAI_0", + "label": "ChatOpenAI", + "name": "chatOpenAI", + "version": 1, + "type": "ChatOpenAI", + "baseClasses": ["ChatOpenAI", "BaseChatModel", "BaseLanguageModel"], + "category": "Chat Models", + "description": "Wrapper around OpenAI large language models that use the Chat endpoint", + "inputParams": [ + { + "label": "Connect Credential", + "name": "credential", + "type": "credential", + "credentialNames": ["openAIApi"], + "id": "chatOpenAI_0-input-credential-credential" + }, + { + "label": "Model Name", + "name": "modelName", + "type": "options", + "options": [ + { + "label": "gpt-4", + "name": "gpt-4" + }, + { + "label": "gpt-4-0613", + "name": "gpt-4-0613" + }, + { + "label": "gpt-4-32k", + "name": "gpt-4-32k" + }, + { + "label": "gpt-4-32k-0613", + "name": "gpt-4-32k-0613" + }, + { + "label": "gpt-3.5-turbo", + "name": "gpt-3.5-turbo" + }, + { + "label": "gpt-3.5-turbo-0613", + "name": "gpt-3.5-turbo-0613" + }, + { + "label": "gpt-3.5-turbo-16k", + "name": "gpt-3.5-turbo-16k" + }, + { + "label": "gpt-3.5-turbo-16k-0613", + "name": "gpt-3.5-turbo-16k-0613" + } + ], + "default": "gpt-3.5-turbo", + "optional": true, + "id": "chatOpenAI_0-input-modelName-options" + }, + { + "label": "Temperature", + "name": "temperature", + "type": "number", + "default": 0.9, + "optional": true, + "id": "chatOpenAI_0-input-temperature-number" + }, + { + "label": "Max Tokens", + "name": "maxTokens", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_0-input-maxTokens-number" + }, + { + "label": "Top Probability", + "name": "topP", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_0-input-topP-number" + }, + { + "label": "Frequency Penalty", + "name": "frequencyPenalty", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_0-input-frequencyPenalty-number" + }, + { + "label": "Presence Penalty", + "name": "presencePenalty", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_0-input-presencePenalty-number" + }, + { + "label": "Timeout", + "name": "timeout", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_0-input-timeout-number" + }, + { + "label": "BasePath", + "name": "basepath", + "type": "string", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_0-input-basepath-string" + } + ], + "inputAnchors": [], + "inputs": { + "modelName": "gpt-3.5-turbo", + "temperature": 0.9, + "maxTokens": "", + "topP": "", + "frequencyPenalty": "", + "presencePenalty": "", + "timeout": "", + "basepath": "" + }, + "outputAnchors": [ + { + "id": "chatOpenAI_0-output-chatOpenAI-ChatOpenAI|BaseChatModel|BaseLanguageModel", + "name": "chatOpenAI", + "label": "ChatOpenAI", + "type": "ChatOpenAI | BaseChatModel | BaseLanguageModel" + } + ], + "outputs": {}, + "selected": false + }, + "selected": false, + "positionAbsolute": { + "x": 587.1798180512677, + "y": -355.9845878719703 + }, + "dragging": false + }, + { + "width": 300, + "height": 329, + "id": "openAIEmbeddings_0", + "position": { + "x": -111.82510263637522, + "y": -224.88655030419665 + }, + "type": "customNode", + "data": { + "id": "openAIEmbeddings_0", + "label": "OpenAI Embeddings", + "name": "openAIEmbeddings", + "version": 1, + "type": "OpenAIEmbeddings", + "baseClasses": ["OpenAIEmbeddings", "Embeddings"], + "category": "Embeddings", + "description": "OpenAI API to generate embeddings for a given text", + "inputParams": [ + { + "label": "Connect Credential", + "name": "credential", + "type": "credential", + "credentialNames": ["openAIApi"], + "id": "openAIEmbeddings_0-input-credential-credential" + }, + { + "label": "Strip New Lines", + "name": "stripNewLines", + "type": "boolean", + "optional": true, + "additionalParams": true, + "id": "openAIEmbeddings_0-input-stripNewLines-boolean" + }, + { + "label": "Batch Size", + "name": "batchSize", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "openAIEmbeddings_0-input-batchSize-number" + }, + { + "label": "Timeout", + "name": "timeout", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "openAIEmbeddings_0-input-timeout-number" + }, + { + "label": "BasePath", + "name": "basepath", + "type": "string", + "optional": true, + "additionalParams": true, + "id": "openAIEmbeddings_0-input-basepath-string" + } + ], + "inputAnchors": [], + "inputs": { + "stripNewLines": "", + "batchSize": "", + "timeout": "", + "basepath": "" + }, + "outputAnchors": [ + { + "id": "openAIEmbeddings_0-output-openAIEmbeddings-OpenAIEmbeddings|Embeddings", + "name": "openAIEmbeddings", + "label": "OpenAIEmbeddings", + "type": "OpenAIEmbeddings | Embeddings" + } + ], + "outputs": {}, + "selected": false + }, + "selected": false, + "positionAbsolute": { + "x": -111.82510263637522, + "y": -224.88655030419665 + }, + "dragging": false + }, + { + "width": 300, + "height": 505, + "id": "pineconeExistingIndex_0", + "position": { + "x": 241.78764591331816, + "y": -38.438460915613945 + }, + "type": "customNode", + "data": { + "id": "pineconeExistingIndex_0", + "label": "Pinecone Load Existing Index", + "name": "pineconeExistingIndex", + "version": 1, + "type": "Pinecone", + "baseClasses": ["Pinecone", "VectorStoreRetriever", "BaseRetriever"], + "category": "Vector Stores", + "description": "Load existing index from Pinecone (i.e: Document has been upserted)", + "inputParams": [ + { + "label": "Connect Credential", + "name": "credential", + "type": "credential", + "credentialNames": ["pineconeApi"], + "id": "pineconeExistingIndex_0-input-credential-credential" + }, + { + "label": "Pinecone Index", + "name": "pineconeIndex", + "type": "string", + "id": "pineconeExistingIndex_0-input-pineconeIndex-string" + }, + { + "label": "Pinecone Namespace", + "name": "pineconeNamespace", + "type": "string", + "placeholder": "my-first-namespace", + "additionalParams": true, + "optional": true, + "id": "pineconeExistingIndex_0-input-pineconeNamespace-string" + }, + { + "label": "Pinecone Metadata Filter", + "name": "pineconeMetadataFilter", + "type": "json", + "optional": true, + "additionalParams": true, + "id": "pineconeExistingIndex_0-input-pineconeMetadataFilter-json" + }, + { + "label": "Top K", + "name": "topK", + "description": "Number of top results to fetch. Default to 4", + "placeholder": "4", + "type": "number", + "additionalParams": true, + "optional": true, + "id": "pineconeExistingIndex_0-input-topK-number" + } + ], + "inputAnchors": [ + { + "label": "Embeddings", + "name": "embeddings", + "type": "Embeddings", + "id": "pineconeExistingIndex_0-input-embeddings-Embeddings" + } + ], + "inputs": { + "embeddings": "{{openAIEmbeddings_0.data.instance}}", + "pineconeIndex": "newindex", + "pineconeNamespace": "", + "pineconeMetadataFilter": "", + "topK": "" + }, + "outputAnchors": [ + { + "name": "output", + "label": "Output", + "type": "options", + "options": [ + { + "id": "pineconeExistingIndex_0-output-retriever-Pinecone|VectorStoreRetriever|BaseRetriever", + "name": "retriever", + "label": "Pinecone Retriever", + "type": "Pinecone | VectorStoreRetriever | BaseRetriever" + }, + { + "id": "pineconeExistingIndex_0-output-vectorStore-Pinecone|VectorStore", + "name": "vectorStore", + "label": "Pinecone Vector Store", + "type": "Pinecone | VectorStore" + } + ], + "default": "retriever" + } + ], + "outputs": { + "output": "vectorStore" + }, + "selected": false + }, + "selected": false, + "positionAbsolute": { + "x": 241.78764591331816, + "y": -38.438460915613945 + }, + "dragging": false } ], "edges": [ { - "source": "openAIEmbeddings_1", - "sourceHandle": "openAIEmbeddings_1-output-openAIEmbeddings-OpenAIEmbeddings|Embeddings", - "target": "pineconeExistingIndex_1", - "targetHandle": "pineconeExistingIndex_1-input-embeddings-Embeddings", - "type": "buttonedge", - "id": "openAIEmbeddings_1-openAIEmbeddings_1-output-openAIEmbeddings-OpenAIEmbeddings|Embeddings-pineconeExistingIndex_1-pineconeExistingIndex_1-input-embeddings-Embeddings", - "data": { - "label": "" - } - }, - { - "source": "chatOpenAI_1", - "sourceHandle": "chatOpenAI_1-output-chatOpenAI-ChatOpenAI|BaseChatModel|BaseLanguageModel|BaseLangChain", - "target": "babyAGI_1", - "targetHandle": "babyAGI_1-input-model-BaseChatModel", - "type": "buttonedge", - "id": "chatOpenAI_1-chatOpenAI_1-output-chatOpenAI-ChatOpenAI|BaseChatModel|BaseLanguageModel|BaseLangChain-babyAGI_1-babyAGI_1-input-model-BaseChatModel", - "data": { - "label": "" - } - }, - { - "source": "pineconeExistingIndex_1", - "sourceHandle": "pineconeExistingIndex_1-output-vectorStore-Pinecone|VectorStore", + "source": "pineconeExistingIndex_0", + "sourceHandle": "pineconeExistingIndex_0-output-vectorStore-Pinecone|VectorStore", "target": "babyAGI_1", "targetHandle": "babyAGI_1-input-vectorStore-VectorStore", "type": "buttonedge", - "id": "pineconeExistingIndex_1-pineconeExistingIndex_1-output-vectorStore-Pinecone|VectorStore-babyAGI_1-babyAGI_1-input-vectorStore-VectorStore", + "id": "pineconeExistingIndex_0-pineconeExistingIndex_0-output-vectorStore-Pinecone|VectorStore-babyAGI_1-babyAGI_1-input-vectorStore-VectorStore", + "data": { + "label": "" + } + }, + { + "source": "chatOpenAI_0", + "sourceHandle": "chatOpenAI_0-output-chatOpenAI-ChatOpenAI|BaseChatModel|BaseLanguageModel", + "target": "babyAGI_1", + "targetHandle": "babyAGI_1-input-model-BaseChatModel", + "type": "buttonedge", + "id": "chatOpenAI_0-chatOpenAI_0-output-chatOpenAI-ChatOpenAI|BaseChatModel|BaseLanguageModel-babyAGI_1-babyAGI_1-input-model-BaseChatModel", + "data": { + "label": "" + } + }, + { + "source": "openAIEmbeddings_0", + "sourceHandle": "openAIEmbeddings_0-output-openAIEmbeddings-OpenAIEmbeddings|Embeddings", + "target": "pineconeExistingIndex_0", + "targetHandle": "pineconeExistingIndex_0-input-embeddings-Embeddings", + "type": "buttonedge", + "id": "openAIEmbeddings_0-openAIEmbeddings_0-output-openAIEmbeddings-OpenAIEmbeddings|Embeddings-pineconeExistingIndex_0-pineconeExistingIndex_0-input-embeddings-Embeddings", "data": { "label": "" } diff --git a/packages/server/marketplaces/chatflows/CSV Agent.json b/packages/server/marketplaces/chatflows/CSV Agent.json new file mode 100644 index 000000000..1515fcad3 --- /dev/null +++ b/packages/server/marketplaces/chatflows/CSV Agent.json @@ -0,0 +1,228 @@ +{ + "description": "Analyse and summarize CSV data", + "nodes": [ + { + "width": 300, + "height": 377, + "id": "csvAgent_0", + "position": { + "x": 1064.0780498701288, + "y": 284.44352695304724 + }, + "type": "customNode", + "data": { + "id": "csvAgent_0", + "label": "CSV Agent", + "name": "csvAgent", + "version": 1, + "type": "AgentExecutor", + "baseClasses": ["AgentExecutor", "BaseChain"], + "category": "Agents", + "description": "Agent used to to answer queries on CSV data", + "inputParams": [ + { + "label": "Csv File", + "name": "csvFile", + "type": "file", + "fileType": ".csv", + "id": "csvAgent_0-input-csvFile-file" + } + ], + "inputAnchors": [ + { + "label": "Language Model", + "name": "model", + "type": "BaseLanguageModel", + "id": "csvAgent_0-input-model-BaseLanguageModel" + } + ], + "inputs": { + "model": "{{chatOpenAI_0.data.instance}}" + }, + "outputAnchors": [ + { + "id": "csvAgent_0-output-csvAgent-AgentExecutor|BaseChain", + "name": "csvAgent", + "label": "AgentExecutor", + "type": "AgentExecutor | BaseChain" + } + ], + "outputs": {}, + "selected": false + }, + "selected": false, + "positionAbsolute": { + "x": 1064.0780498701288, + "y": 284.44352695304724 + }, + "dragging": false + }, + { + "width": 300, + "height": 522, + "id": "chatOpenAI_0", + "position": { + "x": 657.3762197414501, + "y": 220.2950766042332 + }, + "type": "customNode", + "data": { + "id": "chatOpenAI_0", + "label": "ChatOpenAI", + "name": "chatOpenAI", + "version": 1, + "type": "ChatOpenAI", + "baseClasses": ["ChatOpenAI", "BaseChatModel", "BaseLanguageModel"], + "category": "Chat Models", + "description": "Wrapper around OpenAI large language models that use the Chat endpoint", + "inputParams": [ + { + "label": "Connect Credential", + "name": "credential", + "type": "credential", + "credentialNames": ["openAIApi"], + "id": "chatOpenAI_0-input-credential-credential" + }, + { + "label": "Model Name", + "name": "modelName", + "type": "options", + "options": [ + { + "label": "gpt-4", + "name": "gpt-4" + }, + { + "label": "gpt-4-0613", + "name": "gpt-4-0613" + }, + { + "label": "gpt-4-32k", + "name": "gpt-4-32k" + }, + { + "label": "gpt-4-32k-0613", + "name": "gpt-4-32k-0613" + }, + { + "label": "gpt-3.5-turbo", + "name": "gpt-3.5-turbo" + }, + { + "label": "gpt-3.5-turbo-0613", + "name": "gpt-3.5-turbo-0613" + }, + { + "label": "gpt-3.5-turbo-16k", + "name": "gpt-3.5-turbo-16k" + }, + { + "label": "gpt-3.5-turbo-16k-0613", + "name": "gpt-3.5-turbo-16k-0613" + } + ], + "default": "gpt-3.5-turbo", + "optional": true, + "id": "chatOpenAI_0-input-modelName-options" + }, + { + "label": "Temperature", + "name": "temperature", + "type": "number", + "default": 0.9, + "optional": true, + "id": "chatOpenAI_0-input-temperature-number" + }, + { + "label": "Max Tokens", + "name": "maxTokens", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_0-input-maxTokens-number" + }, + { + "label": "Top Probability", + "name": "topP", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_0-input-topP-number" + }, + { + "label": "Frequency Penalty", + "name": "frequencyPenalty", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_0-input-frequencyPenalty-number" + }, + { + "label": "Presence Penalty", + "name": "presencePenalty", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_0-input-presencePenalty-number" + }, + { + "label": "Timeout", + "name": "timeout", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_0-input-timeout-number" + }, + { + "label": "BasePath", + "name": "basepath", + "type": "string", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_0-input-basepath-string" + } + ], + "inputAnchors": [], + "inputs": { + "modelName": "gpt-3.5-turbo", + "temperature": 0.9, + "maxTokens": "", + "topP": "", + "frequencyPenalty": "", + "presencePenalty": "", + "timeout": "", + "basepath": "" + }, + "outputAnchors": [ + { + "id": "chatOpenAI_0-output-chatOpenAI-ChatOpenAI|BaseChatModel|BaseLanguageModel", + "name": "chatOpenAI", + "label": "ChatOpenAI", + "type": "ChatOpenAI | BaseChatModel | BaseLanguageModel" + } + ], + "outputs": {}, + "selected": false + }, + "selected": false, + "positionAbsolute": { + "x": 657.3762197414501, + "y": 220.2950766042332 + }, + "dragging": false + } + ], + "edges": [ + { + "source": "chatOpenAI_0", + "sourceHandle": "chatOpenAI_0-output-chatOpenAI-ChatOpenAI|BaseChatModel|BaseLanguageModel", + "target": "csvAgent_0", + "targetHandle": "csvAgent_0-input-model-BaseLanguageModel", + "type": "buttonedge", + "id": "chatOpenAI_0-chatOpenAI_0-output-chatOpenAI-ChatOpenAI|BaseChatModel|BaseLanguageModel-csvAgent_0-csvAgent_0-input-model-BaseLanguageModel", + "data": { + "label": "" + } + } + ] +} diff --git a/packages/server/marketplaces/ChatGPTPlugin.json b/packages/server/marketplaces/chatflows/ChatGPTPlugin.json similarity index 87% rename from packages/server/marketplaces/ChatGPTPlugin.json rename to packages/server/marketplaces/chatflows/ChatGPTPlugin.json index 4eec2ccc1..471853baa 100644 --- a/packages/server/marketplaces/ChatGPTPlugin.json +++ b/packages/server/marketplaces/chatflows/ChatGPTPlugin.json @@ -14,6 +14,7 @@ "id": "aiPlugin_0", "label": "AI Plugin", "name": "aiPlugin", + "version": 1, "type": "AIPlugin", "baseClasses": ["AIPlugin", "Tool"], "category": "Tools", @@ -60,6 +61,7 @@ "id": "requestsGet_0", "label": "Requests Get", "name": "requestsGet", + "version": 1, "type": "RequestsGet", "baseClasses": ["RequestsGet", "Tool", "StructuredTool", "BaseLangChain"], "category": "Tools", @@ -131,6 +133,7 @@ "id": "requestsPost_0", "label": "Requests Post", "name": "requestsPost", + "version": 1, "type": "RequestsPost", "baseClasses": ["RequestsPost", "Tool", "StructuredTool", "BaseLangChain"], "category": "Tools", @@ -201,82 +204,29 @@ }, { "width": 300, - "height": 280, - "id": "mrklAgentChat_0", - "position": { - "x": 1416.2054860029416, - "y": 451.43299014109715 - }, - "type": "customNode", - "data": { - "id": "mrklAgentChat_0", - "label": "MRKL Agent for Chat Models", - "name": "mrklAgentChat", - "type": "AgentExecutor", - "baseClasses": ["AgentExecutor", "BaseChain", "BaseLangChain"], - "category": "Agents", - "description": "Agent that uses the ReAct Framework to decide what action to take, optimized to be used with Chat Models", - "inputParams": [], - "inputAnchors": [ - { - "label": "Allowed Tools", - "name": "tools", - "type": "Tool", - "list": true, - "id": "mrklAgentChat_0-input-tools-Tool" - }, - { - "label": "Language Model", - "name": "model", - "type": "BaseLanguageModel", - "id": "mrklAgentChat_0-input-model-BaseLanguageModel" - } - ], - "inputs": { - "tools": ["{{requestsGet_0.data.instance}}", "{{requestsPost_0.data.instance}}", "{{aiPlugin_0.data.instance}}"], - "model": "{{chatOpenAI_0.data.instance}}" - }, - "outputAnchors": [ - { - "id": "mrklAgentChat_0-output-mrklAgentChat-AgentExecutor|BaseChain|BaseLangChain", - "name": "mrklAgentChat", - "label": "AgentExecutor", - "type": "AgentExecutor | BaseChain | BaseLangChain" - } - ], - "outputs": {}, - "selected": false - }, - "selected": false, - "positionAbsolute": { - "x": 1416.2054860029416, - "y": 451.43299014109715 - }, - "dragging": false - }, - { - "width": 300, - "height": 524, + "height": 523, "id": "chatOpenAI_0", "position": { - "x": 797.0574814814245, - "y": 578.7641992971934 + "x": 802.0103755177098, + "y": 576.0760341170851 }, "type": "customNode", "data": { "id": "chatOpenAI_0", "label": "ChatOpenAI", "name": "chatOpenAI", + "version": 1, "type": "ChatOpenAI", - "baseClasses": ["ChatOpenAI", "BaseChatModel", "BaseLanguageModel", "BaseLangChain"], + "baseClasses": ["ChatOpenAI", "BaseChatModel", "BaseLanguageModel"], "category": "Chat Models", "description": "Wrapper around OpenAI large language models that use the Chat endpoint", "inputParams": [ { - "label": "OpenAI Api Key", - "name": "openAIApiKey", - "type": "password", - "id": "chatOpenAI_0-input-openAIApiKey-password" + "label": "Connect Credential", + "name": "credential", + "type": "credential", + "credentialNames": ["openAIApi"], + "id": "chatOpenAI_0-input-credential-credential" }, { "label": "Model Name", @@ -288,20 +238,32 @@ "name": "gpt-4" }, { - "label": "gpt-4-0314", - "name": "gpt-4-0314" + "label": "gpt-4-0613", + "name": "gpt-4-0613" }, { - "label": "gpt-4-32k-0314", - "name": "gpt-4-32k-0314" + "label": "gpt-4-32k", + "name": "gpt-4-32k" + }, + { + "label": "gpt-4-32k-0613", + "name": "gpt-4-32k-0613" }, { "label": "gpt-3.5-turbo", "name": "gpt-3.5-turbo" }, { - "label": "gpt-3.5-turbo-0301", - "name": "gpt-3.5-turbo-0301" + "label": "gpt-3.5-turbo-0613", + "name": "gpt-3.5-turbo-0613" + }, + { + "label": "gpt-3.5-turbo-16k", + "name": "gpt-3.5-turbo-16k" + }, + { + "label": "gpt-3.5-turbo-16k-0613", + "name": "gpt-3.5-turbo-16k-0613" } ], "default": "gpt-3.5-turbo", @@ -355,6 +317,14 @@ "optional": true, "additionalParams": true, "id": "chatOpenAI_0-input-timeout-number" + }, + { + "label": "BasePath", + "name": "basepath", + "type": "string", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_0-input-basepath-string" } ], "inputAnchors": [], @@ -365,14 +335,15 @@ "topP": "", "frequencyPenalty": "", "presencePenalty": "", - "timeout": "" + "timeout": "", + "basepath": "" }, "outputAnchors": [ { - "id": "chatOpenAI_0-output-chatOpenAI-ChatOpenAI|BaseChatModel|BaseLanguageModel|BaseLangChain", + "id": "chatOpenAI_0-output-chatOpenAI-ChatOpenAI|BaseChatModel|BaseLanguageModel", "name": "chatOpenAI", "label": "ChatOpenAI", - "type": "ChatOpenAI | BaseChatModel | BaseLanguageModel | BaseLangChain" + "type": "ChatOpenAI | BaseChatModel | BaseLanguageModel" } ], "outputs": {}, @@ -380,24 +351,69 @@ }, "selected": false, "positionAbsolute": { - "x": 797.0574814814245, - "y": 578.7641992971934 + "x": 802.0103755177098, + "y": 576.0760341170851 + }, + "dragging": false + }, + { + "width": 300, + "height": 280, + "id": "mrklAgentChat_0", + "position": { + "x": 1425.5853300862047, + "y": 441.06218012993924 + }, + "type": "customNode", + "data": { + "id": "mrklAgentChat_0", + "label": "MRKL Agent for Chat Models", + "name": "mrklAgentChat", + "version": 1, + "type": "AgentExecutor", + "baseClasses": ["AgentExecutor", "BaseChain"], + "category": "Agents", + "description": "Agent that uses the ReAct Framework to decide what action to take, optimized to be used with Chat Models", + "inputParams": [], + "inputAnchors": [ + { + "label": "Allowed Tools", + "name": "tools", + "type": "Tool", + "list": true, + "id": "mrklAgentChat_0-input-tools-Tool" + }, + { + "label": "Language Model", + "name": "model", + "type": "BaseLanguageModel", + "id": "mrklAgentChat_0-input-model-BaseLanguageModel" + } + ], + "inputs": { + "tools": ["{{requestsGet_0.data.instance}}", "{{requestsPost_0.data.instance}}", "{{aiPlugin_0.data.instance}}"], + "model": "{{chatOpenAI_0.data.instance}}" + }, + "outputAnchors": [ + { + "id": "mrklAgentChat_0-output-mrklAgentChat-AgentExecutor|BaseChain", + "name": "mrklAgentChat", + "label": "AgentExecutor", + "type": "AgentExecutor | BaseChain" + } + ], + "outputs": {}, + "selected": false + }, + "selected": false, + "positionAbsolute": { + "x": 1425.5853300862047, + "y": 441.06218012993924 }, "dragging": false } ], "edges": [ - { - "source": "requestsGet_0", - "sourceHandle": "requestsGet_0-output-requestsGet-RequestsGet|Tool|StructuredTool|BaseLangChain", - "target": "mrklAgentChat_0", - "targetHandle": "mrklAgentChat_0-input-tools-Tool", - "type": "buttonedge", - "id": "requestsGet_0-requestsGet_0-output-requestsGet-RequestsGet|Tool|StructuredTool|BaseLangChain-mrklAgentChat_0-mrklAgentChat_0-input-tools-Tool", - "data": { - "label": "" - } - }, { "source": "aiPlugin_0", "sourceHandle": "aiPlugin_0-output-aiPlugin-AIPlugin|Tool", @@ -409,6 +425,17 @@ "label": "" } }, + { + "source": "requestsGet_0", + "sourceHandle": "requestsGet_0-output-requestsGet-RequestsGet|Tool|StructuredTool|BaseLangChain", + "target": "mrklAgentChat_0", + "targetHandle": "mrklAgentChat_0-input-tools-Tool", + "type": "buttonedge", + "id": "requestsGet_0-requestsGet_0-output-requestsGet-RequestsGet|Tool|StructuredTool|BaseLangChain-mrklAgentChat_0-mrklAgentChat_0-input-tools-Tool", + "data": { + "label": "" + } + }, { "source": "requestsPost_0", "sourceHandle": "requestsPost_0-output-requestsPost-RequestsPost|Tool|StructuredTool|BaseLangChain", @@ -422,11 +449,11 @@ }, { "source": "chatOpenAI_0", - "sourceHandle": "chatOpenAI_0-output-chatOpenAI-ChatOpenAI|BaseChatModel|BaseLanguageModel|BaseLangChain", + "sourceHandle": "chatOpenAI_0-output-chatOpenAI-ChatOpenAI|BaseChatModel|BaseLanguageModel", "target": "mrklAgentChat_0", "targetHandle": "mrklAgentChat_0-input-model-BaseLanguageModel", "type": "buttonedge", - "id": "chatOpenAI_0-chatOpenAI_0-output-chatOpenAI-ChatOpenAI|BaseChatModel|BaseLanguageModel|BaseLangChain-mrklAgentChat_0-mrklAgentChat_0-input-model-BaseLanguageModel", + "id": "chatOpenAI_0-chatOpenAI_0-output-chatOpenAI-ChatOpenAI|BaseChatModel|BaseLanguageModel-mrklAgentChat_0-mrklAgentChat_0-input-model-BaseLanguageModel", "data": { "label": "" } diff --git a/packages/server/marketplaces/chatflows/Claude LLM.json b/packages/server/marketplaces/chatflows/Claude LLM.json new file mode 100644 index 000000000..243d26001 --- /dev/null +++ b/packages/server/marketplaces/chatflows/Claude LLM.json @@ -0,0 +1,416 @@ +{ + "description": "Use Anthropic Claude with 100k context window to ingest whole document for QnA", + "nodes": [ + { + "width": 300, + "height": 376, + "id": "bufferMemory_0", + "position": { + "x": 451.4449437285705, + "y": 118.30026803362762 + }, + "type": "customNode", + "data": { + "id": "bufferMemory_0", + "label": "Buffer Memory", + "name": "bufferMemory", + "version": 1, + "type": "BufferMemory", + "baseClasses": ["BufferMemory", "BaseChatMemory", "BaseMemory"], + "category": "Memory", + "description": "Remembers previous conversational back and forths directly", + "inputParams": [ + { + "label": "Memory Key", + "name": "memoryKey", + "type": "string", + "default": "chat_history", + "id": "bufferMemory_0-input-memoryKey-string" + }, + { + "label": "Input Key", + "name": "inputKey", + "type": "string", + "default": "input", + "id": "bufferMemory_0-input-inputKey-string" + } + ], + "inputAnchors": [], + "inputs": { + "memoryKey": "chat_history", + "inputKey": "input" + }, + "outputAnchors": [ + { + "id": "bufferMemory_0-output-bufferMemory-BufferMemory|BaseChatMemory|BaseMemory", + "name": "bufferMemory", + "label": "BufferMemory", + "type": "BufferMemory | BaseChatMemory | BaseMemory" + } + ], + "outputs": {}, + "selected": false + }, + "selected": false, + "positionAbsolute": { + "x": 451.4449437285705, + "y": 118.30026803362762 + }, + "dragging": false + }, + { + "width": 300, + "height": 383, + "id": "conversationChain_0", + "position": { + "x": 1176.1569322079652, + "y": 303.56879146735974 + }, + "type": "customNode", + "data": { + "id": "conversationChain_0", + "label": "Conversation Chain", + "name": "conversationChain", + "version": 1, + "type": "ConversationChain", + "baseClasses": ["ConversationChain", "LLMChain", "BaseChain"], + "category": "Chains", + "description": "Chat models specific conversational chain with memory", + "inputParams": [ + { + "label": "System Message", + "name": "systemMessagePrompt", + "type": "string", + "rows": 4, + "additionalParams": true, + "optional": true, + "placeholder": "You are a helpful assistant that write codes", + "id": "conversationChain_0-input-systemMessagePrompt-string" + } + ], + "inputAnchors": [ + { + "label": "Language Model", + "name": "model", + "type": "BaseChatModel", + "id": "conversationChain_0-input-model-BaseChatModel" + }, + { + "label": "Memory", + "name": "memory", + "type": "BaseMemory", + "id": "conversationChain_0-input-memory-BaseMemory" + }, + { + "label": "Document", + "name": "document", + "type": "Document", + "description": "Include whole document into the context window", + "optional": true, + "list": true, + "id": "conversationChain_0-input-document-Document" + } + ], + "inputs": { + "model": "{{chatAnthropic_0.data.instance}}", + "memory": "{{bufferMemory_0.data.instance}}", + "document": ["{{pdfFile_0.data.instance}}"], + "systemMessagePrompt": "" + }, + "outputAnchors": [ + { + "id": "conversationChain_0-output-conversationChain-ConversationChain|LLMChain|BaseChain", + "name": "conversationChain", + "label": "ConversationChain", + "type": "ConversationChain | LLMChain | BaseChain" + } + ], + "outputs": {}, + "selected": false + }, + "selected": false, + "positionAbsolute": { + "x": 1176.1569322079652, + "y": 303.56879146735974 + }, + "dragging": false + }, + { + "width": 300, + "height": 523, + "id": "chatAnthropic_0", + "position": { + "x": 800.5525382783799, + "y": -76.7988221837009 + }, + "type": "customNode", + "data": { + "id": "chatAnthropic_0", + "label": "ChatAnthropic", + "name": "chatAnthropic", + "version": 1, + "type": "ChatAnthropic", + "baseClasses": ["ChatAnthropic", "BaseChatModel", "BaseLanguageModel"], + "category": "Chat Models", + "description": "Wrapper around ChatAnthropic large language models that use the Chat endpoint", + "inputParams": [ + { + "label": "Connect Credential", + "name": "credential", + "type": "credential", + "credentialNames": ["anthropicApi"], + "id": "chatAnthropic_0-input-credential-credential" + }, + { + "label": "Model Name", + "name": "modelName", + "type": "options", + "options": [ + { + "label": "claude-2", + "name": "claude-2", + "description": "Claude 2 latest major version, automatically get updates to the model as they are released" + }, + { + "label": "claude-instant-1", + "name": "claude-instant-1", + "description": "Claude Instant latest major version, automatically get updates to the model as they are released" + }, + { + "label": "claude-v1", + "name": "claude-v1" + }, + { + "label": "claude-v1-100k", + "name": "claude-v1-100k" + }, + { + "label": "claude-v1.0", + "name": "claude-v1.0" + }, + { + "label": "claude-v1.2", + "name": "claude-v1.2" + }, + { + "label": "claude-v1.3", + "name": "claude-v1.3" + }, + { + "label": "claude-v1.3-100k", + "name": "claude-v1.3-100k" + }, + { + "label": "claude-instant-v1", + "name": "claude-instant-v1" + }, + { + "label": "claude-instant-v1-100k", + "name": "claude-instant-v1-100k" + }, + { + "label": "claude-instant-v1.0", + "name": "claude-instant-v1.0" + }, + { + "label": "claude-instant-v1.1", + "name": "claude-instant-v1.1" + }, + { + "label": "claude-instant-v1.1-100k", + "name": "claude-instant-v1.1-100k" + } + ], + "default": "claude-v1", + "optional": true, + "id": "chatAnthropic_0-input-modelName-options" + }, + { + "label": "Temperature", + "name": "temperature", + "type": "number", + "default": 0.9, + "optional": true, + "id": "chatAnthropic_0-input-temperature-number" + }, + { + "label": "Max Tokens", + "name": "maxTokensToSample", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "chatAnthropic_0-input-maxTokensToSample-number" + }, + { + "label": "Top P", + "name": "topP", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "chatAnthropic_0-input-topP-number" + }, + { + "label": "Top K", + "name": "topK", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "chatAnthropic_0-input-topK-number" + } + ], + "inputAnchors": [], + "inputs": { + "modelName": "claude-2", + "temperature": 0.9, + "maxTokensToSample": "", + "topP": "", + "topK": "" + }, + "outputAnchors": [ + { + "id": "chatAnthropic_0-output-chatAnthropic-ChatAnthropic|BaseChatModel|BaseLanguageModel", + "name": "chatAnthropic", + "label": "ChatAnthropic", + "type": "ChatAnthropic | BaseChatModel | BaseLanguageModel" + } + ], + "outputs": {}, + "selected": false + }, + "selected": false, + "positionAbsolute": { + "x": 800.5525382783799, + "y": -76.7988221837009 + }, + "dragging": false + }, + { + "width": 300, + "height": 507, + "id": "pdfFile_0", + "position": { + "x": 94.16886576108482, + "y": 37.12056504707391 + }, + "type": "customNode", + "data": { + "id": "pdfFile_0", + "label": "Pdf File", + "name": "pdfFile", + "version": 1, + "type": "Document", + "baseClasses": ["Document"], + "category": "Document Loaders", + "description": "Load data from PDF files", + "inputParams": [ + { + "label": "Pdf File", + "name": "pdfFile", + "type": "file", + "fileType": ".pdf", + "id": "pdfFile_0-input-pdfFile-file" + }, + { + "label": "Usage", + "name": "usage", + "type": "options", + "options": [ + { + "label": "One document per page", + "name": "perPage" + }, + { + "label": "One document per file", + "name": "perFile" + } + ], + "default": "perPage", + "id": "pdfFile_0-input-usage-options" + }, + { + "label": "Use Legacy Build", + "name": "legacyBuild", + "type": "boolean", + "optional": true, + "additionalParams": true, + "id": "pdfFile_0-input-legacyBuild-boolean" + }, + { + "label": "Metadata", + "name": "metadata", + "type": "json", + "optional": true, + "additionalParams": true, + "id": "pdfFile_0-input-metadata-json" + } + ], + "inputAnchors": [ + { + "label": "Text Splitter", + "name": "textSplitter", + "type": "TextSplitter", + "optional": true, + "id": "pdfFile_0-input-textSplitter-TextSplitter" + } + ], + "inputs": { + "textSplitter": "", + "usage": "perPage", + "legacyBuild": "", + "metadata": "" + }, + "outputAnchors": [ + { + "id": "pdfFile_0-output-pdfFile-Document", + "name": "pdfFile", + "label": "Document", + "type": "Document" + } + ], + "outputs": {}, + "selected": false + }, + "selected": false, + "positionAbsolute": { + "x": 94.16886576108482, + "y": 37.12056504707391 + }, + "dragging": false + } + ], + "edges": [ + { + "source": "bufferMemory_0", + "sourceHandle": "bufferMemory_0-output-bufferMemory-BufferMemory|BaseChatMemory|BaseMemory", + "target": "conversationChain_0", + "targetHandle": "conversationChain_0-input-memory-BaseMemory", + "type": "buttonedge", + "id": "bufferMemory_0-bufferMemory_0-output-bufferMemory-BufferMemory|BaseChatMemory|BaseMemory-conversationChain_0-conversationChain_0-input-memory-BaseMemory", + "data": { + "label": "" + } + }, + { + "source": "chatAnthropic_0", + "sourceHandle": "chatAnthropic_0-output-chatAnthropic-ChatAnthropic|BaseChatModel|BaseLanguageModel", + "target": "conversationChain_0", + "targetHandle": "conversationChain_0-input-model-BaseChatModel", + "type": "buttonedge", + "id": "chatAnthropic_0-chatAnthropic_0-output-chatAnthropic-ChatAnthropic|BaseChatModel|BaseLanguageModel-conversationChain_0-conversationChain_0-input-model-BaseChatModel", + "data": { + "label": "" + } + }, + { + "source": "pdfFile_0", + "sourceHandle": "pdfFile_0-output-pdfFile-Document", + "target": "conversationChain_0", + "targetHandle": "conversationChain_0-input-document-Document", + "type": "buttonedge", + "id": "pdfFile_0-pdfFile_0-output-pdfFile-Document-conversationChain_0-conversationChain_0-input-document-Document", + "data": { + "label": "" + } + } + ] +} diff --git a/packages/server/marketplaces/Conversational Agent.json b/packages/server/marketplaces/chatflows/Conversational Agent.json similarity index 65% rename from packages/server/marketplaces/Conversational Agent.json rename to packages/server/marketplaces/chatflows/Conversational Agent.json index b47b73f05..55475b3e9 100644 --- a/packages/server/marketplaces/Conversational Agent.json +++ b/packages/server/marketplaces/chatflows/Conversational Agent.json @@ -1,182 +1,6 @@ { "description": "A conversational agent for a chat model which utilize chat specific prompts", "nodes": [ - { - "width": 300, - "height": 524, - "id": "chatOpenAI_1", - "position": { - "x": 56.646518061018355, - "y": 71.07043412525425 - }, - "type": "customNode", - "data": { - "id": "chatOpenAI_1", - "label": "ChatOpenAI", - "name": "chatOpenAI", - "type": "ChatOpenAI", - "baseClasses": ["ChatOpenAI", "BaseChatModel", "BaseLanguageModel", "BaseLangChain"], - "category": "Chat Models", - "description": "Wrapper around OpenAI large language models that use the Chat endpoint", - "inputParams": [ - { - "label": "OpenAI Api Key", - "name": "openAIApiKey", - "type": "password", - "id": "chatOpenAI_1-input-openAIApiKey-password" - }, - { - "label": "Model Name", - "name": "modelName", - "type": "options", - "options": [ - { - "label": "gpt-4", - "name": "gpt-4" - }, - { - "label": "gpt-4-0314", - "name": "gpt-4-0314" - }, - { - "label": "gpt-4-32k-0314", - "name": "gpt-4-32k-0314" - }, - { - "label": "gpt-3.5-turbo", - "name": "gpt-3.5-turbo" - }, - { - "label": "gpt-3.5-turbo-0301", - "name": "gpt-3.5-turbo-0301" - } - ], - "default": "gpt-3.5-turbo", - "optional": true, - "id": "chatOpenAI_1-input-modelName-options" - }, - { - "label": "Temperature", - "name": "temperature", - "type": "number", - "default": 0.9, - "optional": true, - "id": "chatOpenAI_1-input-temperature-number" - }, - { - "label": "Max Tokens", - "name": "maxTokens", - "type": "number", - "optional": true, - "additionalParams": true, - "id": "chatOpenAI_1-input-maxTokens-number" - }, - { - "label": "Top Probability", - "name": "topP", - "type": "number", - "optional": true, - "additionalParams": true, - "id": "chatOpenAI_1-input-topP-number" - }, - { - "label": "Frequency Penalty", - "name": "frequencyPenalty", - "type": "number", - "optional": true, - "additionalParams": true, - "id": "chatOpenAI_1-input-frequencyPenalty-number" - }, - { - "label": "Presence Penalty", - "name": "presencePenalty", - "type": "number", - "optional": true, - "additionalParams": true, - "id": "chatOpenAI_1-input-presencePenalty-number" - }, - { - "label": "Timeout", - "name": "timeout", - "type": "number", - "optional": true, - "additionalParams": true, - "id": "chatOpenAI_1-input-timeout-number" - } - ], - "inputAnchors": [], - "inputs": { - "modelName": "gpt-3.5-turbo", - "temperature": "0", - "maxTokens": "", - "topP": "", - "frequencyPenalty": "", - "presencePenalty": "", - "timeout": "" - }, - "outputAnchors": [ - { - "id": "chatOpenAI_1-output-chatOpenAI-ChatOpenAI|BaseChatModel|BaseLanguageModel|BaseLangChain", - "name": "chatOpenAI", - "label": "ChatOpenAI", - "type": "ChatOpenAI | BaseChatModel | BaseLanguageModel | BaseLangChain" - } - ], - "outputs": {}, - "selected": false - }, - "selected": false, - "positionAbsolute": { - "x": 56.646518061018355, - "y": 71.07043412525425 - }, - "dragging": false - }, - { - "width": 300, - "height": 278, - "id": "serpAPI_1", - "position": { - "x": 436.94138168947336, - "y": 39.517825311262044 - }, - "type": "customNode", - "data": { - "id": "serpAPI_1", - "label": "Serp API", - "name": "serpAPI", - "type": "SerpAPI", - "baseClasses": ["SerpAPI", "Tool", "StructuredTool", "BaseLangChain"], - "category": "Tools", - "description": "Wrapper around SerpAPI - a real-time API to access Google search results", - "inputParams": [ - { - "label": "Serp Api Key", - "name": "apiKey", - "type": "password", - "id": "serpAPI_1-input-apiKey-password" - } - ], - "inputAnchors": [], - "inputs": {}, - "outputAnchors": [ - { - "id": "serpAPI_1-output-serpAPI-SerpAPI|Tool|StructuredTool|BaseLangChain", - "name": "serpAPI", - "label": "SerpAPI", - "type": "SerpAPI | Tool | StructuredTool | BaseLangChain" - } - ], - "outputs": {}, - "selected": false - }, - "selected": false, - "positionAbsolute": { - "x": 436.94138168947336, - "y": 39.517825311262044 - }, - "dragging": false - }, { "width": 300, "height": 143, @@ -190,6 +14,7 @@ "id": "calculator_1", "label": "Calculator", "name": "calculator", + "version": 1, "type": "Calculator", "baseClasses": ["Calculator", "Tool", "StructuredTool", "BaseLangChain"], "category": "Tools", @@ -220,14 +45,15 @@ "height": 376, "id": "bufferMemory_1", "position": { - "x": 573.479796337051, - "y": 575.8843338367278 + "x": 607.6260576768354, + "y": 584.7920541862369 }, "type": "customNode", "data": { "id": "bufferMemory_1", "label": "Buffer Memory", "name": "bufferMemory", + "version": 1, "type": "BufferMemory", "baseClasses": ["BufferMemory", "BaseChatMemory", "BaseMemory"], "category": "Memory", @@ -265,27 +91,229 @@ "selected": false }, "positionAbsolute": { - "x": 573.479796337051, - "y": 575.8843338367278 + "x": 607.6260576768354, + "y": 584.7920541862369 }, "selected": false, "dragging": false }, + { + "width": 300, + "height": 277, + "id": "serpAPI_0", + "position": { + "x": 451.83740798447855, + "y": 53.2843022150486 + }, + "type": "customNode", + "data": { + "id": "serpAPI_0", + "label": "Serp API", + "name": "serpAPI", + "version": 1, + "type": "SerpAPI", + "baseClasses": ["SerpAPI", "Tool", "StructuredTool"], + "category": "Tools", + "description": "Wrapper around SerpAPI - a real-time API to access Google search results", + "inputParams": [ + { + "label": "Connect Credential", + "name": "credential", + "type": "credential", + "credentialNames": ["serpApi"], + "id": "serpAPI_0-input-credential-credential" + } + ], + "inputAnchors": [], + "inputs": {}, + "outputAnchors": [ + { + "id": "serpAPI_0-output-serpAPI-SerpAPI|Tool|StructuredTool", + "name": "serpAPI", + "label": "SerpAPI", + "type": "SerpAPI | Tool | StructuredTool" + } + ], + "outputs": {}, + "selected": false + }, + "selected": false, + "positionAbsolute": { + "x": 451.83740798447855, + "y": 53.2843022150486 + }, + "dragging": false + }, + { + "width": 300, + "height": 523, + "id": "chatOpenAI_0", + "position": { + "x": 97.01321406237057, + "y": 63.67664262280914 + }, + "type": "customNode", + "data": { + "id": "chatOpenAI_0", + "label": "ChatOpenAI", + "name": "chatOpenAI", + "version": 1, + "type": "ChatOpenAI", + "baseClasses": ["ChatOpenAI", "BaseChatModel", "BaseLanguageModel"], + "category": "Chat Models", + "description": "Wrapper around OpenAI large language models that use the Chat endpoint", + "inputParams": [ + { + "label": "Connect Credential", + "name": "credential", + "type": "credential", + "credentialNames": ["openAIApi"], + "id": "chatOpenAI_0-input-credential-credential" + }, + { + "label": "Model Name", + "name": "modelName", + "type": "options", + "options": [ + { + "label": "gpt-4", + "name": "gpt-4" + }, + { + "label": "gpt-4-0613", + "name": "gpt-4-0613" + }, + { + "label": "gpt-4-32k", + "name": "gpt-4-32k" + }, + { + "label": "gpt-4-32k-0613", + "name": "gpt-4-32k-0613" + }, + { + "label": "gpt-3.5-turbo", + "name": "gpt-3.5-turbo" + }, + { + "label": "gpt-3.5-turbo-0613", + "name": "gpt-3.5-turbo-0613" + }, + { + "label": "gpt-3.5-turbo-16k", + "name": "gpt-3.5-turbo-16k" + }, + { + "label": "gpt-3.5-turbo-16k-0613", + "name": "gpt-3.5-turbo-16k-0613" + } + ], + "default": "gpt-3.5-turbo", + "optional": true, + "id": "chatOpenAI_0-input-modelName-options" + }, + { + "label": "Temperature", + "name": "temperature", + "type": "number", + "default": 0.9, + "optional": true, + "id": "chatOpenAI_0-input-temperature-number" + }, + { + "label": "Max Tokens", + "name": "maxTokens", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_0-input-maxTokens-number" + }, + { + "label": "Top Probability", + "name": "topP", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_0-input-topP-number" + }, + { + "label": "Frequency Penalty", + "name": "frequencyPenalty", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_0-input-frequencyPenalty-number" + }, + { + "label": "Presence Penalty", + "name": "presencePenalty", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_0-input-presencePenalty-number" + }, + { + "label": "Timeout", + "name": "timeout", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_0-input-timeout-number" + }, + { + "label": "BasePath", + "name": "basepath", + "type": "string", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_0-input-basepath-string" + } + ], + "inputAnchors": [], + "inputs": { + "modelName": "gpt-3.5-turbo", + "temperature": 0.9, + "maxTokens": "", + "topP": "", + "frequencyPenalty": "", + "presencePenalty": "", + "timeout": "", + "basepath": "" + }, + "outputAnchors": [ + { + "id": "chatOpenAI_0-output-chatOpenAI-ChatOpenAI|BaseChatModel|BaseLanguageModel", + "name": "chatOpenAI", + "label": "ChatOpenAI", + "type": "ChatOpenAI | BaseChatModel | BaseLanguageModel" + } + ], + "outputs": {}, + "selected": false + }, + "selected": false, + "positionAbsolute": { + "x": 97.01321406237057, + "y": 63.67664262280914 + }, + "dragging": false + }, { "width": 300, "height": 383, "id": "conversationalAgent_0", "position": { - "x": 1206.1996037716035, - "y": 227.39579577603587 + "x": 1164.4550359451973, + "y": 283.40041124403075 }, "type": "customNode", "data": { "id": "conversationalAgent_0", "label": "Conversational Agent", "name": "conversationalAgent", + "version": 1, "type": "AgentExecutor", - "baseClasses": ["AgentExecutor", "BaseChain", "BaseLangChain"], + "baseClasses": ["AgentExecutor", "BaseChain"], "category": "Agents", "description": "Conversational agent for a chat model. It will utilize chat specific prompts", "inputParams": [ @@ -294,18 +322,10 @@ "name": "systemMessage", "type": "string", "rows": 4, + "default": "Assistant is a large language model trained by OpenAI.\n\nAssistant is designed to be able to assist with a wide range of tasks, from answering simple questions to providing in-depth explanations and discussions on a wide range of topics. As a language model, Assistant is able to generate human-like text based on the input it receives, allowing it to engage in natural-sounding conversations and provide responses that are coherent and relevant to the topic at hand.\n\nAssistant is constantly learning and improving, and its capabilities are constantly evolving. It is able to process and understand large amounts of text, and can use this knowledge to provide accurate and informative responses to a wide range of questions. Additionally, Assistant is able to generate its own text based on the input it receives, allowing it to engage in discussions and provide explanations and descriptions on a wide range of topics.\n\nOverall, Assistant is a powerful system that can help with a wide range of tasks and provide valuable insights and information on a wide range of topics. Whether you need help with a specific question or just want to have a conversation about a particular topic, Assistant is here to assist.", "optional": true, "additionalParams": true, "id": "conversationalAgent_0-input-systemMessage-string" - }, - { - "label": "Human Message", - "name": "humanMessage", - "type": "string", - "rows": 4, - "optional": true, - "additionalParams": true, - "id": "conversationalAgent_0-input-humanMessage-string" } ], "inputAnchors": [ @@ -330,18 +350,17 @@ } ], "inputs": { - "tools": ["{{calculator_1.data.instance}}", "{{serpAPI_1.data.instance}}"], - "model": "{{chatOpenAI_1.data.instance}}", + "tools": ["{{calculator_1.data.instance}}", "{{serpAPI_0.data.instance}}"], + "model": "{{chatOpenAI_0.data.instance}}", "memory": "{{bufferMemory_1.data.instance}}", - "systemMessage": "", - "humanMessage": "" + "systemMessage": "Assistant is a large language model trained by OpenAI.\n\nAssistant is designed to be able to assist with a wide range of tasks, from answering simple questions to providing in-depth explanations and discussions on a wide range of topics. As a language model, Assistant is able to generate human-like text based on the input it receives, allowing it to engage in natural-sounding conversations and provide responses that are coherent and relevant to the topic at hand.\n\nAssistant is constantly learning and improving, and its capabilities are constantly evolving. It is able to process and understand large amounts of text, and can use this knowledge to provide accurate and informative responses to a wide range of questions. Additionally, Assistant is able to generate its own text based on the input it receives, allowing it to engage in discussions and provide explanations and descriptions on a wide range of topics.\n\nOverall, Assistant is a powerful system that can help with a wide range of tasks and provide valuable insights and information on a wide range of topics. Whether you need help with a specific question or just want to have a conversation about a particular topic, Assistant is here to assist." }, "outputAnchors": [ { - "id": "conversationalAgent_0-output-conversationalAgent-AgentExecutor|BaseChain|BaseLangChain", + "id": "conversationalAgent_0-output-conversationalAgent-AgentExecutor|BaseChain", "name": "conversationalAgent", "label": "AgentExecutor", - "type": "AgentExecutor | BaseChain | BaseLangChain" + "type": "AgentExecutor | BaseChain" } ], "outputs": {}, @@ -349,8 +368,8 @@ }, "selected": false, "positionAbsolute": { - "x": 1206.1996037716035, - "y": 227.39579577603587 + "x": 1164.4550359451973, + "y": 283.40041124403075 }, "dragging": false } @@ -368,23 +387,23 @@ } }, { - "source": "serpAPI_1", - "sourceHandle": "serpAPI_1-output-serpAPI-SerpAPI|Tool|StructuredTool|BaseLangChain", + "source": "serpAPI_0", + "sourceHandle": "serpAPI_0-output-serpAPI-SerpAPI|Tool|StructuredTool", "target": "conversationalAgent_0", "targetHandle": "conversationalAgent_0-input-tools-Tool", "type": "buttonedge", - "id": "serpAPI_1-serpAPI_1-output-serpAPI-SerpAPI|Tool|StructuredTool|BaseLangChain-conversationalAgent_0-conversationalAgent_0-input-tools-Tool", + "id": "serpAPI_0-serpAPI_0-output-serpAPI-SerpAPI|Tool|StructuredTool-conversationalAgent_0-conversationalAgent_0-input-tools-Tool", "data": { "label": "" } }, { - "source": "chatOpenAI_1", - "sourceHandle": "chatOpenAI_1-output-chatOpenAI-ChatOpenAI|BaseChatModel|BaseLanguageModel|BaseLangChain", + "source": "chatOpenAI_0", + "sourceHandle": "chatOpenAI_0-output-chatOpenAI-ChatOpenAI|BaseChatModel|BaseLanguageModel", "target": "conversationalAgent_0", "targetHandle": "conversationalAgent_0-input-model-BaseLanguageModel", "type": "buttonedge", - "id": "chatOpenAI_1-chatOpenAI_1-output-chatOpenAI-ChatOpenAI|BaseChatModel|BaseLanguageModel|BaseLangChain-conversationalAgent_0-conversationalAgent_0-input-model-BaseLanguageModel", + "id": "chatOpenAI_0-chatOpenAI_0-output-chatOpenAI-ChatOpenAI|BaseChatModel|BaseLanguageModel-conversationalAgent_0-conversationalAgent_0-input-model-BaseLanguageModel", "data": { "label": "" } diff --git a/packages/server/marketplaces/chatflows/Conversational Retrieval Agent.json b/packages/server/marketplaces/chatflows/Conversational Retrieval Agent.json new file mode 100644 index 000000000..dcf344d12 --- /dev/null +++ b/packages/server/marketplaces/chatflows/Conversational Retrieval Agent.json @@ -0,0 +1,607 @@ +{ + "description": "Agent optimized for vector retrieval during conversation and answering questions based on previous dialogue.", + "nodes": [ + { + "width": 300, + "height": 523, + "id": "chatOpenAI_0", + "position": { + "x": 1381.867549919116, + "y": 212.76900895393834 + }, + "type": "customNode", + "data": { + "id": "chatOpenAI_0", + "label": "ChatOpenAI", + "version": 1, + "name": "chatOpenAI", + "type": "ChatOpenAI", + "baseClasses": ["ChatOpenAI", "BaseChatModel", "BaseLanguageModel"], + "category": "Chat Models", + "description": "Wrapper around OpenAI large language models that use the Chat endpoint", + "inputParams": [ + { + "label": "Connect Credential", + "name": "credential", + "type": "credential", + "credentialNames": ["openAIApi"], + "id": "chatOpenAI_0-input-credential-credential" + }, + { + "label": "Model Name", + "name": "modelName", + "type": "options", + "options": [ + { + "label": "gpt-4", + "name": "gpt-4" + }, + { + "label": "gpt-4-0613", + "name": "gpt-4-0613" + }, + { + "label": "gpt-4-32k", + "name": "gpt-4-32k" + }, + { + "label": "gpt-4-32k-0613", + "name": "gpt-4-32k-0613" + }, + { + "label": "gpt-3.5-turbo", + "name": "gpt-3.5-turbo" + }, + { + "label": "gpt-3.5-turbo-0613", + "name": "gpt-3.5-turbo-0613" + }, + { + "label": "gpt-3.5-turbo-16k", + "name": "gpt-3.5-turbo-16k" + }, + { + "label": "gpt-3.5-turbo-16k-0613", + "name": "gpt-3.5-turbo-16k-0613" + } + ], + "default": "gpt-3.5-turbo", + "optional": true, + "id": "chatOpenAI_0-input-modelName-options" + }, + { + "label": "Temperature", + "name": "temperature", + "type": "number", + "default": 0.9, + "optional": true, + "id": "chatOpenAI_0-input-temperature-number" + }, + { + "label": "Max Tokens", + "name": "maxTokens", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_0-input-maxTokens-number" + }, + { + "label": "Top Probability", + "name": "topP", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_0-input-topP-number" + }, + { + "label": "Frequency Penalty", + "name": "frequencyPenalty", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_0-input-frequencyPenalty-number" + }, + { + "label": "Presence Penalty", + "name": "presencePenalty", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_0-input-presencePenalty-number" + }, + { + "label": "Timeout", + "name": "timeout", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_0-input-timeout-number" + }, + { + "label": "BasePath", + "name": "basepath", + "type": "string", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_0-input-basepath-string" + } + ], + "inputAnchors": [], + "inputs": { + "modelName": "gpt-3.5-turbo-16k", + "temperature": "0", + "maxTokens": "", + "topP": "", + "frequencyPenalty": "", + "presencePenalty": "", + "timeout": "", + "basepath": "" + }, + "outputAnchors": [ + { + "id": "chatOpenAI_0-output-chatOpenAI-ChatOpenAI|BaseChatModel|BaseLanguageModel", + "name": "chatOpenAI", + "label": "ChatOpenAI", + "type": "ChatOpenAI | BaseChatModel | BaseLanguageModel" + } + ], + "outputs": {}, + "selected": false + }, + "selected": false, + "positionAbsolute": { + "x": 1381.867549919116, + "y": 212.76900895393834 + }, + "dragging": false + }, + { + "width": 300, + "height": 329, + "id": "openAIEmbeddings_0", + "position": { + "x": 954.0674802999345, + "y": -196.7034956445692 + }, + "type": "customNode", + "data": { + "id": "openAIEmbeddings_0", + "label": "OpenAI Embeddings", + "version": 1, + "name": "openAIEmbeddings", + "type": "OpenAIEmbeddings", + "baseClasses": ["OpenAIEmbeddings", "Embeddings"], + "category": "Embeddings", + "description": "OpenAI API to generate embeddings for a given text", + "inputParams": [ + { + "label": "Connect Credential", + "name": "credential", + "type": "credential", + "credentialNames": ["openAIApi"], + "id": "openAIEmbeddings_0-input-credential-credential" + }, + { + "label": "Strip New Lines", + "name": "stripNewLines", + "type": "boolean", + "optional": true, + "additionalParams": true, + "id": "openAIEmbeddings_0-input-stripNewLines-boolean" + }, + { + "label": "Batch Size", + "name": "batchSize", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "openAIEmbeddings_0-input-batchSize-number" + }, + { + "label": "Timeout", + "name": "timeout", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "openAIEmbeddings_0-input-timeout-number" + }, + { + "label": "BasePath", + "name": "basepath", + "type": "string", + "optional": true, + "additionalParams": true, + "id": "openAIEmbeddings_0-input-basepath-string" + } + ], + "inputAnchors": [], + "inputs": { + "stripNewLines": "", + "batchSize": "", + "timeout": "", + "basepath": "" + }, + "outputAnchors": [ + { + "id": "openAIEmbeddings_0-output-openAIEmbeddings-OpenAIEmbeddings|Embeddings", + "name": "openAIEmbeddings", + "label": "OpenAIEmbeddings", + "type": "OpenAIEmbeddings | Embeddings" + } + ], + "outputs": {}, + "selected": false + }, + "selected": false, + "positionAbsolute": { + "x": 954.0674802999345, + "y": -196.7034956445692 + }, + "dragging": false + }, + { + "width": 300, + "height": 505, + "id": "pineconeExistingIndex_0", + "position": { + "x": 1362.0018461011314, + "y": -334.0373537488481 + }, + "type": "customNode", + "data": { + "id": "pineconeExistingIndex_0", + "label": "Pinecone Load Existing Index", + "version": 1, + "name": "pineconeExistingIndex", + "type": "Pinecone", + "baseClasses": ["Pinecone", "VectorStoreRetriever", "BaseRetriever"], + "category": "Vector Stores", + "description": "Load existing index from Pinecone (i.e: Document has been upserted)", + "inputParams": [ + { + "label": "Connect Credential", + "name": "credential", + "type": "credential", + "credentialNames": ["pineconeApi"], + "id": "pineconeExistingIndex_0-input-credential-credential" + }, + { + "label": "Pinecone Index", + "name": "pineconeIndex", + "type": "string", + "id": "pineconeExistingIndex_0-input-pineconeIndex-string" + }, + { + "label": "Pinecone Namespace", + "name": "pineconeNamespace", + "type": "string", + "placeholder": "my-first-namespace", + "additionalParams": true, + "optional": true, + "id": "pineconeExistingIndex_0-input-pineconeNamespace-string" + }, + { + "label": "Pinecone Metadata Filter", + "name": "pineconeMetadataFilter", + "type": "json", + "optional": true, + "additionalParams": true, + "id": "pineconeExistingIndex_0-input-pineconeMetadataFilter-json" + }, + { + "label": "Top K", + "name": "topK", + "description": "Number of top results to fetch. Default to 4", + "placeholder": "4", + "type": "number", + "additionalParams": true, + "optional": true, + "id": "pineconeExistingIndex_0-input-topK-number" + } + ], + "inputAnchors": [ + { + "label": "Embeddings", + "name": "embeddings", + "type": "Embeddings", + "id": "pineconeExistingIndex_0-input-embeddings-Embeddings" + } + ], + "inputs": { + "embeddings": "{{openAIEmbeddings_0.data.instance}}", + "pineconeIndex": "newindex", + "pineconeNamespace": "", + "pineconeMetadataFilter": "", + "topK": "" + }, + "outputAnchors": [ + { + "name": "output", + "label": "Output", + "type": "options", + "options": [ + { + "id": "pineconeExistingIndex_0-output-retriever-Pinecone|VectorStoreRetriever|BaseRetriever", + "name": "retriever", + "label": "Pinecone Retriever", + "type": "Pinecone | VectorStoreRetriever | BaseRetriever" + }, + { + "id": "pineconeExistingIndex_0-output-vectorStore-Pinecone|VectorStore", + "name": "vectorStore", + "label": "Pinecone Vector Store", + "type": "Pinecone | VectorStore" + } + ], + "default": "retriever" + } + ], + "outputs": { + "output": "retriever" + }, + "selected": false + }, + "selected": false, + "positionAbsolute": { + "x": 1362.0018461011314, + "y": -334.0373537488481 + }, + "dragging": false + }, + { + "width": 300, + "height": 383, + "id": "conversationalRetrievalAgent_0", + "position": { + "x": 2345.912948267881, + "y": 357.97363342258217 + }, + "type": "customNode", + "data": { + "id": "conversationalRetrievalAgent_0", + "label": "Conversational Retrieval Agent", + "version": 1, + "name": "conversationalRetrievalAgent", + "type": "AgentExecutor", + "baseClasses": ["AgentExecutor", "BaseChain", "Runnable"], + "category": "Agents", + "description": "An agent optimized for retrieval during conversation, answering questions based on past dialogue, all using OpenAI's Function Calling", + "inputParams": [ + { + "label": "System Message", + "name": "systemMessage", + "type": "string", + "rows": 4, + "optional": true, + "additionalParams": true, + "id": "conversationalRetrievalAgent_0-input-systemMessage-string" + } + ], + "inputAnchors": [ + { + "label": "Allowed Tools", + "name": "tools", + "type": "Tool", + "list": true, + "id": "conversationalRetrievalAgent_0-input-tools-Tool" + }, + { + "label": "Memory", + "name": "memory", + "type": "BaseChatMemory", + "id": "conversationalRetrievalAgent_0-input-memory-BaseChatMemory" + }, + { + "label": "OpenAI Chat Model", + "name": "model", + "type": "ChatOpenAI", + "id": "conversationalRetrievalAgent_0-input-model-ChatOpenAI" + } + ], + "inputs": { + "tools": ["{{retrieverTool_0.data.instance}}"], + "memory": "{{bufferMemory_0.data.instance}}", + "model": "{{chatOpenAI_0.data.instance}}", + "systemMessage": "" + }, + "outputAnchors": [ + { + "id": "conversationalRetrievalAgent_0-output-conversationalRetrievalAgent-AgentExecutor|BaseChain|Runnable", + "name": "conversationalRetrievalAgent", + "label": "AgentExecutor", + "type": "AgentExecutor | BaseChain | Runnable" + } + ], + "outputs": {}, + "selected": false + }, + "selected": false, + "positionAbsolute": { + "x": 2345.912948267881, + "y": 357.97363342258217 + }, + "dragging": false + }, + { + "width": 300, + "height": 505, + "id": "retrieverTool_0", + "position": { + "x": 1770.1485365275375, + "y": -321.14473253480946 + }, + "type": "customNode", + "data": { + "id": "retrieverTool_0", + "label": "Retriever Tool", + "version": 1, + "name": "retrieverTool", + "type": "RetrieverTool", + "baseClasses": ["RetrieverTool", "DynamicTool", "Tool", "StructuredTool", "Runnable"], + "category": "Tools", + "description": "Use a retriever as allowed tool for agent", + "inputParams": [ + { + "label": "Retriever Name", + "name": "name", + "type": "string", + "placeholder": "search_state_of_union", + "id": "retrieverTool_0-input-name-string" + }, + { + "label": "Retriever Description", + "name": "description", + "type": "string", + "description": "When should agent uses to retrieve documents", + "rows": 3, + "placeholder": "Searches and returns documents regarding the state-of-the-union.", + "id": "retrieverTool_0-input-description-string" + } + ], + "inputAnchors": [ + { + "label": "Retriever", + "name": "retriever", + "type": "BaseRetriever", + "id": "retrieverTool_0-input-retriever-BaseRetriever" + } + ], + "inputs": { + "name": "search_website", + "description": "Searches and return documents regarding Jane - a culinary institution that offers top quality coffee, pastries, breakfast, lunch, and a variety of baked goods. They have multiple locations, including Jane on Fillmore, Jane on Larkin, Jane the Bakery, Toy Boat By Jane, and Little Jane on Grant. They emphasize healthy eating with a focus on flavor and quality ingredients. They bake everything in-house and work with local suppliers to source ingredients directly from farmers. They also offer catering services and delivery options.", + "retriever": "{{pineconeExistingIndex_0.data.instance}}" + }, + "outputAnchors": [ + { + "id": "retrieverTool_0-output-retrieverTool-RetrieverTool|DynamicTool|Tool|StructuredTool|Runnable", + "name": "retrieverTool", + "label": "RetrieverTool", + "type": "RetrieverTool | DynamicTool | Tool | StructuredTool | Runnable" + } + ], + "outputs": {}, + "selected": false + }, + "selected": false, + "dragging": false, + "positionAbsolute": { + "x": 1770.1485365275375, + "y": -321.14473253480946 + } + }, + { + "width": 300, + "height": 376, + "id": "bufferMemory_0", + "position": { + "x": 1771.396291209036, + "y": 216.94151328212496 + }, + "type": "customNode", + "data": { + "id": "bufferMemory_0", + "label": "Buffer Memory", + "version": 1, + "name": "bufferMemory", + "type": "BufferMemory", + "baseClasses": ["BufferMemory", "BaseChatMemory", "BaseMemory"], + "category": "Memory", + "description": "Remembers previous conversational back and forths directly", + "inputParams": [ + { + "label": "Memory Key", + "name": "memoryKey", + "type": "string", + "default": "chat_history", + "id": "bufferMemory_0-input-memoryKey-string" + }, + { + "label": "Input Key", + "name": "inputKey", + "type": "string", + "default": "input", + "id": "bufferMemory_0-input-inputKey-string" + } + ], + "inputAnchors": [], + "inputs": { + "memoryKey": "chat_history", + "inputKey": "input" + }, + "outputAnchors": [ + { + "id": "bufferMemory_0-output-bufferMemory-BufferMemory|BaseChatMemory|BaseMemory", + "name": "bufferMemory", + "label": "BufferMemory", + "type": "BufferMemory | BaseChatMemory | BaseMemory" + } + ], + "outputs": {}, + "selected": false + }, + "selected": false, + "positionAbsolute": { + "x": 1771.396291209036, + "y": 216.94151328212496 + }, + "dragging": false + } + ], + "edges": [ + { + "source": "openAIEmbeddings_0", + "sourceHandle": "openAIEmbeddings_0-output-openAIEmbeddings-OpenAIEmbeddings|Embeddings", + "target": "pineconeExistingIndex_0", + "targetHandle": "pineconeExistingIndex_0-input-embeddings-Embeddings", + "type": "buttonedge", + "id": "openAIEmbeddings_0-openAIEmbeddings_0-output-openAIEmbeddings-OpenAIEmbeddings|Embeddings-pineconeExistingIndex_0-pineconeExistingIndex_0-input-embeddings-Embeddings", + "data": { + "label": "" + } + }, + { + "source": "pineconeExistingIndex_0", + "sourceHandle": "pineconeExistingIndex_0-output-retriever-Pinecone|VectorStoreRetriever|BaseRetriever", + "target": "retrieverTool_0", + "targetHandle": "retrieverTool_0-input-retriever-BaseRetriever", + "type": "buttonedge", + "id": "pineconeExistingIndex_0-pineconeExistingIndex_0-output-retriever-Pinecone|VectorStoreRetriever|BaseRetriever-retrieverTool_0-retrieverTool_0-input-retriever-BaseRetriever", + "data": { + "label": "" + } + }, + { + "source": "retrieverTool_0", + "sourceHandle": "retrieverTool_0-output-retrieverTool-RetrieverTool|DynamicTool|Tool|StructuredTool|Runnable", + "target": "conversationalRetrievalAgent_0", + "targetHandle": "conversationalRetrievalAgent_0-input-tools-Tool", + "type": "buttonedge", + "id": "retrieverTool_0-retrieverTool_0-output-retrieverTool-RetrieverTool|DynamicTool|Tool|StructuredTool|Runnable-conversationalRetrievalAgent_0-conversationalRetrievalAgent_0-input-tools-Tool", + "data": { + "label": "" + } + }, + { + "source": "chatOpenAI_0", + "sourceHandle": "chatOpenAI_0-output-chatOpenAI-ChatOpenAI|BaseChatModel|BaseLanguageModel", + "target": "conversationalRetrievalAgent_0", + "targetHandle": "conversationalRetrievalAgent_0-input-model-ChatOpenAI", + "type": "buttonedge", + "id": "chatOpenAI_0-chatOpenAI_0-output-chatOpenAI-ChatOpenAI|BaseChatModel|BaseLanguageModel-conversationalRetrievalAgent_0-conversationalRetrievalAgent_0-input-model-ChatOpenAI", + "data": { + "label": "" + } + }, + { + "source": "bufferMemory_0", + "sourceHandle": "bufferMemory_0-output-bufferMemory-BufferMemory|BaseChatMemory|BaseMemory", + "target": "conversationalRetrievalAgent_0", + "targetHandle": "conversationalRetrievalAgent_0-input-memory-BaseChatMemory", + "type": "buttonedge", + "id": "bufferMemory_0-bufferMemory_0-output-bufferMemory-BufferMemory|BaseChatMemory|BaseMemory-conversationalRetrievalAgent_0-conversationalRetrievalAgent_0-input-memory-BaseChatMemory", + "data": { + "label": "" + } + } + ] +} diff --git a/packages/server/marketplaces/Conversational Retrieval QA Chain.json b/packages/server/marketplaces/chatflows/Conversational Retrieval QA Chain.json similarity index 60% rename from packages/server/marketplaces/Conversational Retrieval QA Chain.json rename to packages/server/marketplaces/chatflows/Conversational Retrieval QA Chain.json index daa5235ee..bf27e443e 100644 --- a/packages/server/marketplaces/Conversational Retrieval QA Chain.json +++ b/packages/server/marketplaces/chatflows/Conversational Retrieval QA Chain.json @@ -3,322 +3,29 @@ "nodes": [ { "width": 300, - "height": 376, - "id": "recursiveCharacterTextSplitter_1", - "position": { - "x": 422.81091375202413, - "y": 122.99825010325736 - }, - "type": "customNode", - "data": { - "id": "recursiveCharacterTextSplitter_1", - "label": "Recursive Character Text Splitter", - "name": "recursiveCharacterTextSplitter", - "type": "RecursiveCharacterTextSplitter", - "baseClasses": ["RecursiveCharacterTextSplitter", "TextSplitter"], - "category": "Text Splitters", - "description": "Split documents recursively by different characters - starting with \"\n\n\", then \"\n\", then \" \"", - "inputParams": [ - { - "label": "Chunk Size", - "name": "chunkSize", - "type": "number", - "default": 1000, - "optional": true, - "id": "recursiveCharacterTextSplitter_1-input-chunkSize-number" - }, - { - "label": "Chunk Overlap", - "name": "chunkOverlap", - "type": "number", - "optional": true, - "id": "recursiveCharacterTextSplitter_1-input-chunkOverlap-number" - } - ], - "inputAnchors": [], - "inputs": { - "chunkSize": 1000, - "chunkOverlap": "" - }, - "outputAnchors": [ - { - "id": "recursiveCharacterTextSplitter_1-output-recursiveCharacterTextSplitter-RecursiveCharacterTextSplitter|TextSplitter", - "name": "recursiveCharacterTextSplitter", - "label": "RecursiveCharacterTextSplitter", - "type": "RecursiveCharacterTextSplitter | TextSplitter" - } - ], - "outputs": {}, - "selected": false - }, - "selected": false, - "positionAbsolute": { - "x": 422.81091375202413, - "y": 122.99825010325736 - }, - "dragging": false - }, - { - "width": 300, - "height": 392, - "id": "textFile_1", - "position": { - "x": 810.6456923854021, - "y": 61.45989039390216 - }, - "type": "customNode", - "data": { - "id": "textFile_1", - "label": "Text File", - "name": "textFile", - "type": "Document", - "baseClasses": ["Document"], - "category": "Document Loaders", - "description": "Load data from text files", - "inputParams": [ - { - "label": "Txt File", - "name": "txtFile", - "type": "file", - "fileType": ".txt", - "id": "textFile_1-input-txtFile-file" - }, - { - "label": "Metadata", - "name": "metadata", - "type": "json", - "optional": true, - "additionalParams": true, - "id": "textFile_1-input-metadata-json" - } - ], - "inputAnchors": [ - { - "label": "Text Splitter", - "name": "textSplitter", - "type": "TextSplitter", - "optional": true, - "id": "textFile_1-input-textSplitter-TextSplitter" - } - ], - "inputs": { - "textSplitter": "{{recursiveCharacterTextSplitter_1.data.instance}}" - }, - "outputAnchors": [ - { - "id": "textFile_1-output-textFile-Document", - "name": "textFile", - "label": "Document", - "type": "Document" - } - ], - "outputs": {}, - "selected": false - }, - "selected": false, - "positionAbsolute": { - "x": 810.6456923854021, - "y": 61.45989039390216 - }, - "dragging": false - }, - { - "width": 300, - "height": 330, - "id": "openAIEmbeddings_1", - "position": { - "x": 817.2208258595176, - "y": 586.8095386455508 - }, - "type": "customNode", - "data": { - "id": "openAIEmbeddings_1", - "label": "OpenAI Embeddings", - "name": "openAIEmbeddings", - "type": "OpenAIEmbeddings", - "baseClasses": ["OpenAIEmbeddings", "Embeddings"], - "category": "Embeddings", - "description": "OpenAI API to generate embeddings for a given text", - "inputParams": [ - { - "label": "OpenAI Api Key", - "name": "openAIApiKey", - "type": "password", - "id": "openAIEmbeddings_1-input-openAIApiKey-password" - }, - { - "label": "Strip New Lines", - "name": "stripNewLines", - "type": "boolean", - "optional": true, - "additionalParams": true, - "id": "openAIEmbeddings_1-input-stripNewLines-boolean" - }, - { - "label": "Batch Size", - "name": "batchSize", - "type": "number", - "optional": true, - "additionalParams": true, - "id": "openAIEmbeddings_1-input-batchSize-number" - }, - { - "label": "Timeout", - "name": "timeout", - "type": "number", - "optional": true, - "additionalParams": true, - "id": "openAIEmbeddings_1-input-timeout-number" - } - ], - "inputAnchors": [], - "inputs": { - "stripNewLines": "", - "batchSize": "", - "timeout": "" - }, - "outputAnchors": [ - { - "id": "openAIEmbeddings_1-output-openAIEmbeddings-OpenAIEmbeddings|Embeddings", - "name": "openAIEmbeddings", - "label": "OpenAIEmbeddings", - "type": "OpenAIEmbeddings | Embeddings" - } - ], - "outputs": {}, - "selected": false - }, - "selected": false, - "positionAbsolute": { - "x": 817.2208258595176, - "y": 586.8095386455508 - }, - "dragging": false - }, - { - "width": 300, - "height": 702, - "id": "pineconeUpsert_1", - "position": { - "x": 1201.3427203075867, - "y": 545.1800202023215 - }, - "type": "customNode", - "data": { - "id": "pineconeUpsert_1", - "label": "Pinecone Upsert Document", - "name": "pineconeUpsert", - "type": "Pinecone", - "baseClasses": ["Pinecone", "VectorStoreRetriever", "BaseRetriever"], - "category": "Vector Stores", - "description": "Upsert documents to Pinecone", - "inputParams": [ - { - "label": "Pinecone Api Key", - "name": "pineconeApiKey", - "type": "password", - "id": "pineconeUpsert_1-input-pineconeApiKey-password" - }, - { - "label": "Pinecone Environment", - "name": "pineconeEnv", - "type": "string", - "id": "pineconeUpsert_1-input-pineconeEnv-string" - }, - { - "label": "Pinecone Index", - "name": "pineconeIndex", - "type": "string", - "id": "pineconeUpsert_1-input-pineconeIndex-string" - }, - { - "label": "Pinecone Namespace", - "name": "pineconeNamespace", - "type": "string", - "placeholder": "my-first-namespace", - "optional": true, - "id": "pineconeUpsert_1-input-pineconeNamespace-string" - } - ], - "inputAnchors": [ - { - "label": "Document", - "name": "document", - "type": "Document", - "list": true, - "id": "pineconeUpsert_1-input-document-Document" - }, - { - "label": "Embeddings", - "name": "embeddings", - "type": "Embeddings", - "id": "pineconeUpsert_1-input-embeddings-Embeddings" - } - ], - "inputs": { - "document": ["{{textFile_1.data.instance}}"], - "embeddings": "{{openAIEmbeddings_1.data.instance}}", - "pineconeEnv": "us-west4-gcp", - "pineconeIndex": "myindex", - "pineconeNamespace": "mynamespace" - }, - "outputAnchors": [ - { - "name": "output", - "label": "Output", - "type": "options", - "options": [ - { - "id": "pineconeUpsert_1-output-retriever-Pinecone|VectorStoreRetriever|BaseRetriever", - "name": "retriever", - "label": "Pinecone Retriever", - "type": "Pinecone | VectorStoreRetriever | BaseRetriever" - }, - { - "id": "pineconeUpsert_1-output-vectorStore-Pinecone|VectorStore", - "name": "vectorStore", - "label": "Pinecone Vector Store", - "type": "Pinecone | VectorStore" - } - ], - "default": "retriever" - } - ], - "outputs": { - "output": "retriever" - }, - "selected": false - }, - "selected": false, - "dragging": false, - "positionAbsolute": { - "x": 1201.3427203075867, - "y": 545.1800202023215 - } - }, - { - "width": 300, - "height": 524, + "height": 522, "id": "chatOpenAI_0", "position": { - "x": 1200.565568471151, - "y": -33.648143275380406 + "x": 1184.1176114500388, + "y": -44.15535835370571 }, "type": "customNode", "data": { "id": "chatOpenAI_0", "label": "ChatOpenAI", "name": "chatOpenAI", + "version": 1, "type": "ChatOpenAI", - "baseClasses": ["ChatOpenAI", "BaseChatModel", "BaseLanguageModel", "BaseLangChain"], + "baseClasses": ["ChatOpenAI", "BaseChatModel", "BaseLanguageModel"], "category": "Chat Models", "description": "Wrapper around OpenAI large language models that use the Chat endpoint", "inputParams": [ { - "label": "OpenAI Api Key", - "name": "openAIApiKey", - "type": "password", - "id": "chatOpenAI_0-input-openAIApiKey-password" + "label": "Connect Credential", + "name": "credential", + "type": "credential", + "credentialNames": ["openAIApi"], + "id": "chatOpenAI_0-input-credential-credential" }, { "label": "Model Name", @@ -330,20 +37,32 @@ "name": "gpt-4" }, { - "label": "gpt-4-0314", - "name": "gpt-4-0314" + "label": "gpt-4-0613", + "name": "gpt-4-0613" }, { - "label": "gpt-4-32k-0314", - "name": "gpt-4-32k-0314" + "label": "gpt-4-32k", + "name": "gpt-4-32k" + }, + { + "label": "gpt-4-32k-0613", + "name": "gpt-4-32k-0613" }, { "label": "gpt-3.5-turbo", "name": "gpt-3.5-turbo" }, { - "label": "gpt-3.5-turbo-0301", - "name": "gpt-3.5-turbo-0301" + "label": "gpt-3.5-turbo-0613", + "name": "gpt-3.5-turbo-0613" + }, + { + "label": "gpt-3.5-turbo-16k", + "name": "gpt-3.5-turbo-16k" + }, + { + "label": "gpt-3.5-turbo-16k-0613", + "name": "gpt-3.5-turbo-16k-0613" } ], "default": "gpt-3.5-turbo", @@ -397,24 +116,117 @@ "optional": true, "additionalParams": true, "id": "chatOpenAI_0-input-timeout-number" + }, + { + "label": "BasePath", + "name": "basepath", + "type": "string", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_0-input-basepath-string" } ], "inputAnchors": [], "inputs": { "modelName": "gpt-3.5-turbo", - "temperature": "0.5", + "temperature": "0", "maxTokens": "", "topP": "", "frequencyPenalty": "", "presencePenalty": "", - "timeout": "" + "timeout": "", + "basepath": "" }, "outputAnchors": [ { - "id": "chatOpenAI_0-output-chatOpenAI-ChatOpenAI|BaseChatModel|BaseLanguageModel|BaseLangChain", + "id": "chatOpenAI_0-output-chatOpenAI-ChatOpenAI|BaseChatModel|BaseLanguageModel", "name": "chatOpenAI", "label": "ChatOpenAI", - "type": "ChatOpenAI | BaseChatModel | BaseLanguageModel | BaseLangChain" + "type": "ChatOpenAI | BaseChatModel | BaseLanguageModel" + } + ], + "outputs": {}, + "selected": false + }, + "positionAbsolute": { + "x": 1184.1176114500388, + "y": -44.15535835370571 + }, + "selected": false, + "dragging": false + }, + { + "width": 300, + "height": 328, + "id": "openAIEmbeddings_0", + "position": { + "x": 795.6162477805387, + "y": 603.260214150876 + }, + "type": "customNode", + "data": { + "id": "openAIEmbeddings_0", + "label": "OpenAI Embeddings", + "name": "openAIEmbeddings", + "version": 1, + "type": "OpenAIEmbeddings", + "baseClasses": ["OpenAIEmbeddings", "Embeddings"], + "category": "Embeddings", + "description": "OpenAI API to generate embeddings for a given text", + "inputParams": [ + { + "label": "Connect Credential", + "name": "credential", + "type": "credential", + "credentialNames": ["openAIApi"], + "id": "openAIEmbeddings_0-input-credential-credential" + }, + { + "label": "Strip New Lines", + "name": "stripNewLines", + "type": "boolean", + "optional": true, + "additionalParams": true, + "id": "openAIEmbeddings_0-input-stripNewLines-boolean" + }, + { + "label": "Batch Size", + "name": "batchSize", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "openAIEmbeddings_0-input-batchSize-number" + }, + { + "label": "Timeout", + "name": "timeout", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "openAIEmbeddings_0-input-timeout-number" + }, + { + "label": "BasePath", + "name": "basepath", + "type": "string", + "optional": true, + "additionalParams": true, + "id": "openAIEmbeddings_0-input-basepath-string" + } + ], + "inputAnchors": [], + "inputs": { + "stripNewLines": "", + "batchSize": "", + "timeout": "", + "basepath": "" + }, + "outputAnchors": [ + { + "id": "openAIEmbeddings_0-output-openAIEmbeddings-OpenAIEmbeddings|Embeddings", + "name": "openAIEmbeddings", + "label": "OpenAIEmbeddings", + "type": "OpenAIEmbeddings | Embeddings" } ], "outputs": {}, @@ -422,29 +234,305 @@ }, "selected": false, "positionAbsolute": { - "x": 1200.565568471151, - "y": -33.648143275380406 + "x": 795.6162477805387, + "y": 603.260214150876 }, "dragging": false }, { "width": 300, - "height": 280, + "height": 554, + "id": "pineconeUpsert_0", + "position": { + "x": 1191.1792786926865, + "y": 514.2126330994578 + }, + "type": "customNode", + "data": { + "id": "pineconeUpsert_0", + "label": "Pinecone Upsert Document", + "name": "pineconeUpsert", + "version": 1, + "type": "Pinecone", + "baseClasses": ["Pinecone", "VectorStoreRetriever", "BaseRetriever"], + "category": "Vector Stores", + "description": "Upsert documents to Pinecone", + "inputParams": [ + { + "label": "Connect Credential", + "name": "credential", + "type": "credential", + "credentialNames": ["pineconeApi"], + "id": "pineconeUpsert_0-input-credential-credential" + }, + { + "label": "Pinecone Index", + "name": "pineconeIndex", + "type": "string", + "id": "pineconeUpsert_0-input-pineconeIndex-string" + }, + { + "label": "Pinecone Namespace", + "name": "pineconeNamespace", + "type": "string", + "placeholder": "my-first-namespace", + "additionalParams": true, + "optional": true, + "id": "pineconeUpsert_0-input-pineconeNamespace-string" + }, + { + "label": "Top K", + "name": "topK", + "description": "Number of top results to fetch. Default to 4", + "placeholder": "4", + "type": "number", + "additionalParams": true, + "optional": true, + "id": "pineconeUpsert_0-input-topK-number" + } + ], + "inputAnchors": [ + { + "label": "Document", + "name": "document", + "type": "Document", + "list": true, + "id": "pineconeUpsert_0-input-document-Document" + }, + { + "label": "Embeddings", + "name": "embeddings", + "type": "Embeddings", + "id": "pineconeUpsert_0-input-embeddings-Embeddings" + } + ], + "inputs": { + "document": ["{{textFile_0.data.instance}}"], + "embeddings": "{{openAIEmbeddings_0.data.instance}}", + "pineconeIndex": "", + "pineconeNamespace": "", + "topK": "" + }, + "outputAnchors": [ + { + "name": "output", + "label": "Output", + "type": "options", + "options": [ + { + "id": "pineconeUpsert_0-output-retriever-Pinecone|VectorStoreRetriever|BaseRetriever", + "name": "retriever", + "label": "Pinecone Retriever", + "type": "Pinecone | VectorStoreRetriever | BaseRetriever" + }, + { + "id": "pineconeUpsert_0-output-vectorStore-Pinecone|VectorStore", + "name": "vectorStore", + "label": "Pinecone Vector Store", + "type": "Pinecone | VectorStore" + } + ], + "default": "retriever" + } + ], + "outputs": { + "output": "retriever" + }, + "selected": false + }, + "selected": false, + "positionAbsolute": { + "x": 1191.1792786926865, + "y": 514.2126330994578 + }, + "dragging": false + }, + { + "width": 300, + "height": 376, + "id": "recursiveCharacterTextSplitter_0", + "position": { + "x": 406.08456707531263, + "y": 197.66460328693972 + }, + "type": "customNode", + "data": { + "id": "recursiveCharacterTextSplitter_0", + "label": "Recursive Character Text Splitter", + "name": "recursiveCharacterTextSplitter", + "version": 1, + "type": "RecursiveCharacterTextSplitter", + "baseClasses": ["RecursiveCharacterTextSplitter", "TextSplitter"], + "category": "Text Splitters", + "description": "Split documents recursively by different characters - starting with \"\\n\\n\", then \"\\n\", then \" \"", + "inputParams": [ + { + "label": "Chunk Size", + "name": "chunkSize", + "type": "number", + "default": 1000, + "optional": true, + "id": "recursiveCharacterTextSplitter_0-input-chunkSize-number" + }, + { + "label": "Chunk Overlap", + "name": "chunkOverlap", + "type": "number", + "optional": true, + "id": "recursiveCharacterTextSplitter_0-input-chunkOverlap-number" + } + ], + "inputAnchors": [], + "inputs": { + "chunkSize": 1000, + "chunkOverlap": "" + }, + "outputAnchors": [ + { + "id": "recursiveCharacterTextSplitter_0-output-recursiveCharacterTextSplitter-RecursiveCharacterTextSplitter|TextSplitter", + "name": "recursiveCharacterTextSplitter", + "label": "RecursiveCharacterTextSplitter", + "type": "RecursiveCharacterTextSplitter | TextSplitter" + } + ], + "outputs": {}, + "selected": false + }, + "selected": false, + "positionAbsolute": { + "x": 406.08456707531263, + "y": 197.66460328693972 + }, + "dragging": false + }, + { + "width": 300, + "height": 410, + "id": "textFile_0", + "position": { + "x": 786.5497697231324, + "y": 140.09563157584407 + }, + "type": "customNode", + "data": { + "id": "textFile_0", + "label": "Text File", + "name": "textFile", + "version": 1, + "type": "Document", + "baseClasses": ["Document"], + "category": "Document Loaders", + "description": "Load data from text files", + "inputParams": [ + { + "label": "Txt File", + "name": "txtFile", + "type": "file", + "fileType": ".txt", + "id": "textFile_0-input-txtFile-file" + }, + { + "label": "Metadata", + "name": "metadata", + "type": "json", + "optional": true, + "additionalParams": true, + "id": "textFile_0-input-metadata-json" + } + ], + "inputAnchors": [ + { + "label": "Text Splitter", + "name": "textSplitter", + "type": "TextSplitter", + "optional": true, + "id": "textFile_0-input-textSplitter-TextSplitter" + } + ], + "inputs": { + "textSplitter": "{{recursiveCharacterTextSplitter_0.data.instance}}", + "metadata": "" + }, + "outputAnchors": [ + { + "id": "textFile_0-output-textFile-Document", + "name": "textFile", + "label": "Document", + "type": "Document" + } + ], + "outputs": {}, + "selected": false + }, + "selected": false, + "positionAbsolute": { + "x": 786.5497697231324, + "y": 140.09563157584407 + }, + "dragging": false + }, + { + "width": 300, + "height": 479, "id": "conversationalRetrievalQAChain_0", "position": { - "x": 1627.1855024401737, - "y": 394.42287890442145 + "x": 1558.6564094656787, + "y": 386.60217819991124 }, "type": "customNode", "data": { "id": "conversationalRetrievalQAChain_0", "label": "Conversational Retrieval QA Chain", "name": "conversationalRetrievalQAChain", + "version": 1, "type": "ConversationalRetrievalQAChain", - "baseClasses": ["ConversationalRetrievalQAChain", "BaseChain", "BaseLangChain"], + "baseClasses": ["ConversationalRetrievalQAChain", "BaseChain"], "category": "Chains", "description": "Document QA - built on RetrievalQAChain to provide a chat history component", - "inputParams": [], + "inputParams": [ + { + "label": "Return Source Documents", + "name": "returnSourceDocuments", + "type": "boolean", + "optional": true, + "id": "conversationalRetrievalQAChain_0-input-returnSourceDocuments-boolean" + }, + { + "label": "System Message", + "name": "systemMessagePrompt", + "type": "string", + "rows": 4, + "additionalParams": true, + "optional": true, + "placeholder": "I want you to act as a document that I am having a conversation with. Your name is \"AI Assistant\". You will provide me with answers from the given info. If the answer is not included, say exactly \"Hmm, I am not sure.\" and stop after that. Refuse to answer any question not about the info. Never break character.", + "id": "conversationalRetrievalQAChain_0-input-systemMessagePrompt-string" + }, + { + "label": "Chain Option", + "name": "chainOption", + "type": "options", + "options": [ + { + "label": "MapReduceDocumentsChain", + "name": "map_reduce", + "description": "Suitable for QA tasks over larger documents and can run the preprocessing step in parallel, reducing the running time" + }, + { + "label": "RefineDocumentsChain", + "name": "refine", + "description": "Suitable for QA tasks over a large number of documents." + }, + { + "label": "StuffDocumentsChain", + "name": "stuff", + "description": "Suitable for QA tasks over a small number of documents." + } + ], + "additionalParams": true, + "optional": true, + "id": "conversationalRetrievalQAChain_0-input-chainOption-options" + } + ], "inputAnchors": [ { "label": "Language Model", @@ -457,83 +545,94 @@ "name": "vectorStoreRetriever", "type": "BaseRetriever", "id": "conversationalRetrievalQAChain_0-input-vectorStoreRetriever-BaseRetriever" + }, + { + "label": "Memory", + "name": "memory", + "type": "BaseMemory", + "optional": true, + "description": "If left empty, a default BufferMemory will be used", + "id": "conversationalRetrievalQAChain_0-input-memory-BaseMemory" } ], "inputs": { "model": "{{chatOpenAI_0.data.instance}}", - "vectorStoreRetriever": "{{pineconeUpsert_1.data.instance}}" + "vectorStoreRetriever": "{{pineconeUpsert_0.data.instance}}", + "memory": "", + "returnSourceDocuments": "", + "systemMessagePrompt": "", + "chainOption": "" }, "outputAnchors": [ { - "id": "conversationalRetrievalQAChain_0-output-conversationalRetrievalQAChain-ConversationalRetrievalQAChain|BaseChain|BaseLangChain", + "id": "conversationalRetrievalQAChain_0-output-conversationalRetrievalQAChain-ConversationalRetrievalQAChain|BaseChain", "name": "conversationalRetrievalQAChain", "label": "ConversationalRetrievalQAChain", - "type": "ConversationalRetrievalQAChain | BaseChain | BaseLangChain" + "type": "ConversationalRetrievalQAChain | BaseChain" } ], "outputs": {}, "selected": false }, - "selected": false, "positionAbsolute": { - "x": 1627.1855024401737, - "y": 394.42287890442145 + "x": 1558.6564094656787, + "y": 386.60217819991124 }, - "dragging": false + "selected": false } ], "edges": [ { - "source": "openAIEmbeddings_1", - "sourceHandle": "openAIEmbeddings_1-output-openAIEmbeddings-OpenAIEmbeddings|Embeddings", - "target": "pineconeUpsert_1", - "targetHandle": "pineconeUpsert_1-input-embeddings-Embeddings", + "source": "openAIEmbeddings_0", + "sourceHandle": "openAIEmbeddings_0-output-openAIEmbeddings-OpenAIEmbeddings|Embeddings", + "target": "pineconeUpsert_0", + "targetHandle": "pineconeUpsert_0-input-embeddings-Embeddings", "type": "buttonedge", - "id": "openAIEmbeddings_1-openAIEmbeddings_1-output-openAIEmbeddings-OpenAIEmbeddings|Embeddings-pineconeUpsert_1-pineconeUpsert_1-input-embeddings-Embeddings", + "id": "openAIEmbeddings_0-openAIEmbeddings_0-output-openAIEmbeddings-OpenAIEmbeddings|Embeddings-pineconeUpsert_0-pineconeUpsert_0-input-embeddings-Embeddings", "data": { "label": "" } }, { - "source": "textFile_1", - "sourceHandle": "textFile_1-output-textFile-Document", - "target": "pineconeUpsert_1", - "targetHandle": "pineconeUpsert_1-input-document-Document", + "source": "recursiveCharacterTextSplitter_0", + "sourceHandle": "recursiveCharacterTextSplitter_0-output-recursiveCharacterTextSplitter-RecursiveCharacterTextSplitter|TextSplitter", + "target": "textFile_0", + "targetHandle": "textFile_0-input-textSplitter-TextSplitter", "type": "buttonedge", - "id": "textFile_1-textFile_1-output-textFile-Document-pineconeUpsert_1-pineconeUpsert_1-input-document-Document", + "id": "recursiveCharacterTextSplitter_0-recursiveCharacterTextSplitter_0-output-recursiveCharacterTextSplitter-RecursiveCharacterTextSplitter|TextSplitter-textFile_0-textFile_0-input-textSplitter-TextSplitter", "data": { "label": "" } }, { - "source": "recursiveCharacterTextSplitter_1", - "sourceHandle": "recursiveCharacterTextSplitter_1-output-recursiveCharacterTextSplitter-RecursiveCharacterTextSplitter|TextSplitter", - "target": "textFile_1", - "targetHandle": "textFile_1-input-textSplitter-TextSplitter", + "source": "textFile_0", + "sourceHandle": "textFile_0-output-textFile-Document", + "target": "pineconeUpsert_0", + "targetHandle": "pineconeUpsert_0-input-document-Document", "type": "buttonedge", - "id": "recursiveCharacterTextSplitter_1-recursiveCharacterTextSplitter_1-output-recursiveCharacterTextSplitter-RecursiveCharacterTextSplitter|TextSplitter-textFile_1-textFile_1-input-textSplitter-TextSplitter", + "id": "textFile_0-textFile_0-output-textFile-Document-pineconeUpsert_0-pineconeUpsert_0-input-document-Document", "data": { "label": "" } }, { "source": "chatOpenAI_0", - "sourceHandle": "chatOpenAI_0-output-chatOpenAI-ChatOpenAI|BaseChatModel|BaseLanguageModel|BaseLangChain", + "sourceHandle": "chatOpenAI_0-output-chatOpenAI-ChatOpenAI|BaseChatModel|BaseLanguageModel", "target": "conversationalRetrievalQAChain_0", "targetHandle": "conversationalRetrievalQAChain_0-input-model-BaseLanguageModel", "type": "buttonedge", - "id": "chatOpenAI_0-chatOpenAI_0-output-chatOpenAI-ChatOpenAI|BaseChatModel|BaseLanguageModel|BaseLangChain-conversationalRetrievalQAChain_0-conversationalRetrievalQAChain_0-input-model-BaseLanguageModel", + "id": "chatOpenAI_0-chatOpenAI_0-output-chatOpenAI-ChatOpenAI|BaseChatModel|BaseLanguageModel-conversationalRetrievalQAChain_0-conversationalRetrievalQAChain_0-input-model-BaseLanguageModel", "data": { "label": "" } }, { - "source": "pineconeUpsert_1", - "sourceHandle": "pineconeUpsert_1-output-retriever-Pinecone|VectorStoreRetriever|BaseRetriever", + "source": "pineconeUpsert_0", + "sourceHandle": "pineconeUpsert_0-output-retriever-Pinecone|VectorStoreRetriever|BaseRetriever", "target": "conversationalRetrievalQAChain_0", "targetHandle": "conversationalRetrievalQAChain_0-input-vectorStoreRetriever-BaseRetriever", "type": "buttonedge", - "id": "pineconeUpsert_1-pineconeUpsert_1-output-retriever-Pinecone|VectorStoreRetriever|BaseRetriever-conversationalRetrievalQAChain_0-conversationalRetrievalQAChain_0-input-vectorStoreRetriever-BaseRetriever", + "id": "pineconeUpsert_0-pineconeUpsert_0-output-retriever-Pinecone|VectorStoreRetriever|BaseRetriever-conversationalRetrievalQAChain_0-conversationalRetrievalQAChain_0-input-vectorStoreRetriever-BaseRetriever", "data": { "label": "" } diff --git a/packages/server/marketplaces/Github Repo QnA.json b/packages/server/marketplaces/chatflows/Flowise Docs QnA.json similarity index 53% rename from packages/server/marketplaces/Github Repo QnA.json rename to packages/server/marketplaces/chatflows/Flowise Docs QnA.json index a198bbfcb..6d11f3d29 100644 --- a/packages/server/marketplaces/Github Repo QnA.json +++ b/packages/server/marketplaces/chatflows/Flowise Docs QnA.json @@ -1,23 +1,24 @@ { - "description": "Github repo QnA using conversational retrieval QA chain", + "description": "Flowise Docs Github QnA using conversational retrieval QA chain", "nodes": [ { "width": 300, "height": 376, - "id": "recursiveCharacterTextSplitter_1", + "id": "markdownTextSplitter_0", "position": { - "x": 447.1038086695898, - "y": 126.52301921543597 + "x": 1081.1540334344143, + "y": -113.73571627207801 }, "type": "customNode", "data": { - "id": "recursiveCharacterTextSplitter_1", - "label": "Recursive Character Text Splitter", - "name": "recursiveCharacterTextSplitter", - "type": "RecursiveCharacterTextSplitter", - "baseClasses": ["RecursiveCharacterTextSplitter", "TextSplitter"], + "id": "markdownTextSplitter_0", + "label": "Markdown Text Splitter", + "name": "markdownTextSplitter", + "version": 1, + "type": "MarkdownTextSplitter", + "baseClasses": ["MarkdownTextSplitter", "RecursiveCharacterTextSplitter", "TextSplitter", "BaseDocumentTransformer"], "category": "Text Splitters", - "description": "Split documents recursively by different characters - starting with \"\n\n\", then \"\n\", then \" \"", + "description": "Split your content into documents based on the Markdown headers", "inputParams": [ { "label": "Chunk Size", @@ -25,108 +26,27 @@ "type": "number", "default": 1000, "optional": true, - "id": "recursiveCharacterTextSplitter_1-input-chunkSize-number" + "id": "markdownTextSplitter_0-input-chunkSize-number" }, { "label": "Chunk Overlap", "name": "chunkOverlap", "type": "number", "optional": true, - "id": "recursiveCharacterTextSplitter_1-input-chunkOverlap-number" + "id": "markdownTextSplitter_0-input-chunkOverlap-number" } ], "inputAnchors": [], "inputs": { - "chunkSize": 1000, + "chunkSize": "4000", "chunkOverlap": "" }, "outputAnchors": [ { - "id": "recursiveCharacterTextSplitter_1-output-recursiveCharacterTextSplitter-RecursiveCharacterTextSplitter|TextSplitter", - "name": "recursiveCharacterTextSplitter", - "label": "RecursiveCharacterTextSplitter", - "type": "RecursiveCharacterTextSplitter | TextSplitter" - } - ], - "outputs": {}, - "selected": false - }, - "positionAbsolute": { - "x": 447.1038086695898, - "y": 126.52301921543597 - }, - "selected": false, - "dragging": false - }, - { - "width": 300, - "height": 578, - "id": "github_1", - "position": { - "x": 836.9660489009947, - "y": -44.04171088580361 - }, - "type": "customNode", - "data": { - "id": "github_1", - "label": "Github", - "name": "github", - "type": "Document", - "baseClasses": ["Document"], - "category": "Document Loaders", - "description": "Load data from a GitHub repository", - "inputParams": [ - { - "label": "Repo Link", - "name": "repoLink", - "type": "string", - "placeholder": "https://github.com/FlowiseAI/Flowise", - "id": "github_1-input-repoLink-string" - }, - { - "label": "Branch", - "name": "branch", - "type": "string", - "default": "main", - "id": "github_1-input-branch-string" - }, - { - "label": "Access Token", - "name": "accessToken", - "type": "password", - "placeholder": "", - "optional": true, - "id": "github_1-input-accessToken-password" - }, - { - "label": "Metadata", - "name": "metadata", - "type": "json", - "optional": true, - "additionalParams": true, - "id": "github_1-input-metadata-json" - } - ], - "inputAnchors": [ - { - "label": "Text Splitter", - "name": "textSplitter", - "type": "TextSplitter", - "optional": true, - "id": "github_1-input-textSplitter-TextSplitter" - } - ], - "inputs": { - "repoLink": "", - "branch": "main", - "textSplitter": "{{recursiveCharacterTextSplitter_1.data.instance}}" - }, - "outputAnchors": [ - { - "id": "github_1-output-github-Document", - "name": "github", - "label": "Document", - "type": "Document" + "id": "markdownTextSplitter_0-output-markdownTextSplitter-MarkdownTextSplitter|RecursiveCharacterTextSplitter|TextSplitter|BaseDocumentTransformer", + "name": "markdownTextSplitter", + "label": "MarkdownTextSplitter", + "type": "MarkdownTextSplitter | RecursiveCharacterTextSplitter | TextSplitter | BaseDocumentTransformer" } ], "outputs": {}, @@ -134,127 +54,38 @@ }, "selected": false, "positionAbsolute": { - "x": 836.9660489009947, - "y": -44.04171088580361 + "x": 1081.1540334344143, + "y": -113.73571627207801 }, "dragging": false }, { "width": 300, - "height": 330, - "id": "openAIEmbeddings_1", + "height": 405, + "id": "memoryVectorStore_0", "position": { - "x": 833.4085562012468, - "y": 541.7875676090047 + "x": 1844.88052464165, + "y": 484.60473328470243 }, "type": "customNode", "data": { - "id": "openAIEmbeddings_1", - "label": "OpenAI Embeddings", - "name": "openAIEmbeddings", - "type": "OpenAIEmbeddings", - "baseClasses": ["OpenAIEmbeddings", "Embeddings"], - "category": "Embeddings", - "description": "OpenAI API to generate embeddings for a given text", - "inputParams": [ - { - "label": "OpenAI Api Key", - "name": "openAIApiKey", - "type": "password", - "id": "openAIEmbeddings_1-input-openAIApiKey-password" - }, - { - "label": "Strip New Lines", - "name": "stripNewLines", - "type": "boolean", - "optional": true, - "additionalParams": true, - "id": "openAIEmbeddings_1-input-stripNewLines-boolean" - }, - { - "label": "Batch Size", - "name": "batchSize", - "type": "number", - "optional": true, - "additionalParams": true, - "id": "openAIEmbeddings_1-input-batchSize-number" - }, - { - "label": "Timeout", - "name": "timeout", - "type": "number", - "optional": true, - "additionalParams": true, - "id": "openAIEmbeddings_1-input-timeout-number" - } - ], - "inputAnchors": [], - "inputs": { - "stripNewLines": "", - "batchSize": "", - "timeout": "" - }, - "outputAnchors": [ - { - "id": "openAIEmbeddings_1-output-openAIEmbeddings-OpenAIEmbeddings|Embeddings", - "name": "openAIEmbeddings", - "label": "OpenAIEmbeddings", - "type": "OpenAIEmbeddings | Embeddings" - } - ], - "outputs": {}, - "selected": false - }, - "positionAbsolute": { - "x": 833.4085562012468, - "y": 541.7875676090047 - }, - "selected": false, - "dragging": false - }, - { - "width": 300, - "height": 702, - "id": "pineconeUpsert_1", - "position": { - "x": 1268.7946529279823, - "y": 382.77997896801634 - }, - "type": "customNode", - "data": { - "id": "pineconeUpsert_1", - "label": "Pinecone Upsert Document", - "name": "pineconeUpsert", - "type": "Pinecone", - "baseClasses": ["Pinecone", "VectorStoreRetriever", "BaseRetriever"], + "id": "memoryVectorStore_0", + "label": "In-Memory Vector Store", + "name": "memoryVectorStore", + "version": 1, + "type": "Memory", + "baseClasses": ["Memory", "VectorStoreRetriever", "BaseRetriever"], "category": "Vector Stores", - "description": "Upsert documents to Pinecone", + "description": "In-memory vectorstore that stores embeddings and does an exact, linear search for the most similar embeddings.", "inputParams": [ { - "label": "Pinecone Api Key", - "name": "pineconeApiKey", - "type": "password", - "id": "pineconeUpsert_1-input-pineconeApiKey-password" - }, - { - "label": "Pinecone Environment", - "name": "pineconeEnv", - "type": "string", - "id": "pineconeUpsert_1-input-pineconeEnv-string" - }, - { - "label": "Pinecone Index", - "name": "pineconeIndex", - "type": "string", - "id": "pineconeUpsert_1-input-pineconeIndex-string" - }, - { - "label": "Pinecone Namespace", - "name": "pineconeNamespace", - "type": "string", - "placeholder": "my-first-namespace", + "label": "Top K", + "name": "topK", + "description": "Number of top results to fetch. Default to 4", + "placeholder": "4", + "type": "number", "optional": true, - "id": "pineconeUpsert_1-input-pineconeNamespace-string" + "id": "memoryVectorStore_0-input-topK-number" } ], "inputAnchors": [ @@ -263,21 +94,19 @@ "name": "document", "type": "Document", "list": true, - "id": "pineconeUpsert_1-input-document-Document" + "id": "memoryVectorStore_0-input-document-Document" }, { "label": "Embeddings", "name": "embeddings", "type": "Embeddings", - "id": "pineconeUpsert_1-input-embeddings-Embeddings" + "id": "memoryVectorStore_0-input-embeddings-Embeddings" } ], "inputs": { - "document": ["{{github_1.data.instance}}"], - "embeddings": "{{openAIEmbeddings_1.data.instance}}", - "pineconeEnv": "us-west4-gcp", - "pineconeIndex": "myindex", - "pineconeNamespace": "mynamespace" + "document": ["{{github_0.data.instance}}"], + "embeddings": "{{openAIEmbeddings_0.data.instance}}", + "topK": "" }, "outputAnchors": [ { @@ -286,16 +115,16 @@ "type": "options", "options": [ { - "id": "pineconeUpsert_1-output-retriever-Pinecone|VectorStoreRetriever|BaseRetriever", + "id": "memoryVectorStore_0-output-retriever-Memory|VectorStoreRetriever|BaseRetriever", "name": "retriever", - "label": "Pinecone Retriever", - "type": "Pinecone | VectorStoreRetriever | BaseRetriever" + "label": "Memory Retriever", + "type": "Memory | VectorStoreRetriever | BaseRetriever" }, { - "id": "pineconeUpsert_1-output-vectorStore-Pinecone|VectorStore", + "id": "memoryVectorStore_0-output-vectorStore-Memory|VectorStore", "name": "vectorStore", - "label": "Pinecone Vector Store", - "type": "Pinecone | VectorStore" + "label": "Memory Vector Store", + "type": "Memory | VectorStore" } ], "default": "retriever" @@ -307,35 +136,239 @@ "selected": false }, "selected": false, + "positionAbsolute": { + "x": 1844.88052464165, + "y": 484.60473328470243 + }, + "dragging": false + }, + { + "width": 300, + "height": 479, + "id": "conversationalRetrievalQAChain_0", + "position": { + "x": 2311.697827287373, + "y": 228.14841720207832 + }, + "type": "customNode", + "data": { + "id": "conversationalRetrievalQAChain_0", + "label": "Conversational Retrieval QA Chain", + "name": "conversationalRetrievalQAChain", + "version": 1, + "type": "ConversationalRetrievalQAChain", + "baseClasses": ["ConversationalRetrievalQAChain", "BaseChain"], + "category": "Chains", + "description": "Document QA - built on RetrievalQAChain to provide a chat history component", + "inputParams": [ + { + "label": "Return Source Documents", + "name": "returnSourceDocuments", + "type": "boolean", + "optional": true, + "id": "conversationalRetrievalQAChain_0-input-returnSourceDocuments-boolean" + }, + { + "label": "System Message", + "name": "systemMessagePrompt", + "type": "string", + "rows": 4, + "additionalParams": true, + "optional": true, + "placeholder": "I want you to act as a document that I am having a conversation with. Your name is \"AI Assistant\". You will provide me with answers from the given info. If the answer is not included, say exactly \"Hmm, I am not sure.\" and stop after that. Refuse to answer any question not about the info. Never break character.", + "id": "conversationalRetrievalQAChain_0-input-systemMessagePrompt-string" + }, + { + "label": "Chain Option", + "name": "chainOption", + "type": "options", + "options": [ + { + "label": "MapReduceDocumentsChain", + "name": "map_reduce", + "description": "Suitable for QA tasks over larger documents and can run the preprocessing step in parallel, reducing the running time" + }, + { + "label": "RefineDocumentsChain", + "name": "refine", + "description": "Suitable for QA tasks over a large number of documents." + }, + { + "label": "StuffDocumentsChain", + "name": "stuff", + "description": "Suitable for QA tasks over a small number of documents." + } + ], + "additionalParams": true, + "optional": true, + "id": "conversationalRetrievalQAChain_0-input-chainOption-options" + } + ], + "inputAnchors": [ + { + "label": "Language Model", + "name": "model", + "type": "BaseLanguageModel", + "id": "conversationalRetrievalQAChain_0-input-model-BaseLanguageModel" + }, + { + "label": "Vector Store Retriever", + "name": "vectorStoreRetriever", + "type": "BaseRetriever", + "id": "conversationalRetrievalQAChain_0-input-vectorStoreRetriever-BaseRetriever" + }, + { + "label": "Memory", + "name": "memory", + "type": "BaseMemory", + "optional": true, + "description": "If left empty, a default BufferMemory will be used", + "id": "conversationalRetrievalQAChain_0-input-memory-BaseMemory" + } + ], + "inputs": { + "model": "{{chatOpenAI_0.data.instance}}", + "vectorStoreRetriever": "{{memoryVectorStore_0.data.instance}}", + "memory": "", + "returnSourceDocuments": true, + "systemMessagePrompt": "", + "chainOption": "" + }, + "outputAnchors": [ + { + "id": "conversationalRetrievalQAChain_0-output-conversationalRetrievalQAChain-ConversationalRetrievalQAChain|BaseChain", + "name": "conversationalRetrievalQAChain", + "label": "ConversationalRetrievalQAChain", + "type": "ConversationalRetrievalQAChain | BaseChain" + } + ], + "outputs": {}, + "selected": false + }, + "selected": false, "dragging": false, "positionAbsolute": { - "x": 1268.7946529279823, - "y": 382.77997896801634 + "x": 2311.697827287373, + "y": 228.14841720207832 } }, { "width": 300, - "height": 524, + "height": 673, + "id": "github_0", + "position": { + "x": 1460.1858988997, + "y": -137.83585695472374 + }, + "type": "customNode", + "data": { + "id": "github_0", + "label": "Github", + "name": "github", + "version": 1, + "type": "Document", + "baseClasses": ["Document"], + "category": "Document Loaders", + "description": "Load data from a GitHub repository", + "inputParams": [ + { + "label": "Connect Credential", + "name": "credential", + "type": "credential", + "description": "Only needed when accessing private repo", + "optional": true, + "credentialNames": ["githubApi"], + "id": "github_0-input-credential-credential" + }, + { + "label": "Repo Link", + "name": "repoLink", + "type": "string", + "placeholder": "https://github.com/FlowiseAI/Flowise", + "id": "github_0-input-repoLink-string" + }, + { + "label": "Branch", + "name": "branch", + "type": "string", + "default": "main", + "id": "github_0-input-branch-string" + }, + { + "label": "Recursive", + "name": "recursive", + "type": "boolean", + "optional": true, + "id": "github_0-input-recursive-boolean" + }, + { + "label": "Metadata", + "name": "metadata", + "type": "json", + "optional": true, + "additionalParams": true, + "id": "github_0-input-metadata-json" + } + ], + "inputAnchors": [ + { + "label": "Text Splitter", + "name": "textSplitter", + "type": "TextSplitter", + "optional": true, + "id": "github_0-input-textSplitter-TextSplitter" + } + ], + "inputs": { + "repoLink": "https://github.com/FlowiseAI/FlowiseDocs", + "branch": "main", + "recursive": true, + "textSplitter": "{{markdownTextSplitter_0.data.instance}}", + "metadata": "" + }, + "outputAnchors": [ + { + "id": "github_0-output-github-Document", + "name": "github", + "label": "Document", + "type": "Document" + } + ], + "outputs": {}, + "selected": false + }, + "selected": false, + "positionAbsolute": { + "x": 1460.1858988997, + "y": -137.83585695472374 + }, + "dragging": false + }, + { + "width": 300, + "height": 522, "id": "chatOpenAI_0", "position": { - "x": 1271.1300438358664, - "y": -169.75707425097968 + "x": 1857.367353502965, + "y": -104.25095383414119 }, "type": "customNode", "data": { "id": "chatOpenAI_0", "label": "ChatOpenAI", "name": "chatOpenAI", + "version": 1, "type": "ChatOpenAI", - "baseClasses": ["ChatOpenAI", "BaseChatModel", "BaseLanguageModel", "BaseLangChain"], + "baseClasses": ["ChatOpenAI", "BaseChatModel", "BaseLanguageModel"], "category": "Chat Models", "description": "Wrapper around OpenAI large language models that use the Chat endpoint", "inputParams": [ { - "label": "OpenAI Api Key", - "name": "openAIApiKey", - "type": "password", - "id": "chatOpenAI_0-input-openAIApiKey-password" + "label": "Connect Credential", + "name": "credential", + "type": "credential", + "credentialNames": ["openAIApi"], + "id": "chatOpenAI_0-input-credential-credential" }, { "label": "Model Name", @@ -347,20 +380,32 @@ "name": "gpt-4" }, { - "label": "gpt-4-0314", - "name": "gpt-4-0314" + "label": "gpt-4-0613", + "name": "gpt-4-0613" }, { - "label": "gpt-4-32k-0314", - "name": "gpt-4-32k-0314" + "label": "gpt-4-32k", + "name": "gpt-4-32k" + }, + { + "label": "gpt-4-32k-0613", + "name": "gpt-4-32k-0613" }, { "label": "gpt-3.5-turbo", "name": "gpt-3.5-turbo" }, { - "label": "gpt-3.5-turbo-0301", - "name": "gpt-3.5-turbo-0301" + "label": "gpt-3.5-turbo-0613", + "name": "gpt-3.5-turbo-0613" + }, + { + "label": "gpt-3.5-turbo-16k", + "name": "gpt-3.5-turbo-16k" + }, + { + "label": "gpt-3.5-turbo-16k-0613", + "name": "gpt-3.5-turbo-16k-0613" } ], "default": "gpt-3.5-turbo", @@ -414,24 +459,33 @@ "optional": true, "additionalParams": true, "id": "chatOpenAI_0-input-timeout-number" + }, + { + "label": "BasePath", + "name": "basepath", + "type": "string", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_0-input-basepath-string" } ], "inputAnchors": [], "inputs": { "modelName": "gpt-3.5-turbo", - "temperature": "0.5", + "temperature": 0.9, "maxTokens": "", "topP": "", "frequencyPenalty": "", "presencePenalty": "", - "timeout": "" + "timeout": "", + "basepath": "" }, "outputAnchors": [ { - "id": "chatOpenAI_0-output-chatOpenAI-ChatOpenAI|BaseChatModel|BaseLanguageModel|BaseLangChain", + "id": "chatOpenAI_0-output-chatOpenAI-ChatOpenAI|BaseChatModel|BaseLanguageModel", "name": "chatOpenAI", "label": "ChatOpenAI", - "type": "ChatOpenAI | BaseChatModel | BaseLanguageModel | BaseLangChain" + "type": "ChatOpenAI | BaseChatModel | BaseLanguageModel" } ], "outputs": {}, @@ -439,118 +493,148 @@ }, "selected": false, "positionAbsolute": { - "x": 1271.1300438358664, - "y": -169.75707425097968 + "x": 1857.367353502965, + "y": -104.25095383414119 }, "dragging": false }, { "width": 300, - "height": 280, - "id": "conversationalRetrievalQAChain_0", + "height": 328, + "id": "openAIEmbeddings_0", "position": { - "x": 1653.6177539108153, - "y": 266.4856653480158 + "x": 1299.9983863833309, + "y": 581.8406384863323 }, "type": "customNode", "data": { - "id": "conversationalRetrievalQAChain_0", - "label": "Conversational Retrieval QA Chain", - "name": "conversationalRetrievalQAChain", - "type": "ConversationalRetrievalQAChain", - "baseClasses": ["ConversationalRetrievalQAChain", "BaseChain", "BaseLangChain"], - "category": "Chains", - "description": "Document QA - built on RetrievalQAChain to provide a chat history component", - "inputParams": [], - "inputAnchors": [ + "id": "openAIEmbeddings_0", + "label": "OpenAI Embeddings", + "name": "openAIEmbeddings", + "version": 1, + "type": "OpenAIEmbeddings", + "baseClasses": ["OpenAIEmbeddings", "Embeddings"], + "category": "Embeddings", + "description": "OpenAI API to generate embeddings for a given text", + "inputParams": [ { - "label": "Language Model", - "name": "model", - "type": "BaseLanguageModel", - "id": "conversationalRetrievalQAChain_0-input-model-BaseLanguageModel" + "label": "Connect Credential", + "name": "credential", + "type": "credential", + "credentialNames": ["openAIApi"], + "id": "openAIEmbeddings_0-input-credential-credential" }, { - "label": "Vector Store Retriever", - "name": "vectorStoreRetriever", - "type": "BaseRetriever", - "id": "conversationalRetrievalQAChain_0-input-vectorStoreRetriever-BaseRetriever" + "label": "Strip New Lines", + "name": "stripNewLines", + "type": "boolean", + "optional": true, + "additionalParams": true, + "id": "openAIEmbeddings_0-input-stripNewLines-boolean" + }, + { + "label": "Batch Size", + "name": "batchSize", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "openAIEmbeddings_0-input-batchSize-number" + }, + { + "label": "Timeout", + "name": "timeout", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "openAIEmbeddings_0-input-timeout-number" + }, + { + "label": "BasePath", + "name": "basepath", + "type": "string", + "optional": true, + "additionalParams": true, + "id": "openAIEmbeddings_0-input-basepath-string" } ], + "inputAnchors": [], "inputs": { - "model": "{{chatOpenAI_0.data.instance}}", - "vectorStoreRetriever": "{{pineconeUpsert_1.data.instance}}" + "stripNewLines": "", + "batchSize": "", + "timeout": "", + "basepath": "" }, "outputAnchors": [ { - "id": "conversationalRetrievalQAChain_0-output-conversationalRetrievalQAChain-ConversationalRetrievalQAChain|BaseChain|BaseLangChain", - "name": "conversationalRetrievalQAChain", - "label": "ConversationalRetrievalQAChain", - "type": "ConversationalRetrievalQAChain | BaseChain | BaseLangChain" + "id": "openAIEmbeddings_0-output-openAIEmbeddings-OpenAIEmbeddings|Embeddings", + "name": "openAIEmbeddings", + "label": "OpenAIEmbeddings", + "type": "OpenAIEmbeddings | Embeddings" } ], "outputs": {}, "selected": false }, "selected": false, + "dragging": false, "positionAbsolute": { - "x": 1653.6177539108153, - "y": 266.4856653480158 - }, - "dragging": false + "x": 1299.9983863833309, + "y": 581.8406384863323 + } } ], "edges": [ { - "source": "github_1", - "sourceHandle": "github_1-output-github-Document", - "target": "pineconeUpsert_1", - "targetHandle": "pineconeUpsert_1-input-document-Document", + "source": "memoryVectorStore_0", + "sourceHandle": "memoryVectorStore_0-output-retriever-Memory|VectorStoreRetriever|BaseRetriever", + "target": "conversationalRetrievalQAChain_0", + "targetHandle": "conversationalRetrievalQAChain_0-input-vectorStoreRetriever-BaseRetriever", "type": "buttonedge", - "id": "github_1-github_1-output-github-Document-pineconeUpsert_1-pineconeUpsert_1-input-document-Document", + "id": "memoryVectorStore_0-memoryVectorStore_0-output-retriever-Memory|VectorStoreRetriever|BaseRetriever-conversationalRetrievalQAChain_0-conversationalRetrievalQAChain_0-input-vectorStoreRetriever-BaseRetriever", "data": { "label": "" } }, { - "source": "openAIEmbeddings_1", - "sourceHandle": "openAIEmbeddings_1-output-openAIEmbeddings-OpenAIEmbeddings|Embeddings", - "target": "pineconeUpsert_1", - "targetHandle": "pineconeUpsert_1-input-embeddings-Embeddings", + "source": "markdownTextSplitter_0", + "sourceHandle": "markdownTextSplitter_0-output-markdownTextSplitter-MarkdownTextSplitter|RecursiveCharacterTextSplitter|TextSplitter|BaseDocumentTransformer", + "target": "github_0", + "targetHandle": "github_0-input-textSplitter-TextSplitter", "type": "buttonedge", - "id": "openAIEmbeddings_1-openAIEmbeddings_1-output-openAIEmbeddings-OpenAIEmbeddings|Embeddings-pineconeUpsert_1-pineconeUpsert_1-input-embeddings-Embeddings", + "id": "markdownTextSplitter_0-markdownTextSplitter_0-output-markdownTextSplitter-MarkdownTextSplitter|RecursiveCharacterTextSplitter|TextSplitter|BaseDocumentTransformer-github_0-github_0-input-textSplitter-TextSplitter", "data": { "label": "" } }, { - "source": "recursiveCharacterTextSplitter_1", - "sourceHandle": "recursiveCharacterTextSplitter_1-output-recursiveCharacterTextSplitter-RecursiveCharacterTextSplitter|TextSplitter", - "target": "github_1", - "targetHandle": "github_1-input-textSplitter-TextSplitter", + "source": "github_0", + "sourceHandle": "github_0-output-github-Document", + "target": "memoryVectorStore_0", + "targetHandle": "memoryVectorStore_0-input-document-Document", "type": "buttonedge", - "id": "recursiveCharacterTextSplitter_1-recursiveCharacterTextSplitter_1-output-recursiveCharacterTextSplitter-RecursiveCharacterTextSplitter|TextSplitter-github_1-github_1-input-textSplitter-TextSplitter", + "id": "github_0-github_0-output-github-Document-memoryVectorStore_0-memoryVectorStore_0-input-document-Document", "data": { "label": "" } }, { "source": "chatOpenAI_0", - "sourceHandle": "chatOpenAI_0-output-chatOpenAI-ChatOpenAI|BaseChatModel|BaseLanguageModel|BaseLangChain", + "sourceHandle": "chatOpenAI_0-output-chatOpenAI-ChatOpenAI|BaseChatModel|BaseLanguageModel", "target": "conversationalRetrievalQAChain_0", "targetHandle": "conversationalRetrievalQAChain_0-input-model-BaseLanguageModel", "type": "buttonedge", - "id": "chatOpenAI_0-chatOpenAI_0-output-chatOpenAI-ChatOpenAI|BaseChatModel|BaseLanguageModel|BaseLangChain-conversationalRetrievalQAChain_0-conversationalRetrievalQAChain_0-input-model-BaseLanguageModel", + "id": "chatOpenAI_0-chatOpenAI_0-output-chatOpenAI-ChatOpenAI|BaseChatModel|BaseLanguageModel-conversationalRetrievalQAChain_0-conversationalRetrievalQAChain_0-input-model-BaseLanguageModel", "data": { "label": "" } }, { - "source": "pineconeUpsert_1", - "sourceHandle": "pineconeUpsert_1-output-retriever-Pinecone|VectorStoreRetriever|BaseRetriever", - "target": "conversationalRetrievalQAChain_0", - "targetHandle": "conversationalRetrievalQAChain_0-input-vectorStoreRetriever-BaseRetriever", + "source": "openAIEmbeddings_0", + "sourceHandle": "openAIEmbeddings_0-output-openAIEmbeddings-OpenAIEmbeddings|Embeddings", + "target": "memoryVectorStore_0", + "targetHandle": "memoryVectorStore_0-input-embeddings-Embeddings", "type": "buttonedge", - "id": "pineconeUpsert_1-pineconeUpsert_1-output-retriever-Pinecone|VectorStoreRetriever|BaseRetriever-conversationalRetrievalQAChain_0-conversationalRetrievalQAChain_0-input-vectorStoreRetriever-BaseRetriever", + "id": "openAIEmbeddings_0-openAIEmbeddings_0-output-openAIEmbeddings-OpenAIEmbeddings|Embeddings-memoryVectorStore_0-memoryVectorStore_0-input-embeddings-Embeddings", "data": { "label": "" } diff --git a/packages/server/marketplaces/chatflows/HuggingFace LLM Chain.json b/packages/server/marketplaces/chatflows/HuggingFace LLM Chain.json new file mode 100644 index 000000000..6e159a286 --- /dev/null +++ b/packages/server/marketplaces/chatflows/HuggingFace LLM Chain.json @@ -0,0 +1,288 @@ +{ + "description": "Simple LLM Chain using HuggingFace Inference API on falcon-7b-instruct model", + "nodes": [ + { + "width": 300, + "height": 405, + "id": "llmChain_1", + "position": { + "x": 970.9254258940236, + "y": 320.56761595884564 + }, + "type": "customNode", + "data": { + "id": "llmChain_1", + "label": "LLM Chain", + "name": "llmChain", + "version": 1, + "type": "LLMChain", + "baseClasses": ["LLMChain", "BaseChain", "BaseLangChain"], + "category": "Chains", + "description": "Chain to run queries against LLMs", + "inputParams": [ + { + "label": "Chain Name", + "name": "chainName", + "type": "string", + "placeholder": "Name Your Chain", + "optional": true, + "id": "llmChain_1-input-chainName-string" + } + ], + "inputAnchors": [ + { + "label": "Language Model", + "name": "model", + "type": "BaseLanguageModel", + "id": "llmChain_1-input-model-BaseLanguageModel" + }, + { + "label": "Prompt", + "name": "prompt", + "type": "BasePromptTemplate", + "id": "llmChain_1-input-prompt-BasePromptTemplate" + } + ], + "inputs": { + "model": "{{huggingFaceInference_LLMs_0.data.instance}}", + "prompt": "{{promptTemplate_0.data.instance}}", + "chainName": "" + }, + "outputAnchors": [ + { + "name": "output", + "label": "Output", + "type": "options", + "options": [ + { + "id": "llmChain_1-output-llmChain-LLMChain|BaseChain|BaseLangChain", + "name": "llmChain", + "label": "LLM Chain", + "type": "LLMChain | BaseChain | BaseLangChain" + }, + { + "id": "llmChain_1-output-outputPrediction-string|json", + "name": "outputPrediction", + "label": "Output Prediction", + "type": "string | json" + } + ], + "default": "llmChain" + } + ], + "outputs": { + "output": "llmChain" + }, + "selected": false + }, + "positionAbsolute": { + "x": 970.9254258940236, + "y": 320.56761595884564 + }, + "selected": false, + "dragging": false + }, + { + "width": 300, + "height": 475, + "id": "promptTemplate_0", + "position": { + "x": 506.50436294210306, + "y": 504.50766458127396 + }, + "type": "customNode", + "data": { + "id": "promptTemplate_0", + "label": "Prompt Template", + "name": "promptTemplate", + "version": 1, + "type": "PromptTemplate", + "baseClasses": ["PromptTemplate", "BaseStringPromptTemplate", "BasePromptTemplate"], + "category": "Prompts", + "description": "Schema to represent a basic prompt for an LLM", + "inputParams": [ + { + "label": "Template", + "name": "template", + "type": "string", + "rows": 4, + "placeholder": "What is a good name for a company that makes {product}?", + "id": "promptTemplate_0-input-template-string" + }, + { + "label": "Format Prompt Values", + "name": "promptValues", + "type": "json", + "optional": true, + "acceptVariable": true, + "list": true, + "id": "promptTemplate_0-input-promptValues-json" + } + ], + "inputAnchors": [], + "inputs": { + "template": "Question: {question}\n\nAnswer: Let's think step by step.", + "promptValues": "" + }, + "outputAnchors": [ + { + "id": "promptTemplate_0-output-promptTemplate-PromptTemplate|BaseStringPromptTemplate|BasePromptTemplate", + "name": "promptTemplate", + "label": "PromptTemplate", + "type": "PromptTemplate | BaseStringPromptTemplate | BasePromptTemplate" + } + ], + "outputs": {}, + "selected": false + }, + "selected": false, + "positionAbsolute": { + "x": 506.50436294210306, + "y": 504.50766458127396 + }, + "dragging": false + }, + { + "width": 300, + "height": 526, + "id": "huggingFaceInference_LLMs_0", + "position": { + "x": 498.8594464193537, + "y": -44.91050256311678 + }, + "type": "customNode", + "data": { + "id": "huggingFaceInference_LLMs_0", + "label": "HuggingFace Inference", + "name": "huggingFaceInference_LLMs", + "version": 1, + "type": "HuggingFaceInference", + "baseClasses": ["HuggingFaceInference", "LLM", "BaseLLM", "BaseLanguageModel"], + "category": "LLMs", + "description": "Wrapper around HuggingFace large language models", + "inputParams": [ + { + "label": "Connect Credential", + "name": "credential", + "type": "credential", + "credentialNames": ["huggingFaceApi"], + "id": "huggingFaceInference_LLMs_0-input-credential-credential" + }, + { + "label": "Model", + "name": "model", + "type": "string", + "description": "If using own inference endpoint, leave this blank", + "placeholder": "gpt2", + "optional": true, + "id": "huggingFaceInference_LLMs_0-input-model-string" + }, + { + "label": "Endpoint", + "name": "endpoint", + "type": "string", + "placeholder": "https://xyz.eu-west-1.aws.endpoints.huggingface.cloud/gpt2", + "description": "Using your own inference endpoint", + "optional": true, + "id": "huggingFaceInference_LLMs_0-input-endpoint-string" + }, + { + "label": "Temperature", + "name": "temperature", + "type": "number", + "description": "Temperature parameter may not apply to certain model. Please check available model parameters", + "optional": true, + "additionalParams": true, + "id": "huggingFaceInference_LLMs_0-input-temperature-number" + }, + { + "label": "Max Tokens", + "name": "maxTokens", + "type": "number", + "description": "Max Tokens parameter may not apply to certain model. Please check available model parameters", + "optional": true, + "additionalParams": true, + "id": "huggingFaceInference_LLMs_0-input-maxTokens-number" + }, + { + "label": "Top Probability", + "name": "topP", + "type": "number", + "description": "Top Probability parameter may not apply to certain model. Please check available model parameters", + "optional": true, + "additionalParams": true, + "id": "huggingFaceInference_LLMs_0-input-topP-number" + }, + { + "label": "Top K", + "name": "hfTopK", + "type": "number", + "description": "Top K parameter may not apply to certain model. Please check available model parameters", + "optional": true, + "additionalParams": true, + "id": "huggingFaceInference_LLMs_0-input-hfTopK-number" + }, + { + "label": "Frequency Penalty", + "name": "frequencyPenalty", + "type": "number", + "description": "Frequency Penalty parameter may not apply to certain model. Please check available model parameters", + "optional": true, + "additionalParams": true, + "id": "huggingFaceInference_LLMs_0-input-frequencyPenalty-number" + } + ], + "inputAnchors": [], + "inputs": { + "model": "tiiuae/falcon-7b-instruct", + "endpoint": "", + "temperature": "", + "maxTokens": "", + "topP": "", + "hfTopK": "", + "frequencyPenalty": "" + }, + "outputAnchors": [ + { + "id": "huggingFaceInference_LLMs_0-output-huggingFaceInference_LLMs-HuggingFaceInference|LLM|BaseLLM|BaseLanguageModel", + "name": "huggingFaceInference_LLMs", + "label": "HuggingFaceInference", + "type": "HuggingFaceInference | LLM | BaseLLM | BaseLanguageModel" + } + ], + "outputs": {}, + "selected": false + }, + "selected": false, + "positionAbsolute": { + "x": 498.8594464193537, + "y": -44.91050256311678 + }, + "dragging": false + } + ], + "edges": [ + { + "source": "promptTemplate_0", + "sourceHandle": "promptTemplate_0-output-promptTemplate-PromptTemplate|BaseStringPromptTemplate|BasePromptTemplate", + "target": "llmChain_1", + "targetHandle": "llmChain_1-input-prompt-BasePromptTemplate", + "type": "buttonedge", + "id": "promptTemplate_0-promptTemplate_0-output-promptTemplate-PromptTemplate|BaseStringPromptTemplate|BasePromptTemplate-llmChain_1-llmChain_1-input-prompt-BasePromptTemplate", + "data": { + "label": "" + } + }, + { + "source": "huggingFaceInference_LLMs_0", + "sourceHandle": "huggingFaceInference_LLMs_0-output-huggingFaceInference_LLMs-HuggingFaceInference|LLM|BaseLLM|BaseLanguageModel", + "target": "llmChain_1", + "targetHandle": "llmChain_1-input-model-BaseLanguageModel", + "type": "buttonedge", + "id": "huggingFaceInference_LLMs_0-huggingFaceInference_LLMs_0-output-huggingFaceInference_LLMs-HuggingFaceInference|LLM|BaseLLM|BaseLanguageModel-llmChain_1-llmChain_1-input-model-BaseLanguageModel", + "data": { + "label": "" + } + } + ] +} diff --git a/packages/server/marketplaces/chatflows/Local QnA.json b/packages/server/marketplaces/chatflows/Local QnA.json new file mode 100644 index 000000000..9d9f5ec80 --- /dev/null +++ b/packages/server/marketplaces/chatflows/Local QnA.json @@ -0,0 +1,536 @@ +{ + "description": "QnA chain using local LLM, Embedding models, and Faiss local vector store", + "nodes": [ + { + "width": 300, + "height": 376, + "id": "recursiveCharacterTextSplitter_1", + "position": { + "x": 422.81091375202413, + "y": 122.99825010325736 + }, + "type": "customNode", + "data": { + "id": "recursiveCharacterTextSplitter_1", + "label": "Recursive Character Text Splitter", + "name": "recursiveCharacterTextSplitter", + "version": 1, + "type": "RecursiveCharacterTextSplitter", + "baseClasses": ["RecursiveCharacterTextSplitter", "TextSplitter"], + "category": "Text Splitters", + "description": "Split documents recursively by different characters - starting with \"\n\n\", then \"\n\", then \" \"", + "inputParams": [ + { + "label": "Chunk Size", + "name": "chunkSize", + "type": "number", + "default": 1000, + "optional": true, + "id": "recursiveCharacterTextSplitter_1-input-chunkSize-number" + }, + { + "label": "Chunk Overlap", + "name": "chunkOverlap", + "type": "number", + "optional": true, + "id": "recursiveCharacterTextSplitter_1-input-chunkOverlap-number" + } + ], + "inputAnchors": [], + "inputs": { + "chunkSize": 1000, + "chunkOverlap": "" + }, + "outputAnchors": [ + { + "id": "recursiveCharacterTextSplitter_1-output-recursiveCharacterTextSplitter-RecursiveCharacterTextSplitter|TextSplitter", + "name": "recursiveCharacterTextSplitter", + "label": "RecursiveCharacterTextSplitter", + "type": "RecursiveCharacterTextSplitter | TextSplitter" + } + ], + "outputs": {}, + "selected": false + }, + "selected": false, + "positionAbsolute": { + "x": 422.81091375202413, + "y": 122.99825010325736 + }, + "dragging": false + }, + { + "width": 300, + "height": 428, + "id": "conversationalRetrievalQAChain_0", + "position": { + "x": 1634.455879160561, + "y": 428.77742668929807 + }, + "type": "customNode", + "data": { + "id": "conversationalRetrievalQAChain_0", + "label": "Conversational Retrieval QA Chain", + "name": "conversationalRetrievalQAChain", + "version": 1, + "type": "ConversationalRetrievalQAChain", + "baseClasses": ["ConversationalRetrievalQAChain", "BaseChain", "BaseLangChain"], + "category": "Chains", + "description": "Document QA - built on RetrievalQAChain to provide a chat history component", + "inputParams": [ + { + "label": "Return Source Documents", + "name": "returnSourceDocuments", + "type": "boolean", + "optional": true, + "id": "conversationalRetrievalQAChain_0-input-returnSourceDocuments-boolean" + }, + { + "label": "System Message", + "name": "systemMessagePrompt", + "type": "string", + "rows": 4, + "additionalParams": true, + "optional": true, + "placeholder": "I want you to act as a document that I am having a conversation with. Your name is \"AI Assistant\". You will provide me with answers from the given info. If the answer is not included, say exactly \"Hmm, I am not sure.\" and stop after that. Refuse to answer any question not about the info. Never break character.", + "id": "conversationalRetrievalQAChain_0-input-systemMessagePrompt-string" + }, + { + "label": "Chain Option", + "name": "chainOption", + "type": "options", + "options": [ + { + "label": "MapReduceDocumentsChain", + "name": "map_reduce", + "description": "Suitable for QA tasks over larger documents and can run the preprocessing step in parallel, reducing the running time" + }, + { + "label": "RefineDocumentsChain", + "name": "refine", + "description": "Suitable for QA tasks over a large number of documents." + }, + { + "label": "StuffDocumentsChain", + "name": "stuff", + "description": "Suitable for QA tasks over a small number of documents." + } + ], + "additionalParams": true, + "optional": true, + "id": "conversationalRetrievalQAChain_0-input-chainOption-options" + } + ], + "inputAnchors": [ + { + "label": "Language Model", + "name": "model", + "type": "BaseLanguageModel", + "id": "conversationalRetrievalQAChain_0-input-model-BaseLanguageModel" + }, + { + "label": "Vector Store Retriever", + "name": "vectorStoreRetriever", + "type": "BaseRetriever", + "id": "conversationalRetrievalQAChain_0-input-vectorStoreRetriever-BaseRetriever" + }, + { + "label": "Memory", + "name": "memory", + "type": "BaseMemory", + "optional": true, + "description": "If left empty, a default BufferMemory will be used", + "id": "conversationalRetrievalQAChain_0-input-memory-BaseMemory" + } + ], + "inputs": { + "model": "{{chatLocalAI_0.data.instance}}", + "vectorStoreRetriever": "{{faissUpsert_0.data.instance}}", + "memory": "" + }, + "outputAnchors": [ + { + "id": "conversationalRetrievalQAChain_0-output-conversationalRetrievalQAChain-ConversationalRetrievalQAChain|BaseChain|BaseLangChain", + "name": "conversationalRetrievalQAChain", + "label": "ConversationalRetrievalQAChain", + "type": "ConversationalRetrievalQAChain | BaseChain | BaseLangChain" + } + ], + "outputs": {}, + "selected": false + }, + "selected": false, + "positionAbsolute": { + "x": 1634.455879160561, + "y": 428.77742668929807 + }, + "dragging": false + }, + { + "width": 300, + "height": 456, + "id": "faissUpsert_0", + "position": { + "x": 1204.6898035516715, + "y": 521.0933926644659 + }, + "type": "customNode", + "data": { + "id": "faissUpsert_0", + "label": "Faiss Upsert Document", + "name": "faissUpsert", + "version": 1, + "type": "Faiss", + "baseClasses": ["Faiss", "VectorStoreRetriever", "BaseRetriever"], + "category": "Vector Stores", + "description": "Upsert documents to Faiss", + "inputParams": [ + { + "label": "Base Path to store", + "name": "basePath", + "description": "Path to store faiss.index file", + "placeholder": "C:\\Users\\User\\Desktop", + "type": "string", + "id": "faissUpsert_0-input-basePath-string" + }, + { + "label": "Top K", + "name": "topK", + "description": "Number of top results to fetch. Default to 4", + "placeholder": "4", + "type": "number", + "additionalParams": true, + "optional": true, + "id": "faissUpsert_0-input-topK-number" + } + ], + "inputAnchors": [ + { + "label": "Document", + "name": "document", + "type": "Document", + "list": true, + "id": "faissUpsert_0-input-document-Document" + }, + { + "label": "Embeddings", + "name": "embeddings", + "type": "Embeddings", + "id": "faissUpsert_0-input-embeddings-Embeddings" + } + ], + "inputs": { + "document": ["{{textFile_0.data.instance}}"], + "embeddings": "{{localAIEmbeddings_0.data.instance}}", + "basePath": "C:\\Users\\your-folder", + "topK": "" + }, + "outputAnchors": [ + { + "name": "output", + "label": "Output", + "type": "options", + "options": [ + { + "id": "faissUpsert_0-output-retriever-Faiss|VectorStoreRetriever|BaseRetriever", + "name": "retriever", + "label": "Faiss Retriever", + "type": "Faiss | VectorStoreRetriever | BaseRetriever" + }, + { + "id": "faissUpsert_0-output-vectorStore-Faiss|SaveableVectorStore|VectorStore", + "name": "vectorStore", + "label": "Faiss Vector Store", + "type": "Faiss | SaveableVectorStore | VectorStore" + } + ], + "default": "retriever" + } + ], + "outputs": { + "output": "retriever" + }, + "selected": false + }, + "selected": false, + "positionAbsolute": { + "x": 1204.6898035516715, + "y": 521.0933926644659 + }, + "dragging": false + }, + { + "width": 300, + "height": 526, + "id": "chatLocalAI_0", + "position": { + "x": 1191.9512064167336, + "y": -44.05401001663306 + }, + "type": "customNode", + "data": { + "id": "chatLocalAI_0", + "label": "ChatLocalAI", + "name": "chatLocalAI", + "version": 1, + "type": "ChatLocalAI", + "baseClasses": ["ChatLocalAI", "BaseChatModel", "LLM", "BaseLLM", "BaseLanguageModel", "BaseLangChain"], + "category": "Chat Models", + "description": "Use local LLMs like llama.cpp, gpt4all using LocalAI", + "inputParams": [ + { + "label": "Base Path", + "name": "basePath", + "type": "string", + "placeholder": "http://localhost:8080/v1", + "id": "chatLocalAI_0-input-basePath-string" + }, + { + "label": "Model Name", + "name": "modelName", + "type": "string", + "placeholder": "gpt4all-lora-quantized.bin", + "id": "chatLocalAI_0-input-modelName-string" + }, + { + "label": "Temperature", + "name": "temperature", + "type": "number", + "default": 0.9, + "optional": true, + "id": "chatLocalAI_0-input-temperature-number" + }, + { + "label": "Max Tokens", + "name": "maxTokens", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "chatLocalAI_0-input-maxTokens-number" + }, + { + "label": "Top Probability", + "name": "topP", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "chatLocalAI_0-input-topP-number" + }, + { + "label": "Timeout", + "name": "timeout", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "chatLocalAI_0-input-timeout-number" + } + ], + "inputAnchors": [], + "inputs": { + "basePath": "http://localhost:8080/v1", + "modelName": "ggml-gpt4all-j.bin", + "temperature": 0.9, + "maxTokens": "", + "topP": "", + "timeout": "" + }, + "outputAnchors": [ + { + "id": "chatLocalAI_0-output-chatLocalAI-ChatLocalAI|BaseChatModel|LLM|BaseLLM|BaseLanguageModel|BaseLangChain", + "name": "chatLocalAI", + "label": "ChatLocalAI", + "type": "ChatLocalAI | BaseChatModel | LLM | BaseLLM | BaseLanguageModel | BaseLangChain" + } + ], + "outputs": {}, + "selected": false + }, + "selected": false, + "positionAbsolute": { + "x": 1191.9512064167336, + "y": -44.05401001663306 + }, + "dragging": false + }, + { + "width": 300, + "height": 410, + "id": "textFile_0", + "position": { + "x": 809.5432731751458, + "y": 55.85095796777051 + }, + "type": "customNode", + "data": { + "id": "textFile_0", + "label": "Text File", + "name": "textFile", + "version": 1, + "type": "Document", + "baseClasses": ["Document"], + "category": "Document Loaders", + "description": "Load data from text files", + "inputParams": [ + { + "label": "Txt File", + "name": "txtFile", + "type": "file", + "fileType": ".txt", + "id": "textFile_0-input-txtFile-file" + }, + { + "label": "Metadata", + "name": "metadata", + "type": "json", + "optional": true, + "additionalParams": true, + "id": "textFile_0-input-metadata-json" + } + ], + "inputAnchors": [ + { + "label": "Text Splitter", + "name": "textSplitter", + "type": "TextSplitter", + "optional": true, + "id": "textFile_0-input-textSplitter-TextSplitter" + } + ], + "inputs": { + "textSplitter": "{{recursiveCharacterTextSplitter_1.data.instance}}", + "metadata": "" + }, + "outputAnchors": [ + { + "id": "textFile_0-output-textFile-Document", + "name": "textFile", + "label": "Document", + "type": "Document" + } + ], + "outputs": {}, + "selected": false + }, + "selected": false, + "positionAbsolute": { + "x": 809.5432731751458, + "y": 55.85095796777051 + }, + "dragging": false + }, + { + "width": 300, + "height": 376, + "id": "localAIEmbeddings_0", + "position": { + "x": 809.5432731751458, + "y": 507.4586304746849 + }, + "type": "customNode", + "data": { + "id": "localAIEmbeddings_0", + "label": "LocalAI Embeddings", + "name": "localAIEmbeddings", + "version": 1, + "type": "LocalAI Embeddings", + "baseClasses": ["LocalAI Embeddings", "Embeddings"], + "category": "Embeddings", + "description": "Use local embeddings models like llama.cpp", + "inputParams": [ + { + "label": "Base Path", + "name": "basePath", + "type": "string", + "placeholder": "http://localhost:8080/v1", + "id": "localAIEmbeddings_0-input-basePath-string" + }, + { + "label": "Model Name", + "name": "modelName", + "type": "string", + "placeholder": "text-embedding-ada-002", + "id": "localAIEmbeddings_0-input-modelName-string" + } + ], + "inputAnchors": [], + "inputs": { + "basePath": "http://localhost:8080/v1", + "modelName": "text-embedding-ada-002" + }, + "outputAnchors": [ + { + "id": "localAIEmbeddings_0-output-localAIEmbeddings-LocalAI Embeddings|Embeddings", + "name": "localAIEmbeddings", + "label": "LocalAI Embeddings", + "type": "LocalAI Embeddings | Embeddings" + } + ], + "outputs": {}, + "selected": false + }, + "selected": false, + "positionAbsolute": { + "x": 809.5432731751458, + "y": 507.4586304746849 + }, + "dragging": false + } + ], + "edges": [ + { + "source": "faissUpsert_0", + "sourceHandle": "faissUpsert_0-output-retriever-Faiss|VectorStoreRetriever|BaseRetriever", + "target": "conversationalRetrievalQAChain_0", + "targetHandle": "conversationalRetrievalQAChain_0-input-vectorStoreRetriever-BaseRetriever", + "type": "buttonedge", + "id": "faissUpsert_0-faissUpsert_0-output-retriever-Faiss|VectorStoreRetriever|BaseRetriever-conversationalRetrievalQAChain_0-conversationalRetrievalQAChain_0-input-vectorStoreRetriever-BaseRetriever", + "data": { + "label": "" + } + }, + { + "source": "chatLocalAI_0", + "sourceHandle": "chatLocalAI_0-output-chatLocalAI-ChatLocalAI|BaseChatModel|LLM|BaseLLM|BaseLanguageModel|BaseLangChain", + "target": "conversationalRetrievalQAChain_0", + "targetHandle": "conversationalRetrievalQAChain_0-input-model-BaseLanguageModel", + "type": "buttonedge", + "id": "chatLocalAI_0-chatLocalAI_0-output-chatLocalAI-ChatLocalAI|BaseChatModel|LLM|BaseLLM|BaseLanguageModel|BaseLangChain-conversationalRetrievalQAChain_0-conversationalRetrievalQAChain_0-input-model-BaseLanguageModel", + "data": { + "label": "" + } + }, + { + "source": "recursiveCharacterTextSplitter_1", + "sourceHandle": "recursiveCharacterTextSplitter_1-output-recursiveCharacterTextSplitter-RecursiveCharacterTextSplitter|TextSplitter", + "target": "textFile_0", + "targetHandle": "textFile_0-input-textSplitter-TextSplitter", + "type": "buttonedge", + "id": "recursiveCharacterTextSplitter_1-recursiveCharacterTextSplitter_1-output-recursiveCharacterTextSplitter-RecursiveCharacterTextSplitter|TextSplitter-textFile_0-textFile_0-input-textSplitter-TextSplitter", + "data": { + "label": "" + } + }, + { + "source": "textFile_0", + "sourceHandle": "textFile_0-output-textFile-Document", + "target": "faissUpsert_0", + "targetHandle": "faissUpsert_0-input-document-Document", + "type": "buttonedge", + "id": "textFile_0-textFile_0-output-textFile-Document-faissUpsert_0-faissUpsert_0-input-document-Document", + "data": { + "label": "" + } + }, + { + "source": "localAIEmbeddings_0", + "sourceHandle": "localAIEmbeddings_0-output-localAIEmbeddings-LocalAI Embeddings|Embeddings", + "target": "faissUpsert_0", + "targetHandle": "faissUpsert_0-input-embeddings-Embeddings", + "type": "buttonedge", + "id": "localAIEmbeddings_0-localAIEmbeddings_0-output-localAIEmbeddings-LocalAI Embeddings|Embeddings-faissUpsert_0-faissUpsert_0-input-embeddings-Embeddings", + "data": { + "label": "" + } + } + ] +} diff --git a/packages/server/marketplaces/chatflows/Long Term Memory.json b/packages/server/marketplaces/chatflows/Long Term Memory.json new file mode 100644 index 000000000..07669f82b --- /dev/null +++ b/packages/server/marketplaces/chatflows/Long Term Memory.json @@ -0,0 +1,647 @@ +{ + "description": "Use long term memory Zep to differentiate conversations between users with sessionId", + "nodes": [ + { + "width": 300, + "height": 480, + "id": "conversationalRetrievalQAChain_0", + "position": { + "x": 1999.7302950816731, + "y": 365.33064907894243 + }, + "type": "customNode", + "data": { + "id": "conversationalRetrievalQAChain_0", + "label": "Conversational Retrieval QA Chain", + "name": "conversationalRetrievalQAChain", + "version": 1, + "type": "ConversationalRetrievalQAChain", + "baseClasses": ["ConversationalRetrievalQAChain", "BaseChain", "BaseLangChain"], + "category": "Chains", + "description": "Document QA - built on RetrievalQAChain to provide a chat history component", + "inputParams": [ + { + "label": "Return Source Documents", + "name": "returnSourceDocuments", + "type": "boolean", + "optional": true, + "id": "conversationalRetrievalQAChain_0-input-returnSourceDocuments-boolean" + }, + { + "label": "System Message", + "name": "systemMessagePrompt", + "type": "string", + "rows": 4, + "additionalParams": true, + "optional": true, + "placeholder": "I want you to act as a document that I am having a conversation with. Your name is \"AI Assistant\". You will provide me with answers from the given info. If the answer is not included, say exactly \"Hmm, I am not sure.\" and stop after that. Refuse to answer any question not about the info. Never break character.", + "id": "conversationalRetrievalQAChain_0-input-systemMessagePrompt-string" + }, + { + "label": "Chain Option", + "name": "chainOption", + "type": "options", + "options": [ + { + "label": "MapReduceDocumentsChain", + "name": "map_reduce", + "description": "Suitable for QA tasks over larger documents and can run the preprocessing step in parallel, reducing the running time" + }, + { + "label": "RefineDocumentsChain", + "name": "refine", + "description": "Suitable for QA tasks over a large number of documents." + }, + { + "label": "StuffDocumentsChain", + "name": "stuff", + "description": "Suitable for QA tasks over a small number of documents." + } + ], + "additionalParams": true, + "optional": true, + "id": "conversationalRetrievalQAChain_0-input-chainOption-options" + } + ], + "inputAnchors": [ + { + "label": "Language Model", + "name": "model", + "type": "BaseLanguageModel", + "id": "conversationalRetrievalQAChain_0-input-model-BaseLanguageModel" + }, + { + "label": "Vector Store Retriever", + "name": "vectorStoreRetriever", + "type": "BaseRetriever", + "id": "conversationalRetrievalQAChain_0-input-vectorStoreRetriever-BaseRetriever" + }, + { + "label": "Memory", + "name": "memory", + "type": "BaseMemory", + "optional": true, + "description": "If left empty, a default BufferMemory will be used", + "id": "conversationalRetrievalQAChain_0-input-memory-BaseMemory" + } + ], + "inputs": { + "model": "{{chatOpenAI_0.data.instance}}", + "vectorStoreRetriever": "{{pineconeExistingIndex_0.data.instance}}", + "memory": "{{ZepMemory_0.data.instance}}", + "returnSourceDocuments": true + }, + "outputAnchors": [ + { + "id": "conversationalRetrievalQAChain_0-output-conversationalRetrievalQAChain-ConversationalRetrievalQAChain|BaseChain|BaseLangChain", + "name": "conversationalRetrievalQAChain", + "label": "ConversationalRetrievalQAChain", + "type": "ConversationalRetrievalQAChain | BaseChain | BaseLangChain" + } + ], + "outputs": {}, + "selected": false + }, + "selected": false, + "positionAbsolute": { + "x": 1999.7302950816731, + "y": 365.33064907894243 + }, + "dragging": false + }, + { + "width": 300, + "height": 523, + "id": "chatOpenAI_0", + "position": { + "x": 1554.3875781165111, + "y": -14.792508259787212 + }, + "type": "customNode", + "data": { + "id": "chatOpenAI_0", + "label": "ChatOpenAI", + "name": "chatOpenAI", + "version": 1, + "type": "ChatOpenAI", + "baseClasses": ["ChatOpenAI", "BaseChatModel", "BaseLanguageModel"], + "category": "Chat Models", + "description": "Wrapper around OpenAI large language models that use the Chat endpoint", + "inputParams": [ + { + "label": "Connect Credential", + "name": "credential", + "type": "credential", + "credentialNames": ["openAIApi"], + "id": "chatOpenAI_0-input-credential-credential" + }, + { + "label": "Model Name", + "name": "modelName", + "type": "options", + "options": [ + { + "label": "gpt-4", + "name": "gpt-4" + }, + { + "label": "gpt-4-0613", + "name": "gpt-4-0613" + }, + { + "label": "gpt-4-32k", + "name": "gpt-4-32k" + }, + { + "label": "gpt-4-32k-0613", + "name": "gpt-4-32k-0613" + }, + { + "label": "gpt-3.5-turbo", + "name": "gpt-3.5-turbo" + }, + { + "label": "gpt-3.5-turbo-0613", + "name": "gpt-3.5-turbo-0613" + }, + { + "label": "gpt-3.5-turbo-16k", + "name": "gpt-3.5-turbo-16k" + }, + { + "label": "gpt-3.5-turbo-16k-0613", + "name": "gpt-3.5-turbo-16k-0613" + } + ], + "default": "gpt-3.5-turbo", + "optional": true, + "id": "chatOpenAI_0-input-modelName-options" + }, + { + "label": "Temperature", + "name": "temperature", + "type": "number", + "default": 0.9, + "optional": true, + "id": "chatOpenAI_0-input-temperature-number" + }, + { + "label": "Max Tokens", + "name": "maxTokens", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_0-input-maxTokens-number" + }, + { + "label": "Top Probability", + "name": "topP", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_0-input-topP-number" + }, + { + "label": "Frequency Penalty", + "name": "frequencyPenalty", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_0-input-frequencyPenalty-number" + }, + { + "label": "Presence Penalty", + "name": "presencePenalty", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_0-input-presencePenalty-number" + }, + { + "label": "Timeout", + "name": "timeout", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_0-input-timeout-number" + }, + { + "label": "BasePath", + "name": "basepath", + "type": "string", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_0-input-basepath-string" + } + ], + "inputAnchors": [], + "inputs": { + "modelName": "gpt-3.5-turbo", + "temperature": "0", + "maxTokens": "", + "topP": "", + "frequencyPenalty": "", + "presencePenalty": "", + "timeout": "", + "basepath": "" + }, + "outputAnchors": [ + { + "id": "chatOpenAI_0-output-chatOpenAI-ChatOpenAI|BaseChatModel|BaseLanguageModel", + "name": "chatOpenAI", + "label": "ChatOpenAI", + "type": "ChatOpenAI | BaseChatModel | BaseLanguageModel" + } + ], + "outputs": {}, + "selected": false + }, + "selected": false, + "positionAbsolute": { + "x": 1554.3875781165111, + "y": -14.792508259787212 + }, + "dragging": false + }, + { + "width": 300, + "height": 329, + "id": "openAIEmbeddings_0", + "position": { + "x": 789.6839176356616, + "y": 167.70165941305987 + }, + "type": "customNode", + "data": { + "id": "openAIEmbeddings_0", + "label": "OpenAI Embeddings", + "name": "openAIEmbeddings", + "version": 1, + "type": "OpenAIEmbeddings", + "baseClasses": ["OpenAIEmbeddings", "Embeddings"], + "category": "Embeddings", + "description": "OpenAI API to generate embeddings for a given text", + "inputParams": [ + { + "label": "Connect Credential", + "name": "credential", + "type": "credential", + "credentialNames": ["openAIApi"], + "id": "openAIEmbeddings_0-input-credential-credential" + }, + { + "label": "Strip New Lines", + "name": "stripNewLines", + "type": "boolean", + "optional": true, + "additionalParams": true, + "id": "openAIEmbeddings_0-input-stripNewLines-boolean" + }, + { + "label": "Batch Size", + "name": "batchSize", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "openAIEmbeddings_0-input-batchSize-number" + }, + { + "label": "Timeout", + "name": "timeout", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "openAIEmbeddings_0-input-timeout-number" + }, + { + "label": "BasePath", + "name": "basepath", + "type": "string", + "optional": true, + "additionalParams": true, + "id": "openAIEmbeddings_0-input-basepath-string" + } + ], + "inputAnchors": [], + "inputs": { + "stripNewLines": "", + "batchSize": "", + "timeout": "", + "basepath": "" + }, + "outputAnchors": [ + { + "id": "openAIEmbeddings_0-output-openAIEmbeddings-OpenAIEmbeddings|Embeddings", + "name": "openAIEmbeddings", + "label": "OpenAIEmbeddings", + "type": "OpenAIEmbeddings | Embeddings" + } + ], + "outputs": {}, + "selected": false + }, + "selected": false, + "positionAbsolute": { + "x": 789.6839176356616, + "y": 167.70165941305987 + }, + "dragging": false + }, + { + "width": 300, + "height": 505, + "id": "pineconeExistingIndex_0", + "position": { + "x": 1167.128201355349, + "y": 71.89355115516406 + }, + "type": "customNode", + "data": { + "id": "pineconeExistingIndex_0", + "label": "Pinecone Load Existing Index", + "name": "pineconeExistingIndex", + "version": 1, + "type": "Pinecone", + "baseClasses": ["Pinecone", "VectorStoreRetriever", "BaseRetriever"], + "category": "Vector Stores", + "description": "Load existing index from Pinecone (i.e: Document has been upserted)", + "inputParams": [ + { + "label": "Connect Credential", + "name": "credential", + "type": "credential", + "credentialNames": ["pineconeApi"], + "id": "pineconeExistingIndex_0-input-credential-credential" + }, + { + "label": "Pinecone Index", + "name": "pineconeIndex", + "type": "string", + "id": "pineconeExistingIndex_0-input-pineconeIndex-string" + }, + { + "label": "Pinecone Namespace", + "name": "pineconeNamespace", + "type": "string", + "placeholder": "my-first-namespace", + "additionalParams": true, + "optional": true, + "id": "pineconeExistingIndex_0-input-pineconeNamespace-string" + }, + { + "label": "Pinecone Metadata Filter", + "name": "pineconeMetadataFilter", + "type": "json", + "optional": true, + "additionalParams": true, + "id": "pineconeExistingIndex_0-input-pineconeMetadataFilter-json" + }, + { + "label": "Top K", + "name": "topK", + "description": "Number of top results to fetch. Default to 4", + "placeholder": "4", + "type": "number", + "additionalParams": true, + "optional": true, + "id": "pineconeExistingIndex_0-input-topK-number" + } + ], + "inputAnchors": [ + { + "label": "Embeddings", + "name": "embeddings", + "type": "Embeddings", + "id": "pineconeExistingIndex_0-input-embeddings-Embeddings" + } + ], + "inputs": { + "embeddings": "{{openAIEmbeddings_0.data.instance}}", + "pineconeIndex": "", + "pineconeNamespace": "", + "pineconeMetadataFilter": "", + "topK": "" + }, + "outputAnchors": [ + { + "name": "output", + "label": "Output", + "type": "options", + "options": [ + { + "id": "pineconeExistingIndex_0-output-retriever-Pinecone|VectorStoreRetriever|BaseRetriever", + "name": "retriever", + "label": "Pinecone Retriever", + "type": "Pinecone | VectorStoreRetriever | BaseRetriever" + }, + { + "id": "pineconeExistingIndex_0-output-vectorStore-Pinecone|VectorStore", + "name": "vectorStore", + "label": "Pinecone Vector Store", + "type": "Pinecone | VectorStore" + } + ], + "default": "retriever" + } + ], + "outputs": { + "output": "retriever" + }, + "selected": false + }, + "selected": false, + "positionAbsolute": { + "x": 1167.128201355349, + "y": 71.89355115516406 + }, + "dragging": false + }, + { + "width": 300, + "height": 623, + "id": "ZepMemory_0", + "position": { + "x": 1552.2067611642792, + "y": 560.8352147865392 + }, + "type": "customNode", + "data": { + "id": "ZepMemory_0", + "label": "Zep Memory", + "name": "ZepMemory", + "version": 1, + "type": "ZepMemory", + "baseClasses": ["ZepMemory", "BaseChatMemory", "BaseMemory"], + "category": "Memory", + "description": "Summarizes the conversation and stores the memory in zep server", + "inputParams": [ + { + "label": "Connect Credential", + "name": "credential", + "type": "credential", + "optional": true, + "description": "Configure JWT authentication on your Zep instance (Optional)", + "credentialNames": ["zepMemoryApi"], + "id": "ZepMemory_0-input-credential-credential" + }, + { + "label": "Base URL", + "name": "baseURL", + "type": "string", + "default": "http://127.0.0.1:8000", + "id": "ZepMemory_0-input-baseURL-string" + }, + { + "label": "Auto Summary", + "name": "autoSummary", + "type": "boolean", + "default": true, + "id": "ZepMemory_0-input-autoSummary-boolean" + }, + { + "label": "Session Id", + "name": "sessionId", + "type": "string", + "description": "if empty, chatId will be used automatically", + "default": "", + "additionalParams": true, + "optional": true, + "id": "ZepMemory_0-input-sessionId-string" + }, + { + "label": "Size", + "name": "k", + "type": "number", + "default": "10", + "step": 1, + "description": "Window of size k to surface the last k back-and-forths to use as memory.", + "id": "ZepMemory_0-input-k-number" + }, + { + "label": "Auto Summary Template", + "name": "autoSummaryTemplate", + "type": "string", + "default": "This is the summary of the following conversation:\n{summary}", + "additionalParams": true, + "id": "ZepMemory_0-input-autoSummaryTemplate-string" + }, + { + "label": "AI Prefix", + "name": "aiPrefix", + "type": "string", + "default": "ai", + "additionalParams": true, + "id": "ZepMemory_0-input-aiPrefix-string" + }, + { + "label": "Human Prefix", + "name": "humanPrefix", + "type": "string", + "default": "human", + "additionalParams": true, + "id": "ZepMemory_0-input-humanPrefix-string" + }, + { + "label": "Memory Key", + "name": "memoryKey", + "type": "string", + "default": "chat_history", + "additionalParams": true, + "id": "ZepMemory_0-input-memoryKey-string" + }, + { + "label": "Input Key", + "name": "inputKey", + "type": "string", + "default": "input", + "additionalParams": true, + "id": "ZepMemory_0-input-inputKey-string" + }, + { + "label": "Output Key", + "name": "outputKey", + "type": "string", + "default": "text", + "additionalParams": true, + "id": "ZepMemory_0-input-outputKey-string" + } + ], + "inputAnchors": [], + "inputs": { + "baseURL": "http://127.0.0.1:8000", + "autoSummary": true, + "sessionId": "", + "k": "10", + "autoSummaryTemplate": "This is the summary of the following conversation:\n{summary}", + "aiPrefix": "ai", + "humanPrefix": "human", + "memoryKey": "chat_history", + "inputKey": "input", + "outputKey": "text" + }, + "outputAnchors": [ + { + "id": "ZepMemory_0-output-ZepMemory-ZepMemory|BaseChatMemory|BaseMemory", + "name": "ZepMemory", + "label": "ZepMemory", + "type": "ZepMemory | BaseChatMemory | BaseMemory" + } + ], + "outputs": {}, + "selected": false + }, + "selected": false, + "positionAbsolute": { + "x": 1552.2067611642792, + "y": 560.8352147865392 + }, + "dragging": false + } + ], + "edges": [ + { + "source": "openAIEmbeddings_0", + "sourceHandle": "openAIEmbeddings_0-output-openAIEmbeddings-OpenAIEmbeddings|Embeddings", + "target": "pineconeExistingIndex_0", + "targetHandle": "pineconeExistingIndex_0-input-embeddings-Embeddings", + "type": "buttonedge", + "id": "openAIEmbeddings_0-openAIEmbeddings_0-output-openAIEmbeddings-OpenAIEmbeddings|Embeddings-pineconeExistingIndex_0-pineconeExistingIndex_0-input-embeddings-Embeddings", + "data": { + "label": "" + } + }, + { + "source": "chatOpenAI_0", + "sourceHandle": "chatOpenAI_0-output-chatOpenAI-ChatOpenAI|BaseChatModel|BaseLanguageModel", + "target": "conversationalRetrievalQAChain_0", + "targetHandle": "conversationalRetrievalQAChain_0-input-model-BaseLanguageModel", + "type": "buttonedge", + "id": "chatOpenAI_0-chatOpenAI_0-output-chatOpenAI-ChatOpenAI|BaseChatModel|BaseLanguageModel-conversationalRetrievalQAChain_0-conversationalRetrievalQAChain_0-input-model-BaseLanguageModel", + "data": { + "label": "" + } + }, + { + "source": "pineconeExistingIndex_0", + "sourceHandle": "pineconeExistingIndex_0-output-retriever-Pinecone|VectorStoreRetriever|BaseRetriever", + "target": "conversationalRetrievalQAChain_0", + "targetHandle": "conversationalRetrievalQAChain_0-input-vectorStoreRetriever-BaseRetriever", + "type": "buttonedge", + "id": "pineconeExistingIndex_0-pineconeExistingIndex_0-output-retriever-Pinecone|VectorStoreRetriever|BaseRetriever-conversationalRetrievalQAChain_0-conversationalRetrievalQAChain_0-input-vectorStoreRetriever-BaseRetriever", + "data": { + "label": "" + } + }, + { + "source": "ZepMemory_0", + "sourceHandle": "ZepMemory_0-output-ZepMemory-ZepMemory|BaseChatMemory|BaseMemory", + "target": "conversationalRetrievalQAChain_0", + "targetHandle": "conversationalRetrievalQAChain_0-input-memory-BaseMemory", + "type": "buttonedge", + "id": "ZepMemory_0-ZepMemory_0-output-ZepMemory-ZepMemory|BaseChatMemory|BaseMemory-conversationalRetrievalQAChain_0-conversationalRetrievalQAChain_0-input-memory-BaseMemory", + "data": { + "label": "" + } + } + ] +} diff --git a/packages/server/marketplaces/MRKLAgent.json b/packages/server/marketplaces/chatflows/MRKLAgent.json similarity index 62% rename from packages/server/marketplaces/MRKLAgent.json rename to packages/server/marketplaces/chatflows/MRKLAgent.json index c0790a269..f851b0ede 100644 --- a/packages/server/marketplaces/MRKLAgent.json +++ b/packages/server/marketplaces/chatflows/MRKLAgent.json @@ -1,51 +1,6 @@ { "description": "An agent that uses the React Framework to decide what action to take", "nodes": [ - { - "width": 300, - "height": 278, - "id": "serpAPI_1", - "position": { - "x": 312.0655985817535, - "y": 112.09909989842703 - }, - "type": "customNode", - "data": { - "id": "serpAPI_1", - "label": "Serp API", - "name": "serpAPI", - "type": "SerpAPI", - "baseClasses": ["SerpAPI", "Tool", "StructuredTool", "BaseLangChain"], - "category": "Tools", - "description": "Wrapper around SerpAPI - a real-time API to access Google search results", - "inputParams": [ - { - "label": "Serp Api Key", - "name": "apiKey", - "type": "password", - "id": "serpAPI_1-input-apiKey-password" - } - ], - "inputAnchors": [], - "inputs": {}, - "outputAnchors": [ - { - "id": "serpAPI_1-output-serpAPI-SerpAPI|Tool|StructuredTool|BaseLangChain", - "name": "serpAPI", - "label": "SerpAPI", - "type": "SerpAPI | Tool | StructuredTool | BaseLangChain" - } - ], - "outputs": {}, - "selected": false - }, - "selected": false, - "positionAbsolute": { - "x": 312.0655985817535, - "y": 112.09909989842703 - }, - "dragging": false - }, { "width": 300, "height": 143, @@ -59,6 +14,7 @@ "id": "calculator_1", "label": "Calculator", "name": "calculator", + "version": 1, "type": "Calculator", "baseClasses": ["Calculator", "Tool", "StructuredTool", "BaseLangChain"], "category": "Tools", @@ -84,151 +40,6 @@ "selected": false, "dragging": false }, - { - "width": 300, - "height": 524, - "id": "openAI_1", - "position": { - "x": 663.1307301893027, - "y": 394.7618562930441 - }, - "type": "customNode", - "data": { - "id": "openAI_1", - "label": "OpenAI", - "name": "openAI", - "type": "OpenAI", - "baseClasses": ["OpenAI", "BaseLLM", "BaseLanguageModel", "BaseLangChain"], - "category": "LLMs", - "description": "Wrapper around OpenAI large language models", - "inputParams": [ - { - "label": "OpenAI Api Key", - "name": "openAIApiKey", - "type": "password", - "id": "openAI_1-input-openAIApiKey-password" - }, - { - "label": "Model Name", - "name": "modelName", - "type": "options", - "options": [ - { - "label": "text-davinci-003", - "name": "text-davinci-003" - }, - { - "label": "text-davinci-002", - "name": "text-davinci-002" - }, - { - "label": "text-curie-001", - "name": "text-curie-001" - }, - { - "label": "text-babbage-001", - "name": "text-babbage-001" - } - ], - "default": "text-davinci-003", - "optional": true, - "id": "openAI_1-input-modelName-options" - }, - { - "label": "Temperature", - "name": "temperature", - "type": "number", - "default": 0.7, - "optional": true, - "id": "openAI_1-input-temperature-number" - }, - { - "label": "Max Tokens", - "name": "maxTokens", - "type": "number", - "optional": true, - "additionalParams": true, - "id": "openAI_1-input-maxTokens-number" - }, - { - "label": "Top Probability", - "name": "topP", - "type": "number", - "optional": true, - "additionalParams": true, - "id": "openAI_1-input-topP-number" - }, - { - "label": "Best Of", - "name": "bestOf", - "type": "number", - "optional": true, - "additionalParams": true, - "id": "openAI_1-input-bestOf-number" - }, - { - "label": "Frequency Penalty", - "name": "frequencyPenalty", - "type": "number", - "optional": true, - "additionalParams": true, - "id": "openAI_1-input-frequencyPenalty-number" - }, - { - "label": "Presence Penalty", - "name": "presencePenalty", - "type": "number", - "optional": true, - "additionalParams": true, - "id": "openAI_1-input-presencePenalty-number" - }, - { - "label": "Batch Size", - "name": "batchSize", - "type": "number", - "optional": true, - "additionalParams": true, - "id": "openAI_1-input-batchSize-number" - }, - { - "label": "Timeout", - "name": "timeout", - "type": "number", - "optional": true, - "additionalParams": true, - "id": "openAI_1-input-timeout-number" - } - ], - "inputAnchors": [], - "inputs": { - "modelName": "text-davinci-003", - "temperature": 0.7, - "maxTokens": "", - "topP": "", - "bestOf": "", - "frequencyPenalty": "", - "presencePenalty": "", - "batchSize": "", - "timeout": "" - }, - "outputAnchors": [ - { - "id": "openAI_1-output-openAI-OpenAI|BaseLLM|BaseLanguageModel|BaseLangChain", - "name": "openAI", - "label": "OpenAI", - "type": "OpenAI | BaseLLM | BaseLanguageModel | BaseLangChain" - } - ], - "outputs": {}, - "selected": false - }, - "selected": false, - "positionAbsolute": { - "x": 663.1307301893027, - "y": 394.7618562930441 - }, - "dragging": false - }, { "width": 300, "height": 280, @@ -242,6 +53,7 @@ "id": "mrklAgentLLM_0", "label": "MRKL Agent for LLMs", "name": "mrklAgentLLM", + "version": 1, "type": "AgentExecutor", "baseClasses": ["AgentExecutor", "BaseChain", "BaseLangChain"], "category": "Agents", @@ -263,8 +75,8 @@ } ], "inputs": { - "tools": ["{{calculator_1.data.instance}}", "{{serpAPI_1.data.instance}}"], - "model": "{{openAI_1.data.instance}}" + "tools": ["{{calculator_1.data.instance}}", "{{serper_0.data.instance}}"], + "model": "{{chatOpenAI_0.data.instance}}" }, "outputAnchors": [ { @@ -283,6 +95,207 @@ "y": 245.36098016819074 }, "dragging": false + }, + { + "width": 300, + "height": 277, + "id": "serper_0", + "position": { + "x": 330.964079024626, + "y": 109.83185250619351 + }, + "type": "customNode", + "data": { + "id": "serper_0", + "label": "Serper", + "name": "serper", + "version": 1, + "type": "Serper", + "baseClasses": ["Serper", "Tool", "StructuredTool"], + "category": "Tools", + "description": "Wrapper around Serper.dev - Google Search API", + "inputParams": [ + { + "label": "Connect Credential", + "name": "credential", + "type": "credential", + "credentialNames": ["serperApi"], + "id": "serper_0-input-credential-credential" + } + ], + "inputAnchors": [], + "inputs": {}, + "outputAnchors": [ + { + "id": "serper_0-output-serper-Serper|Tool|StructuredTool", + "name": "serper", + "label": "Serper", + "type": "Serper | Tool | StructuredTool" + } + ], + "outputs": {}, + "selected": false + }, + "selected": false, + "positionAbsolute": { + "x": 330.964079024626, + "y": 109.83185250619351 + }, + "dragging": false + }, + { + "width": 300, + "height": 523, + "id": "chatOpenAI_0", + "position": { + "x": 333.58931284721206, + "y": 416.98420974875927 + }, + "type": "customNode", + "data": { + "id": "chatOpenAI_0", + "label": "ChatOpenAI", + "name": "chatOpenAI", + "version": 1, + "type": "ChatOpenAI", + "baseClasses": ["ChatOpenAI", "BaseChatModel", "BaseLanguageModel"], + "category": "Chat Models", + "description": "Wrapper around OpenAI large language models that use the Chat endpoint", + "inputParams": [ + { + "label": "Connect Credential", + "name": "credential", + "type": "credential", + "credentialNames": ["openAIApi"], + "id": "chatOpenAI_0-input-credential-credential" + }, + { + "label": "Model Name", + "name": "modelName", + "type": "options", + "options": [ + { + "label": "gpt-4", + "name": "gpt-4" + }, + { + "label": "gpt-4-0613", + "name": "gpt-4-0613" + }, + { + "label": "gpt-4-32k", + "name": "gpt-4-32k" + }, + { + "label": "gpt-4-32k-0613", + "name": "gpt-4-32k-0613" + }, + { + "label": "gpt-3.5-turbo", + "name": "gpt-3.5-turbo" + }, + { + "label": "gpt-3.5-turbo-0613", + "name": "gpt-3.5-turbo-0613" + }, + { + "label": "gpt-3.5-turbo-16k", + "name": "gpt-3.5-turbo-16k" + }, + { + "label": "gpt-3.5-turbo-16k-0613", + "name": "gpt-3.5-turbo-16k-0613" + } + ], + "default": "gpt-3.5-turbo", + "optional": true, + "id": "chatOpenAI_0-input-modelName-options" + }, + { + "label": "Temperature", + "name": "temperature", + "type": "number", + "default": 0.9, + "optional": true, + "id": "chatOpenAI_0-input-temperature-number" + }, + { + "label": "Max Tokens", + "name": "maxTokens", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_0-input-maxTokens-number" + }, + { + "label": "Top Probability", + "name": "topP", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_0-input-topP-number" + }, + { + "label": "Frequency Penalty", + "name": "frequencyPenalty", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_0-input-frequencyPenalty-number" + }, + { + "label": "Presence Penalty", + "name": "presencePenalty", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_0-input-presencePenalty-number" + }, + { + "label": "Timeout", + "name": "timeout", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_0-input-timeout-number" + }, + { + "label": "BasePath", + "name": "basepath", + "type": "string", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_0-input-basepath-string" + } + ], + "inputAnchors": [], + "inputs": { + "modelName": "gpt-3.5-turbo", + "temperature": 0.9, + "maxTokens": "", + "topP": "", + "frequencyPenalty": "", + "presencePenalty": "", + "timeout": "", + "basepath": "" + }, + "outputAnchors": [ + { + "id": "chatOpenAI_0-output-chatOpenAI-ChatOpenAI|BaseChatModel|BaseLanguageModel", + "name": "chatOpenAI", + "label": "ChatOpenAI", + "type": "ChatOpenAI | BaseChatModel | BaseLanguageModel" + } + ], + "outputs": {}, + "selected": false + }, + "selected": false, + "positionAbsolute": { + "x": 333.58931284721206, + "y": 416.98420974875927 + }, + "dragging": false } ], "edges": [ @@ -298,23 +311,23 @@ } }, { - "source": "serpAPI_1", - "sourceHandle": "serpAPI_1-output-serpAPI-SerpAPI|Tool|StructuredTool|BaseLangChain", + "source": "serper_0", + "sourceHandle": "serper_0-output-serper-Serper|Tool|StructuredTool", "target": "mrklAgentLLM_0", "targetHandle": "mrklAgentLLM_0-input-tools-Tool", "type": "buttonedge", - "id": "serpAPI_1-serpAPI_1-output-serpAPI-SerpAPI|Tool|StructuredTool|BaseLangChain-mrklAgentLLM_0-mrklAgentLLM_0-input-tools-Tool", + "id": "serper_0-serper_0-output-serper-Serper|Tool|StructuredTool-mrklAgentLLM_0-mrklAgentLLM_0-input-tools-Tool", "data": { "label": "" } }, { - "source": "openAI_1", - "sourceHandle": "openAI_1-output-openAI-OpenAI|BaseLLM|BaseLanguageModel|BaseLangChain", + "source": "chatOpenAI_0", + "sourceHandle": "chatOpenAI_0-output-chatOpenAI-ChatOpenAI|BaseChatModel|BaseLanguageModel", "target": "mrklAgentLLM_0", "targetHandle": "mrklAgentLLM_0-input-model-BaseLanguageModel", "type": "buttonedge", - "id": "openAI_1-openAI_1-output-openAI-OpenAI|BaseLLM|BaseLanguageModel|BaseLangChain-mrklAgentLLM_0-mrklAgentLLM_0-input-model-BaseLanguageModel", + "id": "chatOpenAI_0-chatOpenAI_0-output-chatOpenAI-ChatOpenAI|BaseChatModel|BaseLanguageModel-mrklAgentLLM_0-mrklAgentLLM_0-input-model-BaseLanguageModel", "data": { "label": "" } diff --git a/packages/server/marketplaces/Metadata Filter Load.json b/packages/server/marketplaces/chatflows/Metadata Filter Load.json similarity index 57% rename from packages/server/marketplaces/Metadata Filter Load.json rename to packages/server/marketplaces/chatflows/Metadata Filter Load.json index cacabe715..b6ca91e30 100644 --- a/packages/server/marketplaces/Metadata Filter Load.json +++ b/packages/server/marketplaces/chatflows/Metadata Filter Load.json @@ -3,27 +3,135 @@ "nodes": [ { "width": 300, - "height": 524, - "id": "openAI_1", + "height": 480, + "id": "conversationalRetrievalQAChain_0", "position": { - "x": 1195.6182217299724, - "y": -12.958591115085468 + "x": 1643.035168558474, + "y": 360.96295365212774 }, "type": "customNode", "data": { - "id": "openAI_1", - "label": "OpenAI", - "name": "openAI", - "type": "OpenAI", - "baseClasses": ["OpenAI", "BaseLLM", "BaseLanguageModel", "BaseLangChain"], - "category": "LLMs", - "description": "Wrapper around OpenAI large language models", + "id": "conversationalRetrievalQAChain_0", + "label": "Conversational Retrieval QA Chain", + "name": "conversationalRetrievalQAChain", + "version": 1, + "type": "ConversationalRetrievalQAChain", + "baseClasses": ["ConversationalRetrievalQAChain", "BaseChain", "BaseLangChain"], + "category": "Chains", + "description": "Document QA - built on RetrievalQAChain to provide a chat history component", "inputParams": [ { - "label": "OpenAI Api Key", - "name": "openAIApiKey", - "type": "password", - "id": "openAI_1-input-openAIApiKey-password" + "label": "Return Source Documents", + "name": "returnSourceDocuments", + "type": "boolean", + "optional": true, + "id": "conversationalRetrievalQAChain_0-input-returnSourceDocuments-boolean" + }, + { + "label": "System Message", + "name": "systemMessagePrompt", + "type": "string", + "rows": 4, + "additionalParams": true, + "optional": true, + "placeholder": "I want you to act as a document that I am having a conversation with. Your name is \"AI Assistant\". You will provide me with answers from the given info. If the answer is not included, say exactly \"Hmm, I am not sure.\" and stop after that. Refuse to answer any question not about the info. Never break character.", + "id": "conversationalRetrievalQAChain_0-input-systemMessagePrompt-string" + }, + { + "label": "Chain Option", + "name": "chainOption", + "type": "options", + "options": [ + { + "label": "MapReduceDocumentsChain", + "name": "map_reduce", + "description": "Suitable for QA tasks over larger documents and can run the preprocessing step in parallel, reducing the running time" + }, + { + "label": "RefineDocumentsChain", + "name": "refine", + "description": "Suitable for QA tasks over a large number of documents." + }, + { + "label": "StuffDocumentsChain", + "name": "stuff", + "description": "Suitable for QA tasks over a small number of documents." + } + ], + "additionalParams": true, + "optional": true, + "id": "conversationalRetrievalQAChain_0-input-chainOption-options" + } + ], + "inputAnchors": [ + { + "label": "Language Model", + "name": "model", + "type": "BaseLanguageModel", + "id": "conversationalRetrievalQAChain_0-input-model-BaseLanguageModel" + }, + { + "label": "Vector Store Retriever", + "name": "vectorStoreRetriever", + "type": "BaseRetriever", + "id": "conversationalRetrievalQAChain_0-input-vectorStoreRetriever-BaseRetriever" + }, + { + "label": "Memory", + "name": "memory", + "type": "BaseMemory", + "optional": true, + "description": "If left empty, a default BufferMemory will be used", + "id": "conversationalRetrievalQAChain_0-input-memory-BaseMemory" + } + ], + "inputs": { + "model": "{{chatOpenAI_0.data.instance}}", + "vectorStoreRetriever": "{{pineconeExistingIndex_0.data.instance}}" + }, + "outputAnchors": [ + { + "id": "conversationalRetrievalQAChain_0-output-conversationalRetrievalQAChain-ConversationalRetrievalQAChain|BaseChain|BaseLangChain", + "name": "conversationalRetrievalQAChain", + "label": "ConversationalRetrievalQAChain", + "type": "ConversationalRetrievalQAChain | BaseChain | BaseLangChain" + } + ], + "outputs": {}, + "selected": false + }, + "selected": false, + "positionAbsolute": { + "x": 1643.035168558474, + "y": 360.96295365212774 + }, + "dragging": false + }, + { + "width": 300, + "height": 523, + "id": "chatOpenAI_0", + "position": { + "x": 1197.7264239788542, + "y": -16.177600120515933 + }, + "type": "customNode", + "data": { + "id": "chatOpenAI_0", + "label": "ChatOpenAI", + "name": "chatOpenAI", + "version": 1, + "type": "ChatOpenAI", + "baseClasses": ["ChatOpenAI", "BaseChatModel", "BaseLanguageModel"], + "category": "Chat Models", + "description": "Wrapper around OpenAI large language models that use the Chat endpoint", + "inputParams": [ + { + "label": "Connect Credential", + "name": "credential", + "type": "credential", + "credentialNames": ["openAIApi"], + "id": "chatOpenAI_0-input-credential-credential" }, { "label": "Model Name", @@ -31,33 +139,49 @@ "type": "options", "options": [ { - "label": "text-davinci-003", - "name": "text-davinci-003" + "label": "gpt-4", + "name": "gpt-4" }, { - "label": "text-davinci-002", - "name": "text-davinci-002" + "label": "gpt-4-0613", + "name": "gpt-4-0613" }, { - "label": "text-curie-001", - "name": "text-curie-001" + "label": "gpt-4-32k", + "name": "gpt-4-32k" }, { - "label": "text-babbage-001", - "name": "text-babbage-001" + "label": "gpt-4-32k-0613", + "name": "gpt-4-32k-0613" + }, + { + "label": "gpt-3.5-turbo", + "name": "gpt-3.5-turbo" + }, + { + "label": "gpt-3.5-turbo-0613", + "name": "gpt-3.5-turbo-0613" + }, + { + "label": "gpt-3.5-turbo-16k", + "name": "gpt-3.5-turbo-16k" + }, + { + "label": "gpt-3.5-turbo-16k-0613", + "name": "gpt-3.5-turbo-16k-0613" } ], - "default": "text-davinci-003", + "default": "gpt-3.5-turbo", "optional": true, - "id": "openAI_1-input-modelName-options" + "id": "chatOpenAI_0-input-modelName-options" }, { "label": "Temperature", "name": "temperature", "type": "number", - "default": 0.7, + "default": 0.9, "optional": true, - "id": "openAI_1-input-temperature-number" + "id": "chatOpenAI_0-input-temperature-number" }, { "label": "Max Tokens", @@ -65,7 +189,7 @@ "type": "number", "optional": true, "additionalParams": true, - "id": "openAI_1-input-maxTokens-number" + "id": "chatOpenAI_0-input-maxTokens-number" }, { "label": "Top Probability", @@ -73,15 +197,7 @@ "type": "number", "optional": true, "additionalParams": true, - "id": "openAI_1-input-topP-number" - }, - { - "label": "Best Of", - "name": "bestOf", - "type": "number", - "optional": true, - "additionalParams": true, - "id": "openAI_1-input-bestOf-number" + "id": "chatOpenAI_0-input-topP-number" }, { "label": "Frequency Penalty", @@ -89,7 +205,7 @@ "type": "number", "optional": true, "additionalParams": true, - "id": "openAI_1-input-frequencyPenalty-number" + "id": "chatOpenAI_0-input-frequencyPenalty-number" }, { "label": "Presence Penalty", @@ -97,15 +213,7 @@ "type": "number", "optional": true, "additionalParams": true, - "id": "openAI_1-input-presencePenalty-number" - }, - { - "label": "Batch Size", - "name": "batchSize", - "type": "number", - "optional": true, - "additionalParams": true, - "id": "openAI_1-input-batchSize-number" + "id": "chatOpenAI_0-input-presencePenalty-number" }, { "label": "Timeout", @@ -113,62 +221,71 @@ "type": "number", "optional": true, "additionalParams": true, - "id": "openAI_1-input-timeout-number" + "id": "chatOpenAI_0-input-timeout-number" + }, + { + "label": "BasePath", + "name": "basepath", + "type": "string", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_0-input-basepath-string" } ], "inputAnchors": [], "inputs": { - "modelName": "text-davinci-003", + "modelName": "gpt-3.5-turbo", "temperature": "0", "maxTokens": "", "topP": "", - "bestOf": "", "frequencyPenalty": "", "presencePenalty": "", - "batchSize": "", - "timeout": "" + "timeout": "", + "basepath": "" }, "outputAnchors": [ { - "id": "openAI_1-output-openAI-OpenAI|BaseLLM|BaseLanguageModel|BaseLangChain", - "name": "openAI", - "label": "OpenAI", - "type": "OpenAI | BaseLLM | BaseLanguageModel | BaseLangChain" + "id": "chatOpenAI_0-output-chatOpenAI-ChatOpenAI|BaseChatModel|BaseLanguageModel", + "name": "chatOpenAI", + "label": "ChatOpenAI", + "type": "ChatOpenAI | BaseChatModel | BaseLanguageModel" } ], "outputs": {}, "selected": false }, - "positionAbsolute": { - "x": 1195.6182217299724, - "y": -12.958591115085468 - }, "selected": false, + "positionAbsolute": { + "x": 1197.7264239788542, + "y": -16.177600120515933 + }, "dragging": false }, { "width": 300, - "height": 330, - "id": "openAIEmbeddings_1", + "height": 329, + "id": "openAIEmbeddings_0", "position": { - "x": 777.5098693425334, - "y": 308.4221448953297 + "x": 805.2662010688601, + "y": 389.3163571296623 }, "type": "customNode", "data": { - "id": "openAIEmbeddings_1", + "id": "openAIEmbeddings_0", "label": "OpenAI Embeddings", "name": "openAIEmbeddings", + "version": 1, "type": "OpenAIEmbeddings", "baseClasses": ["OpenAIEmbeddings", "Embeddings"], "category": "Embeddings", "description": "OpenAI API to generate embeddings for a given text", "inputParams": [ { - "label": "OpenAI Api Key", - "name": "openAIApiKey", - "type": "password", - "id": "openAIEmbeddings_1-input-openAIApiKey-password" + "label": "Connect Credential", + "name": "credential", + "type": "credential", + "credentialNames": ["openAIApi"], + "id": "openAIEmbeddings_0-input-credential-credential" }, { "label": "Strip New Lines", @@ -176,7 +293,7 @@ "type": "boolean", "optional": true, "additionalParams": true, - "id": "openAIEmbeddings_1-input-stripNewLines-boolean" + "id": "openAIEmbeddings_0-input-stripNewLines-boolean" }, { "label": "Batch Size", @@ -184,7 +301,7 @@ "type": "number", "optional": true, "additionalParams": true, - "id": "openAIEmbeddings_1-input-batchSize-number" + "id": "openAIEmbeddings_0-input-batchSize-number" }, { "label": "Timeout", @@ -192,18 +309,27 @@ "type": "number", "optional": true, "additionalParams": true, - "id": "openAIEmbeddings_1-input-timeout-number" + "id": "openAIEmbeddings_0-input-timeout-number" + }, + { + "label": "BasePath", + "name": "basepath", + "type": "string", + "optional": true, + "additionalParams": true, + "id": "openAIEmbeddings_0-input-basepath-string" } ], "inputAnchors": [], "inputs": { "stripNewLines": "", "batchSize": "", - "timeout": "" + "timeout": "", + "basepath": "" }, "outputAnchors": [ { - "id": "openAIEmbeddings_1-output-openAIEmbeddings-OpenAIEmbeddings|Embeddings", + "id": "openAIEmbeddings_0-output-openAIEmbeddings-OpenAIEmbeddings|Embeddings", "name": "openAIEmbeddings", "label": "OpenAIEmbeddings", "type": "OpenAIEmbeddings | Embeddings" @@ -214,40 +340,36 @@ }, "selected": false, "positionAbsolute": { - "x": 777.5098693425334, - "y": 308.4221448953297 + "x": 805.2662010688601, + "y": 389.3163571296623 }, "dragging": false }, { "width": 300, - "height": 703, + "height": 505, "id": "pineconeExistingIndex_0", "position": { - "x": 1187.519066203033, - "y": 542.6635399602128 + "x": 1194.8300385699242, + "y": 542.8247838029442 }, "type": "customNode", "data": { "id": "pineconeExistingIndex_0", "label": "Pinecone Load Existing Index", "name": "pineconeExistingIndex", + "version": 1, "type": "Pinecone", "baseClasses": ["Pinecone", "VectorStoreRetriever", "BaseRetriever"], "category": "Vector Stores", "description": "Load existing index from Pinecone (i.e: Document has been upserted)", "inputParams": [ { - "label": "Pinecone Api Key", - "name": "pineconeApiKey", - "type": "password", - "id": "pineconeExistingIndex_0-input-pineconeApiKey-password" - }, - { - "label": "Pinecone Environment", - "name": "pineconeEnv", - "type": "string", - "id": "pineconeExistingIndex_0-input-pineconeEnv-string" + "label": "Connect Credential", + "name": "credential", + "type": "credential", + "credentialNames": ["pineconeApi"], + "id": "pineconeExistingIndex_0-input-credential-credential" }, { "label": "Pinecone Index", @@ -260,6 +382,7 @@ "name": "pineconeNamespace", "type": "string", "placeholder": "my-first-namespace", + "additionalParams": true, "optional": true, "id": "pineconeExistingIndex_0-input-pineconeNamespace-string" }, @@ -270,6 +393,16 @@ "optional": true, "additionalParams": true, "id": "pineconeExistingIndex_0-input-pineconeMetadataFilter-json" + }, + { + "label": "Top K", + "name": "topK", + "description": "Number of top results to fetch. Default to 4", + "placeholder": "4", + "type": "number", + "additionalParams": true, + "optional": true, + "id": "pineconeExistingIndex_0-input-topK-number" } ], "inputAnchors": [ @@ -281,11 +414,11 @@ } ], "inputs": { - "embeddings": "{{openAIEmbeddings_1.data.instance}}", - "pineconeEnv": "northamerica-northeast1-gcp", - "pineconeIndex": "myindex", - "pineconeNamespace": "my-namespace", - "pineconeMetadataFilter": "{\"id\":\"doc1\"}" + "embeddings": "{{openAIEmbeddings_0.data.instance}}", + "pineconeIndex": "", + "pineconeNamespace": "", + "pineconeMetadataFilter": "", + "topK": "" }, "outputAnchors": [ { @@ -316,85 +449,31 @@ }, "selected": false, "positionAbsolute": { - "x": 1187.519066203033, - "y": 542.6635399602128 - }, - "dragging": false - }, - { - "width": 300, - "height": 280, - "id": "conversationalRetrievalQAChain_0", - "position": { - "x": 1585.900129303412, - "y": 405.9784391258126 - }, - "type": "customNode", - "data": { - "id": "conversationalRetrievalQAChain_0", - "label": "Conversational Retrieval QA Chain", - "name": "conversationalRetrievalQAChain", - "type": "ConversationalRetrievalQAChain", - "baseClasses": ["ConversationalRetrievalQAChain", "BaseChain", "BaseLangChain"], - "category": "Chains", - "description": "Document QA - built on RetrievalQAChain to provide a chat history component", - "inputParams": [], - "inputAnchors": [ - { - "label": "Language Model", - "name": "model", - "type": "BaseLanguageModel", - "id": "conversationalRetrievalQAChain_0-input-model-BaseLanguageModel" - }, - { - "label": "Vector Store Retriever", - "name": "vectorStoreRetriever", - "type": "BaseRetriever", - "id": "conversationalRetrievalQAChain_0-input-vectorStoreRetriever-BaseRetriever" - } - ], - "inputs": { - "model": "{{openAI_1.data.instance}}", - "vectorStoreRetriever": "{{pineconeExistingIndex_0.data.instance}}" - }, - "outputAnchors": [ - { - "id": "conversationalRetrievalQAChain_0-output-conversationalRetrievalQAChain-ConversationalRetrievalQAChain|BaseChain|BaseLangChain", - "name": "conversationalRetrievalQAChain", - "label": "ConversationalRetrievalQAChain", - "type": "ConversationalRetrievalQAChain | BaseChain | BaseLangChain" - } - ], - "outputs": {}, - "selected": false - }, - "selected": false, - "positionAbsolute": { - "x": 1585.900129303412, - "y": 405.9784391258126 + "x": 1194.8300385699242, + "y": 542.8247838029442 }, "dragging": false } ], "edges": [ { - "source": "openAIEmbeddings_1", - "sourceHandle": "openAIEmbeddings_1-output-openAIEmbeddings-OpenAIEmbeddings|Embeddings", + "source": "openAIEmbeddings_0", + "sourceHandle": "openAIEmbeddings_0-output-openAIEmbeddings-OpenAIEmbeddings|Embeddings", "target": "pineconeExistingIndex_0", "targetHandle": "pineconeExistingIndex_0-input-embeddings-Embeddings", "type": "buttonedge", - "id": "openAIEmbeddings_1-openAIEmbeddings_1-output-openAIEmbeddings-OpenAIEmbeddings|Embeddings-pineconeExistingIndex_0-pineconeExistingIndex_0-input-embeddings-Embeddings", + "id": "openAIEmbeddings_0-openAIEmbeddings_0-output-openAIEmbeddings-OpenAIEmbeddings|Embeddings-pineconeExistingIndex_0-pineconeExistingIndex_0-input-embeddings-Embeddings", "data": { "label": "" } }, { - "source": "openAI_1", - "sourceHandle": "openAI_1-output-openAI-OpenAI|BaseLLM|BaseLanguageModel|BaseLangChain", + "source": "chatOpenAI_0", + "sourceHandle": "chatOpenAI_0-output-chatOpenAI-ChatOpenAI|BaseChatModel|BaseLanguageModel", "target": "conversationalRetrievalQAChain_0", "targetHandle": "conversationalRetrievalQAChain_0-input-model-BaseLanguageModel", "type": "buttonedge", - "id": "openAI_1-openAI_1-output-openAI-OpenAI|BaseLLM|BaseLanguageModel|BaseLangChain-conversationalRetrievalQAChain_0-conversationalRetrievalQAChain_0-input-model-BaseLanguageModel", + "id": "chatOpenAI_0-chatOpenAI_0-output-chatOpenAI-ChatOpenAI|BaseChatModel|BaseLanguageModel-conversationalRetrievalQAChain_0-conversationalRetrievalQAChain_0-input-model-BaseLanguageModel", "data": { "label": "" } diff --git a/packages/server/marketplaces/Metadata Filter Upsert.json b/packages/server/marketplaces/chatflows/Metadata Filter Upsert.json similarity index 71% rename from packages/server/marketplaces/Metadata Filter Upsert.json rename to packages/server/marketplaces/chatflows/Metadata Filter Upsert.json index ab66cf743..e70b11f74 100644 --- a/packages/server/marketplaces/Metadata Filter Upsert.json +++ b/packages/server/marketplaces/chatflows/Metadata Filter Upsert.json @@ -14,6 +14,7 @@ "id": "recursiveCharacterTextSplitter_1", "label": "Recursive Character Text Splitter", "name": "recursiveCharacterTextSplitter", + "version": 1, "type": "RecursiveCharacterTextSplitter", "baseClasses": ["RecursiveCharacterTextSplitter", "TextSplitter"], "category": "Text Splitters", @@ -58,224 +59,6 @@ }, "dragging": false }, - { - "width": 300, - "height": 524, - "id": "openAI_1", - "position": { - "x": 1159.184721109528, - "y": -38.76565405456694 - }, - "type": "customNode", - "data": { - "id": "openAI_1", - "label": "OpenAI", - "name": "openAI", - "type": "OpenAI", - "baseClasses": ["OpenAI", "BaseLLM", "BaseLanguageModel", "BaseLangChain"], - "category": "LLMs", - "description": "Wrapper around OpenAI large language models", - "inputParams": [ - { - "label": "OpenAI Api Key", - "name": "openAIApiKey", - "type": "password", - "id": "openAI_1-input-openAIApiKey-password" - }, - { - "label": "Model Name", - "name": "modelName", - "type": "options", - "options": [ - { - "label": "text-davinci-003", - "name": "text-davinci-003" - }, - { - "label": "text-davinci-002", - "name": "text-davinci-002" - }, - { - "label": "text-curie-001", - "name": "text-curie-001" - }, - { - "label": "text-babbage-001", - "name": "text-babbage-001" - } - ], - "default": "text-davinci-003", - "optional": true, - "id": "openAI_1-input-modelName-options" - }, - { - "label": "Temperature", - "name": "temperature", - "type": "number", - "default": 0.7, - "optional": true, - "id": "openAI_1-input-temperature-number" - }, - { - "label": "Max Tokens", - "name": "maxTokens", - "type": "number", - "optional": true, - "additionalParams": true, - "id": "openAI_1-input-maxTokens-number" - }, - { - "label": "Top Probability", - "name": "topP", - "type": "number", - "optional": true, - "additionalParams": true, - "id": "openAI_1-input-topP-number" - }, - { - "label": "Best Of", - "name": "bestOf", - "type": "number", - "optional": true, - "additionalParams": true, - "id": "openAI_1-input-bestOf-number" - }, - { - "label": "Frequency Penalty", - "name": "frequencyPenalty", - "type": "number", - "optional": true, - "additionalParams": true, - "id": "openAI_1-input-frequencyPenalty-number" - }, - { - "label": "Presence Penalty", - "name": "presencePenalty", - "type": "number", - "optional": true, - "additionalParams": true, - "id": "openAI_1-input-presencePenalty-number" - }, - { - "label": "Batch Size", - "name": "batchSize", - "type": "number", - "optional": true, - "additionalParams": true, - "id": "openAI_1-input-batchSize-number" - }, - { - "label": "Timeout", - "name": "timeout", - "type": "number", - "optional": true, - "additionalParams": true, - "id": "openAI_1-input-timeout-number" - } - ], - "inputAnchors": [], - "inputs": { - "modelName": "text-davinci-003", - "temperature": "0", - "maxTokens": "", - "topP": "", - "bestOf": "", - "frequencyPenalty": "", - "presencePenalty": "", - "batchSize": "", - "timeout": "" - }, - "outputAnchors": [ - { - "id": "openAI_1-output-openAI-OpenAI|BaseLLM|BaseLanguageModel|BaseLangChain", - "name": "openAI", - "label": "OpenAI", - "type": "OpenAI | BaseLLM | BaseLanguageModel | BaseLangChain" - } - ], - "outputs": {}, - "selected": false - }, - "positionAbsolute": { - "x": 1159.184721109528, - "y": -38.76565405456694 - }, - "selected": false, - "dragging": false - }, - { - "width": 300, - "height": 330, - "id": "openAIEmbeddings_1", - "position": { - "x": 749.4044250705479, - "y": 858.4858399327618 - }, - "type": "customNode", - "data": { - "id": "openAIEmbeddings_1", - "label": "OpenAI Embeddings", - "name": "openAIEmbeddings", - "type": "OpenAIEmbeddings", - "baseClasses": ["OpenAIEmbeddings", "Embeddings"], - "category": "Embeddings", - "description": "OpenAI API to generate embeddings for a given text", - "inputParams": [ - { - "label": "OpenAI Api Key", - "name": "openAIApiKey", - "type": "password", - "id": "openAIEmbeddings_1-input-openAIApiKey-password" - }, - { - "label": "Strip New Lines", - "name": "stripNewLines", - "type": "boolean", - "optional": true, - "additionalParams": true, - "id": "openAIEmbeddings_1-input-stripNewLines-boolean" - }, - { - "label": "Batch Size", - "name": "batchSize", - "type": "number", - "optional": true, - "additionalParams": true, - "id": "openAIEmbeddings_1-input-batchSize-number" - }, - { - "label": "Timeout", - "name": "timeout", - "type": "number", - "optional": true, - "additionalParams": true, - "id": "openAIEmbeddings_1-input-timeout-number" - } - ], - "inputAnchors": [], - "inputs": { - "stripNewLines": "", - "batchSize": "", - "timeout": "" - }, - "outputAnchors": [ - { - "id": "openAIEmbeddings_1-output-openAIEmbeddings-OpenAIEmbeddings|Embeddings", - "name": "openAIEmbeddings", - "label": "OpenAIEmbeddings", - "type": "OpenAIEmbeddings | Embeddings" - } - ], - "outputs": {}, - "selected": false - }, - "selected": false, - "positionAbsolute": { - "x": 749.4044250705479, - "y": 858.4858399327618 - }, - "dragging": false - }, { "width": 300, "height": 392, @@ -289,6 +72,7 @@ "id": "textFile_0", "label": "Text File", "name": "textFile", + "version": 1, "type": "Document", "baseClasses": ["Document"], "category": "Document Loaders", @@ -354,6 +138,7 @@ "id": "pdfFile_0", "label": "Pdf File", "name": "pdfFile", + "version": 1, "type": "Document", "baseClasses": ["Document"], "category": "Document Loaders", @@ -426,11 +211,117 @@ }, { "width": 300, - "height": 702, + "height": 480, + "id": "conversationalRetrievalQAChain_0", + "position": { + "x": 1570.3859788160953, + "y": 423.6687850109136 + }, + "type": "customNode", + "data": { + "id": "conversationalRetrievalQAChain_0", + "label": "Conversational Retrieval QA Chain", + "name": "conversationalRetrievalQAChain", + "version": 1, + "type": "ConversationalRetrievalQAChain", + "baseClasses": ["ConversationalRetrievalQAChain", "BaseChain", "BaseLangChain"], + "category": "Chains", + "description": "Document QA - built on RetrievalQAChain to provide a chat history component", + "inputParams": [ + { + "label": "Return Source Documents", + "name": "returnSourceDocuments", + "type": "boolean", + "optional": true, + "id": "conversationalRetrievalQAChain_0-input-returnSourceDocuments-boolean" + }, + { + "label": "System Message", + "name": "systemMessagePrompt", + "type": "string", + "rows": 4, + "additionalParams": true, + "optional": true, + "placeholder": "I want you to act as a document that I am having a conversation with. Your name is \"AI Assistant\". You will provide me with answers from the given info. If the answer is not included, say exactly \"Hmm, I am not sure.\" and stop after that. Refuse to answer any question not about the info. Never break character.", + "id": "conversationalRetrievalQAChain_0-input-systemMessagePrompt-string" + }, + { + "label": "Chain Option", + "name": "chainOption", + "type": "options", + "options": [ + { + "label": "MapReduceDocumentsChain", + "name": "map_reduce", + "description": "Suitable for QA tasks over larger documents and can run the preprocessing step in parallel, reducing the running time" + }, + { + "label": "RefineDocumentsChain", + "name": "refine", + "description": "Suitable for QA tasks over a large number of documents." + }, + { + "label": "StuffDocumentsChain", + "name": "stuff", + "description": "Suitable for QA tasks over a small number of documents." + } + ], + "additionalParams": true, + "optional": true, + "id": "conversationalRetrievalQAChain_0-input-chainOption-options" + } + ], + "inputAnchors": [ + { + "label": "Language Model", + "name": "model", + "type": "BaseLanguageModel", + "id": "conversationalRetrievalQAChain_0-input-model-BaseLanguageModel" + }, + { + "label": "Vector Store Retriever", + "name": "vectorStoreRetriever", + "type": "BaseRetriever", + "id": "conversationalRetrievalQAChain_0-input-vectorStoreRetriever-BaseRetriever" + }, + { + "label": "Memory", + "name": "memory", + "type": "BaseMemory", + "optional": true, + "description": "If left empty, a default BufferMemory will be used", + "id": "conversationalRetrievalQAChain_0-input-memory-BaseMemory" + } + ], + "inputs": { + "model": "{{chatOpenAI_0.data.instance}}", + "vectorStoreRetriever": "{{pineconeUpsert_0.data.instance}}" + }, + "outputAnchors": [ + { + "id": "conversationalRetrievalQAChain_0-output-conversationalRetrievalQAChain-ConversationalRetrievalQAChain|BaseChain|BaseLangChain", + "name": "conversationalRetrievalQAChain", + "label": "ConversationalRetrievalQAChain", + "type": "ConversationalRetrievalQAChain | BaseChain | BaseLangChain" + } + ], + "outputs": {}, + "selected": false + }, + "selected": false, + "positionAbsolute": { + "x": 1570.3859788160953, + "y": 423.6687850109136 + }, + "dragging": false + }, + { + "width": 300, + "height": 555, "id": "pineconeUpsert_0", "position": { - "x": 1161.8813042660154, - "y": 537.0216614326227 + "x": 1161.2426252201622, + "y": 549.7917156049002 }, "type": "customNode", "data": { @@ -438,21 +329,17 @@ "label": "Pinecone Upsert Document", "name": "pineconeUpsert", "type": "Pinecone", + "version": 1, "baseClasses": ["Pinecone", "VectorStoreRetriever", "BaseRetriever"], "category": "Vector Stores", "description": "Upsert documents to Pinecone", "inputParams": [ { - "label": "Pinecone Api Key", - "name": "pineconeApiKey", - "type": "password", - "id": "pineconeUpsert_0-input-pineconeApiKey-password" - }, - { - "label": "Pinecone Environment", - "name": "pineconeEnv", - "type": "string", - "id": "pineconeUpsert_0-input-pineconeEnv-string" + "label": "Connect Credential", + "name": "credential", + "type": "credential", + "credentialNames": ["pineconeApi"], + "id": "pineconeUpsert_0-input-credential-credential" }, { "label": "Pinecone Index", @@ -465,8 +352,19 @@ "name": "pineconeNamespace", "type": "string", "placeholder": "my-first-namespace", + "additionalParams": true, "optional": true, "id": "pineconeUpsert_0-input-pineconeNamespace-string" + }, + { + "label": "Top K", + "name": "topK", + "description": "Number of top results to fetch. Default to 4", + "placeholder": "4", + "type": "number", + "additionalParams": true, + "optional": true, + "id": "pineconeUpsert_0-input-topK-number" } ], "inputAnchors": [ @@ -485,11 +383,11 @@ } ], "inputs": { - "document": ["{{pdfFile_0.data.instance}}", "{{textFile_0.data.instance}}"], - "embeddings": "{{openAIEmbeddings_1.data.instance}}", - "pineconeEnv": "northamerica-northeast1-gcp", - "pineconeIndex": "myindex", - "pineconeNamespace": "my-namespace" + "document": ["{{textFile_0.data.instance}}", "{{pdfFile_0.data.instance}}"], + "embeddings": "{{openAIEmbeddings_0.data.instance}}", + "pineconeIndex": "", + "pineconeNamespace": "", + "topK": "" }, "outputAnchors": [ { @@ -520,53 +418,153 @@ }, "selected": false, "positionAbsolute": { - "x": 1161.8813042660154, - "y": 537.0216614326227 + "x": 1161.2426252201622, + "y": 549.7917156049002 }, "dragging": false }, { "width": 300, - "height": 280, - "id": "conversationalRetrievalQAChain_0", + "height": 523, + "id": "chatOpenAI_0", "position": { - "x": 1570.3859788160953, - "y": 423.6687850109136 + "x": 1164.9667590264419, + "y": -44.2076264967032 }, "type": "customNode", "data": { - "id": "conversationalRetrievalQAChain_0", - "label": "Conversational Retrieval QA Chain", - "name": "conversationalRetrievalQAChain", - "type": "ConversationalRetrievalQAChain", - "baseClasses": ["ConversationalRetrievalQAChain", "BaseChain", "BaseLangChain"], - "category": "Chains", - "description": "Document QA - built on RetrievalQAChain to provide a chat history component", - "inputParams": [], - "inputAnchors": [ + "id": "chatOpenAI_0", + "label": "ChatOpenAI", + "name": "chatOpenAI", + "version": 1, + "type": "ChatOpenAI", + "baseClasses": ["ChatOpenAI", "BaseChatModel", "BaseLanguageModel"], + "category": "Chat Models", + "description": "Wrapper around OpenAI large language models that use the Chat endpoint", + "inputParams": [ { - "label": "Language Model", - "name": "model", - "type": "BaseLanguageModel", - "id": "conversationalRetrievalQAChain_0-input-model-BaseLanguageModel" + "label": "Connect Credential", + "name": "credential", + "type": "credential", + "credentialNames": ["openAIApi"], + "id": "chatOpenAI_0-input-credential-credential" }, { - "label": "Vector Store Retriever", - "name": "vectorStoreRetriever", - "type": "BaseRetriever", - "id": "conversationalRetrievalQAChain_0-input-vectorStoreRetriever-BaseRetriever" + "label": "Model Name", + "name": "modelName", + "type": "options", + "options": [ + { + "label": "gpt-4", + "name": "gpt-4" + }, + { + "label": "gpt-4-0613", + "name": "gpt-4-0613" + }, + { + "label": "gpt-4-32k", + "name": "gpt-4-32k" + }, + { + "label": "gpt-4-32k-0613", + "name": "gpt-4-32k-0613" + }, + { + "label": "gpt-3.5-turbo", + "name": "gpt-3.5-turbo" + }, + { + "label": "gpt-3.5-turbo-0613", + "name": "gpt-3.5-turbo-0613" + }, + { + "label": "gpt-3.5-turbo-16k", + "name": "gpt-3.5-turbo-16k" + }, + { + "label": "gpt-3.5-turbo-16k-0613", + "name": "gpt-3.5-turbo-16k-0613" + } + ], + "default": "gpt-3.5-turbo", + "optional": true, + "id": "chatOpenAI_0-input-modelName-options" + }, + { + "label": "Temperature", + "name": "temperature", + "type": "number", + "default": 0.9, + "optional": true, + "id": "chatOpenAI_0-input-temperature-number" + }, + { + "label": "Max Tokens", + "name": "maxTokens", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_0-input-maxTokens-number" + }, + { + "label": "Top Probability", + "name": "topP", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_0-input-topP-number" + }, + { + "label": "Frequency Penalty", + "name": "frequencyPenalty", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_0-input-frequencyPenalty-number" + }, + { + "label": "Presence Penalty", + "name": "presencePenalty", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_0-input-presencePenalty-number" + }, + { + "label": "Timeout", + "name": "timeout", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_0-input-timeout-number" + }, + { + "label": "BasePath", + "name": "basepath", + "type": "string", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_0-input-basepath-string" } ], + "inputAnchors": [], "inputs": { - "model": "{{openAI_1.data.instance}}", - "vectorStoreRetriever": "{{pineconeUpsert_0.data.instance}}" + "modelName": "gpt-3.5-turbo", + "temperature": 0.9, + "maxTokens": "", + "topP": "", + "frequencyPenalty": "", + "presencePenalty": "", + "timeout": "", + "basepath": "" }, "outputAnchors": [ { - "id": "conversationalRetrievalQAChain_0-output-conversationalRetrievalQAChain-ConversationalRetrievalQAChain|BaseChain|BaseLangChain", - "name": "conversationalRetrievalQAChain", - "label": "ConversationalRetrievalQAChain", - "type": "ConversationalRetrievalQAChain | BaseChain | BaseLangChain" + "id": "chatOpenAI_0-output-chatOpenAI-ChatOpenAI|BaseChatModel|BaseLanguageModel", + "name": "chatOpenAI", + "label": "ChatOpenAI", + "type": "ChatOpenAI | BaseChatModel | BaseLanguageModel" } ], "outputs": {}, @@ -574,8 +572,92 @@ }, "selected": false, "positionAbsolute": { - "x": 1570.3859788160953, - "y": 423.6687850109136 + "x": 1164.9667590264419, + "y": -44.2076264967032 + }, + "dragging": false + }, + { + "width": 300, + "height": 329, + "id": "openAIEmbeddings_0", + "position": { + "x": 772.0706424639393, + "y": 862.6189553323906 + }, + "type": "customNode", + "data": { + "id": "openAIEmbeddings_0", + "label": "OpenAI Embeddings", + "name": "openAIEmbeddings", + "version": 1, + "type": "OpenAIEmbeddings", + "baseClasses": ["OpenAIEmbeddings", "Embeddings"], + "category": "Embeddings", + "description": "OpenAI API to generate embeddings for a given text", + "inputParams": [ + { + "label": "Connect Credential", + "name": "credential", + "type": "credential", + "credentialNames": ["openAIApi"], + "id": "openAIEmbeddings_0-input-credential-credential" + }, + { + "label": "Strip New Lines", + "name": "stripNewLines", + "type": "boolean", + "optional": true, + "additionalParams": true, + "id": "openAIEmbeddings_0-input-stripNewLines-boolean" + }, + { + "label": "Batch Size", + "name": "batchSize", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "openAIEmbeddings_0-input-batchSize-number" + }, + { + "label": "Timeout", + "name": "timeout", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "openAIEmbeddings_0-input-timeout-number" + }, + { + "label": "BasePath", + "name": "basepath", + "type": "string", + "optional": true, + "additionalParams": true, + "id": "openAIEmbeddings_0-input-basepath-string" + } + ], + "inputAnchors": [], + "inputs": { + "stripNewLines": "", + "batchSize": "", + "timeout": "", + "basepath": "" + }, + "outputAnchors": [ + { + "id": "openAIEmbeddings_0-output-openAIEmbeddings-OpenAIEmbeddings|Embeddings", + "name": "openAIEmbeddings", + "label": "OpenAIEmbeddings", + "type": "OpenAIEmbeddings | Embeddings" + } + ], + "outputs": {}, + "selected": false + }, + "selected": false, + "positionAbsolute": { + "x": 772.0706424639393, + "y": 862.6189553323906 }, "dragging": false } @@ -604,23 +686,12 @@ } }, { - "source": "openAIEmbeddings_1", - "sourceHandle": "openAIEmbeddings_1-output-openAIEmbeddings-OpenAIEmbeddings|Embeddings", + "source": "openAIEmbeddings_0", + "sourceHandle": "openAIEmbeddings_0-output-openAIEmbeddings-OpenAIEmbeddings|Embeddings", "target": "pineconeUpsert_0", "targetHandle": "pineconeUpsert_0-input-embeddings-Embeddings", "type": "buttonedge", - "id": "openAIEmbeddings_1-openAIEmbeddings_1-output-openAIEmbeddings-OpenAIEmbeddings|Embeddings-pineconeUpsert_0-pineconeUpsert_0-input-embeddings-Embeddings", - "data": { - "label": "" - } - }, - { - "source": "pdfFile_0", - "sourceHandle": "pdfFile_0-output-pdfFile-Document", - "target": "pineconeUpsert_0", - "targetHandle": "pineconeUpsert_0-input-document-Document", - "type": "buttonedge", - "id": "pdfFile_0-pdfFile_0-output-pdfFile-Document-pineconeUpsert_0-pineconeUpsert_0-input-document-Document", + "id": "openAIEmbeddings_0-openAIEmbeddings_0-output-openAIEmbeddings-OpenAIEmbeddings|Embeddings-pineconeUpsert_0-pineconeUpsert_0-input-embeddings-Embeddings", "data": { "label": "" } @@ -637,12 +708,23 @@ } }, { - "source": "openAI_1", - "sourceHandle": "openAI_1-output-openAI-OpenAI|BaseLLM|BaseLanguageModel|BaseLangChain", + "source": "pdfFile_0", + "sourceHandle": "pdfFile_0-output-pdfFile-Document", + "target": "pineconeUpsert_0", + "targetHandle": "pineconeUpsert_0-input-document-Document", + "type": "buttonedge", + "id": "pdfFile_0-pdfFile_0-output-pdfFile-Document-pineconeUpsert_0-pineconeUpsert_0-input-document-Document", + "data": { + "label": "" + } + }, + { + "source": "chatOpenAI_0", + "sourceHandle": "chatOpenAI_0-output-chatOpenAI-ChatOpenAI|BaseChatModel|BaseLanguageModel", "target": "conversationalRetrievalQAChain_0", "targetHandle": "conversationalRetrievalQAChain_0-input-model-BaseLanguageModel", "type": "buttonedge", - "id": "openAI_1-openAI_1-output-openAI-OpenAI|BaseLLM|BaseLanguageModel|BaseLangChain-conversationalRetrievalQAChain_0-conversationalRetrievalQAChain_0-input-model-BaseLanguageModel", + "id": "chatOpenAI_0-chatOpenAI_0-output-chatOpenAI-ChatOpenAI|BaseChatModel|BaseLanguageModel-conversationalRetrievalQAChain_0-conversationalRetrievalQAChain_0-input-model-BaseLanguageModel", "data": { "label": "" } diff --git a/packages/server/marketplaces/chatflows/Multi Prompt Chain.json b/packages/server/marketplaces/chatflows/Multi Prompt Chain.json new file mode 100644 index 000000000..cf86df5be --- /dev/null +++ b/packages/server/marketplaces/chatflows/Multi Prompt Chain.json @@ -0,0 +1,469 @@ +{ + "description": "A chain that automatically picks an appropriate prompt from multiple prompts", + "nodes": [ + { + "width": 300, + "height": 632, + "id": "promptRetriever_0", + "position": { + "x": 197.46642699727397, + "y": 25.945621297410923 + }, + "type": "customNode", + "data": { + "id": "promptRetriever_0", + "label": "Prompt Retriever", + "name": "promptRetriever", + "version": 1, + "type": "PromptRetriever", + "baseClasses": ["PromptRetriever"], + "category": "Retrievers", + "description": "Store prompt template with name & description to be later queried by MultiPromptChain", + "inputParams": [ + { + "label": "Prompt Name", + "name": "name", + "type": "string", + "placeholder": "physics-qa", + "id": "promptRetriever_0-input-name-string" + }, + { + "label": "Prompt Description", + "name": "description", + "type": "string", + "rows": 3, + "description": "Description of what the prompt does and when it should be used", + "placeholder": "Good for answering questions about physics", + "id": "promptRetriever_0-input-description-string" + }, + { + "label": "Prompt System Message", + "name": "systemMessage", + "type": "string", + "rows": 4, + "placeholder": "You are a very smart physics professor. You are great at answering questions about physics in a concise and easy to understand manner. When you don't know the answer to a question you admit that you don't know.", + "id": "promptRetriever_0-input-systemMessage-string" + } + ], + "inputAnchors": [], + "inputs": { + "name": "physics", + "description": "Good for answering questions about physics", + "systemMessage": "You are a very smart physics professor. You are great at answering questions about physics in a concise and easy to understand manner. When you don't know the answer to a question you admit that you don't know." + }, + "outputAnchors": [ + { + "id": "promptRetriever_0-output-promptRetriever-PromptRetriever", + "name": "promptRetriever", + "label": "PromptRetriever", + "type": "PromptRetriever" + } + ], + "outputs": {}, + "selected": false + }, + "selected": false, + "positionAbsolute": { + "x": 197.46642699727397, + "y": 25.945621297410923 + }, + "dragging": false + }, + { + "width": 300, + "height": 280, + "id": "multiPromptChain_0", + "position": { + "x": 1619.1305522575494, + "y": 210.28103293821243 + }, + "type": "customNode", + "data": { + "id": "multiPromptChain_0", + "label": "Multi Prompt Chain", + "name": "multiPromptChain", + "version": 1, + "type": "MultiPromptChain", + "baseClasses": ["MultiPromptChain", "MultiRouteChain", "BaseChain", "BaseLangChain"], + "category": "Chains", + "description": "Chain automatically picks an appropriate prompt from multiple prompt templates", + "inputParams": [], + "inputAnchors": [ + { + "label": "Language Model", + "name": "model", + "type": "BaseLanguageModel", + "id": "multiPromptChain_0-input-model-BaseLanguageModel" + }, + { + "label": "Prompt Retriever", + "name": "promptRetriever", + "type": "PromptRetriever", + "list": true, + "id": "multiPromptChain_0-input-promptRetriever-PromptRetriever" + } + ], + "inputs": { + "model": "{{chatOpenAI_0.data.instance}}", + "promptRetriever": [ + "{{promptRetriever_0.data.instance}}", + "{{promptRetriever_2.data.instance}}", + "{{promptRetriever_1.data.instance}}" + ] + }, + "outputAnchors": [ + { + "id": "multiPromptChain_0-output-multiPromptChain-MultiPromptChain|MultiRouteChain|BaseChain|BaseLangChain", + "name": "multiPromptChain", + "label": "MultiPromptChain", + "type": "MultiPromptChain | MultiRouteChain | BaseChain | BaseLangChain" + } + ], + "outputs": {}, + "selected": false + }, + "positionAbsolute": { + "x": 1619.1305522575494, + "y": 210.28103293821243 + }, + "selected": false, + "dragging": false + }, + { + "width": 300, + "height": 632, + "id": "promptRetriever_1", + "position": { + "x": 539.1322780233141, + "y": -250.72967142925938 + }, + "type": "customNode", + "data": { + "id": "promptRetriever_1", + "label": "Prompt Retriever", + "name": "promptRetriever", + "version": 1, + "type": "PromptRetriever", + "baseClasses": ["PromptRetriever"], + "category": "Retrievers", + "description": "Store prompt template with name & description to be later queried by MultiPromptChain", + "inputParams": [ + { + "label": "Prompt Name", + "name": "name", + "type": "string", + "placeholder": "physics-qa", + "id": "promptRetriever_1-input-name-string" + }, + { + "label": "Prompt Description", + "name": "description", + "type": "string", + "rows": 3, + "description": "Description of what the prompt does and when it should be used", + "placeholder": "Good for answering questions about physics", + "id": "promptRetriever_1-input-description-string" + }, + { + "label": "Prompt System Message", + "name": "systemMessage", + "type": "string", + "rows": 4, + "placeholder": "You are a very smart physics professor. You are great at answering questions about physics in a concise and easy to understand manner. When you don't know the answer to a question you admit that you don't know.", + "id": "promptRetriever_1-input-systemMessage-string" + } + ], + "inputAnchors": [], + "inputs": { + "name": "math", + "description": "Good for answering math questions", + "systemMessage": "You are a very good mathematician. You are great at answering math questions. You are so good because you are able to break down hard problems into their component parts, answer the component parts, and then put them together to answer the broader question." + }, + "outputAnchors": [ + { + "id": "promptRetriever_1-output-promptRetriever-PromptRetriever", + "name": "promptRetriever", + "label": "PromptRetriever", + "type": "PromptRetriever" + } + ], + "outputs": {}, + "selected": false + }, + "selected": false, + "positionAbsolute": { + "x": 539.1322780233141, + "y": -250.72967142925938 + }, + "dragging": false + }, + { + "width": 300, + "height": 632, + "id": "promptRetriever_2", + "position": { + "x": 872.6184534864304, + "y": -366.9443140594265 + }, + "type": "customNode", + "data": { + "id": "promptRetriever_2", + "label": "Prompt Retriever", + "name": "promptRetriever", + "version": 1, + "type": "PromptRetriever", + "baseClasses": ["PromptRetriever"], + "category": "Retrievers", + "description": "Store prompt template with name & description to be later queried by MultiPromptChain", + "inputParams": [ + { + "label": "Prompt Name", + "name": "name", + "type": "string", + "placeholder": "physics-qa", + "id": "promptRetriever_2-input-name-string" + }, + { + "label": "Prompt Description", + "name": "description", + "type": "string", + "rows": 3, + "description": "Description of what the prompt does and when it should be used", + "placeholder": "Good for answering questions about physics", + "id": "promptRetriever_2-input-description-string" + }, + { + "label": "Prompt System Message", + "name": "systemMessage", + "type": "string", + "rows": 4, + "placeholder": "You are a very smart physics professor. You are great at answering questions about physics in a concise and easy to understand manner. When you don't know the answer to a question you admit that you don't know.", + "id": "promptRetriever_2-input-systemMessage-string" + } + ], + "inputAnchors": [], + "inputs": { + "name": "history", + "description": "Good for answering questions about history", + "systemMessage": "You are a very smart history professor. You are great at answering questions about history in a concise and easy to understand manner. When you don't know the answer to a question you admit that you don't know." + }, + "outputAnchors": [ + { + "id": "promptRetriever_2-output-promptRetriever-PromptRetriever", + "name": "promptRetriever", + "label": "PromptRetriever", + "type": "PromptRetriever" + } + ], + "outputs": {}, + "selected": false + }, + "selected": false, + "positionAbsolute": { + "x": 872.6184534864304, + "y": -366.9443140594265 + }, + "dragging": false + }, + { + "width": 300, + "height": 523, + "id": "chatOpenAI_0", + "position": { + "x": 1228.4059611466973, + "y": -326.46419383157513 + }, + "type": "customNode", + "data": { + "id": "chatOpenAI_0", + "label": "ChatOpenAI", + "name": "chatOpenAI", + "version": 1, + "type": "ChatOpenAI", + "baseClasses": ["ChatOpenAI", "BaseChatModel", "BaseLanguageModel"], + "category": "Chat Models", + "description": "Wrapper around OpenAI large language models that use the Chat endpoint", + "inputParams": [ + { + "label": "Connect Credential", + "name": "credential", + "type": "credential", + "credentialNames": ["openAIApi"], + "id": "chatOpenAI_0-input-credential-credential" + }, + { + "label": "Model Name", + "name": "modelName", + "type": "options", + "options": [ + { + "label": "gpt-4", + "name": "gpt-4" + }, + { + "label": "gpt-4-0613", + "name": "gpt-4-0613" + }, + { + "label": "gpt-4-32k", + "name": "gpt-4-32k" + }, + { + "label": "gpt-4-32k-0613", + "name": "gpt-4-32k-0613" + }, + { + "label": "gpt-3.5-turbo", + "name": "gpt-3.5-turbo" + }, + { + "label": "gpt-3.5-turbo-0613", + "name": "gpt-3.5-turbo-0613" + }, + { + "label": "gpt-3.5-turbo-16k", + "name": "gpt-3.5-turbo-16k" + }, + { + "label": "gpt-3.5-turbo-16k-0613", + "name": "gpt-3.5-turbo-16k-0613" + } + ], + "default": "gpt-3.5-turbo", + "optional": true, + "id": "chatOpenAI_0-input-modelName-options" + }, + { + "label": "Temperature", + "name": "temperature", + "type": "number", + "default": 0.9, + "optional": true, + "id": "chatOpenAI_0-input-temperature-number" + }, + { + "label": "Max Tokens", + "name": "maxTokens", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_0-input-maxTokens-number" + }, + { + "label": "Top Probability", + "name": "topP", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_0-input-topP-number" + }, + { + "label": "Frequency Penalty", + "name": "frequencyPenalty", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_0-input-frequencyPenalty-number" + }, + { + "label": "Presence Penalty", + "name": "presencePenalty", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_0-input-presencePenalty-number" + }, + { + "label": "Timeout", + "name": "timeout", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_0-input-timeout-number" + }, + { + "label": "BasePath", + "name": "basepath", + "type": "string", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_0-input-basepath-string" + } + ], + "inputAnchors": [], + "inputs": { + "modelName": "gpt-3.5-turbo", + "temperature": 0.9, + "maxTokens": "", + "topP": "", + "frequencyPenalty": "", + "presencePenalty": "", + "timeout": "", + "basepath": "" + }, + "outputAnchors": [ + { + "id": "chatOpenAI_0-output-chatOpenAI-ChatOpenAI|BaseChatModel|BaseLanguageModel", + "name": "chatOpenAI", + "label": "ChatOpenAI", + "type": "ChatOpenAI | BaseChatModel | BaseLanguageModel" + } + ], + "outputs": {}, + "selected": false + }, + "selected": false, + "positionAbsolute": { + "x": 1228.4059611466973, + "y": -326.46419383157513 + }, + "dragging": false + } + ], + "edges": [ + { + "source": "promptRetriever_0", + "sourceHandle": "promptRetriever_0-output-promptRetriever-PromptRetriever", + "target": "multiPromptChain_0", + "targetHandle": "multiPromptChain_0-input-promptRetriever-PromptRetriever", + "type": "buttonedge", + "id": "promptRetriever_0-promptRetriever_0-output-promptRetriever-PromptRetriever-multiPromptChain_0-multiPromptChain_0-input-promptRetriever-PromptRetriever", + "data": { + "label": "" + } + }, + { + "source": "promptRetriever_2", + "sourceHandle": "promptRetriever_2-output-promptRetriever-PromptRetriever", + "target": "multiPromptChain_0", + "targetHandle": "multiPromptChain_0-input-promptRetriever-PromptRetriever", + "type": "buttonedge", + "id": "promptRetriever_2-promptRetriever_2-output-promptRetriever-PromptRetriever-multiPromptChain_0-multiPromptChain_0-input-promptRetriever-PromptRetriever", + "data": { + "label": "" + } + }, + { + "source": "promptRetriever_1", + "sourceHandle": "promptRetriever_1-output-promptRetriever-PromptRetriever", + "target": "multiPromptChain_0", + "targetHandle": "multiPromptChain_0-input-promptRetriever-PromptRetriever", + "type": "buttonedge", + "id": "promptRetriever_1-promptRetriever_1-output-promptRetriever-PromptRetriever-multiPromptChain_0-multiPromptChain_0-input-promptRetriever-PromptRetriever", + "data": { + "label": "" + } + }, + { + "source": "chatOpenAI_0", + "sourceHandle": "chatOpenAI_0-output-chatOpenAI-ChatOpenAI|BaseChatModel|BaseLanguageModel", + "target": "multiPromptChain_0", + "targetHandle": "multiPromptChain_0-input-model-BaseLanguageModel", + "type": "buttonedge", + "id": "chatOpenAI_0-chatOpenAI_0-output-chatOpenAI-ChatOpenAI|BaseChatModel|BaseLanguageModel-multiPromptChain_0-multiPromptChain_0-input-model-BaseLanguageModel", + "data": { + "label": "" + } + } + ] +} diff --git a/packages/server/marketplaces/chatflows/Multi Retrieval QA Chain.json b/packages/server/marketplaces/chatflows/Multi Retrieval QA Chain.json new file mode 100644 index 000000000..f5604bf60 --- /dev/null +++ b/packages/server/marketplaces/chatflows/Multi Retrieval QA Chain.json @@ -0,0 +1,936 @@ +{ + "description": "A chain that automatically picks an appropriate retriever from multiple different vector databases", + "nodes": [ + { + "width": 300, + "height": 505, + "id": "vectorStoreRetriever_0", + "position": { + "x": 712.9322670298264, + "y": 860.5462810572917 + }, + "type": "customNode", + "data": { + "id": "vectorStoreRetriever_0", + "label": "Vector Store Retriever", + "name": "vectorStoreRetriever", + "version": 1, + "type": "VectorStoreRetriever", + "baseClasses": ["VectorStoreRetriever"], + "category": "Retrievers", + "description": "Store vector store as retriever. Used with MultiRetrievalQAChain", + "inputParams": [ + { + "label": "Retriever Name", + "name": "name", + "type": "string", + "placeholder": "netflix movies", + "id": "vectorStoreRetriever_0-input-name-string" + }, + { + "label": "Retriever Description", + "name": "description", + "type": "string", + "rows": 3, + "description": "Description of when to use the vector store retriever", + "placeholder": "Good for answering questions about netflix movies", + "id": "vectorStoreRetriever_0-input-description-string" + } + ], + "inputAnchors": [ + { + "label": "Vector Store", + "name": "vectorStore", + "type": "VectorStore", + "id": "vectorStoreRetriever_0-input-vectorStore-VectorStore" + } + ], + "inputs": { + "vectorStore": "{{supabaseExistingIndex_0.data.instance}}", + "name": "aqua teen", + "description": "Good for answering questions about Aqua Teen Hunger Force theme song" + }, + "outputAnchors": [ + { + "id": "vectorStoreRetriever_0-output-vectorStoreRetriever-VectorStoreRetriever", + "name": "vectorStoreRetriever", + "label": "VectorStoreRetriever", + "type": "VectorStoreRetriever" + } + ], + "outputs": {}, + "selected": false + }, + "selected": false, + "positionAbsolute": { + "x": 712.9322670298264, + "y": 860.5462810572917 + }, + "dragging": false + }, + { + "width": 300, + "height": 377, + "id": "multiRetrievalQAChain_0", + "position": { + "x": 1563.0150452201099, + "y": 460.78375893303934 + }, + "type": "customNode", + "data": { + "id": "multiRetrievalQAChain_0", + "label": "Multi Retrieval QA Chain", + "name": "multiRetrievalQAChain", + "version": 1, + "type": "MultiRetrievalQAChain", + "baseClasses": ["MultiRetrievalQAChain", "MultiRouteChain", "BaseChain", "BaseLangChain"], + "category": "Chains", + "description": "QA Chain that automatically picks an appropriate vector store from multiple retrievers", + "inputParams": [ + { + "label": "Return Source Documents", + "name": "returnSourceDocuments", + "type": "boolean", + "optional": true + } + ], + "inputAnchors": [ + { + "label": "Language Model", + "name": "model", + "type": "BaseLanguageModel", + "id": "multiRetrievalQAChain_0-input-model-BaseLanguageModel" + }, + { + "label": "Vector Store Retriever", + "name": "vectorStoreRetriever", + "type": "VectorStoreRetriever", + "list": true, + "id": "multiRetrievalQAChain_0-input-vectorStoreRetriever-VectorStoreRetriever" + } + ], + "inputs": { + "model": "{{chatOpenAI_0.data.instance}}", + "vectorStoreRetriever": [ + "{{vectorStoreRetriever_0.data.instance}}", + "{{vectorStoreRetriever_1.data.instance}}", + "{{vectorStoreRetriever_2.data.instance}}" + ] + }, + "outputAnchors": [ + { + "id": "multiRetrievalQAChain_0-output-multiRetrievalQAChain-MultiRetrievalQAChain|MultiRouteChain|BaseChain|BaseLangChain", + "name": "multiRetrievalQAChain", + "label": "MultiRetrievalQAChain", + "type": "MultiRetrievalQAChain | MultiRouteChain | BaseChain | BaseLangChain" + } + ], + "outputs": {}, + "selected": false + }, + "selected": false, + "positionAbsolute": { + "x": 1563.0150452201099, + "y": 460.78375893303934 + }, + "dragging": false + }, + { + "width": 300, + "height": 505, + "id": "vectorStoreRetriever_1", + "position": { + "x": 711.4902931206071, + "y": 315.2414600651632 + }, + "type": "customNode", + "data": { + "id": "vectorStoreRetriever_1", + "label": "Vector Store Retriever", + "name": "vectorStoreRetriever", + "version": 1, + "type": "VectorStoreRetriever", + "baseClasses": ["VectorStoreRetriever"], + "category": "Retrievers", + "description": "Store vector store as retriever. Used with MultiRetrievalQAChain", + "inputParams": [ + { + "label": "Retriever Name", + "name": "name", + "type": "string", + "placeholder": "netflix movies", + "id": "vectorStoreRetriever_1-input-name-string" + }, + { + "label": "Retriever Description", + "name": "description", + "type": "string", + "rows": 3, + "description": "Description of when to use the vector store retriever", + "placeholder": "Good for answering questions about netflix movies", + "id": "vectorStoreRetriever_1-input-description-string" + } + ], + "inputAnchors": [ + { + "label": "Vector Store", + "name": "vectorStore", + "type": "VectorStore", + "id": "vectorStoreRetriever_1-input-vectorStore-VectorStore" + } + ], + "inputs": { + "vectorStore": "{{chromaExistingIndex_0.data.instance}}", + "name": "mst3k", + "description": "Good for answering questions about Mystery Science Theater 3000 theme song" + }, + "outputAnchors": [ + { + "id": "vectorStoreRetriever_1-output-vectorStoreRetriever-VectorStoreRetriever", + "name": "vectorStoreRetriever", + "label": "VectorStoreRetriever", + "type": "VectorStoreRetriever" + } + ], + "outputs": {}, + "selected": false + }, + "selected": false, + "positionAbsolute": { + "x": 711.4902931206071, + "y": 315.2414600651632 + }, + "dragging": false + }, + { + "width": 300, + "height": 505, + "id": "vectorStoreRetriever_2", + "position": { + "x": 706.0716220151372, + "y": -217.51566869136752 + }, + "type": "customNode", + "data": { + "id": "vectorStoreRetriever_2", + "label": "Vector Store Retriever", + "name": "vectorStoreRetriever", + "version": 1, + "type": "VectorStoreRetriever", + "baseClasses": ["VectorStoreRetriever"], + "category": "Retrievers", + "description": "Store vector store as retriever. Used with MultiRetrievalQAChain", + "inputParams": [ + { + "label": "Retriever Name", + "name": "name", + "type": "string", + "placeholder": "netflix movies", + "id": "vectorStoreRetriever_2-input-name-string" + }, + { + "label": "Retriever Description", + "name": "description", + "type": "string", + "rows": 3, + "description": "Description of when to use the vector store retriever", + "placeholder": "Good for answering questions about netflix movies", + "id": "vectorStoreRetriever_2-input-description-string" + } + ], + "inputAnchors": [ + { + "label": "Vector Store", + "name": "vectorStore", + "type": "VectorStore", + "id": "vectorStoreRetriever_2-input-vectorStore-VectorStore" + } + ], + "inputs": { + "vectorStore": "{{pineconeExistingIndex_0.data.instance}}", + "name": "animaniacs", + "description": "Good for answering questions about Animaniacs theme song" + }, + "outputAnchors": [ + { + "id": "vectorStoreRetriever_2-output-vectorStoreRetriever-VectorStoreRetriever", + "name": "vectorStoreRetriever", + "label": "VectorStoreRetriever", + "type": "VectorStoreRetriever" + } + ], + "outputs": {}, + "selected": false + }, + "selected": false, + "positionAbsolute": { + "x": 706.0716220151372, + "y": -217.51566869136752 + }, + "dragging": false + }, + { + "width": 300, + "height": 505, + "id": "pineconeExistingIndex_0", + "position": { + "x": 267.45589163840236, + "y": -300.13817634747346 + }, + "type": "customNode", + "data": { + "id": "pineconeExistingIndex_0", + "label": "Pinecone Load Existing Index", + "name": "pineconeExistingIndex", + "version": 1, + "type": "Pinecone", + "baseClasses": ["Pinecone", "VectorStoreRetriever", "BaseRetriever"], + "category": "Vector Stores", + "description": "Load existing index from Pinecone (i.e: Document has been upserted)", + "inputParams": [ + { + "label": "Connect Credential", + "name": "credential", + "type": "credential", + "credentialNames": ["pineconeApi"], + "id": "pineconeExistingIndex_0-input-credential-credential" + }, + { + "label": "Pinecone Index", + "name": "pineconeIndex", + "type": "string", + "id": "pineconeExistingIndex_0-input-pineconeIndex-string" + }, + { + "label": "Pinecone Namespace", + "name": "pineconeNamespace", + "type": "string", + "placeholder": "my-first-namespace", + "additionalParams": true, + "optional": true, + "id": "pineconeExistingIndex_0-input-pineconeNamespace-string" + }, + { + "label": "Pinecone Metadata Filter", + "name": "pineconeMetadataFilter", + "type": "json", + "optional": true, + "additionalParams": true, + "id": "pineconeExistingIndex_0-input-pineconeMetadataFilter-json" + }, + { + "label": "Top K", + "name": "topK", + "description": "Number of top results to fetch. Default to 4", + "placeholder": "4", + "type": "number", + "additionalParams": true, + "optional": true, + "id": "pineconeExistingIndex_0-input-topK-number" + } + ], + "inputAnchors": [ + { + "label": "Embeddings", + "name": "embeddings", + "type": "Embeddings", + "id": "pineconeExistingIndex_0-input-embeddings-Embeddings" + } + ], + "inputs": { + "embeddings": "{{openAIEmbeddings_0.data.instance}}", + "pineconeIndex": "", + "pineconeNamespace": "", + "pineconeMetadataFilter": "", + "topK": "" + }, + "outputAnchors": [ + { + "name": "output", + "label": "Output", + "type": "options", + "options": [ + { + "id": "pineconeExistingIndex_0-output-retriever-Pinecone|VectorStoreRetriever|BaseRetriever", + "name": "retriever", + "label": "Pinecone Retriever", + "type": "Pinecone | VectorStoreRetriever | BaseRetriever" + }, + { + "id": "pineconeExistingIndex_0-output-vectorStore-Pinecone|VectorStore", + "name": "vectorStore", + "label": "Pinecone Vector Store", + "type": "Pinecone | VectorStore" + } + ], + "default": "retriever" + } + ], + "outputs": { + "output": "vectorStore" + }, + "selected": false + }, + "selected": false, + "positionAbsolute": { + "x": 267.45589163840236, + "y": -300.13817634747346 + }, + "dragging": false + }, + { + "width": 300, + "height": 506, + "id": "chromaExistingIndex_0", + "position": { + "x": 264.5271545331116, + "y": 246.32716342844174 + }, + "type": "customNode", + "data": { + "id": "chromaExistingIndex_0", + "label": "Chroma Load Existing Index", + "name": "chromaExistingIndex", + "version": 1, + "type": "Chroma", + "baseClasses": ["Chroma", "VectorStoreRetriever", "BaseRetriever"], + "category": "Vector Stores", + "description": "Load existing index from Chroma (i.e: Document has been upserted)", + "inputParams": [ + { + "label": "Collection Name", + "name": "collectionName", + "type": "string", + "id": "chromaExistingIndex_0-input-collectionName-string" + }, + { + "label": "Chroma URL", + "name": "chromaURL", + "type": "string", + "optional": true, + "id": "chromaExistingIndex_0-input-chromaURL-string" + }, + { + "label": "Top K", + "name": "topK", + "description": "Number of top results to fetch. Default to 4", + "placeholder": "4", + "type": "number", + "additionalParams": true, + "optional": true, + "id": "chromaExistingIndex_0-input-topK-number" + } + ], + "inputAnchors": [ + { + "label": "Embeddings", + "name": "embeddings", + "type": "Embeddings", + "id": "chromaExistingIndex_0-input-embeddings-Embeddings" + } + ], + "inputs": { + "embeddings": "{{openAIEmbeddings_0.data.instance}}", + "collectionName": "", + "chromaURL": "", + "topK": "" + }, + "outputAnchors": [ + { + "name": "output", + "label": "Output", + "type": "options", + "options": [ + { + "id": "chromaExistingIndex_0-output-retriever-Chroma|VectorStoreRetriever|BaseRetriever", + "name": "retriever", + "label": "Chroma Retriever", + "type": "Chroma | VectorStoreRetriever | BaseRetriever" + }, + { + "id": "chromaExistingIndex_0-output-vectorStore-Chroma|VectorStore", + "name": "vectorStore", + "label": "Chroma Vector Store", + "type": "Chroma | VectorStore" + } + ], + "default": "retriever" + } + ], + "outputs": { + "output": "vectorStore" + }, + "selected": false + }, + "selected": false, + "positionAbsolute": { + "x": 264.5271545331116, + "y": 246.32716342844174 + }, + "dragging": false + }, + { + "width": 300, + "height": 329, + "id": "openAIEmbeddings_0", + "position": { + "x": -212.46977797044045, + "y": 252.45726960585722 + }, + "type": "customNode", + "data": { + "id": "openAIEmbeddings_0", + "label": "OpenAI Embeddings", + "name": "openAIEmbeddings", + "version": 1, + "type": "OpenAIEmbeddings", + "baseClasses": ["OpenAIEmbeddings", "Embeddings"], + "category": "Embeddings", + "description": "OpenAI API to generate embeddings for a given text", + "inputParams": [ + { + "label": "Connect Credential", + "name": "credential", + "type": "credential", + "credentialNames": ["openAIApi"], + "id": "openAIEmbeddings_0-input-credential-credential" + }, + { + "label": "Strip New Lines", + "name": "stripNewLines", + "type": "boolean", + "optional": true, + "additionalParams": true, + "id": "openAIEmbeddings_0-input-stripNewLines-boolean" + }, + { + "label": "Batch Size", + "name": "batchSize", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "openAIEmbeddings_0-input-batchSize-number" + }, + { + "label": "Timeout", + "name": "timeout", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "openAIEmbeddings_0-input-timeout-number" + }, + { + "label": "BasePath", + "name": "basepath", + "type": "string", + "optional": true, + "additionalParams": true, + "id": "openAIEmbeddings_0-input-basepath-string" + } + ], + "inputAnchors": [], + "inputs": { + "stripNewLines": "", + "batchSize": "", + "timeout": "", + "basepath": "" + }, + "outputAnchors": [ + { + "id": "openAIEmbeddings_0-output-openAIEmbeddings-OpenAIEmbeddings|Embeddings", + "name": "openAIEmbeddings", + "label": "OpenAIEmbeddings", + "type": "OpenAIEmbeddings | Embeddings" + } + ], + "outputs": {}, + "selected": false + }, + "selected": false, + "positionAbsolute": { + "x": -212.46977797044045, + "y": 252.45726960585722 + }, + "dragging": false + }, + { + "width": 300, + "height": 702, + "id": "supabaseExistingIndex_0", + "position": { + "x": 270.90499551102573, + "y": 783.5053782099461 + }, + "type": "customNode", + "data": { + "id": "supabaseExistingIndex_0", + "label": "Supabase Load Existing Index", + "name": "supabaseExistingIndex", + "version": 1, + "type": "Supabase", + "baseClasses": ["Supabase", "VectorStoreRetriever", "BaseRetriever"], + "category": "Vector Stores", + "description": "Load existing index from Supabase (i.e: Document has been upserted)", + "inputParams": [ + { + "label": "Connect Credential", + "name": "credential", + "type": "credential", + "credentialNames": ["supabaseApi"], + "id": "supabaseExistingIndex_0-input-credential-credential" + }, + { + "label": "Supabase Project URL", + "name": "supabaseProjUrl", + "type": "string", + "id": "supabaseExistingIndex_0-input-supabaseProjUrl-string" + }, + { + "label": "Table Name", + "name": "tableName", + "type": "string", + "id": "supabaseExistingIndex_0-input-tableName-string" + }, + { + "label": "Query Name", + "name": "queryName", + "type": "string", + "id": "supabaseExistingIndex_0-input-queryName-string" + }, + { + "label": "Supabase Metadata Filter", + "name": "supabaseMetadataFilter", + "type": "json", + "optional": true, + "additionalParams": true, + "id": "supabaseExistingIndex_0-input-supabaseMetadataFilter-json" + }, + { + "label": "Top K", + "name": "topK", + "description": "Number of top results to fetch. Default to 4", + "placeholder": "4", + "type": "number", + "additionalParams": true, + "optional": true, + "id": "supabaseExistingIndex_0-input-topK-number" + } + ], + "inputAnchors": [ + { + "label": "Embeddings", + "name": "embeddings", + "type": "Embeddings", + "id": "supabaseExistingIndex_0-input-embeddings-Embeddings" + } + ], + "inputs": { + "embeddings": "{{openAIEmbeddings_0.data.instance}}", + "supabaseProjUrl": "", + "tableName": "", + "queryName": "", + "supabaseMetadataFilter": "", + "topK": "" + }, + "outputAnchors": [ + { + "name": "output", + "label": "Output", + "type": "options", + "options": [ + { + "id": "supabaseExistingIndex_0-output-retriever-Supabase|VectorStoreRetriever|BaseRetriever", + "name": "retriever", + "label": "Supabase Retriever", + "type": "Supabase | VectorStoreRetriever | BaseRetriever" + }, + { + "id": "supabaseExistingIndex_0-output-vectorStore-Supabase|VectorStore", + "name": "vectorStore", + "label": "Supabase Vector Store", + "type": "Supabase | VectorStore" + } + ], + "default": "retriever" + } + ], + "outputs": { + "output": "vectorStore" + }, + "selected": false + }, + "selected": false, + "positionAbsolute": { + "x": 270.90499551102573, + "y": 783.5053782099461 + }, + "dragging": false + }, + { + "width": 300, + "height": 523, + "id": "chatOpenAI_0", + "position": { + "x": 1154.0989175770958, + "y": -255.77769163789395 + }, + "type": "customNode", + "data": { + "id": "chatOpenAI_0", + "label": "ChatOpenAI", + "name": "chatOpenAI", + "version": 1, + "type": "ChatOpenAI", + "baseClasses": ["ChatOpenAI", "BaseChatModel", "BaseLanguageModel"], + "category": "Chat Models", + "description": "Wrapper around OpenAI large language models that use the Chat endpoint", + "inputParams": [ + { + "label": "Connect Credential", + "name": "credential", + "type": "credential", + "credentialNames": ["openAIApi"], + "id": "chatOpenAI_0-input-credential-credential" + }, + { + "label": "Model Name", + "name": "modelName", + "type": "options", + "options": [ + { + "label": "gpt-4", + "name": "gpt-4" + }, + { + "label": "gpt-4-0613", + "name": "gpt-4-0613" + }, + { + "label": "gpt-4-32k", + "name": "gpt-4-32k" + }, + { + "label": "gpt-4-32k-0613", + "name": "gpt-4-32k-0613" + }, + { + "label": "gpt-3.5-turbo", + "name": "gpt-3.5-turbo" + }, + { + "label": "gpt-3.5-turbo-0613", + "name": "gpt-3.5-turbo-0613" + }, + { + "label": "gpt-3.5-turbo-16k", + "name": "gpt-3.5-turbo-16k" + }, + { + "label": "gpt-3.5-turbo-16k-0613", + "name": "gpt-3.5-turbo-16k-0613" + } + ], + "default": "gpt-3.5-turbo", + "optional": true, + "id": "chatOpenAI_0-input-modelName-options" + }, + { + "label": "Temperature", + "name": "temperature", + "type": "number", + "default": 0.9, + "optional": true, + "id": "chatOpenAI_0-input-temperature-number" + }, + { + "label": "Max Tokens", + "name": "maxTokens", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_0-input-maxTokens-number" + }, + { + "label": "Top Probability", + "name": "topP", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_0-input-topP-number" + }, + { + "label": "Frequency Penalty", + "name": "frequencyPenalty", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_0-input-frequencyPenalty-number" + }, + { + "label": "Presence Penalty", + "name": "presencePenalty", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_0-input-presencePenalty-number" + }, + { + "label": "Timeout", + "name": "timeout", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_0-input-timeout-number" + }, + { + "label": "BasePath", + "name": "basepath", + "type": "string", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_0-input-basepath-string" + } + ], + "inputAnchors": [], + "inputs": { + "modelName": "gpt-3.5-turbo", + "temperature": 0.9, + "maxTokens": "", + "topP": "", + "frequencyPenalty": "", + "presencePenalty": "", + "timeout": "", + "basepath": "" + }, + "outputAnchors": [ + { + "id": "chatOpenAI_0-output-chatOpenAI-ChatOpenAI|BaseChatModel|BaseLanguageModel", + "name": "chatOpenAI", + "label": "ChatOpenAI", + "type": "ChatOpenAI | BaseChatModel | BaseLanguageModel" + } + ], + "outputs": {}, + "selected": false + }, + "selected": false, + "positionAbsolute": { + "x": 1154.0989175770958, + "y": -255.77769163789395 + }, + "dragging": false + } + ], + "edges": [ + { + "source": "vectorStoreRetriever_0", + "sourceHandle": "vectorStoreRetriever_0-output-vectorStoreRetriever-VectorStoreRetriever", + "target": "multiRetrievalQAChain_0", + "targetHandle": "multiRetrievalQAChain_0-input-vectorStoreRetriever-VectorStoreRetriever", + "type": "buttonedge", + "id": "vectorStoreRetriever_0-vectorStoreRetriever_0-output-vectorStoreRetriever-VectorStoreRetriever-multiRetrievalQAChain_0-multiRetrievalQAChain_0-input-vectorStoreRetriever-VectorStoreRetriever", + "data": { + "label": "" + } + }, + { + "source": "vectorStoreRetriever_1", + "sourceHandle": "vectorStoreRetriever_1-output-vectorStoreRetriever-VectorStoreRetriever", + "target": "multiRetrievalQAChain_0", + "targetHandle": "multiRetrievalQAChain_0-input-vectorStoreRetriever-VectorStoreRetriever", + "type": "buttonedge", + "id": "vectorStoreRetriever_1-vectorStoreRetriever_1-output-vectorStoreRetriever-VectorStoreRetriever-multiRetrievalQAChain_0-multiRetrievalQAChain_0-input-vectorStoreRetriever-VectorStoreRetriever", + "data": { + "label": "" + } + }, + { + "source": "vectorStoreRetriever_2", + "sourceHandle": "vectorStoreRetriever_2-output-vectorStoreRetriever-VectorStoreRetriever", + "target": "multiRetrievalQAChain_0", + "targetHandle": "multiRetrievalQAChain_0-input-vectorStoreRetriever-VectorStoreRetriever", + "type": "buttonedge", + "id": "vectorStoreRetriever_2-vectorStoreRetriever_2-output-vectorStoreRetriever-VectorStoreRetriever-multiRetrievalQAChain_0-multiRetrievalQAChain_0-input-vectorStoreRetriever-VectorStoreRetriever", + "data": { + "label": "" + } + }, + { + "source": "pineconeExistingIndex_0", + "sourceHandle": "pineconeExistingIndex_0-output-vectorStore-Pinecone|VectorStore", + "target": "vectorStoreRetriever_2", + "targetHandle": "vectorStoreRetriever_2-input-vectorStore-VectorStore", + "type": "buttonedge", + "id": "pineconeExistingIndex_0-pineconeExistingIndex_0-output-vectorStore-Pinecone|VectorStore-vectorStoreRetriever_2-vectorStoreRetriever_2-input-vectorStore-VectorStore", + "data": { + "label": "" + } + }, + { + "source": "chromaExistingIndex_0", + "sourceHandle": "chromaExistingIndex_0-output-vectorStore-Chroma|VectorStore", + "target": "vectorStoreRetriever_1", + "targetHandle": "vectorStoreRetriever_1-input-vectorStore-VectorStore", + "type": "buttonedge", + "id": "chromaExistingIndex_0-chromaExistingIndex_0-output-vectorStore-Chroma|VectorStore-vectorStoreRetriever_1-vectorStoreRetriever_1-input-vectorStore-VectorStore", + "data": { + "label": "" + } + }, + { + "source": "openAIEmbeddings_0", + "sourceHandle": "openAIEmbeddings_0-output-openAIEmbeddings-OpenAIEmbeddings|Embeddings", + "target": "pineconeExistingIndex_0", + "targetHandle": "pineconeExistingIndex_0-input-embeddings-Embeddings", + "type": "buttonedge", + "id": "openAIEmbeddings_0-openAIEmbeddings_0-output-openAIEmbeddings-OpenAIEmbeddings|Embeddings-pineconeExistingIndex_0-pineconeExistingIndex_0-input-embeddings-Embeddings", + "data": { + "label": "" + } + }, + { + "source": "openAIEmbeddings_0", + "sourceHandle": "openAIEmbeddings_0-output-openAIEmbeddings-OpenAIEmbeddings|Embeddings", + "target": "chromaExistingIndex_0", + "targetHandle": "chromaExistingIndex_0-input-embeddings-Embeddings", + "type": "buttonedge", + "id": "openAIEmbeddings_0-openAIEmbeddings_0-output-openAIEmbeddings-OpenAIEmbeddings|Embeddings-chromaExistingIndex_0-chromaExistingIndex_0-input-embeddings-Embeddings", + "data": { + "label": "" + } + }, + { + "source": "openAIEmbeddings_0", + "sourceHandle": "openAIEmbeddings_0-output-openAIEmbeddings-OpenAIEmbeddings|Embeddings", + "target": "supabaseExistingIndex_0", + "targetHandle": "supabaseExistingIndex_0-input-embeddings-Embeddings", + "type": "buttonedge", + "id": "openAIEmbeddings_0-openAIEmbeddings_0-output-openAIEmbeddings-OpenAIEmbeddings|Embeddings-supabaseExistingIndex_0-supabaseExistingIndex_0-input-embeddings-Embeddings", + "data": { + "label": "" + } + }, + { + "source": "supabaseExistingIndex_0", + "sourceHandle": "supabaseExistingIndex_0-output-vectorStore-Supabase|VectorStore", + "target": "vectorStoreRetriever_0", + "targetHandle": "vectorStoreRetriever_0-input-vectorStore-VectorStore", + "type": "buttonedge", + "id": "supabaseExistingIndex_0-supabaseExistingIndex_0-output-vectorStore-Supabase|VectorStore-vectorStoreRetriever_0-vectorStoreRetriever_0-input-vectorStore-VectorStore", + "data": { + "label": "" + } + }, + { + "source": "chatOpenAI_0", + "sourceHandle": "chatOpenAI_0-output-chatOpenAI-ChatOpenAI|BaseChatModel|BaseLanguageModel", + "target": "multiRetrievalQAChain_0", + "targetHandle": "multiRetrievalQAChain_0-input-model-BaseLanguageModel", + "type": "buttonedge", + "id": "chatOpenAI_0-chatOpenAI_0-output-chatOpenAI-ChatOpenAI|BaseChatModel|BaseLanguageModel-multiRetrievalQAChain_0-multiRetrievalQAChain_0-input-model-BaseLanguageModel", + "data": { + "label": "" + } + } + ] +} diff --git a/packages/server/marketplaces/Multiple VectorDB.json b/packages/server/marketplaces/chatflows/Multiple VectorDB.json similarity index 71% rename from packages/server/marketplaces/Multiple VectorDB.json rename to packages/server/marketplaces/chatflows/Multiple VectorDB.json index 72eb3081c..101a683b6 100644 --- a/packages/server/marketplaces/Multiple VectorDB.json +++ b/packages/server/marketplaces/chatflows/Multiple VectorDB.json @@ -1,29 +1,676 @@ { - "description": "Use the agent to choose between multiple different vector databases", + "description": "Use the agent to choose between multiple different vector databases, with the ability to use other tools", "nodes": [ { "width": 300, - "height": 330, - "id": "openAIEmbeddings_2", + "height": 602, + "id": "chainTool_2", "position": { - "x": 155.07832615625986, - "y": -778.383353751991 + "x": 1251.240972921597, + "y": -922.9180420195128 }, "type": "customNode", "data": { - "id": "openAIEmbeddings_2", + "id": "chainTool_2", + "label": "Chain Tool", + "name": "chainTool", + "version": 1, + "type": "ChainTool", + "baseClasses": ["ChainTool", "DynamicTool", "Tool", "StructuredTool", "BaseLangChain"], + "category": "Tools", + "description": "Use a chain as allowed tool for agent", + "inputParams": [ + { + "label": "Chain Name", + "name": "name", + "type": "string", + "placeholder": "state-of-union-qa", + "id": "chainTool_2-input-name-string" + }, + { + "label": "Chain Description", + "name": "description", + "type": "string", + "rows": 3, + "placeholder": "State of the Union QA - useful for when you need to ask questions about the most recent state of the union address.", + "id": "chainTool_2-input-description-string" + }, + { + "label": "Return Direct", + "name": "returnDirect", + "type": "boolean", + "optional": true, + "id": "chainTool_2-input-returnDirect-boolean" + } + ], + "inputAnchors": [ + { + "label": "Base Chain", + "name": "baseChain", + "type": "BaseChain", + "id": "chainTool_2-input-baseChain-BaseChain" + } + ], + "inputs": { + "name": "ai-paper-qa", + "description": "AI Paper QA - useful for when you need to ask questions about the AI-Generated Content paper.", + "returnDirect": "", + "baseChain": "{{retrievalQAChain_0.data.instance}}" + }, + "outputAnchors": [ + { + "id": "chainTool_2-output-chainTool-ChainTool|DynamicTool|Tool|StructuredTool|BaseLangChain", + "name": "chainTool", + "label": "ChainTool", + "type": "ChainTool | DynamicTool | Tool | StructuredTool | BaseLangChain" + } + ], + "outputs": {}, + "selected": false + }, + "selected": false, + "positionAbsolute": { + "x": 1251.240972921597, + "y": -922.9180420195128 + }, + "dragging": false + }, + { + "width": 300, + "height": 602, + "id": "chainTool_3", + "position": { + "x": 1267.7142132085273, + "y": -85.7749282485849 + }, + "type": "customNode", + "data": { + "id": "chainTool_3", + "label": "Chain Tool", + "name": "chainTool", + "version": 1, + "type": "ChainTool", + "baseClasses": ["ChainTool", "DynamicTool", "Tool", "StructuredTool", "BaseLangChain"], + "category": "Tools", + "description": "Use a chain as allowed tool for agent", + "inputParams": [ + { + "label": "Chain Name", + "name": "name", + "type": "string", + "placeholder": "state-of-union-qa", + "id": "chainTool_3-input-name-string" + }, + { + "label": "Chain Description", + "name": "description", + "type": "string", + "rows": 3, + "placeholder": "State of the Union QA - useful for when you need to ask questions about the most recent state of the union address.", + "id": "chainTool_3-input-description-string" + }, + { + "label": "Return Direct", + "name": "returnDirect", + "type": "boolean", + "optional": true, + "id": "chainTool_3-input-returnDirect-boolean" + } + ], + "inputAnchors": [ + { + "label": "Base Chain", + "name": "baseChain", + "type": "BaseChain", + "id": "chainTool_3-input-baseChain-BaseChain" + } + ], + "inputs": { + "name": "state-of-union-qa", + "description": "State of the Union QA - useful for when you need to ask questions about the most recent state of the union address.", + "returnDirect": "", + "baseChain": "{{retrievalQAChain_1.data.instance}}" + }, + "outputAnchors": [ + { + "id": "chainTool_3-output-chainTool-ChainTool|DynamicTool|Tool|StructuredTool|BaseLangChain", + "name": "chainTool", + "label": "ChainTool", + "type": "ChainTool | DynamicTool | Tool | StructuredTool | BaseLangChain" + } + ], + "outputs": {}, + "selected": false + }, + "selected": false, + "dragging": false, + "positionAbsolute": { + "x": 1267.7142132085273, + "y": -85.7749282485849 + } + }, + { + "width": 300, + "height": 280, + "id": "mrklAgentLLM_0", + "position": { + "x": 2061.891333395338, + "y": -140.0694021759809 + }, + "type": "customNode", + "data": { + "id": "mrklAgentLLM_0", + "label": "MRKL Agent for LLMs", + "name": "mrklAgentLLM", + "version": 1, + "type": "AgentExecutor", + "baseClasses": ["AgentExecutor", "BaseChain", "BaseLangChain"], + "category": "Agents", + "description": "Agent that uses the ReAct Framework to decide what action to take, optimized to be used with LLMs", + "inputParams": [], + "inputAnchors": [ + { + "label": "Allowed Tools", + "name": "tools", + "type": "Tool", + "list": true, + "id": "mrklAgentLLM_0-input-tools-Tool" + }, + { + "label": "Language Model", + "name": "model", + "type": "BaseLanguageModel", + "id": "mrklAgentLLM_0-input-model-BaseLanguageModel" + } + ], + "inputs": { + "tools": ["{{chainTool_2.data.instance}}", "{{chainTool_3.data.instance}}"], + "model": "{{openAI_4.data.instance}}" + }, + "outputAnchors": [ + { + "id": "mrklAgentLLM_0-output-mrklAgentLLM-AgentExecutor|BaseChain|BaseLangChain", + "name": "mrklAgentLLM", + "label": "AgentExecutor", + "type": "AgentExecutor | BaseChain | BaseLangChain" + } + ], + "outputs": {}, + "selected": false + }, + "selected": false, + "positionAbsolute": { + "x": 2061.891333395338, + "y": -140.0694021759809 + }, + "dragging": false + }, + { + "width": 300, + "height": 280, + "id": "retrievalQAChain_0", + "position": { + "x": 898.1253096948574, + "y": -859.1174013418433 + }, + "type": "customNode", + "data": { + "id": "retrievalQAChain_0", + "label": "Retrieval QA Chain", + "name": "retrievalQAChain", + "version": 1, + "type": "RetrievalQAChain", + "baseClasses": ["RetrievalQAChain", "BaseChain", "BaseLangChain"], + "category": "Chains", + "description": "QA chain to answer a question based on the retrieved documents", + "inputParams": [], + "inputAnchors": [ + { + "label": "Language Model", + "name": "model", + "type": "BaseLanguageModel", + "id": "retrievalQAChain_0-input-model-BaseLanguageModel" + }, + { + "label": "Vector Store Retriever", + "name": "vectorStoreRetriever", + "type": "BaseRetriever", + "id": "retrievalQAChain_0-input-vectorStoreRetriever-BaseRetriever" + } + ], + "inputs": { + "model": "{{openAI_2.data.instance}}", + "vectorStoreRetriever": "{{chromaExistingIndex_0.data.instance}}" + }, + "outputAnchors": [ + { + "id": "retrievalQAChain_0-output-retrievalQAChain-RetrievalQAChain|BaseChain|BaseLangChain", + "name": "retrievalQAChain", + "label": "RetrievalQAChain", + "type": "RetrievalQAChain | BaseChain | BaseLangChain" + } + ], + "outputs": {}, + "selected": false + }, + "selected": false, + "positionAbsolute": { + "x": 898.1253096948574, + "y": -859.1174013418433 + }, + "dragging": false + }, + { + "width": 300, + "height": 280, + "id": "retrievalQAChain_1", + "position": { + "x": 895.4349543765911, + "y": 166.60331503487222 + }, + "type": "customNode", + "data": { + "id": "retrievalQAChain_1", + "label": "Retrieval QA Chain", + "name": "retrievalQAChain", + "version": 1, + "type": "RetrievalQAChain", + "baseClasses": ["RetrievalQAChain", "BaseChain", "BaseLangChain"], + "category": "Chains", + "description": "QA chain to answer a question based on the retrieved documents", + "inputParams": [], + "inputAnchors": [ + { + "label": "Language Model", + "name": "model", + "type": "BaseLanguageModel", + "id": "retrievalQAChain_1-input-model-BaseLanguageModel" + }, + { + "label": "Vector Store Retriever", + "name": "vectorStoreRetriever", + "type": "BaseRetriever", + "id": "retrievalQAChain_1-input-vectorStoreRetriever-BaseRetriever" + } + ], + "inputs": { + "model": "{{openAI_3.data.instance}}", + "vectorStoreRetriever": "{{pineconeExistingIndex_0.data.instance}}" + }, + "outputAnchors": [ + { + "id": "retrievalQAChain_1-output-retrievalQAChain-RetrievalQAChain|BaseChain|BaseLangChain", + "name": "retrievalQAChain", + "label": "RetrievalQAChain", + "type": "RetrievalQAChain | BaseChain | BaseLangChain" + } + ], + "outputs": {}, + "selected": false + }, + "selected": false, + "positionAbsolute": { + "x": 895.4349543765911, + "y": 166.60331503487222 + }, + "dragging": false + }, + { + "width": 300, + "height": 523, + "id": "openAI_2", + "position": { + "x": 520.8471510168988, + "y": -1282.1183473852964 + }, + "type": "customNode", + "data": { + "id": "openAI_2", + "label": "OpenAI", + "name": "openAI", + "version": 1, + "type": "OpenAI", + "baseClasses": ["OpenAI", "BaseLLM", "BaseLanguageModel"], + "category": "LLMs", + "description": "Wrapper around OpenAI large language models", + "inputParams": [ + { + "label": "Connect Credential", + "name": "credential", + "type": "credential", + "credentialNames": ["openAIApi"], + "id": "openAI_2-input-credential-credential" + }, + { + "label": "Model Name", + "name": "modelName", + "type": "options", + "options": [ + { + "label": "text-davinci-003", + "name": "text-davinci-003" + }, + { + "label": "text-davinci-002", + "name": "text-davinci-002" + }, + { + "label": "text-curie-001", + "name": "text-curie-001" + }, + { + "label": "text-babbage-001", + "name": "text-babbage-001" + } + ], + "default": "text-davinci-003", + "optional": true, + "id": "openAI_2-input-modelName-options" + }, + { + "label": "Temperature", + "name": "temperature", + "type": "number", + "default": 0.7, + "optional": true, + "id": "openAI_2-input-temperature-number" + }, + { + "label": "Max Tokens", + "name": "maxTokens", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "openAI_2-input-maxTokens-number" + }, + { + "label": "Top Probability", + "name": "topP", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "openAI_2-input-topP-number" + }, + { + "label": "Best Of", + "name": "bestOf", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "openAI_2-input-bestOf-number" + }, + { + "label": "Frequency Penalty", + "name": "frequencyPenalty", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "openAI_2-input-frequencyPenalty-number" + }, + { + "label": "Presence Penalty", + "name": "presencePenalty", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "openAI_2-input-presencePenalty-number" + }, + { + "label": "Batch Size", + "name": "batchSize", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "openAI_2-input-batchSize-number" + }, + { + "label": "Timeout", + "name": "timeout", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "openAI_2-input-timeout-number" + }, + { + "label": "BasePath", + "name": "basepath", + "type": "string", + "optional": true, + "additionalParams": true, + "id": "openAI_2-input-basepath-string" + } + ], + "inputAnchors": [], + "inputs": { + "modelName": "text-davinci-003", + "temperature": 0.7, + "maxTokens": "", + "topP": "", + "bestOf": "", + "frequencyPenalty": "", + "presencePenalty": "", + "batchSize": "", + "timeout": "", + "basepath": "" + }, + "outputAnchors": [ + { + "id": "openAI_2-output-openAI-OpenAI|BaseLLM|BaseLanguageModel", + "name": "openAI", + "label": "OpenAI", + "type": "OpenAI | BaseLLM | BaseLanguageModel" + } + ], + "outputs": {}, + "selected": false + }, + "selected": false, + "positionAbsolute": { + "x": 520.8471510168988, + "y": -1282.1183473852964 + }, + "dragging": false + }, + { + "width": 300, + "height": 329, + "id": "openAIEmbeddings_1", + "position": { + "x": 148.65789308409916, + "y": -915.1982675859331 + }, + "type": "customNode", + "data": { + "id": "openAIEmbeddings_1", "label": "OpenAI Embeddings", "name": "openAIEmbeddings", + "version": 1, "type": "OpenAIEmbeddings", "baseClasses": ["OpenAIEmbeddings", "Embeddings"], "category": "Embeddings", "description": "OpenAI API to generate embeddings for a given text", "inputParams": [ { - "label": "OpenAI Api Key", - "name": "openAIApiKey", - "type": "password", - "id": "openAIEmbeddings_2-input-openAIApiKey-password" + "label": "Connect Credential", + "name": "credential", + "type": "credential", + "credentialNames": ["openAIApi"], + "id": "openAIEmbeddings_1-input-credential-credential" + }, + { + "label": "Strip New Lines", + "name": "stripNewLines", + "type": "boolean", + "optional": true, + "additionalParams": true, + "id": "openAIEmbeddings_1-input-stripNewLines-boolean" + }, + { + "label": "Batch Size", + "name": "batchSize", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "openAIEmbeddings_1-input-batchSize-number" + }, + { + "label": "Timeout", + "name": "timeout", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "openAIEmbeddings_1-input-timeout-number" + }, + { + "label": "BasePath", + "name": "basepath", + "type": "string", + "optional": true, + "additionalParams": true, + "id": "openAIEmbeddings_1-input-basepath-string" + } + ], + "inputAnchors": [], + "inputs": { + "stripNewLines": "", + "batchSize": "", + "timeout": "", + "basepath": "" + }, + "outputAnchors": [ + { + "id": "openAIEmbeddings_1-output-openAIEmbeddings-OpenAIEmbeddings|Embeddings", + "name": "openAIEmbeddings", + "label": "OpenAIEmbeddings", + "type": "OpenAIEmbeddings | Embeddings" + } + ], + "outputs": {}, + "selected": false + }, + "selected": false, + "positionAbsolute": { + "x": 148.65789308409916, + "y": -915.1982675859331 + }, + "dragging": false + }, + { + "width": 300, + "height": 506, + "id": "chromaExistingIndex_0", + "position": { + "x": 509.55198017578016, + "y": -732.42003311752 + }, + "type": "customNode", + "data": { + "id": "chromaExistingIndex_0", + "label": "Chroma Load Existing Index", + "name": "chromaExistingIndex", + "version": 1, + "type": "Chroma", + "baseClasses": ["Chroma", "VectorStoreRetriever", "BaseRetriever"], + "category": "Vector Stores", + "description": "Load existing index from Chroma (i.e: Document has been upserted)", + "inputParams": [ + { + "label": "Collection Name", + "name": "collectionName", + "type": "string", + "id": "chromaExistingIndex_0-input-collectionName-string" + }, + { + "label": "Chroma URL", + "name": "chromaURL", + "type": "string", + "optional": true, + "id": "chromaExistingIndex_0-input-chromaURL-string" + }, + { + "label": "Top K", + "name": "topK", + "description": "Number of top results to fetch. Default to 4", + "placeholder": "4", + "type": "number", + "additionalParams": true, + "optional": true, + "id": "chromaExistingIndex_0-input-topK-number" + } + ], + "inputAnchors": [ + { + "label": "Embeddings", + "name": "embeddings", + "type": "Embeddings", + "id": "chromaExistingIndex_0-input-embeddings-Embeddings" + } + ], + "inputs": { + "embeddings": "{{openAIEmbeddings_1.data.instance}}", + "collectionName": "", + "chromaURL": "", + "topK": "" + }, + "outputAnchors": [ + { + "name": "output", + "label": "Output", + "type": "options", + "options": [ + { + "id": "chromaExistingIndex_0-output-retriever-Chroma|VectorStoreRetriever|BaseRetriever", + "name": "retriever", + "label": "Chroma Retriever", + "type": "Chroma | VectorStoreRetriever | BaseRetriever" + }, + { + "id": "chromaExistingIndex_0-output-vectorStore-Chroma|VectorStore", + "name": "vectorStore", + "label": "Chroma Vector Store", + "type": "Chroma | VectorStore" + } + ], + "default": "retriever" + } + ], + "outputs": { + "output": "retriever" + }, + "selected": false + }, + "selected": false, + "positionAbsolute": { + "x": 509.55198017578016, + "y": -732.42003311752 + }, + "dragging": false + }, + { + "width": 300, + "height": 329, + "id": "openAIEmbeddings_2", + "position": { + "x": 128.85404348918783, + "y": 155.96043384682295 + }, + "type": "customNode", + "data": { + "id": "openAIEmbeddings_2", + "label": "OpenAI Embeddings", + "name": "openAIEmbeddings", + "version": 1, + "type": "OpenAIEmbeddings", + "baseClasses": ["OpenAIEmbeddings", "Embeddings"], + "category": "Embeddings", + "description": "OpenAI API to generate embeddings for a given text", + "inputParams": [ + { + "label": "Connect Credential", + "name": "credential", + "type": "credential", + "credentialNames": ["openAIApi"], + "id": "openAIEmbeddings_2-input-credential-credential" }, { "label": "Strip New Lines", @@ -48,13 +695,22 @@ "optional": true, "additionalParams": true, "id": "openAIEmbeddings_2-input-timeout-number" + }, + { + "label": "BasePath", + "name": "basepath", + "type": "string", + "optional": true, + "additionalParams": true, + "id": "openAIEmbeddings_2-input-basepath-string" } ], "inputAnchors": [], "inputs": { "stripNewLines": "", "batchSize": "", - "timeout": "" + "timeout": "", + "basepath": "" }, "outputAnchors": [ { @@ -69,105 +725,36 @@ }, "selected": false, "positionAbsolute": { - "x": 155.07832615625986, - "y": -778.383353751991 + "x": 128.85404348918783, + "y": 155.96043384682295 }, "dragging": false }, { "width": 300, - "height": 355, - "id": "chromaExistingIndex_1", - "position": { - "x": 522.8177328694987, - "y": -548.8355398674973 - }, - "type": "customNode", - "data": { - "id": "chromaExistingIndex_1", - "label": "Chroma Load Existing Index", - "name": "chromaExistingIndex", - "type": "Chroma", - "baseClasses": ["Chroma", "VectorStoreRetriever", "BaseRetriever"], - "category": "Vector Stores", - "description": "Load existing index from Chroma (i.e: Document has been upserted)", - "inputParams": [ - { - "label": "Collection Name", - "name": "collectionName", - "type": "string", - "id": "chromaExistingIndex_1-input-collectionName-string" - } - ], - "inputAnchors": [ - { - "label": "Embeddings", - "name": "embeddings", - "type": "Embeddings", - "id": "chromaExistingIndex_1-input-embeddings-Embeddings" - } - ], - "inputs": { - "embeddings": "{{openAIEmbeddings_2.data.instance}}", - "collectionName": "ai-paper" - }, - "outputAnchors": [ - { - "name": "output", - "label": "Output", - "type": "options", - "options": [ - { - "id": "chromaExistingIndex_1-output-retriever-Chroma|VectorStoreRetriever|BaseRetriever", - "name": "retriever", - "label": "Chroma Retriever", - "type": "Chroma | VectorStoreRetriever | BaseRetriever" - }, - { - "id": "chromaExistingIndex_1-output-vectorStore-Chroma|VectorStore", - "name": "vectorStore", - "label": "Chroma Vector Store", - "type": "Chroma | VectorStore" - } - ], - "default": "retriever" - } - ], - "outputs": { - "output": "vectorStore" - }, - "selected": false - }, - "positionAbsolute": { - "x": 522.8177328694987, - "y": -548.8355398674973 - }, - "selected": false, - "dragging": false - }, - { - "width": 300, - "height": 524, + "height": 523, "id": "openAI_3", "position": { - "x": 512.7434966474709, - "y": -1107.9938317347255 + "x": 504.808358369027, + "y": -197.78194663790197 }, "type": "customNode", "data": { "id": "openAI_3", "label": "OpenAI", "name": "openAI", + "version": 1, "type": "OpenAI", - "baseClasses": ["OpenAI", "BaseLLM", "BaseLanguageModel", "BaseLangChain"], + "baseClasses": ["OpenAI", "BaseLLM", "BaseLanguageModel"], "category": "LLMs", "description": "Wrapper around OpenAI large language models", "inputParams": [ { - "label": "OpenAI Api Key", - "name": "openAIApiKey", - "type": "password", - "id": "openAI_3-input-openAIApiKey-password" + "label": "Connect Credential", + "name": "credential", + "type": "credential", + "credentialNames": ["openAIApi"], + "id": "openAI_3-input-credential-credential" }, { "label": "Model Name", @@ -258,6 +845,14 @@ "optional": true, "additionalParams": true, "id": "openAI_3-input-timeout-number" + }, + { + "label": "BasePath", + "name": "basepath", + "type": "string", + "optional": true, + "additionalParams": true, + "id": "openAI_3-input-basepath-string" } ], "inputAnchors": [], @@ -270,332 +865,161 @@ "frequencyPenalty": "", "presencePenalty": "", "batchSize": "", - "timeout": "" + "timeout": "", + "basepath": "" }, "outputAnchors": [ { - "id": "openAI_3-output-openAI-OpenAI|BaseLLM|BaseLanguageModel|BaseLangChain", + "id": "openAI_3-output-openAI-OpenAI|BaseLLM|BaseLanguageModel", "name": "openAI", "label": "OpenAI", - "type": "OpenAI | BaseLLM | BaseLanguageModel | BaseLangChain" + "type": "OpenAI | BaseLLM | BaseLanguageModel" } ], "outputs": {}, "selected": false }, - "positionAbsolute": { - "x": 512.7434966474709, - "y": -1107.9938317347255 - }, "selected": false, + "positionAbsolute": { + "x": 504.808358369027, + "y": -197.78194663790197 + }, "dragging": false }, { "width": 300, - "height": 280, - "id": "vectorDBQAChain_2", + "height": 505, + "id": "pineconeExistingIndex_0", "position": { - "x": 880.7795222381183, - "y": -823.6550506138045 + "x": 507.5206146177215, + "y": 343.07818128024616 }, "type": "customNode", "data": { - "id": "vectorDBQAChain_2", - "label": "VectorDB QA Chain", - "name": "vectorDBQAChain", - "type": "VectorDBQAChain", - "baseClasses": ["VectorDBQAChain", "BaseChain", "BaseLangChain"], - "category": "Chains", - "description": "QA chain for vector databases", - "inputParams": [], - "inputAnchors": [ - { - "label": "Language Model", - "name": "model", - "type": "BaseLanguageModel", - "id": "vectorDBQAChain_2-input-model-BaseLanguageModel" - }, - { - "label": "Vector Store", - "name": "vectorStore", - "type": "VectorStore", - "id": "vectorDBQAChain_2-input-vectorStore-VectorStore" - } - ], - "inputs": { - "model": "{{openAI_3.data.instance}}", - "vectorStore": "{{chromaExistingIndex_1.data.instance}}" - }, - "outputAnchors": [ - { - "id": "vectorDBQAChain_2-output-vectorDBQAChain-VectorDBQAChain|BaseChain|BaseLangChain", - "name": "vectorDBQAChain", - "label": "VectorDBQAChain", - "type": "VectorDBQAChain | BaseChain | BaseLangChain" - } - ], - "outputs": {}, - "selected": false - }, - "positionAbsolute": { - "x": 880.7795222381183, - "y": -823.6550506138045 - }, - "selected": false, - "dragging": false - }, - { - "width": 300, - "height": 602, - "id": "chainTool_2", - "position": { - "x": 1251.240972921597, - "y": -922.9180420195128 - }, - "type": "customNode", - "data": { - "id": "chainTool_2", - "label": "Chain Tool", - "name": "chainTool", - "type": "ChainTool", - "baseClasses": ["ChainTool", "DynamicTool", "Tool", "StructuredTool", "BaseLangChain"], - "category": "Tools", - "description": "Use a chain as allowed tool for agent", + "id": "pineconeExistingIndex_0", + "label": "Pinecone Load Existing Index", + "name": "pineconeExistingIndex", + "version": 1, + "type": "Pinecone", + "baseClasses": ["Pinecone", "VectorStoreRetriever", "BaseRetriever"], + "category": "Vector Stores", + "description": "Load existing index from Pinecone (i.e: Document has been upserted)", "inputParams": [ { - "label": "Chain Name", - "name": "name", - "type": "string", - "placeholder": "state-of-union-qa", - "id": "chainTool_2-input-name-string" + "label": "Connect Credential", + "name": "credential", + "type": "credential", + "credentialNames": ["pineconeApi"], + "id": "pineconeExistingIndex_0-input-credential-credential" }, { - "label": "Chain Description", - "name": "description", + "label": "Pinecone Index", + "name": "pineconeIndex", "type": "string", - "rows": 3, - "placeholder": "State of the Union QA - useful for when you need to ask questions about the most recent state of the union address.", - "id": "chainTool_2-input-description-string" + "id": "pineconeExistingIndex_0-input-pineconeIndex-string" }, { - "label": "Return Direct", - "name": "returnDirect", - "type": "boolean", + "label": "Pinecone Namespace", + "name": "pineconeNamespace", + "type": "string", + "placeholder": "my-first-namespace", + "additionalParams": true, "optional": true, - "id": "chainTool_2-input-returnDirect-boolean" + "id": "pineconeExistingIndex_0-input-pineconeNamespace-string" + }, + { + "label": "Pinecone Metadata Filter", + "name": "pineconeMetadataFilter", + "type": "json", + "optional": true, + "additionalParams": true, + "id": "pineconeExistingIndex_0-input-pineconeMetadataFilter-json" + }, + { + "label": "Top K", + "name": "topK", + "description": "Number of top results to fetch. Default to 4", + "placeholder": "4", + "type": "number", + "additionalParams": true, + "optional": true, + "id": "pineconeExistingIndex_0-input-topK-number" } ], "inputAnchors": [ { - "label": "Base Chain", - "name": "baseChain", - "type": "BaseChain", - "id": "chainTool_2-input-baseChain-BaseChain" + "label": "Embeddings", + "name": "embeddings", + "type": "Embeddings", + "id": "pineconeExistingIndex_0-input-embeddings-Embeddings" } ], "inputs": { - "name": "ai-paper-qa", - "description": "AI Paper QA - useful for when you need to ask questions about the AI-Generated Content paper.", - "returnDirect": "", - "baseChain": "{{vectorDBQAChain_2.data.instance}}" + "embeddings": "{{openAIEmbeddings_2.data.instance}}", + "pineconeIndex": "", + "pineconeNamespace": "", + "pineconeMetadataFilter": "", + "topK": "" }, "outputAnchors": [ { - "id": "chainTool_2-output-chainTool-ChainTool|DynamicTool|Tool|StructuredTool|BaseLangChain", - "name": "chainTool", - "label": "ChainTool", - "type": "ChainTool | DynamicTool | Tool | StructuredTool | BaseLangChain" + "name": "output", + "label": "Output", + "type": "options", + "options": [ + { + "id": "pineconeExistingIndex_0-output-retriever-Pinecone|VectorStoreRetriever|BaseRetriever", + "name": "retriever", + "label": "Pinecone Retriever", + "type": "Pinecone | VectorStoreRetriever | BaseRetriever" + }, + { + "id": "pineconeExistingIndex_0-output-vectorStore-Pinecone|VectorStore", + "name": "vectorStore", + "label": "Pinecone Vector Store", + "type": "Pinecone | VectorStore" + } + ], + "default": "retriever" } ], - "outputs": {}, - "selected": false - }, - "selected": false, - "positionAbsolute": { - "x": 1251.240972921597, - "y": -922.9180420195128 - }, - "dragging": false - }, - { - "width": 300, - "height": 143, - "id": "calculator_1", - "position": { - "x": 1649.5389102641816, - "y": -835.8729983638877 - }, - "type": "customNode", - "data": { - "id": "calculator_1", - "label": "Calculator", - "name": "calculator", - "type": "Calculator", - "baseClasses": ["Calculator", "Tool", "StructuredTool", "BaseLangChain"], - "category": "Tools", - "description": "Perform calculations on response", - "inputParams": [], - "inputAnchors": [], - "inputs": {}, - "outputAnchors": [ - { - "id": "calculator_1-output-calculator-Calculator|Tool|StructuredTool|BaseLangChain", - "name": "calculator", - "label": "Calculator", - "type": "Calculator | Tool | StructuredTool | BaseLangChain" - } - ], - "outputs": {}, - "selected": false - }, - "positionAbsolute": { - "x": 1649.5389102641816, - "y": -835.8729983638877 - }, - "selected": false, - "dragging": false - }, - { - "width": 300, - "height": 278, - "id": "serpAPI_0", - "position": { - "x": 1654.5273488033688, - "y": -622.1607096176143 - }, - "type": "customNode", - "data": { - "id": "serpAPI_0", - "label": "Serp API", - "name": "serpAPI", - "type": "SerpAPI", - "baseClasses": ["SerpAPI", "Tool", "StructuredTool", "BaseLangChain"], - "category": "Tools", - "description": "Wrapper around SerpAPI - a real-time API to access Google search results", - "inputParams": [ - { - "label": "Serp Api Key", - "name": "apiKey", - "type": "password", - "id": "serpAPI_0-input-apiKey-password" - } - ], - "inputAnchors": [], - "inputs": {}, - "outputAnchors": [ - { - "id": "serpAPI_0-output-serpAPI-SerpAPI|Tool|StructuredTool|BaseLangChain", - "name": "serpAPI", - "label": "SerpAPI", - "type": "SerpAPI | Tool | StructuredTool | BaseLangChain" - } - ], - "outputs": {}, - "selected": false - }, - "selected": false, - "positionAbsolute": { - "x": 1654.5273488033688, - "y": -622.1607096176143 - }, - "dragging": false - }, - { - "width": 300, - "height": 330, - "id": "openAIEmbeddings_3", - "position": { - "x": 163.902196956619, - "y": 318.66096921035574 - }, - "type": "customNode", - "data": { - "id": "openAIEmbeddings_3", - "label": "OpenAI Embeddings", - "name": "openAIEmbeddings", - "type": "OpenAIEmbeddings", - "baseClasses": ["OpenAIEmbeddings", "Embeddings"], - "category": "Embeddings", - "description": "OpenAI API to generate embeddings for a given text", - "inputParams": [ - { - "label": "OpenAI Api Key", - "name": "openAIApiKey", - "type": "password", - "id": "openAIEmbeddings_3-input-openAIApiKey-password" - }, - { - "label": "Strip New Lines", - "name": "stripNewLines", - "type": "boolean", - "optional": true, - "additionalParams": true, - "id": "openAIEmbeddings_3-input-stripNewLines-boolean" - }, - { - "label": "Batch Size", - "name": "batchSize", - "type": "number", - "optional": true, - "additionalParams": true, - "id": "openAIEmbeddings_3-input-batchSize-number" - }, - { - "label": "Timeout", - "name": "timeout", - "type": "number", - "optional": true, - "additionalParams": true, - "id": "openAIEmbeddings_3-input-timeout-number" - } - ], - "inputAnchors": [], - "inputs": { - "stripNewLines": "", - "batchSize": "", - "timeout": "" + "outputs": { + "output": "retriever" }, - "outputAnchors": [ - { - "id": "openAIEmbeddings_3-output-openAIEmbeddings-OpenAIEmbeddings|Embeddings", - "name": "openAIEmbeddings", - "label": "OpenAIEmbeddings", - "type": "OpenAIEmbeddings | Embeddings" - } - ], - "outputs": {}, "selected": false }, "selected": false, "positionAbsolute": { - "x": 163.902196956619, - "y": 318.66096921035574 + "x": 507.5206146177215, + "y": 343.07818128024616 }, "dragging": false }, { "width": 300, - "height": 524, + "height": 523, "id": "openAI_4", "position": { - "x": 529.8870809493459, - "y": -137.8839994127831 + "x": 1619.5346765785587, + "y": 292.29615581180684 }, "type": "customNode", "data": { "id": "openAI_4", "label": "OpenAI", "name": "openAI", + "version": 1, "type": "OpenAI", - "baseClasses": ["OpenAI", "BaseLLM", "BaseLanguageModel", "BaseLangChain"], + "baseClasses": ["OpenAI", "BaseLLM", "BaseLanguageModel"], "category": "LLMs", "description": "Wrapper around OpenAI large language models", "inputParams": [ { - "label": "OpenAI Api Key", - "name": "openAIApiKey", - "type": "password", - "id": "openAI_4-input-openAIApiKey-password" + "label": "Connect Credential", + "name": "credential", + "type": "credential", + "credentialNames": ["openAIApi"], + "id": "openAI_4-input-credential-credential" }, { "label": "Model Name", @@ -686,6 +1110,14 @@ "optional": true, "additionalParams": true, "id": "openAI_4-input-timeout-number" + }, + { + "label": "BasePath", + "name": "basepath", + "type": "string", + "optional": true, + "additionalParams": true, + "id": "openAI_4-input-basepath-string" } ], "inputAnchors": [], @@ -698,447 +1130,15 @@ "frequencyPenalty": "", "presencePenalty": "", "batchSize": "", - "timeout": "" + "timeout": "", + "basepath": "" }, "outputAnchors": [ { - "id": "openAI_4-output-openAI-OpenAI|BaseLLM|BaseLanguageModel|BaseLangChain", + "id": "openAI_4-output-openAI-OpenAI|BaseLLM|BaseLanguageModel", "name": "openAI", "label": "OpenAI", - "type": "OpenAI | BaseLLM | BaseLanguageModel | BaseLangChain" - } - ], - "outputs": {}, - "selected": false - }, - "positionAbsolute": { - "x": 529.8870809493459, - "y": -137.8839994127831 - }, - "selected": false, - "dragging": false - }, - { - "width": 300, - "height": 703, - "id": "pineconeExistingIndex_1", - "position": { - "x": 539.4840212380209, - "y": 452.3690065882661 - }, - "type": "customNode", - "data": { - "id": "pineconeExistingIndex_1", - "label": "Pinecone Load Existing Index", - "name": "pineconeExistingIndex", - "type": "Pinecone", - "baseClasses": ["Pinecone", "VectorStoreRetriever", "BaseRetriever"], - "category": "Vector Stores", - "description": "Load existing index from Pinecone (i.e: Document has been upserted)", - "inputParams": [ - { - "label": "Pinecone Api Key", - "name": "pineconeApiKey", - "type": "password", - "id": "pineconeExistingIndex_1-input-pineconeApiKey-password" - }, - { - "label": "Pinecone Environment", - "name": "pineconeEnv", - "type": "string", - "id": "pineconeExistingIndex_1-input-pineconeEnv-string" - }, - { - "label": "Pinecone Index", - "name": "pineconeIndex", - "type": "string", - "id": "pineconeExistingIndex_1-input-pineconeIndex-string" - }, - { - "label": "Pinecone Namespace", - "name": "pineconeNamespace", - "type": "string", - "placeholder": "my-first-namespace", - "optional": true, - "id": "pineconeExistingIndex_1-input-pineconeNamespace-string" - }, - { - "label": "Pinecone Metadata Filter", - "name": "pineconeMetadataFilter", - "type": "json", - "optional": true, - "additionalParams": true, - "id": "pineconeExistingIndex_1-input-pineconeMetadataFilter-json" - } - ], - "inputAnchors": [ - { - "label": "Embeddings", - "name": "embeddings", - "type": "Embeddings", - "id": "pineconeExistingIndex_1-input-embeddings-Embeddings" - } - ], - "inputs": { - "embeddings": "{{openAIEmbeddings_3.data.instance}}", - "pineconeEnv": "us-west4-gcp", - "pineconeIndex": "state-of-union", - "pineconeNamespace": "" - }, - "outputAnchors": [ - { - "name": "output", - "label": "Output", - "type": "options", - "options": [ - { - "id": "pineconeExistingIndex_1-output-retriever-Pinecone|VectorStoreRetriever|BaseRetriever", - "name": "retriever", - "label": "Pinecone Retriever", - "type": "Pinecone | VectorStoreRetriever | BaseRetriever" - }, - { - "id": "pineconeExistingIndex_1-output-vectorStore-Pinecone|VectorStore", - "name": "vectorStore", - "label": "Pinecone Vector Store", - "type": "Pinecone | VectorStore" - } - ], - "default": "retriever" - } - ], - "outputs": { - "output": "vectorStore" - }, - "selected": false - }, - "selected": false, - "dragging": false, - "positionAbsolute": { - "x": 539.4840212380209, - "y": 452.3690065882661 - } - }, - { - "width": 300, - "height": 280, - "id": "vectorDBQAChain_3", - "position": { - "x": 896.3238465010572, - "y": 173.57643605877104 - }, - "type": "customNode", - "data": { - "id": "vectorDBQAChain_3", - "label": "VectorDB QA Chain", - "name": "vectorDBQAChain", - "type": "VectorDBQAChain", - "baseClasses": ["VectorDBQAChain", "BaseChain", "BaseLangChain"], - "category": "Chains", - "description": "QA chain for vector databases", - "inputParams": [], - "inputAnchors": [ - { - "label": "Language Model", - "name": "model", - "type": "BaseLanguageModel", - "id": "vectorDBQAChain_3-input-model-BaseLanguageModel" - }, - { - "label": "Vector Store", - "name": "vectorStore", - "type": "VectorStore", - "id": "vectorDBQAChain_3-input-vectorStore-VectorStore" - } - ], - "inputs": { - "model": "{{openAI_4.data.instance}}", - "vectorStore": "{{pineconeExistingIndex_1.data.instance}}" - }, - "outputAnchors": [ - { - "id": "vectorDBQAChain_3-output-vectorDBQAChain-VectorDBQAChain|BaseChain|BaseLangChain", - "name": "vectorDBQAChain", - "label": "VectorDBQAChain", - "type": "VectorDBQAChain | BaseChain | BaseLangChain" - } - ], - "outputs": {}, - "selected": false - }, - "positionAbsolute": { - "x": 896.3238465010572, - "y": 173.57643605877104 - }, - "selected": false, - "dragging": false - }, - { - "width": 300, - "height": 602, - "id": "chainTool_3", - "position": { - "x": 1260.8044270644157, - "y": -244.7000095631508 - }, - "type": "customNode", - "data": { - "id": "chainTool_3", - "label": "Chain Tool", - "name": "chainTool", - "type": "ChainTool", - "baseClasses": ["ChainTool", "DynamicTool", "Tool", "StructuredTool", "BaseLangChain"], - "category": "Tools", - "description": "Use a chain as allowed tool for agent", - "inputParams": [ - { - "label": "Chain Name", - "name": "name", - "type": "string", - "placeholder": "state-of-union-qa", - "id": "chainTool_3-input-name-string" - }, - { - "label": "Chain Description", - "name": "description", - "type": "string", - "rows": 3, - "placeholder": "State of the Union QA - useful for when you need to ask questions about the most recent state of the union address.", - "id": "chainTool_3-input-description-string" - }, - { - "label": "Return Direct", - "name": "returnDirect", - "type": "boolean", - "optional": true, - "id": "chainTool_3-input-returnDirect-boolean" - } - ], - "inputAnchors": [ - { - "label": "Base Chain", - "name": "baseChain", - "type": "BaseChain", - "id": "chainTool_3-input-baseChain-BaseChain" - } - ], - "inputs": { - "name": "state-of-union-qa", - "description": "State of the Union QA - useful for when you need to ask questions about the most recent state of the union address.", - "returnDirect": "", - "baseChain": "{{vectorDBQAChain_3.data.instance}}" - }, - "outputAnchors": [ - { - "id": "chainTool_3-output-chainTool-ChainTool|DynamicTool|Tool|StructuredTool|BaseLangChain", - "name": "chainTool", - "label": "ChainTool", - "type": "ChainTool | DynamicTool | Tool | StructuredTool | BaseLangChain" - } - ], - "outputs": {}, - "selected": false - }, - "selected": false, - "dragging": false, - "positionAbsolute": { - "x": 1260.8044270644157, - "y": -244.7000095631508 - } - }, - { - "width": 300, - "height": 524, - "id": "openAI_5", - "position": { - "x": 1683.95439713088, - "y": 329.0556949149878 - }, - "type": "customNode", - "data": { - "id": "openAI_5", - "label": "OpenAI", - "name": "openAI", - "type": "OpenAI", - "baseClasses": ["OpenAI", "BaseLLM", "BaseLanguageModel", "BaseLangChain"], - "category": "LLMs", - "description": "Wrapper around OpenAI large language models", - "inputParams": [ - { - "label": "OpenAI Api Key", - "name": "openAIApiKey", - "type": "password", - "id": "openAI_5-input-openAIApiKey-password" - }, - { - "label": "Model Name", - "name": "modelName", - "type": "options", - "options": [ - { - "label": "text-davinci-003", - "name": "text-davinci-003" - }, - { - "label": "text-davinci-002", - "name": "text-davinci-002" - }, - { - "label": "text-curie-001", - "name": "text-curie-001" - }, - { - "label": "text-babbage-001", - "name": "text-babbage-001" - } - ], - "default": "text-davinci-003", - "optional": true, - "id": "openAI_5-input-modelName-options" - }, - { - "label": "Temperature", - "name": "temperature", - "type": "number", - "default": 0.7, - "optional": true, - "id": "openAI_5-input-temperature-number" - }, - { - "label": "Max Tokens", - "name": "maxTokens", - "type": "number", - "optional": true, - "additionalParams": true, - "id": "openAI_5-input-maxTokens-number" - }, - { - "label": "Top Probability", - "name": "topP", - "type": "number", - "optional": true, - "additionalParams": true, - "id": "openAI_5-input-topP-number" - }, - { - "label": "Best Of", - "name": "bestOf", - "type": "number", - "optional": true, - "additionalParams": true, - "id": "openAI_5-input-bestOf-number" - }, - { - "label": "Frequency Penalty", - "name": "frequencyPenalty", - "type": "number", - "optional": true, - "additionalParams": true, - "id": "openAI_5-input-frequencyPenalty-number" - }, - { - "label": "Presence Penalty", - "name": "presencePenalty", - "type": "number", - "optional": true, - "additionalParams": true, - "id": "openAI_5-input-presencePenalty-number" - }, - { - "label": "Batch Size", - "name": "batchSize", - "type": "number", - "optional": true, - "additionalParams": true, - "id": "openAI_5-input-batchSize-number" - }, - { - "label": "Timeout", - "name": "timeout", - "type": "number", - "optional": true, - "additionalParams": true, - "id": "openAI_5-input-timeout-number" - } - ], - "inputAnchors": [], - "inputs": { - "modelName": "text-davinci-003", - "temperature": "0", - "maxTokens": "", - "topP": "", - "bestOf": "", - "frequencyPenalty": "", - "presencePenalty": "", - "batchSize": "", - "timeout": "" - }, - "outputAnchors": [ - { - "id": "openAI_5-output-openAI-OpenAI|BaseLLM|BaseLanguageModel|BaseLangChain", - "name": "openAI", - "label": "OpenAI", - "type": "OpenAI | BaseLLM | BaseLanguageModel | BaseLangChain" - } - ], - "outputs": {}, - "selected": false - }, - "positionAbsolute": { - "x": 1683.95439713088, - "y": 329.0556949149878 - }, - "selected": false, - "dragging": false - }, - { - "width": 300, - "height": 280, - "id": "mrklAgentLLM_0", - "position": { - "x": 2061.891333395338, - "y": -140.0694021759809 - }, - "type": "customNode", - "data": { - "id": "mrklAgentLLM_0", - "label": "MRKL Agent for LLMs", - "name": "mrklAgentLLM", - "type": "AgentExecutor", - "baseClasses": ["AgentExecutor", "BaseChain", "BaseLangChain"], - "category": "Agents", - "description": "Agent that uses the ReAct Framework to decide what action to take, optimized to be used with LLMs", - "inputParams": [], - "inputAnchors": [ - { - "label": "Allowed Tools", - "name": "tools", - "type": "Tool", - "list": true, - "id": "mrklAgentLLM_0-input-tools-Tool" - }, - { - "label": "Language Model", - "name": "model", - "type": "BaseLanguageModel", - "id": "mrklAgentLLM_0-input-model-BaseLanguageModel" - } - ], - "inputs": { - "tools": [ - "{{serpAPI_0.data.instance}}", - "{{calculator_1.data.instance}}", - "{{chainTool_2.data.instance}}", - "{{chainTool_3.data.instance}}" - ], - "model": "{{openAI_5.data.instance}}" - }, - "outputAnchors": [ - { - "id": "mrklAgentLLM_0-output-mrklAgentLLM-AgentExecutor|BaseChain|BaseLangChain", - "name": "mrklAgentLLM", - "label": "AgentExecutor", - "type": "AgentExecutor | BaseChain | BaseLangChain" + "type": "OpenAI | BaseLLM | BaseLanguageModel" } ], "outputs": {}, @@ -1146,123 +1146,13 @@ }, "selected": false, "positionAbsolute": { - "x": 2061.891333395338, - "y": -140.0694021759809 + "x": 1619.5346765785587, + "y": 292.29615581180684 }, "dragging": false } ], "edges": [ - { - "source": "openAIEmbeddings_2", - "sourceHandle": "openAIEmbeddings_2-output-openAIEmbeddings-OpenAIEmbeddings|Embeddings", - "target": "chromaExistingIndex_1", - "targetHandle": "chromaExistingIndex_1-input-embeddings-Embeddings", - "type": "buttonedge", - "id": "openAIEmbeddings_2-openAIEmbeddings_2-output-openAIEmbeddings-OpenAIEmbeddings|Embeddings-chromaExistingIndex_1-chromaExistingIndex_1-input-embeddings-Embeddings", - "data": { - "label": "" - } - }, - { - "source": "chromaExistingIndex_1", - "sourceHandle": "chromaExistingIndex_1-output-vectorStore-Chroma|VectorStore", - "target": "vectorDBQAChain_2", - "targetHandle": "vectorDBQAChain_2-input-vectorStore-VectorStore", - "type": "buttonedge", - "id": "chromaExistingIndex_1-chromaExistingIndex_1-output-vectorStore-Chroma|VectorStore-vectorDBQAChain_2-vectorDBQAChain_2-input-vectorStore-VectorStore", - "data": { - "label": "" - } - }, - { - "source": "openAI_3", - "sourceHandle": "openAI_3-output-openAI-OpenAI|BaseLLM|BaseLanguageModel|BaseLangChain", - "target": "vectorDBQAChain_2", - "targetHandle": "vectorDBQAChain_2-input-model-BaseLanguageModel", - "type": "buttonedge", - "id": "openAI_3-openAI_3-output-openAI-OpenAI|BaseLLM|BaseLanguageModel|BaseLangChain-vectorDBQAChain_2-vectorDBQAChain_2-input-model-BaseLanguageModel", - "data": { - "label": "" - } - }, - { - "source": "vectorDBQAChain_2", - "sourceHandle": "vectorDBQAChain_2-output-vectorDBQAChain-VectorDBQAChain|BaseChain|BaseLangChain", - "target": "chainTool_2", - "targetHandle": "chainTool_2-input-baseChain-BaseChain", - "type": "buttonedge", - "id": "vectorDBQAChain_2-vectorDBQAChain_2-output-vectorDBQAChain-VectorDBQAChain|BaseChain|BaseLangChain-chainTool_2-chainTool_2-input-baseChain-BaseChain", - "data": { - "label": "" - } - }, - { - "source": "openAI_4", - "sourceHandle": "openAI_4-output-openAI-OpenAI|BaseLLM|BaseLanguageModel|BaseLangChain", - "target": "vectorDBQAChain_3", - "targetHandle": "vectorDBQAChain_3-input-model-BaseLanguageModel", - "type": "buttonedge", - "id": "openAI_4-openAI_4-output-openAI-OpenAI|BaseLLM|BaseLanguageModel|BaseLangChain-vectorDBQAChain_3-vectorDBQAChain_3-input-model-BaseLanguageModel", - "data": { - "label": "" - } - }, - { - "source": "openAIEmbeddings_3", - "sourceHandle": "openAIEmbeddings_3-output-openAIEmbeddings-OpenAIEmbeddings|Embeddings", - "target": "pineconeExistingIndex_1", - "targetHandle": "pineconeExistingIndex_1-input-embeddings-Embeddings", - "type": "buttonedge", - "id": "openAIEmbeddings_3-openAIEmbeddings_3-output-openAIEmbeddings-OpenAIEmbeddings|Embeddings-pineconeExistingIndex_1-pineconeExistingIndex_1-input-embeddings-Embeddings", - "data": { - "label": "" - } - }, - { - "source": "vectorDBQAChain_3", - "sourceHandle": "vectorDBQAChain_3-output-vectorDBQAChain-VectorDBQAChain|BaseChain|BaseLangChain", - "target": "chainTool_3", - "targetHandle": "chainTool_3-input-baseChain-BaseChain", - "type": "buttonedge", - "id": "vectorDBQAChain_3-vectorDBQAChain_3-output-vectorDBQAChain-VectorDBQAChain|BaseChain|BaseLangChain-chainTool_3-chainTool_3-input-baseChain-BaseChain", - "data": { - "label": "" - } - }, - { - "source": "pineconeExistingIndex_1", - "sourceHandle": "pineconeExistingIndex_1-output-vectorStore-Pinecone|VectorStore", - "target": "vectorDBQAChain_3", - "targetHandle": "vectorDBQAChain_3-input-vectorStore-VectorStore", - "type": "buttonedge", - "id": "pineconeExistingIndex_1-pineconeExistingIndex_1-output-vectorStore-Pinecone|VectorStore-vectorDBQAChain_3-vectorDBQAChain_3-input-vectorStore-VectorStore", - "data": { - "label": "" - } - }, - { - "source": "serpAPI_0", - "sourceHandle": "serpAPI_0-output-serpAPI-SerpAPI|Tool|StructuredTool|BaseLangChain", - "target": "mrklAgentLLM_0", - "targetHandle": "mrklAgentLLM_0-input-tools-Tool", - "type": "buttonedge", - "id": "serpAPI_0-serpAPI_0-output-serpAPI-SerpAPI|Tool|StructuredTool|BaseLangChain-mrklAgentLLM_0-mrklAgentLLM_0-input-tools-Tool", - "data": { - "label": "" - } - }, - { - "source": "calculator_1", - "sourceHandle": "calculator_1-output-calculator-Calculator|Tool|StructuredTool|BaseLangChain", - "target": "mrklAgentLLM_0", - "targetHandle": "mrklAgentLLM_0-input-tools-Tool", - "type": "buttonedge", - "id": "calculator_1-calculator_1-output-calculator-Calculator|Tool|StructuredTool|BaseLangChain-mrklAgentLLM_0-mrklAgentLLM_0-input-tools-Tool", - "data": { - "label": "" - } - }, { "source": "chainTool_2", "sourceHandle": "chainTool_2-output-chainTool-ChainTool|DynamicTool|Tool|StructuredTool|BaseLangChain", @@ -1286,12 +1176,100 @@ } }, { - "source": "openAI_5", - "sourceHandle": "openAI_5-output-openAI-OpenAI|BaseLLM|BaseLanguageModel|BaseLangChain", + "source": "retrievalQAChain_0", + "sourceHandle": "retrievalQAChain_0-output-retrievalQAChain-RetrievalQAChain|BaseChain|BaseLangChain", + "target": "chainTool_2", + "targetHandle": "chainTool_2-input-baseChain-BaseChain", + "type": "buttonedge", + "id": "retrievalQAChain_0-retrievalQAChain_0-output-retrievalQAChain-RetrievalQAChain|BaseChain|BaseLangChain-chainTool_2-chainTool_2-input-baseChain-BaseChain", + "data": { + "label": "" + } + }, + { + "source": "retrievalQAChain_1", + "sourceHandle": "retrievalQAChain_1-output-retrievalQAChain-RetrievalQAChain|BaseChain|BaseLangChain", + "target": "chainTool_3", + "targetHandle": "chainTool_3-input-baseChain-BaseChain", + "type": "buttonedge", + "id": "retrievalQAChain_1-retrievalQAChain_1-output-retrievalQAChain-RetrievalQAChain|BaseChain|BaseLangChain-chainTool_3-chainTool_3-input-baseChain-BaseChain", + "data": { + "label": "" + } + }, + { + "source": "openAI_2", + "sourceHandle": "openAI_2-output-openAI-OpenAI|BaseLLM|BaseLanguageModel", + "target": "retrievalQAChain_0", + "targetHandle": "retrievalQAChain_0-input-model-BaseLanguageModel", + "type": "buttonedge", + "id": "openAI_2-openAI_2-output-openAI-OpenAI|BaseLLM|BaseLanguageModel-retrievalQAChain_0-retrievalQAChain_0-input-model-BaseLanguageModel", + "data": { + "label": "" + } + }, + { + "source": "openAIEmbeddings_1", + "sourceHandle": "openAIEmbeddings_1-output-openAIEmbeddings-OpenAIEmbeddings|Embeddings", + "target": "chromaExistingIndex_0", + "targetHandle": "chromaExistingIndex_0-input-embeddings-Embeddings", + "type": "buttonedge", + "id": "openAIEmbeddings_1-openAIEmbeddings_1-output-openAIEmbeddings-OpenAIEmbeddings|Embeddings-chromaExistingIndex_0-chromaExistingIndex_0-input-embeddings-Embeddings", + "data": { + "label": "" + } + }, + { + "source": "chromaExistingIndex_0", + "sourceHandle": "chromaExistingIndex_0-output-retriever-Chroma|VectorStoreRetriever|BaseRetriever", + "target": "retrievalQAChain_0", + "targetHandle": "retrievalQAChain_0-input-vectorStoreRetriever-BaseRetriever", + "type": "buttonedge", + "id": "chromaExistingIndex_0-chromaExistingIndex_0-output-retriever-Chroma|VectorStoreRetriever|BaseRetriever-retrievalQAChain_0-retrievalQAChain_0-input-vectorStoreRetriever-BaseRetriever", + "data": { + "label": "" + } + }, + { + "source": "openAIEmbeddings_2", + "sourceHandle": "openAIEmbeddings_2-output-openAIEmbeddings-OpenAIEmbeddings|Embeddings", + "target": "pineconeExistingIndex_0", + "targetHandle": "pineconeExistingIndex_0-input-embeddings-Embeddings", + "type": "buttonedge", + "id": "openAIEmbeddings_2-openAIEmbeddings_2-output-openAIEmbeddings-OpenAIEmbeddings|Embeddings-pineconeExistingIndex_0-pineconeExistingIndex_0-input-embeddings-Embeddings", + "data": { + "label": "" + } + }, + { + "source": "openAI_3", + "sourceHandle": "openAI_3-output-openAI-OpenAI|BaseLLM|BaseLanguageModel", + "target": "retrievalQAChain_1", + "targetHandle": "retrievalQAChain_1-input-model-BaseLanguageModel", + "type": "buttonedge", + "id": "openAI_3-openAI_3-output-openAI-OpenAI|BaseLLM|BaseLanguageModel-retrievalQAChain_1-retrievalQAChain_1-input-model-BaseLanguageModel", + "data": { + "label": "" + } + }, + { + "source": "pineconeExistingIndex_0", + "sourceHandle": "pineconeExistingIndex_0-output-retriever-Pinecone|VectorStoreRetriever|BaseRetriever", + "target": "retrievalQAChain_1", + "targetHandle": "retrievalQAChain_1-input-vectorStoreRetriever-BaseRetriever", + "type": "buttonedge", + "id": "pineconeExistingIndex_0-pineconeExistingIndex_0-output-retriever-Pinecone|VectorStoreRetriever|BaseRetriever-retrievalQAChain_1-retrievalQAChain_1-input-vectorStoreRetriever-BaseRetriever", + "data": { + "label": "" + } + }, + { + "source": "openAI_4", + "sourceHandle": "openAI_4-output-openAI-OpenAI|BaseLLM|BaseLanguageModel", "target": "mrklAgentLLM_0", "targetHandle": "mrklAgentLLM_0-input-model-BaseLanguageModel", "type": "buttonedge", - "id": "openAI_5-openAI_5-output-openAI-OpenAI|BaseLLM|BaseLanguageModel|BaseLangChain-mrklAgentLLM_0-mrklAgentLLM_0-input-model-BaseLanguageModel", + "id": "openAI_4-openAI_4-output-openAI-OpenAI|BaseLLM|BaseLanguageModel-mrklAgentLLM_0-mrklAgentLLM_0-input-model-BaseLanguageModel", "data": { "label": "" } diff --git a/packages/server/marketplaces/chatflows/OpenAI Agent.json b/packages/server/marketplaces/chatflows/OpenAI Agent.json new file mode 100644 index 000000000..91d5d38ce --- /dev/null +++ b/packages/server/marketplaces/chatflows/OpenAI Agent.json @@ -0,0 +1,483 @@ +{ + "description": "An agent that uses OpenAI's Function Calling functionality to pick the tool and args to call", + "nodes": [ + { + "width": 300, + "height": 143, + "id": "calculator_0", + "position": { + "x": 288.06681362611545, + "y": 289.1385194199715 + }, + "type": "customNode", + "data": { + "id": "calculator_0", + "label": "Calculator", + "name": "calculator", + "version": 1, + "type": "Calculator", + "baseClasses": ["Calculator", "Tool", "StructuredTool", "BaseLangChain", "Serializable"], + "category": "Tools", + "description": "Perform calculations on response", + "inputParams": [], + "inputAnchors": [], + "inputs": {}, + "outputAnchors": [ + { + "id": "calculator_0-output-calculator-Calculator|Tool|StructuredTool|BaseLangChain|Serializable", + "name": "calculator", + "label": "Calculator", + "type": "Calculator | Tool | StructuredTool | BaseLangChain | Serializable" + } + ], + "outputs": {}, + "selected": false + }, + "selected": false, + "positionAbsolute": { + "x": 288.06681362611545, + "y": 289.1385194199715 + }, + "dragging": false + }, + { + "width": 300, + "height": 376, + "id": "bufferMemory_0", + "position": { + "x": 285.7750469157585, + "y": 465.1140427303788 + }, + "type": "customNode", + "data": { + "id": "bufferMemory_0", + "label": "Buffer Memory", + "name": "bufferMemory", + "version": 1, + "type": "BufferMemory", + "baseClasses": ["BufferMemory", "BaseChatMemory", "BaseMemory"], + "category": "Memory", + "description": "Remembers previous conversational back and forths directly", + "inputParams": [ + { + "label": "Memory Key", + "name": "memoryKey", + "type": "string", + "default": "chat_history", + "id": "bufferMemory_0-input-memoryKey-string" + }, + { + "label": "Input Key", + "name": "inputKey", + "type": "string", + "default": "input", + "id": "bufferMemory_0-input-inputKey-string" + } + ], + "inputAnchors": [], + "inputs": { + "memoryKey": "chat_history", + "inputKey": "input" + }, + "outputAnchors": [ + { + "id": "bufferMemory_0-output-bufferMemory-BufferMemory|BaseChatMemory|BaseMemory", + "name": "bufferMemory", + "label": "BufferMemory", + "type": "BufferMemory | BaseChatMemory | BaseMemory" + } + ], + "outputs": {}, + "selected": false + }, + "selected": false, + "positionAbsolute": { + "x": 285.7750469157585, + "y": 465.1140427303788 + }, + "dragging": false + }, + { + "width": 300, + "height": 277, + "id": "customTool_0", + "position": { + "x": 883.9529939431576, + "y": -32.32503903826486 + }, + "type": "customNode", + "data": { + "id": "customTool_0", + "label": "Custom Tool", + "name": "customTool", + "version": 1, + "type": "CustomTool", + "baseClasses": ["CustomTool", "Tool", "StructuredTool"], + "category": "Tools", + "description": "Use custom tool you've created in Flowise within chatflow", + "inputParams": [ + { + "label": "Select Tool", + "name": "selectedTool", + "type": "asyncOptions", + "loadMethod": "listTools", + "id": "customTool_0-input-selectedTool-asyncOptions" + } + ], + "inputAnchors": [], + "inputs": { + "selectedTool": "" + }, + "outputAnchors": [ + { + "id": "customTool_0-output-customTool-CustomTool|Tool|StructuredTool", + "name": "customTool", + "label": "CustomTool", + "type": "CustomTool | Tool | StructuredTool" + } + ], + "outputs": {}, + "selected": false + }, + "selected": false, + "positionAbsolute": { + "x": 883.9529939431576, + "y": -32.32503903826486 + }, + "dragging": false + }, + { + "width": 300, + "height": 277, + "id": "serper_0", + "position": { + "x": 504.3508341937219, + "y": -10.324432507151982 + }, + "type": "customNode", + "data": { + "id": "serper_0", + "label": "Serper", + "name": "serper", + "version": 1, + "type": "Serper", + "baseClasses": ["Serper", "Tool", "StructuredTool"], + "category": "Tools", + "description": "Wrapper around Serper.dev - Google Search API", + "inputParams": [ + { + "label": "Connect Credential", + "name": "credential", + "type": "credential", + "credentialNames": ["serperApi"], + "id": "serper_0-input-credential-credential" + } + ], + "inputAnchors": [], + "inputs": {}, + "outputAnchors": [ + { + "id": "serper_0-output-serper-Serper|Tool|StructuredTool", + "name": "serper", + "label": "Serper", + "type": "Serper | Tool | StructuredTool" + } + ], + "outputs": {}, + "selected": false + }, + "selected": false, + "positionAbsolute": { + "x": 504.3508341937219, + "y": -10.324432507151982 + }, + "dragging": false + }, + { + "width": 300, + "height": 383, + "id": "openAIFunctionAgent_0", + "position": { + "x": 1241.9739093293213, + "y": 359.3158950327101 + }, + "type": "customNode", + "data": { + "id": "openAIFunctionAgent_0", + "label": "OpenAI Function Agent", + "name": "openAIFunctionAgent", + "version": 1, + "type": "AgentExecutor", + "baseClasses": ["AgentExecutor", "BaseChain"], + "category": "Agents", + "description": "An agent that uses OpenAI's Function Calling functionality to pick the tool and args to call", + "inputParams": [ + { + "label": "System Message", + "name": "systemMessage", + "type": "string", + "rows": 4, + "optional": true, + "additionalParams": true, + "id": "openAIFunctionAgent_0-input-systemMessage-string" + } + ], + "inputAnchors": [ + { + "label": "Allowed Tools", + "name": "tools", + "type": "Tool", + "list": true, + "id": "openAIFunctionAgent_0-input-tools-Tool" + }, + { + "label": "Memory", + "name": "memory", + "type": "BaseChatMemory", + "id": "openAIFunctionAgent_0-input-memory-BaseChatMemory" + }, + { + "label": "OpenAI Chat Model", + "name": "model", + "description": "Only works with gpt-3.5-turbo-0613 and gpt-4-0613. Refer docs for more info", + "type": "BaseChatModel", + "id": "openAIFunctionAgent_0-input-model-BaseChatModel" + } + ], + "inputs": { + "tools": ["{{calculator_0.data.instance}}", "{{serper_0.data.instance}}", "{{customTool_0.data.instance}}"], + "memory": "{{bufferMemory_0.data.instance}}", + "model": "{{chatOpenAI_0.data.instance}}", + "systemMessage": "" + }, + "outputAnchors": [ + { + "id": "openAIFunctionAgent_0-output-openAIFunctionAgent-AgentExecutor|BaseChain", + "name": "openAIFunctionAgent", + "label": "AgentExecutor", + "type": "AgentExecutor | BaseChain" + } + ], + "outputs": {}, + "selected": false + }, + "selected": false, + "positionAbsolute": { + "x": 1241.9739093293213, + "y": 359.3158950327101 + }, + "dragging": false + }, + { + "width": 300, + "height": 523, + "id": "chatOpenAI_0", + "position": { + "x": 817.8210275868742, + "y": 627.7677030233751 + }, + "type": "customNode", + "data": { + "id": "chatOpenAI_0", + "label": "ChatOpenAI", + "name": "chatOpenAI", + "version": 1, + "type": "ChatOpenAI", + "baseClasses": ["ChatOpenAI", "BaseChatModel", "BaseLanguageModel"], + "category": "Chat Models", + "description": "Wrapper around OpenAI large language models that use the Chat endpoint", + "inputParams": [ + { + "label": "Connect Credential", + "name": "credential", + "type": "credential", + "credentialNames": ["openAIApi"], + "id": "chatOpenAI_0-input-credential-credential" + }, + { + "label": "Model Name", + "name": "modelName", + "type": "options", + "options": [ + { + "label": "gpt-4", + "name": "gpt-4" + }, + { + "label": "gpt-4-0613", + "name": "gpt-4-0613" + }, + { + "label": "gpt-4-32k", + "name": "gpt-4-32k" + }, + { + "label": "gpt-4-32k-0613", + "name": "gpt-4-32k-0613" + }, + { + "label": "gpt-3.5-turbo", + "name": "gpt-3.5-turbo" + }, + { + "label": "gpt-3.5-turbo-0613", + "name": "gpt-3.5-turbo-0613" + }, + { + "label": "gpt-3.5-turbo-16k", + "name": "gpt-3.5-turbo-16k" + }, + { + "label": "gpt-3.5-turbo-16k-0613", + "name": "gpt-3.5-turbo-16k-0613" + } + ], + "default": "gpt-3.5-turbo", + "optional": true, + "id": "chatOpenAI_0-input-modelName-options" + }, + { + "label": "Temperature", + "name": "temperature", + "type": "number", + "default": 0.9, + "optional": true, + "id": "chatOpenAI_0-input-temperature-number" + }, + { + "label": "Max Tokens", + "name": "maxTokens", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_0-input-maxTokens-number" + }, + { + "label": "Top Probability", + "name": "topP", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_0-input-topP-number" + }, + { + "label": "Frequency Penalty", + "name": "frequencyPenalty", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_0-input-frequencyPenalty-number" + }, + { + "label": "Presence Penalty", + "name": "presencePenalty", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_0-input-presencePenalty-number" + }, + { + "label": "Timeout", + "name": "timeout", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_0-input-timeout-number" + }, + { + "label": "BasePath", + "name": "basepath", + "type": "string", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_0-input-basepath-string" + } + ], + "inputAnchors": [], + "inputs": { + "modelName": "gpt-3.5-turbo", + "temperature": 0.9, + "maxTokens": "", + "topP": "", + "frequencyPenalty": "", + "presencePenalty": "", + "timeout": "", + "basepath": "" + }, + "outputAnchors": [ + { + "id": "chatOpenAI_0-output-chatOpenAI-ChatOpenAI|BaseChatModel|BaseLanguageModel", + "name": "chatOpenAI", + "label": "ChatOpenAI", + "type": "ChatOpenAI | BaseChatModel | BaseLanguageModel" + } + ], + "outputs": {}, + "selected": false + }, + "selected": false, + "positionAbsolute": { + "x": 817.8210275868742, + "y": 627.7677030233751 + }, + "dragging": false + } + ], + "edges": [ + { + "source": "calculator_0", + "sourceHandle": "calculator_0-output-calculator-Calculator|Tool|StructuredTool|BaseLangChain|Serializable", + "target": "openAIFunctionAgent_0", + "targetHandle": "openAIFunctionAgent_0-input-tools-Tool", + "type": "buttonedge", + "id": "calculator_0-calculator_0-output-calculator-Calculator|Tool|StructuredTool|BaseLangChain|Serializable-openAIFunctionAgent_0-openAIFunctionAgent_0-input-tools-Tool", + "data": { + "label": "" + } + }, + { + "source": "serper_0", + "sourceHandle": "serper_0-output-serper-Serper|Tool|StructuredTool", + "target": "openAIFunctionAgent_0", + "targetHandle": "openAIFunctionAgent_0-input-tools-Tool", + "type": "buttonedge", + "id": "serper_0-serper_0-output-serper-Serper|Tool|StructuredTool-openAIFunctionAgent_0-openAIFunctionAgent_0-input-tools-Tool", + "data": { + "label": "" + } + }, + { + "source": "customTool_0", + "sourceHandle": "customTool_0-output-customTool-CustomTool|Tool|StructuredTool", + "target": "openAIFunctionAgent_0", + "targetHandle": "openAIFunctionAgent_0-input-tools-Tool", + "type": "buttonedge", + "id": "customTool_0-customTool_0-output-customTool-CustomTool|Tool|StructuredTool-openAIFunctionAgent_0-openAIFunctionAgent_0-input-tools-Tool", + "data": { + "label": "" + } + }, + { + "source": "chatOpenAI_0", + "sourceHandle": "chatOpenAI_0-output-chatOpenAI-ChatOpenAI|BaseChatModel|BaseLanguageModel", + "target": "openAIFunctionAgent_0", + "targetHandle": "openAIFunctionAgent_0-input-model-BaseChatModel", + "type": "buttonedge", + "id": "chatOpenAI_0-chatOpenAI_0-output-chatOpenAI-ChatOpenAI|BaseChatModel|BaseLanguageModel-openAIFunctionAgent_0-openAIFunctionAgent_0-input-model-BaseChatModel", + "data": { + "label": "" + } + }, + { + "source": "bufferMemory_0", + "sourceHandle": "bufferMemory_0-output-bufferMemory-BufferMemory|BaseChatMemory|BaseMemory", + "target": "openAIFunctionAgent_0", + "targetHandle": "openAIFunctionAgent_0-input-memory-BaseChatMemory", + "type": "buttonedge", + "id": "bufferMemory_0-bufferMemory_0-output-bufferMemory-BufferMemory|BaseChatMemory|BaseMemory-openAIFunctionAgent_0-openAIFunctionAgent_0-input-memory-BaseChatMemory", + "data": { + "label": "" + } + } + ] +} diff --git a/packages/server/marketplaces/chatflows/Prompt Chaining with VectorStore.json b/packages/server/marketplaces/chatflows/Prompt Chaining with VectorStore.json new file mode 100644 index 000000000..9d6838eb3 --- /dev/null +++ b/packages/server/marketplaces/chatflows/Prompt Chaining with VectorStore.json @@ -0,0 +1,966 @@ +{ + "description": "Use chat history to rephrase user question, and answer the rephrased question using retrieved docs from vector store", + "nodes": [ + { + "width": 300, + "height": 503, + "id": "pineconeExistingIndex_0", + "position": { + "x": 1062.7418678410986, + "y": -109.27680365777141 + }, + "type": "customNode", + "data": { + "id": "pineconeExistingIndex_0", + "label": "Pinecone Load Existing Index", + "version": 1, + "name": "pineconeExistingIndex", + "type": "Pinecone", + "baseClasses": ["Pinecone", "VectorStoreRetriever", "BaseRetriever"], + "category": "Vector Stores", + "description": "Load existing index from Pinecone (i.e: Document has been upserted)", + "inputParams": [ + { + "label": "Connect Credential", + "name": "credential", + "type": "credential", + "credentialNames": ["pineconeApi"], + "id": "pineconeExistingIndex_0-input-credential-credential" + }, + { + "label": "Pinecone Index", + "name": "pineconeIndex", + "type": "string", + "id": "pineconeExistingIndex_0-input-pineconeIndex-string" + }, + { + "label": "Pinecone Namespace", + "name": "pineconeNamespace", + "type": "string", + "placeholder": "my-first-namespace", + "additionalParams": true, + "optional": true, + "id": "pineconeExistingIndex_0-input-pineconeNamespace-string" + }, + { + "label": "Pinecone Metadata Filter", + "name": "pineconeMetadataFilter", + "type": "json", + "optional": true, + "additionalParams": true, + "id": "pineconeExistingIndex_0-input-pineconeMetadataFilter-json" + }, + { + "label": "Top K", + "name": "topK", + "description": "Number of top results to fetch. Default to 4", + "placeholder": "4", + "type": "number", + "additionalParams": true, + "optional": true, + "id": "pineconeExistingIndex_0-input-topK-number" + } + ], + "inputAnchors": [ + { + "label": "Embeddings", + "name": "embeddings", + "type": "Embeddings", + "id": "pineconeExistingIndex_0-input-embeddings-Embeddings" + } + ], + "inputs": { + "embeddings": "{{openAIEmbeddings_0.data.instance}}", + "pineconeIndex": "newindex", + "pineconeNamespace": "", + "pineconeMetadataFilter": "{}", + "topK": "" + }, + "outputAnchors": [ + { + "name": "output", + "label": "Output", + "type": "options", + "options": [ + { + "id": "pineconeExistingIndex_0-output-retriever-Pinecone|VectorStoreRetriever|BaseRetriever", + "name": "retriever", + "label": "Pinecone Retriever", + "type": "Pinecone | VectorStoreRetriever | BaseRetriever" + }, + { + "id": "pineconeExistingIndex_0-output-vectorStore-Pinecone|VectorStore", + "name": "vectorStore", + "label": "Pinecone Vector Store", + "type": "Pinecone | VectorStore" + } + ], + "default": "retriever" + } + ], + "outputs": { + "output": "vectorStore" + }, + "selected": false + }, + "selected": false, + "positionAbsolute": { + "x": 1062.7418678410986, + "y": -109.27680365777141 + }, + "dragging": false + }, + { + "width": 300, + "height": 327, + "id": "openAIEmbeddings_0", + "position": { + "x": 711.3971966563331, + "y": 7.7184225021727 + }, + "type": "customNode", + "data": { + "id": "openAIEmbeddings_0", + "label": "OpenAI Embeddings", + "version": 1, + "name": "openAIEmbeddings", + "type": "OpenAIEmbeddings", + "baseClasses": ["OpenAIEmbeddings", "Embeddings"], + "category": "Embeddings", + "description": "OpenAI API to generate embeddings for a given text", + "inputParams": [ + { + "label": "Connect Credential", + "name": "credential", + "type": "credential", + "credentialNames": ["openAIApi"], + "id": "openAIEmbeddings_0-input-credential-credential" + }, + { + "label": "Strip New Lines", + "name": "stripNewLines", + "type": "boolean", + "optional": true, + "additionalParams": true, + "id": "openAIEmbeddings_0-input-stripNewLines-boolean" + }, + { + "label": "Batch Size", + "name": "batchSize", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "openAIEmbeddings_0-input-batchSize-number" + }, + { + "label": "Timeout", + "name": "timeout", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "openAIEmbeddings_0-input-timeout-number" + }, + { + "label": "BasePath", + "name": "basepath", + "type": "string", + "optional": true, + "additionalParams": true, + "id": "openAIEmbeddings_0-input-basepath-string" + } + ], + "inputAnchors": [], + "inputs": { + "stripNewLines": "", + "batchSize": "", + "timeout": "", + "basepath": "" + }, + "outputAnchors": [ + { + "id": "openAIEmbeddings_0-output-openAIEmbeddings-OpenAIEmbeddings|Embeddings", + "name": "openAIEmbeddings", + "label": "OpenAIEmbeddings", + "type": "OpenAIEmbeddings | Embeddings" + } + ], + "outputs": {}, + "selected": false + }, + "selected": false, + "positionAbsolute": { + "x": 711.3971966563331, + "y": 7.7184225021727 + }, + "dragging": false + }, + { + "width": 300, + "height": 473, + "id": "promptTemplate_0", + "position": { + "x": 348.2881107399286, + "y": -97.74510214137423 + }, + "type": "customNode", + "data": { + "id": "promptTemplate_0", + "label": "Prompt Template", + "version": 1, + "name": "promptTemplate", + "type": "PromptTemplate", + "baseClasses": ["PromptTemplate", "BaseStringPromptTemplate", "BasePromptTemplate", "Runnable"], + "category": "Prompts", + "description": "Schema to represent a basic prompt for an LLM", + "inputParams": [ + { + "label": "Template", + "name": "template", + "type": "string", + "rows": 4, + "placeholder": "What is a good name for a company that makes {product}?", + "id": "promptTemplate_0-input-template-string" + }, + { + "label": "Format Prompt Values", + "name": "promptValues", + "type": "json", + "optional": true, + "acceptVariable": true, + "list": true, + "id": "promptTemplate_0-input-promptValues-json" + } + ], + "inputAnchors": [], + "inputs": { + "template": "Given the following conversation and a follow up question, rephrase the follow up question to be a standalone question.\n\nChat History:\n{chat_history}\nFollow Up Input: {question}\nStandalone question:", + "promptValues": "{\"question\":\"{{question}}\",\"chat_history\":\"{{chat_history}}\"}" + }, + "outputAnchors": [ + { + "id": "promptTemplate_0-output-promptTemplate-PromptTemplate|BaseStringPromptTemplate|BasePromptTemplate|Runnable", + "name": "promptTemplate", + "label": "PromptTemplate", + "type": "PromptTemplate | BaseStringPromptTemplate | BasePromptTemplate | Runnable" + } + ], + "outputs": {}, + "selected": false + }, + "selected": false, + "positionAbsolute": { + "x": 348.2881107399286, + "y": -97.74510214137423 + }, + "dragging": false + }, + { + "width": 300, + "height": 522, + "id": "chatOpenAI_0", + "position": { + "x": 335.7621848973805, + "y": -651.7411273245009 + }, + "type": "customNode", + "data": { + "id": "chatOpenAI_0", + "label": "ChatOpenAI", + "version": 1, + "name": "chatOpenAI", + "type": "ChatOpenAI", + "baseClasses": ["ChatOpenAI", "BaseChatModel", "BaseLanguageModel", "Runnable"], + "category": "Chat Models", + "description": "Wrapper around OpenAI large language models that use the Chat endpoint", + "inputParams": [ + { + "label": "Connect Credential", + "name": "credential", + "type": "credential", + "credentialNames": ["openAIApi"], + "id": "chatOpenAI_0-input-credential-credential" + }, + { + "label": "Model Name", + "name": "modelName", + "type": "options", + "options": [ + { + "label": "gpt-4", + "name": "gpt-4" + }, + { + "label": "gpt-4-0613", + "name": "gpt-4-0613" + }, + { + "label": "gpt-4-32k", + "name": "gpt-4-32k" + }, + { + "label": "gpt-4-32k-0613", + "name": "gpt-4-32k-0613" + }, + { + "label": "gpt-3.5-turbo", + "name": "gpt-3.5-turbo" + }, + { + "label": "gpt-3.5-turbo-0613", + "name": "gpt-3.5-turbo-0613" + }, + { + "label": "gpt-3.5-turbo-16k", + "name": "gpt-3.5-turbo-16k" + }, + { + "label": "gpt-3.5-turbo-16k-0613", + "name": "gpt-3.5-turbo-16k-0613" + } + ], + "default": "gpt-3.5-turbo", + "optional": true, + "id": "chatOpenAI_0-input-modelName-options" + }, + { + "label": "Temperature", + "name": "temperature", + "type": "number", + "step": 0.1, + "default": 0.9, + "optional": true, + "id": "chatOpenAI_0-input-temperature-number" + }, + { + "label": "Max Tokens", + "name": "maxTokens", + "type": "number", + "step": 1, + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_0-input-maxTokens-number" + }, + { + "label": "Top Probability", + "name": "topP", + "type": "number", + "step": 0.1, + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_0-input-topP-number" + }, + { + "label": "Frequency Penalty", + "name": "frequencyPenalty", + "type": "number", + "step": 0.1, + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_0-input-frequencyPenalty-number" + }, + { + "label": "Presence Penalty", + "name": "presencePenalty", + "type": "number", + "step": 0.1, + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_0-input-presencePenalty-number" + }, + { + "label": "Timeout", + "name": "timeout", + "type": "number", + "step": 1, + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_0-input-timeout-number" + }, + { + "label": "BasePath", + "name": "basepath", + "type": "string", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_0-input-basepath-string" + } + ], + "inputAnchors": [], + "inputs": { + "modelName": "gpt-3.5-turbo-16k", + "temperature": 0.9, + "maxTokens": "", + "topP": "", + "frequencyPenalty": "", + "presencePenalty": "", + "timeout": "", + "basepath": "" + }, + "outputAnchors": [ + { + "id": "chatOpenAI_0-output-chatOpenAI-ChatOpenAI|BaseChatModel|BaseLanguageModel|Runnable", + "name": "chatOpenAI", + "label": "ChatOpenAI", + "type": "ChatOpenAI | BaseChatModel | BaseLanguageModel | Runnable" + } + ], + "outputs": {}, + "selected": false + }, + "selected": false, + "dragging": false, + "positionAbsolute": { + "x": 335.7621848973805, + "y": -651.7411273245009 + } + }, + { + "width": 300, + "height": 522, + "id": "chatOpenAI_1", + "position": { + "x": 1765.2801848172305, + "y": -667.9261054149061 + }, + "type": "customNode", + "data": { + "id": "chatOpenAI_1", + "label": "ChatOpenAI", + "version": 1, + "name": "chatOpenAI", + "type": "ChatOpenAI", + "baseClasses": ["ChatOpenAI", "BaseChatModel", "BaseLanguageModel", "Runnable"], + "category": "Chat Models", + "description": "Wrapper around OpenAI large language models that use the Chat endpoint", + "inputParams": [ + { + "label": "Connect Credential", + "name": "credential", + "type": "credential", + "credentialNames": ["openAIApi"], + "id": "chatOpenAI_1-input-credential-credential" + }, + { + "label": "Model Name", + "name": "modelName", + "type": "options", + "options": [ + { + "label": "gpt-4", + "name": "gpt-4" + }, + { + "label": "gpt-4-0613", + "name": "gpt-4-0613" + }, + { + "label": "gpt-4-32k", + "name": "gpt-4-32k" + }, + { + "label": "gpt-4-32k-0613", + "name": "gpt-4-32k-0613" + }, + { + "label": "gpt-3.5-turbo", + "name": "gpt-3.5-turbo" + }, + { + "label": "gpt-3.5-turbo-0613", + "name": "gpt-3.5-turbo-0613" + }, + { + "label": "gpt-3.5-turbo-16k", + "name": "gpt-3.5-turbo-16k" + }, + { + "label": "gpt-3.5-turbo-16k-0613", + "name": "gpt-3.5-turbo-16k-0613" + } + ], + "default": "gpt-3.5-turbo", + "optional": true, + "id": "chatOpenAI_1-input-modelName-options" + }, + { + "label": "Temperature", + "name": "temperature", + "type": "number", + "step": 0.1, + "default": 0.9, + "optional": true, + "id": "chatOpenAI_1-input-temperature-number" + }, + { + "label": "Max Tokens", + "name": "maxTokens", + "type": "number", + "step": 1, + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_1-input-maxTokens-number" + }, + { + "label": "Top Probability", + "name": "topP", + "type": "number", + "step": 0.1, + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_1-input-topP-number" + }, + { + "label": "Frequency Penalty", + "name": "frequencyPenalty", + "type": "number", + "step": 0.1, + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_1-input-frequencyPenalty-number" + }, + { + "label": "Presence Penalty", + "name": "presencePenalty", + "type": "number", + "step": 0.1, + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_1-input-presencePenalty-number" + }, + { + "label": "Timeout", + "name": "timeout", + "type": "number", + "step": 1, + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_1-input-timeout-number" + }, + { + "label": "BasePath", + "name": "basepath", + "type": "string", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_1-input-basepath-string" + } + ], + "inputAnchors": [], + "inputs": { + "modelName": "gpt-3.5-turbo-16k", + "temperature": 0.9, + "maxTokens": "", + "topP": "", + "frequencyPenalty": "", + "presencePenalty": "", + "timeout": "", + "basepath": "" + }, + "outputAnchors": [ + { + "id": "chatOpenAI_1-output-chatOpenAI-ChatOpenAI|BaseChatModel|BaseLanguageModel|Runnable", + "name": "chatOpenAI", + "label": "ChatOpenAI", + "type": "ChatOpenAI | BaseChatModel | BaseLanguageModel | Runnable" + } + ], + "outputs": {}, + "selected": false + }, + "selected": false, + "dragging": false, + "positionAbsolute": { + "x": 1765.2801848172305, + "y": -667.9261054149061 + } + }, + { + "width": 300, + "height": 473, + "id": "promptTemplate_1", + "position": { + "x": 1773.720934090435, + "y": -116.71323227575395 + }, + "type": "customNode", + "data": { + "id": "promptTemplate_1", + "label": "Prompt Template", + "version": 1, + "name": "promptTemplate", + "type": "PromptTemplate", + "baseClasses": ["PromptTemplate", "BaseStringPromptTemplate", "BasePromptTemplate", "Runnable"], + "category": "Prompts", + "description": "Schema to represent a basic prompt for an LLM", + "inputParams": [ + { + "label": "Template", + "name": "template", + "type": "string", + "rows": 4, + "placeholder": "What is a good name for a company that makes {product}?", + "id": "promptTemplate_1-input-template-string" + }, + { + "label": "Format Prompt Values", + "name": "promptValues", + "type": "json", + "optional": true, + "acceptVariable": true, + "list": true, + "id": "promptTemplate_1-input-promptValues-json" + } + ], + "inputAnchors": [], + "inputs": { + "template": "Use the following pieces of context to answer the question at the end.\n\n{context}\n\nQuestion: {question}\nHelpful Answer:", + "promptValues": "{\"context\":\"{{vectorStoreToDocument_0.data.instance}}\",\"question\":\"{{llmChain_0.data.instance}}\"}" + }, + "outputAnchors": [ + { + "id": "promptTemplate_1-output-promptTemplate-PromptTemplate|BaseStringPromptTemplate|BasePromptTemplate|Runnable", + "name": "promptTemplate", + "label": "PromptTemplate", + "type": "PromptTemplate | BaseStringPromptTemplate | BasePromptTemplate | Runnable" + } + ], + "outputs": {}, + "selected": false + }, + "selected": false, + "positionAbsolute": { + "x": 1773.720934090435, + "y": -116.71323227575395 + }, + "dragging": false + }, + { + "width": 300, + "height": 404, + "id": "llmChain_0", + "position": { + "x": 756.1670091985342, + "y": -592.5151355056942 + }, + "type": "customNode", + "data": { + "id": "llmChain_0", + "label": "LLM Chain", + "version": 1, + "name": "llmChain", + "type": "LLMChain", + "baseClasses": ["LLMChain", "BaseChain", "Runnable"], + "category": "Chains", + "description": "Chain to run queries against LLMs", + "inputParams": [ + { + "label": "Chain Name", + "name": "chainName", + "type": "string", + "placeholder": "Name Your Chain", + "optional": true, + "id": "llmChain_0-input-chainName-string" + } + ], + "inputAnchors": [ + { + "label": "Language Model", + "name": "model", + "type": "BaseLanguageModel", + "id": "llmChain_0-input-model-BaseLanguageModel" + }, + { + "label": "Prompt", + "name": "prompt", + "type": "BasePromptTemplate", + "id": "llmChain_0-input-prompt-BasePromptTemplate" + } + ], + "inputs": { + "model": "{{chatOpenAI_0.data.instance}}", + "prompt": "{{promptTemplate_0.data.instance}}", + "chainName": "QuestionChain" + }, + "outputAnchors": [ + { + "name": "output", + "label": "Output", + "type": "options", + "options": [ + { + "id": "llmChain_0-output-llmChain-LLMChain|BaseChain|Runnable", + "name": "llmChain", + "label": "LLM Chain", + "type": "LLMChain | BaseChain | Runnable" + }, + { + "id": "llmChain_0-output-outputPrediction-string|json", + "name": "outputPrediction", + "label": "Output Prediction", + "type": "string | json" + } + ], + "default": "llmChain" + } + ], + "outputs": { + "output": "outputPrediction" + }, + "selected": false + }, + "selected": false, + "positionAbsolute": { + "x": 756.1670091985342, + "y": -592.5151355056942 + }, + "dragging": false + }, + { + "width": 300, + "height": 404, + "id": "llmChain_1", + "position": { + "x": 2200.1274896215496, + "y": -144.29167974642334 + }, + "type": "customNode", + "data": { + "id": "llmChain_1", + "label": "LLM Chain", + "version": 1, + "name": "llmChain", + "type": "LLMChain", + "baseClasses": ["LLMChain", "BaseChain", "Runnable"], + "category": "Chains", + "description": "Chain to run queries against LLMs", + "inputParams": [ + { + "label": "Chain Name", + "name": "chainName", + "type": "string", + "placeholder": "Name Your Chain", + "optional": true, + "id": "llmChain_1-input-chainName-string" + } + ], + "inputAnchors": [ + { + "label": "Language Model", + "name": "model", + "type": "BaseLanguageModel", + "id": "llmChain_1-input-model-BaseLanguageModel" + }, + { + "label": "Prompt", + "name": "prompt", + "type": "BasePromptTemplate", + "id": "llmChain_1-input-prompt-BasePromptTemplate" + } + ], + "inputs": { + "model": "{{chatOpenAI_1.data.instance}}", + "prompt": "{{promptTemplate_1.data.instance}}", + "chainName": "" + }, + "outputAnchors": [ + { + "name": "output", + "label": "Output", + "type": "options", + "options": [ + { + "id": "llmChain_1-output-llmChain-LLMChain|BaseChain|Runnable", + "name": "llmChain", + "label": "LLM Chain", + "type": "LLMChain | BaseChain | Runnable" + }, + { + "id": "llmChain_1-output-outputPrediction-string|json", + "name": "outputPrediction", + "label": "Output Prediction", + "type": "string | json" + } + ], + "default": "llmChain" + } + ], + "outputs": { + "output": "llmChain" + }, + "selected": false + }, + "selected": false, + "positionAbsolute": { + "x": 2200.1274896215496, + "y": -144.29167974642334 + }, + "dragging": false + }, + { + "width": 300, + "height": 353, + "id": "vectorStoreToDocument_0", + "position": { + "x": 1407.7038120189868, + "y": -26.16468811205081 + }, + "type": "customNode", + "data": { + "id": "vectorStoreToDocument_0", + "label": "VectorStore To Document", + "version": 1, + "name": "vectorStoreToDocument", + "type": "Document", + "baseClasses": ["Document"], + "category": "Document Loaders", + "description": "Search documents with scores from vector store", + "inputParams": [ + { + "label": "Minimum Score (%)", + "name": "minScore", + "type": "number", + "optional": true, + "placeholder": "75", + "step": 1, + "description": "Minumum score for embeddings documents to be included", + "id": "vectorStoreToDocument_0-input-minScore-number" + } + ], + "inputAnchors": [ + { + "label": "Vector Store", + "name": "vectorStore", + "type": "VectorStore", + "id": "vectorStoreToDocument_0-input-vectorStore-VectorStore" + } + ], + "inputs": { + "vectorStore": "{{pineconeExistingIndex_0.data.instance}}", + "minScore": "" + }, + "outputAnchors": [ + { + "name": "output", + "label": "Output", + "type": "options", + "options": [ + { + "id": "vectorStoreToDocument_0-output-document-Document", + "name": "document", + "label": "Document", + "type": "Document" + }, + { + "id": "vectorStoreToDocument_0-output-text-string|json", + "name": "text", + "label": "Text", + "type": "string | json" + } + ], + "default": "document" + } + ], + "outputs": { + "output": "text" + }, + "selected": false + }, + "selected": false, + "positionAbsolute": { + "x": 1407.7038120189868, + "y": -26.16468811205081 + }, + "dragging": false + } + ], + "edges": [ + { + "source": "openAIEmbeddings_0", + "sourceHandle": "openAIEmbeddings_0-output-openAIEmbeddings-OpenAIEmbeddings|Embeddings", + "target": "pineconeExistingIndex_0", + "targetHandle": "pineconeExistingIndex_0-input-embeddings-Embeddings", + "type": "buttonedge", + "id": "openAIEmbeddings_0-openAIEmbeddings_0-output-openAIEmbeddings-OpenAIEmbeddings|Embeddings-pineconeExistingIndex_0-pineconeExistingIndex_0-input-embeddings-Embeddings", + "data": { + "label": "" + } + }, + { + "source": "pineconeExistingIndex_0", + "sourceHandle": "pineconeExistingIndex_0-output-vectorStore-Pinecone|VectorStore", + "target": "vectorStoreToDocument_0", + "targetHandle": "vectorStoreToDocument_0-input-vectorStore-VectorStore", + "type": "buttonedge", + "id": "pineconeExistingIndex_0-pineconeExistingIndex_0-output-vectorStore-Pinecone|VectorStore-vectorStoreToDocument_0-vectorStoreToDocument_0-input-vectorStore-VectorStore", + "data": { + "label": "" + } + }, + { + "source": "vectorStoreToDocument_0", + "sourceHandle": "vectorStoreToDocument_0-output-text-string|json", + "target": "promptTemplate_1", + "targetHandle": "promptTemplate_1-input-promptValues-json", + "type": "buttonedge", + "id": "vectorStoreToDocument_0-vectorStoreToDocument_0-output-text-string|json-promptTemplate_1-promptTemplate_1-input-promptValues-json", + "data": { + "label": "" + } + }, + { + "source": "chatOpenAI_0", + "sourceHandle": "chatOpenAI_0-output-chatOpenAI-ChatOpenAI|BaseChatModel|BaseLanguageModel|Runnable", + "target": "llmChain_0", + "targetHandle": "llmChain_0-input-model-BaseLanguageModel", + "type": "buttonedge", + "id": "chatOpenAI_0-chatOpenAI_0-output-chatOpenAI-ChatOpenAI|BaseChatModel|BaseLanguageModel|Runnable-llmChain_0-llmChain_0-input-model-BaseLanguageModel", + "data": { + "label": "" + } + }, + { + "source": "promptTemplate_0", + "sourceHandle": "promptTemplate_0-output-promptTemplate-PromptTemplate|BaseStringPromptTemplate|BasePromptTemplate|Runnable", + "target": "llmChain_0", + "targetHandle": "llmChain_0-input-prompt-BasePromptTemplate", + "type": "buttonedge", + "id": "promptTemplate_0-promptTemplate_0-output-promptTemplate-PromptTemplate|BaseStringPromptTemplate|BasePromptTemplate|Runnable-llmChain_0-llmChain_0-input-prompt-BasePromptTemplate", + "data": { + "label": "" + } + }, + { + "source": "llmChain_0", + "sourceHandle": "llmChain_0-output-outputPrediction-string|json", + "target": "promptTemplate_1", + "targetHandle": "promptTemplate_1-input-promptValues-json", + "type": "buttonedge", + "id": "llmChain_0-llmChain_0-output-outputPrediction-string|json-promptTemplate_1-promptTemplate_1-input-promptValues-json", + "data": { + "label": "" + } + }, + { + "source": "chatOpenAI_1", + "sourceHandle": "chatOpenAI_1-output-chatOpenAI-ChatOpenAI|BaseChatModel|BaseLanguageModel|Runnable", + "target": "llmChain_1", + "targetHandle": "llmChain_1-input-model-BaseLanguageModel", + "type": "buttonedge", + "id": "chatOpenAI_1-chatOpenAI_1-output-chatOpenAI-ChatOpenAI|BaseChatModel|BaseLanguageModel|Runnable-llmChain_1-llmChain_1-input-model-BaseLanguageModel", + "data": { + "label": "" + } + }, + { + "source": "promptTemplate_1", + "sourceHandle": "promptTemplate_1-output-promptTemplate-PromptTemplate|BaseStringPromptTemplate|BasePromptTemplate|Runnable", + "target": "llmChain_1", + "targetHandle": "llmChain_1-input-prompt-BasePromptTemplate", + "type": "buttonedge", + "id": "promptTemplate_1-promptTemplate_1-output-promptTemplate-PromptTemplate|BaseStringPromptTemplate|BasePromptTemplate|Runnable-llmChain_1-llmChain_1-input-prompt-BasePromptTemplate", + "data": { + "label": "" + } + } + ] +} diff --git a/packages/server/marketplaces/Prompt Chaining.json b/packages/server/marketplaces/chatflows/Prompt Chaining.json similarity index 72% rename from packages/server/marketplaces/Prompt Chaining.json rename to packages/server/marketplaces/chatflows/Prompt Chaining.json index 69f9370ed..e0491cc1c 100644 --- a/packages/server/marketplaces/Prompt Chaining.json +++ b/packages/server/marketplaces/chatflows/Prompt Chaining.json @@ -3,27 +3,467 @@ "nodes": [ { "width": 300, - "height": 526, + "height": 475, + "id": "promptTemplate_0", + "position": { + "x": 792.9464838535649, + "y": 527.1718536712464 + }, + "type": "customNode", + "data": { + "id": "promptTemplate_0", + "label": "Prompt Template", + "name": "promptTemplate", + "version": 1, + "type": "PromptTemplate", + "baseClasses": ["PromptTemplate", "BaseStringPromptTemplate", "BasePromptTemplate"], + "category": "Prompts", + "description": "Schema to represent a basic prompt for an LLM", + "inputParams": [ + { + "label": "Template", + "name": "template", + "type": "string", + "rows": 4, + "placeholder": "What is a good name for a company that makes {product}?", + "id": "promptTemplate_0-input-template-string" + }, + { + "label": "Format Prompt Values", + "name": "promptValues", + "type": "json", + "optional": true, + "acceptVariable": true, + "list": true, + "id": "promptTemplate_0-input-promptValues-json" + } + ], + "inputAnchors": [], + "inputs": { + "template": "You are an AI who performs one task based on the following objective: {objective}.\nRespond with how you would complete this task:", + "promptValues": "{\"objective\":\"{{question}}\"}" + }, + "outputAnchors": [ + { + "id": "promptTemplate_0-output-promptTemplate-PromptTemplate|BaseStringPromptTemplate|BasePromptTemplate", + "name": "promptTemplate", + "label": "PromptTemplate", + "type": "PromptTemplate | BaseStringPromptTemplate | BasePromptTemplate" + } + ], + "outputs": {}, + "selected": false + }, + "selected": false, + "positionAbsolute": { + "x": 792.9464838535649, + "y": 527.1718536712464 + }, + "dragging": false + }, + { + "width": 300, + "height": 475, + "id": "promptTemplate_1", + "position": { + "x": 1571.0896874449775, + "y": 522.8455116403258 + }, + "type": "customNode", + "data": { + "id": "promptTemplate_1", + "label": "Prompt Template", + "name": "promptTemplate", + "version": 1, + "type": "PromptTemplate", + "baseClasses": ["PromptTemplate", "BaseStringPromptTemplate", "BasePromptTemplate"], + "category": "Prompts", + "description": "Schema to represent a basic prompt for an LLM", + "inputParams": [ + { + "label": "Template", + "name": "template", + "type": "string", + "rows": 4, + "placeholder": "What is a good name for a company that makes {product}?", + "id": "promptTemplate_1-input-template-string" + }, + { + "label": "Format Prompt Values", + "name": "promptValues", + "type": "json", + "optional": true, + "acceptVariable": true, + "list": true, + "id": "promptTemplate_1-input-promptValues-json" + } + ], + "inputAnchors": [], + "inputs": { + "template": "You are a task creation AI that uses the result of an execution agent to create new tasks with the following objective: {objective}.\nThe last completed task has the result: {result}.\nBased on the result, create new tasks to be completed by the AI system that do not overlap with result.\nReturn the tasks as an array.", + "promptValues": "{\"objective\":\"{{question}}\",\"result\":\"{{llmChain_0.data.instance}}\"}" + }, + "outputAnchors": [ + { + "id": "promptTemplate_1-output-promptTemplate-PromptTemplate|BaseStringPromptTemplate|BasePromptTemplate", + "name": "promptTemplate", + "label": "PromptTemplate", + "type": "PromptTemplate | BaseStringPromptTemplate | BasePromptTemplate" + } + ], + "outputs": {}, + "selected": false + }, + "positionAbsolute": { + "x": 1571.0896874449775, + "y": 522.8455116403258 + }, + "selected": false, + "dragging": false + }, + { + "width": 300, + "height": 405, + "id": "llmChain_0", + "position": { + "x": 1192.835706086358, + "y": 367.49653955405995 + }, + "type": "customNode", + "data": { + "id": "llmChain_0", + "label": "LLM Chain", + "name": "llmChain", + "version": 1, + "type": "LLMChain", + "baseClasses": ["LLMChain", "BaseChain"], + "category": "Chains", + "description": "Chain to run queries against LLMs", + "inputParams": [ + { + "label": "Chain Name", + "name": "chainName", + "type": "string", + "placeholder": "Name Your Chain", + "optional": true, + "id": "llmChain_0-input-chainName-string" + } + ], + "inputAnchors": [ + { + "label": "Language Model", + "name": "model", + "type": "BaseLanguageModel", + "id": "llmChain_0-input-model-BaseLanguageModel" + }, + { + "label": "Prompt", + "name": "prompt", + "type": "BasePromptTemplate", + "id": "llmChain_0-input-prompt-BasePromptTemplate" + } + ], + "inputs": { + "model": "{{openAI_1.data.instance}}", + "prompt": "{{promptTemplate_0.data.instance}}", + "chainName": "FirstChain" + }, + "outputAnchors": [ + { + "name": "output", + "label": "Output", + "type": "options", + "options": [ + { + "id": "llmChain_0-output-llmChain-LLMChain|BaseChain", + "name": "llmChain", + "label": "LLM Chain", + "type": "LLMChain | BaseChain" + }, + { + "id": "llmChain_0-output-outputPrediction-string|json", + "name": "outputPrediction", + "label": "Output Prediction", + "type": "string | json" + } + ], + "default": "llmChain" + } + ], + "outputs": { + "output": "outputPrediction" + }, + "selected": false + }, + "selected": false, + "positionAbsolute": { + "x": 1192.835706086358, + "y": 367.49653955405995 + }, + "dragging": false + }, + { + "width": 300, + "height": 405, + "id": "llmChain_1", + "position": { + "x": 1956.8236771865425, + "y": 359.10696865911547 + }, + "type": "customNode", + "data": { + "id": "llmChain_1", + "label": "LLM Chain", + "name": "llmChain", + "version": 1, + "type": "LLMChain", + "baseClasses": ["LLMChain", "BaseChain"], + "category": "Chains", + "description": "Chain to run queries against LLMs", + "inputParams": [ + { + "label": "Chain Name", + "name": "chainName", + "type": "string", + "placeholder": "Name Your Chain", + "optional": true, + "id": "llmChain_1-input-chainName-string" + } + ], + "inputAnchors": [ + { + "label": "Language Model", + "name": "model", + "type": "BaseLanguageModel", + "id": "llmChain_1-input-model-BaseLanguageModel" + }, + { + "label": "Prompt", + "name": "prompt", + "type": "BasePromptTemplate", + "id": "llmChain_1-input-prompt-BasePromptTemplate" + } + ], + "inputs": { + "model": "{{openAI_2.data.instance}}", + "prompt": "{{promptTemplate_1.data.instance}}", + "chainName": "LastChain" + }, + "outputAnchors": [ + { + "name": "output", + "label": "Output", + "type": "options", + "options": [ + { + "id": "llmChain_1-output-llmChain-LLMChain|BaseChain", + "name": "llmChain", + "label": "LLM Chain", + "type": "LLMChain | BaseChain" + }, + { + "id": "llmChain_1-output-outputPrediction-string|json", + "name": "outputPrediction", + "label": "Output Prediction", + "type": "string | json" + } + ], + "default": "llmChain" + } + ], + "outputs": { + "output": "llmChain" + }, + "selected": false + }, + "selected": false, + "positionAbsolute": { + "x": 1956.8236771865425, + "y": 359.10696865911547 + }, + "dragging": false + }, + { + "width": 300, + "height": 523, + "id": "openAI_1", + "position": { + "x": 791.6102007244282, + "y": -13.71386876566092 + }, + "type": "customNode", + "data": { + "id": "openAI_1", + "label": "OpenAI", + "name": "openAI", + "version": 1, + "type": "OpenAI", + "baseClasses": ["OpenAI", "BaseLLM", "BaseLanguageModel"], + "category": "LLMs", + "description": "Wrapper around OpenAI large language models", + "inputParams": [ + { + "label": "Connect Credential", + "name": "credential", + "type": "credential", + "credentialNames": ["openAIApi"], + "id": "openAI_1-input-credential-credential" + }, + { + "label": "Model Name", + "name": "modelName", + "type": "options", + "options": [ + { + "label": "text-davinci-003", + "name": "text-davinci-003" + }, + { + "label": "text-davinci-002", + "name": "text-davinci-002" + }, + { + "label": "text-curie-001", + "name": "text-curie-001" + }, + { + "label": "text-babbage-001", + "name": "text-babbage-001" + } + ], + "default": "text-davinci-003", + "optional": true, + "id": "openAI_1-input-modelName-options" + }, + { + "label": "Temperature", + "name": "temperature", + "type": "number", + "default": 0.7, + "optional": true, + "id": "openAI_1-input-temperature-number" + }, + { + "label": "Max Tokens", + "name": "maxTokens", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "openAI_1-input-maxTokens-number" + }, + { + "label": "Top Probability", + "name": "topP", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "openAI_1-input-topP-number" + }, + { + "label": "Best Of", + "name": "bestOf", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "openAI_1-input-bestOf-number" + }, + { + "label": "Frequency Penalty", + "name": "frequencyPenalty", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "openAI_1-input-frequencyPenalty-number" + }, + { + "label": "Presence Penalty", + "name": "presencePenalty", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "openAI_1-input-presencePenalty-number" + }, + { + "label": "Batch Size", + "name": "batchSize", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "openAI_1-input-batchSize-number" + }, + { + "label": "Timeout", + "name": "timeout", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "openAI_1-input-timeout-number" + }, + { + "label": "BasePath", + "name": "basepath", + "type": "string", + "optional": true, + "additionalParams": true, + "id": "openAI_1-input-basepath-string" + } + ], + "inputAnchors": [], + "inputs": { + "modelName": "text-davinci-003", + "temperature": 0.7, + "maxTokens": "", + "topP": "", + "bestOf": "", + "frequencyPenalty": "", + "presencePenalty": "", + "batchSize": "", + "timeout": "", + "basepath": "" + }, + "outputAnchors": [ + { + "id": "openAI_1-output-openAI-OpenAI|BaseLLM|BaseLanguageModel", + "name": "openAI", + "label": "OpenAI", + "type": "OpenAI | BaseLLM | BaseLanguageModel" + } + ], + "outputs": {}, + "selected": false + }, + "selected": false, + "positionAbsolute": { + "x": 791.6102007244282, + "y": -13.71386876566092 + }, + "dragging": false + }, + { + "width": 300, + "height": 523, "id": "openAI_2", "position": { - "x": 793.6674026500068, - "y": -20.826430802683774 + "x": 1571.148617508543, + "y": -20.372437481171687 }, "type": "customNode", "data": { "id": "openAI_2", "label": "OpenAI", "name": "openAI", + "version": 1, "type": "OpenAI", - "baseClasses": ["OpenAI", "BaseLLM", "BaseLanguageModel", "BaseLangChain"], + "baseClasses": ["OpenAI", "BaseLLM", "BaseLanguageModel"], "category": "LLMs", "description": "Wrapper around OpenAI large language models", "inputParams": [ { - "label": "OpenAI Api Key", - "name": "openAIApiKey", - "type": "password", - "id": "openAI_2-input-openAIApiKey-password" + "label": "Connect Credential", + "name": "credential", + "type": "credential", + "credentialNames": ["openAIApi"], + "id": "openAI_2-input-credential-credential" }, { "label": "Model Name", @@ -114,6 +554,14 @@ "optional": true, "additionalParams": true, "id": "openAI_2-input-timeout-number" + }, + { + "label": "BasePath", + "name": "basepath", + "type": "string", + "optional": true, + "additionalParams": true, + "id": "openAI_2-input-basepath-string" } ], "inputAnchors": [], @@ -126,75 +574,15 @@ "frequencyPenalty": "", "presencePenalty": "", "batchSize": "", - "timeout": "" + "timeout": "", + "basepath": "" }, "outputAnchors": [ { - "id": "openAI_2-output-openAI-OpenAI|BaseLLM|BaseLanguageModel|BaseLangChain", + "id": "openAI_2-output-openAI-OpenAI|BaseLLM|BaseLanguageModel", "name": "openAI", "label": "OpenAI", - "type": "OpenAI | BaseLLM | BaseLanguageModel | BaseLangChain" - } - ], - "outputs": {}, - "selected": false - }, - "positionAbsolute": { - "x": 793.6674026500068, - "y": -20.826430802683774 - }, - "selected": false, - "dragging": false - }, - { - "width": 300, - "height": 534, - "id": "promptTemplate_2", - "position": { - "x": 796.3399644963663, - "y": 512.349657546027 - }, - "type": "customNode", - "data": { - "id": "promptTemplate_2", - "label": "Prompt Template", - "name": "promptTemplate", - "type": "PromptTemplate", - "baseClasses": ["PromptTemplate", "BaseStringPromptTemplate", "BasePromptTemplate"], - "category": "Prompts", - "description": "Schema to represent a basic prompt for an LLM", - "inputParams": [ - { - "label": "Template", - "name": "template", - "type": "string", - "rows": 4, - "placeholder": "What is a good name for a company that makes {product}?", - "id": "promptTemplate_2-input-template-string" - }, - { - "label": "Format Prompt Values", - "name": "promptValues", - "type": "string", - "rows": 4, - "placeholder": "{\n \"input_language\": \"English\",\n \"output_language\": \"French\"\n}", - "optional": true, - "acceptVariable": true, - "list": true, - "id": "promptTemplate_2-input-promptValues-string" - } - ], - "inputAnchors": [], - "inputs": { - "template": "You are an AI who performs one task based on the following objective: {objective}.\nRespond with how you would complete this task:", - "promptValues": "{\n \"objective\": \"{{question}}\"\n}" - }, - "outputAnchors": [ - { - "id": "promptTemplate_2-output-promptTemplate-PromptTemplate|BaseStringPromptTemplate|BasePromptTemplate", - "name": "promptTemplate", - "label": "PromptTemplate", - "type": "PromptTemplate | BaseStringPromptTemplate | BasePromptTemplate" + "type": "OpenAI | BaseLLM | BaseLanguageModel" } ], "outputs": {}, @@ -202,430 +590,64 @@ }, "selected": false, "positionAbsolute": { - "x": 796.3399644963663, - "y": 512.349657546027 + "x": 1571.148617508543, + "y": -20.372437481171687 }, "dragging": false - }, - { - "width": 300, - "height": 407, - "id": "llmChain_2", - "position": { - "x": 1225.2861408370582, - "y": 485.62403908243243 - }, - "type": "customNode", - "data": { - "id": "llmChain_2", - "label": "LLM Chain", - "name": "llmChain", - "type": "LLMChain", - "baseClasses": ["LLMChain", "BaseChain", "BaseLangChain"], - "category": "Chains", - "description": "Chain to run queries against LLMs", - "inputParams": [ - { - "label": "Chain Name", - "name": "chainName", - "type": "string", - "placeholder": "Name Your Chain", - "optional": true, - "id": "llmChain_2-input-chainName-string" - } - ], - "inputAnchors": [ - { - "label": "Language Model", - "name": "model", - "type": "BaseLanguageModel", - "id": "llmChain_2-input-model-BaseLanguageModel" - }, - { - "label": "Prompt", - "name": "prompt", - "type": "BasePromptTemplate", - "id": "llmChain_2-input-prompt-BasePromptTemplate" - } - ], - "inputs": { - "model": "{{openAI_2.data.instance}}", - "prompt": "{{promptTemplate_2.data.instance}}", - "chainName": "First Chain" - }, - "outputAnchors": [ - { - "name": "output", - "label": "Output", - "type": "options", - "options": [ - { - "id": "llmChain_2-output-llmChain-LLMChain|BaseChain|BaseLangChain", - "name": "llmChain", - "label": "LLM Chain", - "type": "LLMChain | BaseChain | BaseLangChain" - }, - { - "id": "llmChain_2-output-outputPrediction-string", - "name": "outputPrediction", - "label": "Output Prediction", - "type": "string" - } - ], - "default": "llmChain" - } - ], - "outputs": { - "output": "outputPrediction" - }, - "selected": false - }, - "selected": false, - "dragging": false, - "positionAbsolute": { - "x": 1225.2861408370582, - "y": 485.62403908243243 - } - }, - { - "width": 300, - "height": 534, - "id": "promptTemplate_3", - "position": { - "x": 1589.206555911206, - "y": 460.23470154201766 - }, - "type": "customNode", - "data": { - "id": "promptTemplate_3", - "label": "Prompt Template", - "name": "promptTemplate", - "type": "PromptTemplate", - "baseClasses": ["PromptTemplate", "BaseStringPromptTemplate", "BasePromptTemplate"], - "category": "Prompts", - "description": "Schema to represent a basic prompt for an LLM", - "inputParams": [ - { - "label": "Template", - "name": "template", - "type": "string", - "rows": 4, - "placeholder": "What is a good name for a company that makes {product}?", - "id": "promptTemplate_3-input-template-string" - }, - { - "label": "Format Prompt Values", - "name": "promptValues", - "type": "string", - "rows": 4, - "placeholder": "{\n \"input_language\": \"English\",\n \"output_language\": \"French\"\n}", - "optional": true, - "acceptVariable": true, - "list": true, - "id": "promptTemplate_3-input-promptValues-string" - } - ], - "inputAnchors": [], - "inputs": { - "template": "You are a task creation AI that uses the result of an execution agent to create new tasks with the following objective: {objective}.\nThe last completed task has the result: {result}.\nBased on the result, create new tasks to be completed by the AI system that do not overlap with result.\nReturn the tasks as an array.", - "promptValues": "{\n \"objective\": \"{{question}}\",\n \"result\": \"\"\n}" - }, - "outputAnchors": [ - { - "id": "promptTemplate_3-output-promptTemplate-PromptTemplate|BaseStringPromptTemplate|BasePromptTemplate", - "name": "promptTemplate", - "label": "PromptTemplate", - "type": "PromptTemplate | BaseStringPromptTemplate | BasePromptTemplate" - } - ], - "outputs": {}, - "selected": false - }, - "selected": false, - "positionAbsolute": { - "x": 1589.206555911206, - "y": 460.23470154201766 - }, - "dragging": false - }, - { - "width": 300, - "height": 526, - "id": "openAI_3", - "position": { - "x": 1225.2861408370586, - "y": -62.7856517905272 - }, - "type": "customNode", - "data": { - "id": "openAI_3", - "label": "OpenAI", - "name": "openAI", - "type": "OpenAI", - "baseClasses": ["OpenAI", "BaseLLM", "BaseLanguageModel", "BaseLangChain"], - "category": "LLMs", - "description": "Wrapper around OpenAI large language models", - "inputParams": [ - { - "label": "OpenAI Api Key", - "name": "openAIApiKey", - "type": "password", - "id": "openAI_3-input-openAIApiKey-password" - }, - { - "label": "Model Name", - "name": "modelName", - "type": "options", - "options": [ - { - "label": "text-davinci-003", - "name": "text-davinci-003" - }, - { - "label": "text-davinci-002", - "name": "text-davinci-002" - }, - { - "label": "text-curie-001", - "name": "text-curie-001" - }, - { - "label": "text-babbage-001", - "name": "text-babbage-001" - } - ], - "default": "text-davinci-003", - "optional": true, - "id": "openAI_3-input-modelName-options" - }, - { - "label": "Temperature", - "name": "temperature", - "type": "number", - "default": 0.7, - "optional": true, - "id": "openAI_3-input-temperature-number" - }, - { - "label": "Max Tokens", - "name": "maxTokens", - "type": "number", - "optional": true, - "additionalParams": true, - "id": "openAI_3-input-maxTokens-number" - }, - { - "label": "Top Probability", - "name": "topP", - "type": "number", - "optional": true, - "additionalParams": true, - "id": "openAI_3-input-topP-number" - }, - { - "label": "Best Of", - "name": "bestOf", - "type": "number", - "optional": true, - "additionalParams": true, - "id": "openAI_3-input-bestOf-number" - }, - { - "label": "Frequency Penalty", - "name": "frequencyPenalty", - "type": "number", - "optional": true, - "additionalParams": true, - "id": "openAI_3-input-frequencyPenalty-number" - }, - { - "label": "Presence Penalty", - "name": "presencePenalty", - "type": "number", - "optional": true, - "additionalParams": true, - "id": "openAI_3-input-presencePenalty-number" - }, - { - "label": "Batch Size", - "name": "batchSize", - "type": "number", - "optional": true, - "additionalParams": true, - "id": "openAI_3-input-batchSize-number" - }, - { - "label": "Timeout", - "name": "timeout", - "type": "number", - "optional": true, - "additionalParams": true, - "id": "openAI_3-input-timeout-number" - } - ], - "inputAnchors": [], - "inputs": { - "modelName": "text-davinci-003", - "temperature": 0.7, - "maxTokens": "", - "topP": "", - "bestOf": "", - "frequencyPenalty": "", - "presencePenalty": "", - "batchSize": "", - "timeout": "" - }, - "outputAnchors": [ - { - "id": "openAI_3-output-openAI-OpenAI|BaseLLM|BaseLanguageModel|BaseLangChain", - "name": "openAI", - "label": "OpenAI", - "type": "OpenAI | BaseLLM | BaseLanguageModel | BaseLangChain" - } - ], - "outputs": {}, - "selected": false - }, - "positionAbsolute": { - "x": 1225.2861408370586, - "y": -62.7856517905272 - }, - "selected": false, - "dragging": false - }, - { - "width": 300, - "height": 407, - "id": "llmChain_3", - "position": { - "x": 1972.2671768945252, - "y": 142.73435419451476 - }, - "type": "customNode", - "data": { - "id": "llmChain_3", - "label": "LLM Chain", - "name": "llmChain", - "type": "LLMChain", - "baseClasses": ["LLMChain", "BaseChain", "BaseLangChain"], - "category": "Chains", - "description": "Chain to run queries against LLMs", - "inputParams": [ - { - "label": "Chain Name", - "name": "chainName", - "type": "string", - "placeholder": "Name Your Chain", - "optional": true, - "id": "llmChain_3-input-chainName-string" - } - ], - "inputAnchors": [ - { - "label": "Language Model", - "name": "model", - "type": "BaseLanguageModel", - "id": "llmChain_3-input-model-BaseLanguageModel" - }, - { - "label": "Prompt", - "name": "prompt", - "type": "BasePromptTemplate", - "id": "llmChain_3-input-prompt-BasePromptTemplate" - } - ], - "inputs": { - "model": "{{openAI_3.data.instance}}", - "prompt": "{{promptTemplate_3.data.instance}}", - "chainName": "LastChain" - }, - "outputAnchors": [ - { - "name": "output", - "label": "Output", - "type": "options", - "options": [ - { - "id": "llmChain_3-output-llmChain-LLMChain|BaseChain|BaseLangChain", - "name": "llmChain", - "label": "LLM Chain", - "type": "LLMChain | BaseChain | BaseLangChain" - }, - { - "id": "llmChain_3-output-outputPrediction-string", - "name": "outputPrediction", - "label": "Output Prediction", - "type": "string" - } - ], - "default": "llmChain" - } - ], - "outputs": { - "output": "llmChain" - }, - "selected": false - }, - "selected": false, - "dragging": false, - "positionAbsolute": { - "x": 1972.2671768945252, - "y": 142.73435419451476 - } } ], "edges": [ { - "source": "llmChain_2", - "sourceHandle": "llmChain_2-output-outputPrediction-string", - "target": "promptTemplate_3", - "targetHandle": "promptTemplate_3-input-promptValues-string", + "source": "promptTemplate_0", + "sourceHandle": "promptTemplate_0-output-promptTemplate-PromptTemplate|BaseStringPromptTemplate|BasePromptTemplate", + "target": "llmChain_0", + "targetHandle": "llmChain_0-input-prompt-BasePromptTemplate", "type": "buttonedge", - "id": "llmChain_2-llmChain_2-output-outputPrediction-string-promptTemplate_3-promptTemplate_3-input-promptValues-string", + "id": "promptTemplate_0-promptTemplate_0-output-promptTemplate-PromptTemplate|BaseStringPromptTemplate|BasePromptTemplate-llmChain_0-llmChain_0-input-prompt-BasePromptTemplate", + "data": { + "label": "" + } + }, + { + "source": "llmChain_0", + "sourceHandle": "llmChain_0-output-outputPrediction-string|json", + "target": "promptTemplate_1", + "targetHandle": "promptTemplate_1-input-promptValues-json", + "type": "buttonedge", + "id": "llmChain_0-llmChain_0-output-outputPrediction-string|json-promptTemplate_1-promptTemplate_1-input-promptValues-json", + "data": { + "label": "" + } + }, + { + "source": "promptTemplate_1", + "sourceHandle": "promptTemplate_1-output-promptTemplate-PromptTemplate|BaseStringPromptTemplate|BasePromptTemplate", + "target": "llmChain_1", + "targetHandle": "llmChain_1-input-prompt-BasePromptTemplate", + "type": "buttonedge", + "id": "promptTemplate_1-promptTemplate_1-output-promptTemplate-PromptTemplate|BaseStringPromptTemplate|BasePromptTemplate-llmChain_1-llmChain_1-input-prompt-BasePromptTemplate", + "data": { + "label": "" + } + }, + { + "source": "openAI_1", + "sourceHandle": "openAI_1-output-openAI-OpenAI|BaseLLM|BaseLanguageModel", + "target": "llmChain_0", + "targetHandle": "llmChain_0-input-model-BaseLanguageModel", + "type": "buttonedge", + "id": "openAI_1-openAI_1-output-openAI-OpenAI|BaseLLM|BaseLanguageModel-llmChain_0-llmChain_0-input-model-BaseLanguageModel", "data": { "label": "" } }, { "source": "openAI_2", - "sourceHandle": "openAI_2-output-openAI-OpenAI|BaseLLM|BaseLanguageModel|BaseLangChain", - "target": "llmChain_2", - "targetHandle": "llmChain_2-input-model-BaseLanguageModel", + "sourceHandle": "openAI_2-output-openAI-OpenAI|BaseLLM|BaseLanguageModel", + "target": "llmChain_1", + "targetHandle": "llmChain_1-input-model-BaseLanguageModel", "type": "buttonedge", - "id": "openAI_2-openAI_2-output-openAI-OpenAI|BaseLLM|BaseLanguageModel|BaseLangChain-llmChain_2-llmChain_2-input-model-BaseLanguageModel", - "data": { - "label": "" - } - }, - { - "source": "promptTemplate_2", - "sourceHandle": "promptTemplate_2-output-promptTemplate-PromptTemplate|BaseStringPromptTemplate|BasePromptTemplate", - "target": "llmChain_2", - "targetHandle": "llmChain_2-input-prompt-BasePromptTemplate", - "type": "buttonedge", - "id": "promptTemplate_2-promptTemplate_2-output-promptTemplate-PromptTemplate|BaseStringPromptTemplate|BasePromptTemplate-llmChain_2-llmChain_2-input-prompt-BasePromptTemplate", - "data": { - "label": "" - } - }, - { - "source": "openAI_3", - "sourceHandle": "openAI_3-output-openAI-OpenAI|BaseLLM|BaseLanguageModel|BaseLangChain", - "target": "llmChain_3", - "targetHandle": "llmChain_3-input-model-BaseLanguageModel", - "type": "buttonedge", - "id": "openAI_3-openAI_3-output-openAI-OpenAI|BaseLLM|BaseLanguageModel|BaseLangChain-llmChain_3-llmChain_3-input-model-BaseLanguageModel", - "data": { - "label": "" - } - }, - { - "source": "promptTemplate_3", - "sourceHandle": "promptTemplate_3-output-promptTemplate-PromptTemplate|BaseStringPromptTemplate|BasePromptTemplate", - "target": "llmChain_3", - "targetHandle": "llmChain_3-input-prompt-BasePromptTemplate", - "type": "buttonedge", - "id": "promptTemplate_3-promptTemplate_3-output-promptTemplate-PromptTemplate|BaseStringPromptTemplate|BasePromptTemplate-llmChain_3-llmChain_3-input-prompt-BasePromptTemplate", + "id": "openAI_2-openAI_2-output-openAI-OpenAI|BaseLLM|BaseLanguageModel-llmChain_1-llmChain_1-input-model-BaseLanguageModel", "data": { "label": "" } diff --git a/packages/server/marketplaces/chatflows/Replicate LLM.json b/packages/server/marketplaces/chatflows/Replicate LLM.json new file mode 100644 index 000000000..c5a0ac8ff --- /dev/null +++ b/packages/server/marketplaces/chatflows/Replicate LLM.json @@ -0,0 +1,281 @@ +{ + "description": "Use Replicate API that runs Llama 13b v2 model with LLMChain", + "nodes": [ + { + "width": 300, + "height": 405, + "id": "llmChain_1", + "position": { + "x": 967.581544453458, + "y": 320.56761595884564 + }, + "type": "customNode", + "data": { + "id": "llmChain_1", + "label": "LLM Chain", + "version": 1, + "name": "llmChain", + "type": "LLMChain", + "baseClasses": ["LLMChain", "BaseChain", "BaseLangChain"], + "category": "Chains", + "description": "Chain to run queries against LLMs", + "inputParams": [ + { + "label": "Chain Name", + "name": "chainName", + "type": "string", + "placeholder": "Name Your Chain", + "optional": true, + "id": "llmChain_1-input-chainName-string" + } + ], + "inputAnchors": [ + { + "label": "Language Model", + "name": "model", + "type": "BaseLanguageModel", + "id": "llmChain_1-input-model-BaseLanguageModel" + }, + { + "label": "Prompt", + "name": "prompt", + "type": "BasePromptTemplate", + "id": "llmChain_1-input-prompt-BasePromptTemplate" + } + ], + "inputs": { + "model": "{{replicate_0.data.instance}}", + "prompt": "{{promptTemplate_0.data.instance}}", + "chainName": "" + }, + "outputAnchors": [ + { + "name": "output", + "label": "Output", + "type": "options", + "options": [ + { + "id": "llmChain_1-output-llmChain-LLMChain|BaseChain|BaseLangChain", + "name": "llmChain", + "label": "LLM Chain", + "type": "LLMChain | BaseChain | BaseLangChain" + }, + { + "id": "llmChain_1-output-outputPrediction-string|json", + "name": "outputPrediction", + "label": "Output Prediction", + "type": "string | json" + } + ], + "default": "llmChain" + } + ], + "outputs": { + "output": "llmChain" + }, + "selected": false + }, + "positionAbsolute": { + "x": 967.581544453458, + "y": 320.56761595884564 + }, + "selected": false, + "dragging": false + }, + { + "width": 300, + "height": 474, + "id": "promptTemplate_0", + "position": { + "x": 269.2203229225663, + "y": 129.02909641085535 + }, + "type": "customNode", + "data": { + "id": "promptTemplate_0", + "label": "Prompt Template", + "version": 1, + "name": "promptTemplate", + "type": "PromptTemplate", + "baseClasses": ["PromptTemplate", "BaseStringPromptTemplate", "BasePromptTemplate"], + "category": "Prompts", + "description": "Schema to represent a basic prompt for an LLM", + "inputParams": [ + { + "label": "Template", + "name": "template", + "type": "string", + "rows": 4, + "placeholder": "What is a good name for a company that makes {product}?", + "id": "promptTemplate_0-input-template-string" + }, + { + "label": "Format Prompt Values", + "name": "promptValues", + "type": "json", + "optional": true, + "acceptVariable": true, + "list": true, + "id": "promptTemplate_0-input-promptValues-json" + } + ], + "inputAnchors": [], + "inputs": { + "template": "Assistant: You are a helpful assistant. You do not respond as 'User' or pretend to be 'User'. You only respond once as Assistant.\nUser: {query}\nAssistant:", + "promptValues": "{\"query\":\"{{question}}\"}" + }, + "outputAnchors": [ + { + "id": "promptTemplate_0-output-promptTemplate-PromptTemplate|BaseStringPromptTemplate|BasePromptTemplate", + "name": "promptTemplate", + "label": "PromptTemplate", + "type": "PromptTemplate | BaseStringPromptTemplate | BasePromptTemplate" + } + ], + "outputs": {}, + "selected": false + }, + "selected": false, + "positionAbsolute": { + "x": 269.2203229225663, + "y": 129.02909641085535 + }, + "dragging": false + }, + { + "width": 300, + "height": 526, + "id": "replicate_0", + "position": { + "x": 623.313978186024, + "y": -72.92788335022428 + }, + "type": "customNode", + "data": { + "id": "replicate_0", + "label": "Replicate", + "version": 1, + "name": "replicate", + "type": "Replicate", + "baseClasses": ["Replicate", "BaseChatModel", "LLM", "BaseLLM", "BaseLanguageModel", "Runnable"], + "category": "LLMs", + "description": "Use Replicate to run open source models on cloud", + "inputParams": [ + { + "label": "Connect Credential", + "name": "credential", + "type": "credential", + "credentialNames": ["replicateApi"], + "id": "replicate_0-input-credential-credential" + }, + { + "label": "Model", + "name": "model", + "type": "string", + "placeholder": "a16z-infra/llama13b-v2-chat:df7690f1994d94e96ad9d568eac121aecf50684a0b0963b25a41cc40061269e5", + "optional": true, + "id": "replicate_0-input-model-string" + }, + { + "label": "Temperature", + "name": "temperature", + "type": "number", + "step": 0.1, + "description": "Adjusts randomness of outputs, greater than 1 is random and 0 is deterministic, 0.75 is a good starting value.", + "default": 0.7, + "optional": true, + "id": "replicate_0-input-temperature-number" + }, + { + "label": "Max Tokens", + "name": "maxTokens", + "type": "number", + "step": 1, + "description": "Maximum number of tokens to generate. A word is generally 2-3 tokens", + "optional": true, + "additionalParams": true, + "id": "replicate_0-input-maxTokens-number" + }, + { + "label": "Top Probability", + "name": "topP", + "type": "number", + "step": 0.1, + "description": "When decoding text, samples from the top p percentage of most likely tokens; lower to ignore less likely tokens", + "optional": true, + "additionalParams": true, + "id": "replicate_0-input-topP-number" + }, + { + "label": "Repetition Penalty", + "name": "repetitionPenalty", + "type": "number", + "step": 0.1, + "description": "Penalty for repeated words in generated text; 1 is no penalty, values greater than 1 discourage repetition, less than 1 encourage it. (minimum: 0.01; maximum: 5)", + "optional": true, + "additionalParams": true, + "id": "replicate_0-input-repetitionPenalty-number" + }, + { + "label": "Additional Inputs", + "name": "additionalInputs", + "type": "json", + "description": "Each model has different parameters, refer to the specific model accepted inputs. For example: llama13b-v2", + "additionalParams": true, + "optional": true, + "id": "replicate_0-input-additionalInputs-json" + } + ], + "inputAnchors": [], + "inputs": { + "model": "a16z-infra/llama13b-v2-chat:df7690f1994d94e96ad9d568eac121aecf50684a0b0963b25a41cc40061269e5", + "temperature": 0.7, + "maxTokens": "", + "topP": "", + "repetitionPenalty": "", + "additionalInputs": "" + }, + "outputAnchors": [ + { + "id": "replicate_0-output-replicate-Replicate|BaseChatModel|LLM|BaseLLM|BaseLanguageModel|Runnable", + "name": "replicate", + "label": "Replicate", + "type": "Replicate | BaseChatModel | LLM | BaseLLM | BaseLanguageModel | Runnable" + } + ], + "outputs": {}, + "selected": false + }, + "selected": false, + "positionAbsolute": { + "x": 623.313978186024, + "y": -72.92788335022428 + }, + "dragging": false + } + ], + "edges": [ + { + "source": "promptTemplate_0", + "sourceHandle": "promptTemplate_0-output-promptTemplate-PromptTemplate|BaseStringPromptTemplate|BasePromptTemplate", + "target": "llmChain_1", + "targetHandle": "llmChain_1-input-prompt-BasePromptTemplate", + "type": "buttonedge", + "id": "promptTemplate_0-promptTemplate_0-output-promptTemplate-PromptTemplate|BaseStringPromptTemplate|BasePromptTemplate-llmChain_1-llmChain_1-input-prompt-BasePromptTemplate", + "data": { + "label": "" + } + }, + { + "source": "replicate_0", + "sourceHandle": "replicate_0-output-replicate-Replicate|BaseChatModel|LLM|BaseLLM|BaseLanguageModel|Runnable", + "target": "llmChain_1", + "targetHandle": "llmChain_1-input-model-BaseLanguageModel", + "type": "buttonedge", + "id": "replicate_0-replicate_0-output-replicate-Replicate|BaseChatModel|LLM|BaseLLM|BaseLanguageModel|Runnable-llmChain_1-llmChain_1-input-model-BaseLanguageModel", + "data": { + "label": "" + } + } + ] +} diff --git a/packages/server/marketplaces/SQL DB Chain.json b/packages/server/marketplaces/chatflows/SQL DB Chain.json similarity index 62% rename from packages/server/marketplaces/SQL DB Chain.json rename to packages/server/marketplaces/chatflows/SQL DB Chain.json index 90f8814c6..b37dc7ce7 100644 --- a/packages/server/marketplaces/SQL DB Chain.json +++ b/packages/server/marketplaces/chatflows/SQL DB Chain.json @@ -1,151 +1,6 @@ { "description": "Answer questions over a SQL database", "nodes": [ - { - "width": 300, - "height": 524, - "id": "openAI_1", - "position": { - "x": 835.4668837832456, - "y": 182.4724119898708 - }, - "type": "customNode", - "data": { - "id": "openAI_1", - "label": "OpenAI", - "name": "openAI", - "type": "OpenAI", - "baseClasses": ["OpenAI", "BaseLLM", "BaseLanguageModel", "BaseLangChain"], - "category": "LLMs", - "description": "Wrapper around OpenAI large language models", - "inputParams": [ - { - "label": "OpenAI Api Key", - "name": "openAIApiKey", - "type": "password", - "id": "openAI_1-input-openAIApiKey-password" - }, - { - "label": "Model Name", - "name": "modelName", - "type": "options", - "options": [ - { - "label": "text-davinci-003", - "name": "text-davinci-003" - }, - { - "label": "text-davinci-002", - "name": "text-davinci-002" - }, - { - "label": "text-curie-001", - "name": "text-curie-001" - }, - { - "label": "text-babbage-001", - "name": "text-babbage-001" - } - ], - "default": "text-davinci-003", - "optional": true, - "id": "openAI_1-input-modelName-options" - }, - { - "label": "Temperature", - "name": "temperature", - "type": "number", - "default": 0.7, - "optional": true, - "id": "openAI_1-input-temperature-number" - }, - { - "label": "Max Tokens", - "name": "maxTokens", - "type": "number", - "optional": true, - "additionalParams": true, - "id": "openAI_1-input-maxTokens-number" - }, - { - "label": "Top Probability", - "name": "topP", - "type": "number", - "optional": true, - "additionalParams": true, - "id": "openAI_1-input-topP-number" - }, - { - "label": "Best Of", - "name": "bestOf", - "type": "number", - "optional": true, - "additionalParams": true, - "id": "openAI_1-input-bestOf-number" - }, - { - "label": "Frequency Penalty", - "name": "frequencyPenalty", - "type": "number", - "optional": true, - "additionalParams": true, - "id": "openAI_1-input-frequencyPenalty-number" - }, - { - "label": "Presence Penalty", - "name": "presencePenalty", - "type": "number", - "optional": true, - "additionalParams": true, - "id": "openAI_1-input-presencePenalty-number" - }, - { - "label": "Batch Size", - "name": "batchSize", - "type": "number", - "optional": true, - "additionalParams": true, - "id": "openAI_1-input-batchSize-number" - }, - { - "label": "Timeout", - "name": "timeout", - "type": "number", - "optional": true, - "additionalParams": true, - "id": "openAI_1-input-timeout-number" - } - ], - "inputAnchors": [], - "inputs": { - "modelName": "text-davinci-003", - "temperature": 0.7, - "maxTokens": "", - "topP": "", - "bestOf": "", - "frequencyPenalty": "", - "presencePenalty": "", - "batchSize": "", - "timeout": "" - }, - "outputAnchors": [ - { - "id": "openAI_1-output-openAI-OpenAI|BaseLLM|BaseLanguageModel|BaseLangChain", - "name": "openAI", - "label": "OpenAI", - "type": "OpenAI | BaseLLM | BaseLanguageModel | BaseLangChain" - } - ], - "outputs": {}, - "selected": false - }, - "selected": false, - "positionAbsolute": { - "x": 835.4668837832456, - "y": 182.4724119898708 - }, - "dragging": false - }, { "width": 300, "height": 424, @@ -159,6 +14,7 @@ "id": "sqlDatabaseChain_0", "label": "Sql Database Chain", "name": "sqlDatabaseChain", + "version": 1, "type": "SqlDatabaseChain", "baseClasses": ["SqlDatabaseChain", "BaseChain", "BaseLangChain"], "category": "Chains", @@ -194,7 +50,7 @@ } ], "inputs": { - "model": "{{openAI_1.data.instance}}", + "model": "{{chatOpenAI_0.data.instance}}", "database": "sqlite", "dbFilePath": "" }, @@ -215,16 +71,170 @@ "y": 217.507437391498 }, "dragging": false + }, + { + "width": 300, + "height": 523, + "id": "chatOpenAI_0", + "position": { + "x": 855.0396169649254, + "y": 179.29430548099504 + }, + "type": "customNode", + "data": { + "id": "chatOpenAI_0", + "label": "ChatOpenAI", + "name": "chatOpenAI", + "version": 1, + "type": "ChatOpenAI", + "baseClasses": ["ChatOpenAI", "BaseChatModel", "BaseLanguageModel"], + "category": "Chat Models", + "description": "Wrapper around OpenAI large language models that use the Chat endpoint", + "inputParams": [ + { + "label": "Connect Credential", + "name": "credential", + "type": "credential", + "credentialNames": ["openAIApi"], + "id": "chatOpenAI_0-input-credential-credential" + }, + { + "label": "Model Name", + "name": "modelName", + "type": "options", + "options": [ + { + "label": "gpt-4", + "name": "gpt-4" + }, + { + "label": "gpt-4-0613", + "name": "gpt-4-0613" + }, + { + "label": "gpt-4-32k", + "name": "gpt-4-32k" + }, + { + "label": "gpt-4-32k-0613", + "name": "gpt-4-32k-0613" + }, + { + "label": "gpt-3.5-turbo", + "name": "gpt-3.5-turbo" + }, + { + "label": "gpt-3.5-turbo-0613", + "name": "gpt-3.5-turbo-0613" + }, + { + "label": "gpt-3.5-turbo-16k", + "name": "gpt-3.5-turbo-16k" + }, + { + "label": "gpt-3.5-turbo-16k-0613", + "name": "gpt-3.5-turbo-16k-0613" + } + ], + "default": "gpt-3.5-turbo", + "optional": true, + "id": "chatOpenAI_0-input-modelName-options" + }, + { + "label": "Temperature", + "name": "temperature", + "type": "number", + "default": 0.9, + "optional": true, + "id": "chatOpenAI_0-input-temperature-number" + }, + { + "label": "Max Tokens", + "name": "maxTokens", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_0-input-maxTokens-number" + }, + { + "label": "Top Probability", + "name": "topP", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_0-input-topP-number" + }, + { + "label": "Frequency Penalty", + "name": "frequencyPenalty", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_0-input-frequencyPenalty-number" + }, + { + "label": "Presence Penalty", + "name": "presencePenalty", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_0-input-presencePenalty-number" + }, + { + "label": "Timeout", + "name": "timeout", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_0-input-timeout-number" + }, + { + "label": "BasePath", + "name": "basepath", + "type": "string", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_0-input-basepath-string" + } + ], + "inputAnchors": [], + "inputs": { + "modelName": "gpt-3.5-turbo", + "temperature": "0", + "maxTokens": "", + "topP": "", + "frequencyPenalty": "", + "presencePenalty": "", + "timeout": "", + "basepath": "" + }, + "outputAnchors": [ + { + "id": "chatOpenAI_0-output-chatOpenAI-ChatOpenAI|BaseChatModel|BaseLanguageModel", + "name": "chatOpenAI", + "label": "ChatOpenAI", + "type": "ChatOpenAI | BaseChatModel | BaseLanguageModel" + } + ], + "outputs": {}, + "selected": false + }, + "selected": false, + "positionAbsolute": { + "x": 855.0396169649254, + "y": 179.29430548099504 + }, + "dragging": false } ], "edges": [ { - "source": "openAI_1", - "sourceHandle": "openAI_1-output-openAI-OpenAI|BaseLLM|BaseLanguageModel|BaseLangChain", + "source": "chatOpenAI_0", + "sourceHandle": "chatOpenAI_0-output-chatOpenAI-ChatOpenAI|BaseChatModel|BaseLanguageModel", "target": "sqlDatabaseChain_0", "targetHandle": "sqlDatabaseChain_0-input-model-BaseLanguageModel", "type": "buttonedge", - "id": "openAI_1-openAI_1-output-openAI-OpenAI|BaseLLM|BaseLanguageModel|BaseLangChain-sqlDatabaseChain_0-sqlDatabaseChain_0-input-model-BaseLanguageModel", + "id": "chatOpenAI_0-chatOpenAI_0-output-chatOpenAI-ChatOpenAI|BaseChatModel|BaseLanguageModel-sqlDatabaseChain_0-sqlDatabaseChain_0-input-model-BaseLanguageModel", "data": { "label": "" } diff --git a/packages/server/marketplaces/Simple Conversation Chain.json b/packages/server/marketplaces/chatflows/Simple Conversation Chain.json similarity index 78% rename from packages/server/marketplaces/Simple Conversation Chain.json rename to packages/server/marketplaces/chatflows/Simple Conversation Chain.json index bb1a5fff7..2c41a54f6 100644 --- a/packages/server/marketplaces/Simple Conversation Chain.json +++ b/packages/server/marketplaces/chatflows/Simple Conversation Chain.json @@ -3,27 +3,86 @@ "nodes": [ { "width": 300, - "height": 524, + "height": 376, + "id": "bufferMemory_0", + "position": { + "x": 753.4300788823234, + "y": 479.5336426526603 + }, + "type": "customNode", + "data": { + "id": "bufferMemory_0", + "label": "Buffer Memory", + "name": "bufferMemory", + "version": 1, + "type": "BufferMemory", + "baseClasses": ["BufferMemory", "BaseChatMemory", "BaseMemory"], + "category": "Memory", + "description": "Remembers previous conversational back and forths directly", + "inputParams": [ + { + "label": "Memory Key", + "name": "memoryKey", + "type": "string", + "default": "chat_history", + "id": "bufferMemory_0-input-memoryKey-string" + }, + { + "label": "Input Key", + "name": "inputKey", + "type": "string", + "default": "input", + "id": "bufferMemory_0-input-inputKey-string" + } + ], + "inputAnchors": [], + "inputs": { + "memoryKey": "chat_history", + "inputKey": "input" + }, + "outputAnchors": [ + { + "id": "bufferMemory_0-output-bufferMemory-BufferMemory|BaseChatMemory|BaseMemory", + "name": "bufferMemory", + "label": "BufferMemory", + "type": "BufferMemory | BaseChatMemory | BaseMemory" + } + ], + "outputs": {}, + "selected": false + }, + "selected": false, + "positionAbsolute": { + "x": 753.4300788823234, + "y": 479.5336426526603 + }, + "dragging": false + }, + { + "width": 300, + "height": 523, "id": "chatOpenAI_0", "position": { - "x": 750.6529856117049, - "y": -75.72544375812092 + "x": 754.8942497823595, + "y": -70.76607584232393 }, "type": "customNode", "data": { "id": "chatOpenAI_0", "label": "ChatOpenAI", "name": "chatOpenAI", + "version": 1, "type": "ChatOpenAI", - "baseClasses": ["ChatOpenAI", "BaseChatModel", "BaseLanguageModel", "BaseLangChain"], + "baseClasses": ["ChatOpenAI", "BaseChatModel", "BaseLanguageModel"], "category": "Chat Models", "description": "Wrapper around OpenAI large language models that use the Chat endpoint", "inputParams": [ { - "label": "OpenAI Api Key", - "name": "openAIApiKey", - "type": "password", - "id": "chatOpenAI_0-input-openAIApiKey-password" + "label": "Connect Credential", + "name": "credential", + "type": "credential", + "credentialNames": ["openAIApi"], + "id": "chatOpenAI_0-input-credential-credential" }, { "label": "Model Name", @@ -35,20 +94,32 @@ "name": "gpt-4" }, { - "label": "gpt-4-0314", - "name": "gpt-4-0314" + "label": "gpt-4-0613", + "name": "gpt-4-0613" }, { - "label": "gpt-4-32k-0314", - "name": "gpt-4-32k-0314" + "label": "gpt-4-32k", + "name": "gpt-4-32k" + }, + { + "label": "gpt-4-32k-0613", + "name": "gpt-4-32k-0613" }, { "label": "gpt-3.5-turbo", "name": "gpt-3.5-turbo" }, { - "label": "gpt-3.5-turbo-0301", - "name": "gpt-3.5-turbo-0301" + "label": "gpt-3.5-turbo-0613", + "name": "gpt-3.5-turbo-0613" + }, + { + "label": "gpt-3.5-turbo-16k", + "name": "gpt-3.5-turbo-16k" + }, + { + "label": "gpt-3.5-turbo-16k-0613", + "name": "gpt-3.5-turbo-16k-0613" } ], "default": "gpt-3.5-turbo", @@ -102,6 +173,14 @@ "optional": true, "additionalParams": true, "id": "chatOpenAI_0-input-timeout-number" + }, + { + "label": "BasePath", + "name": "basepath", + "type": "string", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_0-input-basepath-string" } ], "inputAnchors": [], @@ -112,70 +191,15 @@ "topP": "", "frequencyPenalty": "", "presencePenalty": "", - "timeout": "" + "timeout": "", + "basepath": "" }, "outputAnchors": [ { - "id": "chatOpenAI_0-output-chatOpenAI-ChatOpenAI|BaseChatModel|BaseLanguageModel|BaseLangChain", + "id": "chatOpenAI_0-output-chatOpenAI-ChatOpenAI|BaseChatModel|BaseLanguageModel", "name": "chatOpenAI", "label": "ChatOpenAI", - "type": "ChatOpenAI | BaseChatModel | BaseLanguageModel | BaseLangChain" - } - ], - "outputs": {}, - "selected": false - }, - "positionAbsolute": { - "x": 750.6529856117049, - "y": -75.72544375812092 - }, - "selected": false, - "dragging": false - }, - { - "width": 300, - "height": 376, - "id": "bufferMemory_0", - "position": { - "x": 753.4300788823234, - "y": 479.5336426526603 - }, - "type": "customNode", - "data": { - "id": "bufferMemory_0", - "label": "Buffer Memory", - "name": "bufferMemory", - "type": "BufferMemory", - "baseClasses": ["BufferMemory", "BaseChatMemory", "BaseMemory"], - "category": "Memory", - "description": "Remembers previous conversational back and forths directly", - "inputParams": [ - { - "label": "Memory Key", - "name": "memoryKey", - "type": "string", - "default": "chat_history", - "id": "bufferMemory_0-input-memoryKey-string" - }, - { - "label": "Input Key", - "name": "inputKey", - "type": "string", - "default": "input", - "id": "bufferMemory_0-input-inputKey-string" - } - ], - "inputAnchors": [], - "inputs": { - "memoryKey": "chat_history", - "inputKey": "input" - }, - "outputAnchors": [ - { - "id": "bufferMemory_0-output-bufferMemory-BufferMemory|BaseChatMemory|BaseMemory", - "name": "bufferMemory", - "label": "BufferMemory", - "type": "BufferMemory | BaseChatMemory | BaseMemory" + "type": "ChatOpenAI | BaseChatModel | BaseLanguageModel" } ], "outputs": {}, @@ -183,26 +207,27 @@ }, "selected": false, "positionAbsolute": { - "x": 753.4300788823234, - "y": 479.5336426526603 + "x": 754.8942497823595, + "y": -70.76607584232393 }, "dragging": false }, { "width": 300, - "height": 332, + "height": 383, "id": "conversationChain_0", "position": { - "x": 1201.6630991237407, - "y": 291.86981791303066 + "x": 1174.6496397666272, + "y": 311.1052536740497 }, "type": "customNode", "data": { "id": "conversationChain_0", "label": "Conversation Chain", "name": "conversationChain", + "version": 1, "type": "ConversationChain", - "baseClasses": ["ConversationChain", "LLMChain", "BaseChain", "BaseLangChain"], + "baseClasses": ["ConversationChain", "LLMChain", "BaseChain"], "category": "Chains", "description": "Chat models specific conversational chain with memory", "inputParams": [ @@ -229,19 +254,29 @@ "name": "memory", "type": "BaseMemory", "id": "conversationChain_0-input-memory-BaseMemory" + }, + { + "label": "Document", + "name": "document", + "type": "Document", + "description": "Include whole document into the context window", + "optional": true, + "list": true, + "id": "conversationChain_0-input-document-Document" } ], "inputs": { "model": "{{chatOpenAI_0.data.instance}}", "memory": "{{bufferMemory_0.data.instance}}", + "document": "", "systemMessagePrompt": "" }, "outputAnchors": [ { - "id": "conversationChain_0-output-conversationChain-ConversationChain|LLMChain|BaseChain|BaseLangChain", + "id": "conversationChain_0-output-conversationChain-ConversationChain|LLMChain|BaseChain", "name": "conversationChain", "label": "ConversationChain", - "type": "ConversationChain | LLMChain | BaseChain | BaseLangChain" + "type": "ConversationChain | LLMChain | BaseChain" } ], "outputs": {}, @@ -249,8 +284,8 @@ }, "selected": false, "positionAbsolute": { - "x": 1201.6630991237407, - "y": 291.86981791303066 + "x": 1174.6496397666272, + "y": 311.1052536740497 }, "dragging": false } @@ -258,11 +293,11 @@ "edges": [ { "source": "chatOpenAI_0", - "sourceHandle": "chatOpenAI_0-output-chatOpenAI-ChatOpenAI|BaseChatModel|BaseLanguageModel|BaseLangChain", + "sourceHandle": "chatOpenAI_0-output-chatOpenAI-ChatOpenAI|BaseChatModel|BaseLanguageModel", "target": "conversationChain_0", "targetHandle": "conversationChain_0-input-model-BaseChatModel", "type": "buttonedge", - "id": "chatOpenAI_0-chatOpenAI_0-output-chatOpenAI-ChatOpenAI|BaseChatModel|BaseLanguageModel|BaseLangChain-conversationChain_0-conversationChain_0-input-model-BaseChatModel", + "id": "chatOpenAI_0-chatOpenAI_0-output-chatOpenAI-ChatOpenAI|BaseChatModel|BaseLanguageModel-conversationChain_0-conversationChain_0-input-model-BaseChatModel", "data": { "label": "" } diff --git a/packages/server/marketplaces/Simple LLM Chain.json b/packages/server/marketplaces/chatflows/Simple LLM Chain.json similarity index 77% rename from packages/server/marketplaces/Simple LLM Chain.json rename to packages/server/marketplaces/chatflows/Simple LLM Chain.json index 26a3a2eed..0fc648c66 100644 --- a/packages/server/marketplaces/Simple LLM Chain.json +++ b/packages/server/marketplaces/chatflows/Simple LLM Chain.json @@ -3,213 +3,7 @@ "nodes": [ { "width": 300, - "height": 526, - "id": "openAI_1", - "position": { - "x": 510.75932526856377, - "y": -44.80152395958956 - }, - "type": "customNode", - "data": { - "id": "openAI_1", - "label": "OpenAI", - "name": "openAI", - "type": "OpenAI", - "baseClasses": ["OpenAI", "BaseLLM", "BaseLanguageModel", "BaseLangChain"], - "category": "LLMs", - "description": "Wrapper around OpenAI large language models", - "inputParams": [ - { - "label": "OpenAI Api Key", - "name": "openAIApiKey", - "type": "password", - "id": "openAI_1-input-openAIApiKey-password" - }, - { - "label": "Model Name", - "name": "modelName", - "type": "options", - "options": [ - { - "label": "text-davinci-003", - "name": "text-davinci-003" - }, - { - "label": "text-davinci-002", - "name": "text-davinci-002" - }, - { - "label": "text-curie-001", - "name": "text-curie-001" - }, - { - "label": "text-babbage-001", - "name": "text-babbage-001" - } - ], - "default": "text-davinci-003", - "optional": true, - "id": "openAI_1-input-modelName-options" - }, - { - "label": "Temperature", - "name": "temperature", - "type": "number", - "default": 0.7, - "optional": true, - "id": "openAI_1-input-temperature-number" - }, - { - "label": "Max Tokens", - "name": "maxTokens", - "type": "number", - "optional": true, - "additionalParams": true, - "id": "openAI_1-input-maxTokens-number" - }, - { - "label": "Top Probability", - "name": "topP", - "type": "number", - "optional": true, - "additionalParams": true, - "id": "openAI_1-input-topP-number" - }, - { - "label": "Best Of", - "name": "bestOf", - "type": "number", - "optional": true, - "additionalParams": true, - "id": "openAI_1-input-bestOf-number" - }, - { - "label": "Frequency Penalty", - "name": "frequencyPenalty", - "type": "number", - "optional": true, - "additionalParams": true, - "id": "openAI_1-input-frequencyPenalty-number" - }, - { - "label": "Presence Penalty", - "name": "presencePenalty", - "type": "number", - "optional": true, - "additionalParams": true, - "id": "openAI_1-input-presencePenalty-number" - }, - { - "label": "Batch Size", - "name": "batchSize", - "type": "number", - "optional": true, - "additionalParams": true, - "id": "openAI_1-input-batchSize-number" - }, - { - "label": "Timeout", - "name": "timeout", - "type": "number", - "optional": true, - "additionalParams": true, - "id": "openAI_1-input-timeout-number" - } - ], - "inputAnchors": [], - "inputs": { - "modelName": "text-davinci-003", - "temperature": 0.7, - "maxTokens": "", - "topP": "", - "bestOf": "", - "frequencyPenalty": "", - "presencePenalty": "", - "batchSize": "", - "timeout": "" - }, - "outputAnchors": [ - { - "id": "openAI_1-output-openAI-OpenAI|BaseLLM|BaseLanguageModel|BaseLangChain", - "name": "openAI", - "label": "OpenAI", - "type": "OpenAI | BaseLLM | BaseLanguageModel | BaseLangChain" - } - ], - "outputs": {}, - "selected": false - }, - "selected": false, - "positionAbsolute": { - "x": 510.75932526856377, - "y": -44.80152395958956 - }, - "dragging": false - }, - { - "width": 300, - "height": 534, - "id": "promptTemplate_1", - "position": { - "x": 514.5434056794296, - "y": 507.47798128037107 - }, - "type": "customNode", - "data": { - "id": "promptTemplate_1", - "label": "Prompt Template", - "name": "promptTemplate", - "type": "PromptTemplate", - "baseClasses": ["PromptTemplate", "BaseStringPromptTemplate", "BasePromptTemplate"], - "category": "Prompts", - "description": "Schema to represent a basic prompt for an LLM", - "inputParams": [ - { - "label": "Template", - "name": "template", - "type": "string", - "rows": 4, - "placeholder": "What is a good name for a company that makes {product}?", - "id": "promptTemplate_1-input-template-string" - }, - { - "label": "Format Prompt Values", - "name": "promptValues", - "type": "string", - "rows": 4, - "placeholder": "{\n \"input_language\": \"English\",\n \"output_language\": \"French\"\n}", - "optional": true, - "acceptVariable": true, - "list": true, - "id": "promptTemplate_1-input-promptValues-string" - } - ], - "inputAnchors": [], - "inputs": { - "template": "", - "promptValues": "" - }, - "outputAnchors": [ - { - "id": "promptTemplate_1-output-promptTemplate-PromptTemplate|BaseStringPromptTemplate|BasePromptTemplate", - "name": "promptTemplate", - "label": "PromptTemplate", - "type": "PromptTemplate | BaseStringPromptTemplate | BasePromptTemplate" - } - ], - "outputs": {}, - "selected": false - }, - "selected": false, - "positionAbsolute": { - "x": 514.5434056794296, - "y": 507.47798128037107 - }, - "dragging": false - }, - { - "width": 300, - "height": 407, + "height": 405, "id": "llmChain_1", "position": { "x": 970.9254258940236, @@ -220,6 +14,7 @@ "id": "llmChain_1", "label": "LLM Chain", "name": "llmChain", + "version": 1, "type": "LLMChain", "baseClasses": ["LLMChain", "BaseChain", "BaseLangChain"], "category": "Chains", @@ -249,8 +44,8 @@ } ], "inputs": { - "model": "{{openAI_1.data.instance}}", - "prompt": "{{promptTemplate_1.data.instance}}", + "model": "{{openAI_0.data.instance}}", + "prompt": "{{promptTemplate_0.data.instance}}", "chainName": "" }, "outputAnchors": [ @@ -266,10 +61,10 @@ "type": "LLMChain | BaseChain | BaseLangChain" }, { - "id": "llmChain_1-output-outputPrediction-string", + "id": "llmChain_1-output-outputPrediction-string|json", "name": "outputPrediction", "label": "Output Prediction", - "type": "string" + "type": "string | json" } ], "default": "llmChain" @@ -286,27 +81,243 @@ }, "selected": false, "dragging": false + }, + { + "width": 300, + "height": 475, + "id": "promptTemplate_0", + "position": { + "x": 517.7412884791509, + "y": 506.7411400888471 + }, + "type": "customNode", + "data": { + "id": "promptTemplate_0", + "label": "Prompt Template", + "name": "promptTemplate", + "version": 1, + "type": "PromptTemplate", + "baseClasses": ["PromptTemplate", "BaseStringPromptTemplate", "BasePromptTemplate"], + "category": "Prompts", + "description": "Schema to represent a basic prompt for an LLM", + "inputParams": [ + { + "label": "Template", + "name": "template", + "type": "string", + "rows": 4, + "placeholder": "What is a good name for a company that makes {product}?", + "id": "promptTemplate_0-input-template-string" + }, + { + "label": "Format Prompt Values", + "name": "promptValues", + "type": "json", + "optional": true, + "acceptVariable": true, + "list": true, + "id": "promptTemplate_0-input-promptValues-json" + } + ], + "inputAnchors": [], + "inputs": { + "template": "What is a good name for a company that makes {product}?", + "promptValues": "" + }, + "outputAnchors": [ + { + "id": "promptTemplate_0-output-promptTemplate-PromptTemplate|BaseStringPromptTemplate|BasePromptTemplate", + "name": "promptTemplate", + "label": "PromptTemplate", + "type": "PromptTemplate | BaseStringPromptTemplate | BasePromptTemplate" + } + ], + "outputs": {}, + "selected": false + }, + "selected": false, + "positionAbsolute": { + "x": 517.7412884791509, + "y": 506.7411400888471 + }, + "dragging": false + }, + { + "width": 300, + "height": 523, + "id": "openAI_0", + "position": { + "x": 513.3297923232442, + "y": -42.67554802812833 + }, + "type": "customNode", + "data": { + "id": "openAI_0", + "label": "OpenAI", + "name": "openAI", + "version": 1, + "type": "OpenAI", + "baseClasses": ["OpenAI", "BaseLLM", "BaseLanguageModel"], + "category": "LLMs", + "description": "Wrapper around OpenAI large language models", + "inputParams": [ + { + "label": "Connect Credential", + "name": "credential", + "type": "credential", + "credentialNames": ["openAIApi"], + "id": "openAI_0-input-credential-credential" + }, + { + "label": "Model Name", + "name": "modelName", + "type": "options", + "options": [ + { + "label": "text-davinci-003", + "name": "text-davinci-003" + }, + { + "label": "text-davinci-002", + "name": "text-davinci-002" + }, + { + "label": "text-curie-001", + "name": "text-curie-001" + }, + { + "label": "text-babbage-001", + "name": "text-babbage-001" + } + ], + "default": "text-davinci-003", + "optional": true, + "id": "openAI_0-input-modelName-options" + }, + { + "label": "Temperature", + "name": "temperature", + "type": "number", + "default": 0.7, + "optional": true, + "id": "openAI_0-input-temperature-number" + }, + { + "label": "Max Tokens", + "name": "maxTokens", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "openAI_0-input-maxTokens-number" + }, + { + "label": "Top Probability", + "name": "topP", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "openAI_0-input-topP-number" + }, + { + "label": "Best Of", + "name": "bestOf", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "openAI_0-input-bestOf-number" + }, + { + "label": "Frequency Penalty", + "name": "frequencyPenalty", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "openAI_0-input-frequencyPenalty-number" + }, + { + "label": "Presence Penalty", + "name": "presencePenalty", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "openAI_0-input-presencePenalty-number" + }, + { + "label": "Batch Size", + "name": "batchSize", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "openAI_0-input-batchSize-number" + }, + { + "label": "Timeout", + "name": "timeout", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "openAI_0-input-timeout-number" + }, + { + "label": "BasePath", + "name": "basepath", + "type": "string", + "optional": true, + "additionalParams": true, + "id": "openAI_0-input-basepath-string" + } + ], + "inputAnchors": [], + "inputs": { + "modelName": "text-davinci-003", + "temperature": 0.7, + "maxTokens": "", + "topP": "", + "bestOf": "", + "frequencyPenalty": "", + "presencePenalty": "", + "batchSize": "", + "timeout": "", + "basepath": "" + }, + "outputAnchors": [ + { + "id": "openAI_0-output-openAI-OpenAI|BaseLLM|BaseLanguageModel", + "name": "openAI", + "label": "OpenAI", + "type": "OpenAI | BaseLLM | BaseLanguageModel" + } + ], + "outputs": {}, + "selected": false + }, + "selected": false, + "positionAbsolute": { + "x": 513.3297923232442, + "y": -42.67554802812833 + }, + "dragging": false } ], "edges": [ { - "source": "openAI_1", - "sourceHandle": "openAI_1-output-openAI-OpenAI|BaseLLM|BaseLanguageModel|BaseLangChain", + "source": "promptTemplate_0", + "sourceHandle": "promptTemplate_0-output-promptTemplate-PromptTemplate|BaseStringPromptTemplate|BasePromptTemplate", "target": "llmChain_1", - "targetHandle": "llmChain_1-input-model-BaseLanguageModel", + "targetHandle": "llmChain_1-input-prompt-BasePromptTemplate", "type": "buttonedge", - "id": "openAI_1-openAI_1-output-openAI-OpenAI|BaseLLM|BaseLanguageModel|BaseLangChain-llmChain_1-llmChain_1-input-model-BaseLanguageModel", + "id": "promptTemplate_0-promptTemplate_0-output-promptTemplate-PromptTemplate|BaseStringPromptTemplate|BasePromptTemplate-llmChain_1-llmChain_1-input-prompt-BasePromptTemplate", "data": { "label": "" } }, { - "source": "promptTemplate_1", - "sourceHandle": "promptTemplate_1-output-promptTemplate-PromptTemplate|BaseStringPromptTemplate|BasePromptTemplate", + "source": "openAI_0", + "sourceHandle": "openAI_0-output-openAI-OpenAI|BaseLLM|BaseLanguageModel", "target": "llmChain_1", - "targetHandle": "llmChain_1-input-prompt-BasePromptTemplate", + "targetHandle": "llmChain_1-input-model-BaseLanguageModel", "type": "buttonedge", - "id": "promptTemplate_1-promptTemplate_1-output-promptTemplate-PromptTemplate|BaseStringPromptTemplate|BasePromptTemplate-llmChain_1-llmChain_1-input-prompt-BasePromptTemplate", + "id": "openAI_0-openAI_0-output-openAI-OpenAI|BaseLLM|BaseLanguageModel-llmChain_1-llmChain_1-input-model-BaseLanguageModel", "data": { "label": "" } diff --git a/packages/server/marketplaces/Translator.json b/packages/server/marketplaces/chatflows/Translator.json similarity index 71% rename from packages/server/marketplaces/Translator.json rename to packages/server/marketplaces/chatflows/Translator.json index d9bc4f256..dc2ee6ba0 100644 --- a/packages/server/marketplaces/Translator.json +++ b/packages/server/marketplaces/chatflows/Translator.json @@ -3,208 +3,7 @@ "nodes": [ { "width": 300, - "height": 711, - "id": "chatPromptTemplate_1", - "position": { - "x": 441.8516979620723, - "y": 636.1108860994266 - }, - "type": "customNode", - "data": { - "id": "chatPromptTemplate_1", - "label": "Chat Prompt Template", - "name": "chatPromptTemplate", - "type": "ChatPromptTemplate", - "baseClasses": ["ChatPromptTemplate", "BaseChatPromptTemplate", "BasePromptTemplate"], - "category": "Prompts", - "description": "Schema to represent a chat prompt", - "inputParams": [ - { - "label": "System Message", - "name": "systemMessagePrompt", - "type": "string", - "rows": 4, - "placeholder": "You are a helpful assistant that translates {input_language} to {output_language}.", - "id": "chatPromptTemplate_1-input-systemMessagePrompt-string" - }, - { - "label": "Human Message", - "name": "humanMessagePrompt", - "type": "string", - "rows": 4, - "placeholder": "{text}", - "id": "chatPromptTemplate_1-input-humanMessagePrompt-string" - }, - { - "label": "Format Prompt Values", - "name": "promptValues", - "type": "string", - "rows": 4, - "placeholder": "{\n \"input_language\": \"English\",\n \"output_language\": \"French\"\n}", - "optional": true, - "acceptVariable": true, - "list": true, - "id": "chatPromptTemplate_1-input-promptValues-string" - } - ], - "inputAnchors": [], - "inputs": { - "systemMessagePrompt": "You are a helpful assistant that translates {input_language} to {output_language}.", - "humanMessagePrompt": "{input}", - "promptValues": "{\n \"input_language\": \"English\",\n \"output_language\": \"French\"\n}" - }, - "outputAnchors": [ - { - "id": "chatPromptTemplate_1-output-chatPromptTemplate-ChatPromptTemplate|BaseChatPromptTemplate|BasePromptTemplate", - "name": "chatPromptTemplate", - "label": "ChatPromptTemplate", - "type": "ChatPromptTemplate | BaseChatPromptTemplate | BasePromptTemplate" - } - ], - "outputs": {}, - "selected": false - }, - "selected": false, - "positionAbsolute": { - "x": 441.8516979620723, - "y": 636.1108860994266 - }, - "dragging": false - }, - { - "width": 300, - "height": 526, - "id": "chatOpenAI_1", - "position": { - "x": 439.5219561593599, - "y": 93.61600226758335 - }, - "type": "customNode", - "data": { - "id": "chatOpenAI_1", - "label": "ChatOpenAI", - "name": "chatOpenAI", - "type": "ChatOpenAI", - "baseClasses": ["ChatOpenAI", "BaseChatModel", "BaseLanguageModel", "BaseLangChain"], - "category": "Chat Models", - "description": "Wrapper around OpenAI large language models that use the Chat endpoint", - "inputParams": [ - { - "label": "OpenAI Api Key", - "name": "openAIApiKey", - "type": "password", - "id": "chatOpenAI_1-input-openAIApiKey-password" - }, - { - "label": "Model Name", - "name": "modelName", - "type": "options", - "options": [ - { - "label": "gpt-4", - "name": "gpt-4" - }, - { - "label": "gpt-4-0314", - "name": "gpt-4-0314" - }, - { - "label": "gpt-4-32k-0314", - "name": "gpt-4-32k-0314" - }, - { - "label": "gpt-3.5-turbo", - "name": "gpt-3.5-turbo" - }, - { - "label": "gpt-3.5-turbo-0301", - "name": "gpt-3.5-turbo-0301" - } - ], - "default": "gpt-3.5-turbo", - "optional": true, - "id": "chatOpenAI_1-input-modelName-options" - }, - { - "label": "Temperature", - "name": "temperature", - "type": "number", - "default": 0.9, - "optional": true, - "id": "chatOpenAI_1-input-temperature-number" - }, - { - "label": "Max Tokens", - "name": "maxTokens", - "type": "number", - "optional": true, - "additionalParams": true, - "id": "chatOpenAI_1-input-maxTokens-number" - }, - { - "label": "Top Probability", - "name": "topP", - "type": "number", - "optional": true, - "additionalParams": true, - "id": "chatOpenAI_1-input-topP-number" - }, - { - "label": "Frequency Penalty", - "name": "frequencyPenalty", - "type": "number", - "optional": true, - "additionalParams": true, - "id": "chatOpenAI_1-input-frequencyPenalty-number" - }, - { - "label": "Presence Penalty", - "name": "presencePenalty", - "type": "number", - "optional": true, - "additionalParams": true, - "id": "chatOpenAI_1-input-presencePenalty-number" - }, - { - "label": "Timeout", - "name": "timeout", - "type": "number", - "optional": true, - "additionalParams": true, - "id": "chatOpenAI_1-input-timeout-number" - } - ], - "inputAnchors": [], - "inputs": { - "modelName": "gpt-3.5-turbo", - "temperature": 0.9, - "maxTokens": "", - "topP": "", - "frequencyPenalty": "", - "presencePenalty": "", - "timeout": "" - }, - "outputAnchors": [ - { - "id": "chatOpenAI_1-output-chatOpenAI-ChatOpenAI|BaseChatModel|BaseLanguageModel|BaseLangChain", - "name": "chatOpenAI", - "label": "ChatOpenAI", - "type": "ChatOpenAI | BaseChatModel | BaseLanguageModel | BaseLangChain" - } - ], - "outputs": {}, - "selected": false - }, - "selected": false, - "positionAbsolute": { - "x": 439.5219561593599, - "y": 93.61600226758335 - }, - "dragging": false - }, - { - "width": 300, - "height": 407, + "height": 405, "id": "llmChain_1", "position": { "x": 865.7775572410412, @@ -215,6 +14,7 @@ "id": "llmChain_1", "label": "LLM Chain", "name": "llmChain", + "version": 1, "type": "LLMChain", "baseClasses": ["LLMChain", "BaseChain", "BaseLangChain"], "category": "Chains", @@ -244,8 +44,8 @@ } ], "inputs": { - "model": "{{chatOpenAI_1.data.instance}}", - "prompt": "{{chatPromptTemplate_1.data.instance}}", + "model": "{{chatOpenAI_0.data.instance}}", + "prompt": "{{chatPromptTemplate_0.data.instance}}", "chainName": "Language Translation" }, "outputAnchors": [ @@ -261,10 +61,10 @@ "type": "LLMChain | BaseChain | BaseLangChain" }, { - "id": "llmChain_1-output-outputPrediction-string", + "id": "llmChain_1-output-outputPrediction-string|json", "name": "outputPrediction", "label": "Output Prediction", - "type": "string" + "type": "string | json" } ], "default": "llmChain" @@ -281,27 +81,250 @@ "y": 543.9211372857111 }, "dragging": false + }, + { + "width": 300, + "height": 652, + "id": "chatPromptTemplate_0", + "position": { + "x": 437.51367850489396, + "y": 649.7619214034173 + }, + "type": "customNode", + "data": { + "id": "chatPromptTemplate_0", + "label": "Chat Prompt Template", + "name": "chatPromptTemplate", + "version": 1, + "type": "ChatPromptTemplate", + "baseClasses": ["ChatPromptTemplate", "BaseChatPromptTemplate", "BasePromptTemplate"], + "category": "Prompts", + "description": "Schema to represent a chat prompt", + "inputParams": [ + { + "label": "System Message", + "name": "systemMessagePrompt", + "type": "string", + "rows": 4, + "placeholder": "You are a helpful assistant that translates {input_language} to {output_language}.", + "id": "chatPromptTemplate_0-input-systemMessagePrompt-string" + }, + { + "label": "Human Message", + "name": "humanMessagePrompt", + "type": "string", + "rows": 4, + "placeholder": "{text}", + "id": "chatPromptTemplate_0-input-humanMessagePrompt-string" + }, + { + "label": "Format Prompt Values", + "name": "promptValues", + "type": "json", + "optional": true, + "acceptVariable": true, + "list": true, + "id": "chatPromptTemplate_0-input-promptValues-json" + } + ], + "inputAnchors": [], + "inputs": { + "systemMessagePrompt": "You are a helpful assistant that translates {input_language} to {output_language}.", + "humanMessagePrompt": "{text}", + "promptValues": "{\"input_language\":\"English\",\"output_language\":\"French\",\"text\":\"{{question}}\"}" + }, + "outputAnchors": [ + { + "id": "chatPromptTemplate_0-output-chatPromptTemplate-ChatPromptTemplate|BaseChatPromptTemplate|BasePromptTemplate", + "name": "chatPromptTemplate", + "label": "ChatPromptTemplate", + "type": "ChatPromptTemplate | BaseChatPromptTemplate | BasePromptTemplate" + } + ], + "outputs": {}, + "selected": false + }, + "selected": false, + "positionAbsolute": { + "x": 437.51367850489396, + "y": 649.7619214034173 + }, + "dragging": false + }, + { + "width": 300, + "height": 523, + "id": "chatOpenAI_0", + "position": { + "x": 436.97058562345904, + "y": 99.96180150605153 + }, + "type": "customNode", + "data": { + "id": "chatOpenAI_0", + "label": "ChatOpenAI", + "name": "chatOpenAI", + "version": 1, + "type": "ChatOpenAI", + "baseClasses": ["ChatOpenAI", "BaseChatModel", "BaseLanguageModel"], + "category": "Chat Models", + "description": "Wrapper around OpenAI large language models that use the Chat endpoint", + "inputParams": [ + { + "label": "Connect Credential", + "name": "credential", + "type": "credential", + "credentialNames": ["openAIApi"], + "id": "chatOpenAI_0-input-credential-credential" + }, + { + "label": "Model Name", + "name": "modelName", + "type": "options", + "options": [ + { + "label": "gpt-4", + "name": "gpt-4" + }, + { + "label": "gpt-4-0613", + "name": "gpt-4-0613" + }, + { + "label": "gpt-4-32k", + "name": "gpt-4-32k" + }, + { + "label": "gpt-4-32k-0613", + "name": "gpt-4-32k-0613" + }, + { + "label": "gpt-3.5-turbo", + "name": "gpt-3.5-turbo" + }, + { + "label": "gpt-3.5-turbo-0613", + "name": "gpt-3.5-turbo-0613" + }, + { + "label": "gpt-3.5-turbo-16k", + "name": "gpt-3.5-turbo-16k" + }, + { + "label": "gpt-3.5-turbo-16k-0613", + "name": "gpt-3.5-turbo-16k-0613" + } + ], + "default": "gpt-3.5-turbo", + "optional": true, + "id": "chatOpenAI_0-input-modelName-options" + }, + { + "label": "Temperature", + "name": "temperature", + "type": "number", + "default": 0.9, + "optional": true, + "id": "chatOpenAI_0-input-temperature-number" + }, + { + "label": "Max Tokens", + "name": "maxTokens", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_0-input-maxTokens-number" + }, + { + "label": "Top Probability", + "name": "topP", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_0-input-topP-number" + }, + { + "label": "Frequency Penalty", + "name": "frequencyPenalty", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_0-input-frequencyPenalty-number" + }, + { + "label": "Presence Penalty", + "name": "presencePenalty", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_0-input-presencePenalty-number" + }, + { + "label": "Timeout", + "name": "timeout", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_0-input-timeout-number" + }, + { + "label": "BasePath", + "name": "basepath", + "type": "string", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_0-input-basepath-string" + } + ], + "inputAnchors": [], + "inputs": { + "modelName": "gpt-3.5-turbo", + "temperature": "0", + "maxTokens": "", + "topP": "", + "frequencyPenalty": "", + "presencePenalty": "", + "timeout": "", + "basepath": "" + }, + "outputAnchors": [ + { + "id": "chatOpenAI_0-output-chatOpenAI-ChatOpenAI|BaseChatModel|BaseLanguageModel", + "name": "chatOpenAI", + "label": "ChatOpenAI", + "type": "ChatOpenAI | BaseChatModel | BaseLanguageModel" + } + ], + "outputs": {}, + "selected": false + }, + "selected": false, + "positionAbsolute": { + "x": 436.97058562345904, + "y": 99.96180150605153 + }, + "dragging": false } ], "edges": [ { - "source": "chatOpenAI_1", - "sourceHandle": "chatOpenAI_1-output-chatOpenAI-ChatOpenAI|BaseChatModel|BaseLanguageModel|BaseLangChain", + "source": "chatPromptTemplate_0", + "sourceHandle": "chatPromptTemplate_0-output-chatPromptTemplate-ChatPromptTemplate|BaseChatPromptTemplate|BasePromptTemplate", "target": "llmChain_1", - "targetHandle": "llmChain_1-input-model-BaseLanguageModel", + "targetHandle": "llmChain_1-input-prompt-BasePromptTemplate", "type": "buttonedge", - "id": "chatOpenAI_1-chatOpenAI_1-output-chatOpenAI-ChatOpenAI|BaseChatModel|BaseLanguageModel|BaseLangChain-llmChain_1-llmChain_1-input-model-BaseLanguageModel", + "id": "chatPromptTemplate_0-chatPromptTemplate_0-output-chatPromptTemplate-ChatPromptTemplate|BaseChatPromptTemplate|BasePromptTemplate-llmChain_1-llmChain_1-input-prompt-BasePromptTemplate", "data": { "label": "" } }, { - "source": "chatPromptTemplate_1", - "sourceHandle": "chatPromptTemplate_1-output-chatPromptTemplate-ChatPromptTemplate|BaseChatPromptTemplate|BasePromptTemplate", + "source": "chatOpenAI_0", + "sourceHandle": "chatOpenAI_0-output-chatOpenAI-ChatOpenAI|BaseChatModel|BaseLanguageModel", "target": "llmChain_1", - "targetHandle": "llmChain_1-input-prompt-BasePromptTemplate", + "targetHandle": "llmChain_1-input-model-BaseLanguageModel", "type": "buttonedge", - "id": "chatPromptTemplate_1-chatPromptTemplate_1-output-chatPromptTemplate-ChatPromptTemplate|BaseChatPromptTemplate|BasePromptTemplate-llmChain_1-llmChain_1-input-prompt-BasePromptTemplate", + "id": "chatOpenAI_0-chatOpenAI_0-output-chatOpenAI-ChatOpenAI|BaseChatModel|BaseLanguageModel-llmChain_1-llmChain_1-input-model-BaseLanguageModel", "data": { "label": "" } diff --git a/packages/server/marketplaces/chatflows/Vectara LLM Chain Upload.json b/packages/server/marketplaces/chatflows/Vectara LLM Chain Upload.json new file mode 100644 index 000000000..784ad2406 --- /dev/null +++ b/packages/server/marketplaces/chatflows/Vectara LLM Chain Upload.json @@ -0,0 +1,448 @@ +{ + "description": "A simple LLM chain that uses Vectara to enable conversations with uploaded documents", + "nodes": [ + { + "width": 300, + "height": 525, + "id": "chatOpenAI_0", + "position": { "x": 514.1088940275924, "y": 199.574479681537 }, + "type": "customNode", + "data": { + "id": "chatOpenAI_0", + "label": "ChatOpenAI", + "version": 1, + "name": "chatOpenAI", + "type": "ChatOpenAI", + "baseClasses": ["ChatOpenAI", "BaseChatModel", "BaseLanguageModel"], + "category": "Chat Models", + "description": "Wrapper around OpenAI large language models that use the Chat endpoint", + "inputParams": [ + { + "label": "Connect Credential", + "name": "credential", + "type": "credential", + "credentialNames": ["openAIApi"], + "id": "chatOpenAI_0-input-credential-credential" + }, + { + "label": "Model Name", + "name": "modelName", + "type": "options", + "options": [ + { "label": "gpt-4", "name": "gpt-4" }, + { "label": "gpt-4-0613", "name": "gpt-4-0613" }, + { "label": "gpt-4-32k", "name": "gpt-4-32k" }, + { "label": "gpt-4-32k-0613", "name": "gpt-4-32k-0613" }, + { "label": "gpt-3.5-turbo", "name": "gpt-3.5-turbo" }, + { "label": "gpt-3.5-turbo-0613", "name": "gpt-3.5-turbo-0613" }, + { "label": "gpt-3.5-turbo-16k", "name": "gpt-3.5-turbo-16k" }, + { "label": "gpt-3.5-turbo-16k-0613", "name": "gpt-3.5-turbo-16k-0613" } + ], + "default": "gpt-3.5-turbo", + "optional": true, + "id": "chatOpenAI_0-input-modelName-options" + }, + { + "label": "Temperature", + "name": "temperature", + "type": "number", + "step": 0.1, + "default": 0.9, + "optional": true, + "id": "chatOpenAI_0-input-temperature-number" + }, + { + "label": "Max Tokens", + "name": "maxTokens", + "type": "number", + "step": 1, + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_0-input-maxTokens-number" + }, + { + "label": "Top Probability", + "name": "topP", + "type": "number", + "step": 0.1, + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_0-input-topP-number" + }, + { + "label": "Frequency Penalty", + "name": "frequencyPenalty", + "type": "number", + "step": 0.1, + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_0-input-frequencyPenalty-number" + }, + { + "label": "Presence Penalty", + "name": "presencePenalty", + "type": "number", + "step": 0.1, + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_0-input-presencePenalty-number" + }, + { + "label": "Timeout", + "name": "timeout", + "type": "number", + "step": 1, + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_0-input-timeout-number" + }, + { + "label": "BasePath", + "name": "basepath", + "type": "string", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_0-input-basepath-string" + } + ], + "inputAnchors": [], + "inputs": { + "modelName": "gpt-3.5-turbo", + "temperature": "0.5", + "maxTokens": "", + "topP": "", + "frequencyPenalty": "", + "presencePenalty": "", + "timeout": "", + "basepath": "" + }, + "outputAnchors": [ + { + "id": "chatOpenAI_0-output-chatOpenAI-ChatOpenAI|BaseChatModel|BaseLanguageModel", + "name": "chatOpenAI", + "label": "ChatOpenAI", + "type": "ChatOpenAI | BaseChatModel | BaseLanguageModel" + } + ], + "outputs": {}, + "selected": false + }, + "selected": false, + "positionAbsolute": { "x": 514.1088940275924, "y": 199.574479681537 }, + "dragging": false + }, + { + "width": 300, + "height": 481, + "id": "conversationalRetrievalQAChain_0", + "position": { "x": 900.4793407261002, "y": 205.9476004518217 }, + "type": "customNode", + "data": { + "id": "conversationalRetrievalQAChain_0", + "label": "Conversational Retrieval QA Chain", + "version": 1, + "name": "conversationalRetrievalQAChain", + "type": "ConversationalRetrievalQAChain", + "baseClasses": ["ConversationalRetrievalQAChain", "BaseChain"], + "category": "Chains", + "description": "Document QA - built on RetrievalQAChain to provide a chat history component", + "inputParams": [ + { + "label": "Return Source Documents", + "name": "returnSourceDocuments", + "type": "boolean", + "optional": true, + "id": "conversationalRetrievalQAChain_0-input-returnSourceDocuments-boolean" + }, + { + "label": "System Message", + "name": "systemMessagePrompt", + "type": "string", + "rows": 4, + "additionalParams": true, + "optional": true, + "placeholder": "I want you to act as a document that I am having a conversation with. Your name is \"AI Assistant\". You will provide me with answers from the given info. If the answer is not included, say exactly \"Hmm, I am not sure.\" and stop after that. Refuse to answer any question not about the info. Never break character.", + "id": "conversationalRetrievalQAChain_0-input-systemMessagePrompt-string" + }, + { + "label": "Chain Option", + "name": "chainOption", + "type": "options", + "options": [ + { + "label": "MapReduceDocumentsChain", + "name": "map_reduce", + "description": "Suitable for QA tasks over larger documents and can run the preprocessing step in parallel, reducing the running time" + }, + { + "label": "RefineDocumentsChain", + "name": "refine", + "description": "Suitable for QA tasks over a large number of documents." + }, + { + "label": "StuffDocumentsChain", + "name": "stuff", + "description": "Suitable for QA tasks over a small number of documents." + } + ], + "additionalParams": true, + "optional": true, + "id": "conversationalRetrievalQAChain_0-input-chainOption-options" + } + ], + "inputAnchors": [ + { + "label": "Language Model", + "name": "model", + "type": "BaseLanguageModel", + "id": "conversationalRetrievalQAChain_0-input-model-BaseLanguageModel" + }, + { + "label": "Vector Store Retriever", + "name": "vectorStoreRetriever", + "type": "BaseRetriever", + "id": "conversationalRetrievalQAChain_0-input-vectorStoreRetriever-BaseRetriever" + }, + { + "label": "Memory", + "name": "memory", + "type": "BaseMemory", + "optional": true, + "description": "If left empty, a default BufferMemory will be used", + "id": "conversationalRetrievalQAChain_0-input-memory-BaseMemory" + } + ], + "inputs": { + "model": "{{chatOpenAI_0.data.instance}}", + "vectorStoreRetriever": "{{vectaraUpsert_0.data.instance}}", + "memory": "", + "returnSourceDocuments": "", + "systemMessagePrompt": "", + "chainOption": "" + }, + "outputAnchors": [ + { + "id": "conversationalRetrievalQAChain_0-output-conversationalRetrievalQAChain-ConversationalRetrievalQAChain|BaseChain", + "name": "conversationalRetrievalQAChain", + "label": "ConversationalRetrievalQAChain", + "type": "ConversationalRetrievalQAChain | BaseChain" + } + ], + "outputs": {}, + "selected": false + }, + "selected": false, + "positionAbsolute": { "x": 900.4793407261002, "y": 205.9476004518217 }, + "dragging": false + }, + { + "width": 300, + "height": 509, + "id": "pdfFile_0", + "position": { "x": -210.44158723479913, "y": 236.6627524951051 }, + "type": "customNode", + "data": { + "id": "pdfFile_0", + "label": "Pdf File", + "version": 1, + "name": "pdfFile", + "type": "Document", + "baseClasses": ["Document"], + "category": "Document Loaders", + "description": "Load data from PDF files", + "inputParams": [ + { "label": "Pdf File", "name": "pdfFile", "type": "file", "fileType": ".pdf", "id": "pdfFile_0-input-pdfFile-file" }, + { + "label": "Usage", + "name": "usage", + "type": "options", + "options": [ + { "label": "One document per page", "name": "perPage" }, + { "label": "One document per file", "name": "perFile" } + ], + "default": "perPage", + "id": "pdfFile_0-input-usage-options" + }, + { + "label": "Use Legacy Build", + "name": "legacyBuild", + "type": "boolean", + "optional": true, + "additionalParams": true, + "id": "pdfFile_0-input-legacyBuild-boolean" + }, + { + "label": "Metadata", + "name": "metadata", + "type": "json", + "optional": true, + "additionalParams": true, + "id": "pdfFile_0-input-metadata-json" + } + ], + "inputAnchors": [ + { + "label": "Text Splitter", + "name": "textSplitter", + "type": "TextSplitter", + "optional": true, + "id": "pdfFile_0-input-textSplitter-TextSplitter" + } + ], + "inputs": { "textSplitter": "", "usage": "perPage", "legacyBuild": "", "metadata": "" }, + "outputAnchors": [ + { "id": "pdfFile_0-output-pdfFile-Document", "name": "pdfFile", "label": "Document", "type": "Document" } + ], + "outputs": {}, + "selected": false + }, + "selected": false, + "positionAbsolute": { "x": -210.44158723479913, "y": 236.6627524951051 }, + "dragging": false + }, + { + "width": 300, + "height": 408, + "id": "vectaraUpsert_0", + "position": { "x": 172.06946164914868, "y": 373.11406233089934 }, + "type": "customNode", + "data": { + "id": "vectaraUpsert_0", + "label": "Vectara Upsert Document", + "version": 1, + "name": "vectaraUpsert", + "type": "Vectara", + "baseClasses": ["Vectara", "VectorStoreRetriever", "BaseRetriever"], + "category": "Vector Stores", + "description": "Upsert documents to Vectara", + "inputParams": [ + { + "label": "Connect Credential", + "name": "credential", + "type": "credential", + "credentialNames": ["vectaraApi"], + "id": "vectaraUpsert_0-input-credential-credential" + }, + { + "label": "Vectara Metadata Filter", + "name": "filter", + "description": "Filter to apply to Vectara metadata. Refer to the documentation on how to use Vectara filters with Flowise.", + "type": "string", + "additionalParams": true, + "optional": true, + "id": "vectaraUpsert_0-input-filter-string" + }, + { + "label": "Sentences Before", + "name": "sentencesBefore", + "description": "Number of sentences to fetch before the matched sentence. Defaults to 2.", + "type": "number", + "additionalParams": true, + "optional": true, + "id": "vectaraUpsert_0-input-sentencesBefore-number" + }, + { + "label": "Sentences After", + "name": "sentencesAfter", + "description": "Number of sentences to fetch after the matched sentence. Defaults to 2.", + "type": "number", + "additionalParams": true, + "optional": true, + "id": "vectaraUpsert_0-input-sentencesAfter-number" + }, + { + "label": "Lambda", + "name": "lambda", + "description": "Improves retrieval accuracy by adjusting the balance (from 0 to 1) between neural search and keyword-based search factors.", + "type": "number", + "additionalParams": true, + "optional": true, + "id": "vectaraUpsert_0-input-lambda-number" + }, + { + "label": "Top K", + "name": "topK", + "description": "Number of top results to fetch. Defaults to 4", + "placeholder": "4", + "type": "number", + "additionalParams": true, + "optional": true, + "id": "vectaraUpsert_0-input-topK-number" + } + ], + "inputAnchors": [ + { + "label": "Document", + "name": "document", + "type": "Document", + "list": true, + "id": "vectaraUpsert_0-input-document-Document" + } + ], + "inputs": { + "document": ["{{pdfFile_0.data.instance}}"], + "filter": "", + "sentencesBefore": "", + "sentencesAfter": "", + "lambda": "", + "topK": "" + }, + "outputAnchors": [ + { + "name": "output", + "label": "Output", + "type": "options", + "options": [ + { + "id": "vectaraUpsert_0-output-retriever-Vectara|VectorStoreRetriever|BaseRetriever", + "name": "retriever", + "label": "Vectara Retriever", + "type": "Vectara | VectorStoreRetriever | BaseRetriever" + }, + { + "id": "vectaraUpsert_0-output-vectorStore-Vectara|VectorStore", + "name": "vectorStore", + "label": "Vectara Vector Store", + "type": "Vectara | VectorStore" + } + ], + "default": "retriever" + } + ], + "outputs": { "output": "retriever" }, + "selected": false + }, + "positionAbsolute": { "x": 172.06946164914868, "y": 373.11406233089934 }, + "selected": false + } + ], + "edges": [ + { + "source": "chatOpenAI_0", + "sourceHandle": "chatOpenAI_0-output-chatOpenAI-ChatOpenAI|BaseChatModel|BaseLanguageModel", + "target": "conversationalRetrievalQAChain_0", + "targetHandle": "conversationalRetrievalQAChain_0-input-model-BaseLanguageModel", + "type": "buttonedge", + "id": "chatOpenAI_0-chatOpenAI_0-output-chatOpenAI-ChatOpenAI|BaseChatModel|BaseLanguageModel-conversationalRetrievalQAChain_0-conversationalRetrievalQAChain_0-input-model-BaseLanguageModel", + "data": { "label": "" } + }, + { + "source": "pdfFile_0", + "sourceHandle": "pdfFile_0-output-pdfFile-Document", + "target": "vectaraUpsert_0", + "targetHandle": "vectaraUpsert_0-input-document-Document", + "type": "buttonedge", + "id": "pdfFile_0-pdfFile_0-output-pdfFile-Document-vectaraUpsert_0-vectaraUpsert_0-input-document-Document", + "data": { "label": "" } + }, + { + "source": "vectaraUpsert_0", + "sourceHandle": "vectaraUpsert_0-output-retriever-Vectara|VectorStoreRetriever|BaseRetriever", + "target": "conversationalRetrievalQAChain_0", + "targetHandle": "conversationalRetrievalQAChain_0-input-vectorStoreRetriever-BaseRetriever", + "type": "buttonedge", + "id": "vectaraUpsert_0-vectaraUpsert_0-output-retriever-Vectara|VectorStoreRetriever|BaseRetriever-conversationalRetrievalQAChain_0-conversationalRetrievalQAChain_0-input-vectorStoreRetriever-BaseRetriever", + "data": { "label": "" } + } + ] +} diff --git a/packages/server/marketplaces/WebBrowser.json b/packages/server/marketplaces/chatflows/WebBrowser.json similarity index 72% rename from packages/server/marketplaces/WebBrowser.json rename to packages/server/marketplaces/chatflows/WebBrowser.json index bd3671612..95743f9f4 100644 --- a/packages/server/marketplaces/WebBrowser.json +++ b/packages/server/marketplaces/chatflows/WebBrowser.json @@ -3,27 +3,216 @@ "nodes": [ { "width": 300, - "height": 524, + "height": 376, + "id": "bufferMemory_0", + "position": { + "x": 457.04304716743604, + "y": 362.4048129799687 + }, + "type": "customNode", + "data": { + "id": "bufferMemory_0", + "label": "Buffer Memory", + "name": "bufferMemory", + "version": 1, + "type": "BufferMemory", + "baseClasses": ["BufferMemory", "BaseChatMemory", "BaseMemory"], + "category": "Memory", + "description": "Remembers previous conversational back and forths directly", + "inputParams": [ + { + "label": "Memory Key", + "name": "memoryKey", + "type": "string", + "default": "chat_history", + "id": "bufferMemory_0-input-memoryKey-string" + }, + { + "label": "Input Key", + "name": "inputKey", + "type": "string", + "default": "input", + "id": "bufferMemory_0-input-inputKey-string" + } + ], + "inputAnchors": [], + "inputs": { + "memoryKey": "chat_history", + "inputKey": "input" + }, + "outputAnchors": [ + { + "id": "bufferMemory_0-output-bufferMemory-BufferMemory|BaseChatMemory|BaseMemory", + "name": "bufferMemory", + "label": "BufferMemory", + "type": "BufferMemory | BaseChatMemory | BaseMemory" + } + ], + "outputs": {}, + "selected": false + }, + "selected": false, + "positionAbsolute": { + "x": 457.04304716743604, + "y": 362.4048129799687 + }, + "dragging": false + }, + { + "width": 300, + "height": 280, + "id": "webBrowser_0", + "position": { + "x": 1091.0866823400172, + "y": -16.43806989958216 + }, + "type": "customNode", + "data": { + "id": "webBrowser_0", + "label": "Web Browser", + "name": "webBrowser", + "version": 1, + "type": "WebBrowser", + "baseClasses": ["WebBrowser", "Tool", "StructuredTool", "BaseLangChain"], + "category": "Tools", + "description": "Gives agent the ability to visit a website and extract information", + "inputParams": [], + "inputAnchors": [ + { + "label": "Language Model", + "name": "model", + "type": "BaseLanguageModel", + "id": "webBrowser_0-input-model-BaseLanguageModel" + }, + { + "label": "Embeddings", + "name": "embeddings", + "type": "Embeddings", + "id": "webBrowser_0-input-embeddings-Embeddings" + } + ], + "inputs": { + "model": "{{chatOpenAI_0.data.instance}}", + "embeddings": "{{openAIEmbeddings_0.data.instance}}" + }, + "outputAnchors": [ + { + "id": "webBrowser_0-output-webBrowser-WebBrowser|Tool|StructuredTool|BaseLangChain", + "name": "webBrowser", + "label": "WebBrowser", + "type": "WebBrowser | Tool | StructuredTool | BaseLangChain" + } + ], + "outputs": {}, + "selected": false + }, + "selected": false, + "positionAbsolute": { + "x": 1091.0866823400172, + "y": -16.43806989958216 + }, + "dragging": false + }, + { + "width": 300, + "height": 383, + "id": "conversationalAgent_0", + "position": { + "x": 1464.513303631911, + "y": 155.73036805253955 + }, + "type": "customNode", + "data": { + "id": "conversationalAgent_0", + "label": "Conversational Agent", + "name": "conversationalAgent", + "version": 1, + "type": "AgentExecutor", + "baseClasses": ["AgentExecutor", "BaseChain"], + "category": "Agents", + "description": "Conversational agent for a chat model. It will utilize chat specific prompts", + "inputParams": [ + { + "label": "System Message", + "name": "systemMessage", + "type": "string", + "rows": 4, + "default": "Assistant is a large language model trained by OpenAI.\n\nAssistant is designed to be able to assist with a wide range of tasks, from answering simple questions to providing in-depth explanations and discussions on a wide range of topics. As a language model, Assistant is able to generate human-like text based on the input it receives, allowing it to engage in natural-sounding conversations and provide responses that are coherent and relevant to the topic at hand.\n\nAssistant is constantly learning and improving, and its capabilities are constantly evolving. It is able to process and understand large amounts of text, and can use this knowledge to provide accurate and informative responses to a wide range of questions. Additionally, Assistant is able to generate its own text based on the input it receives, allowing it to engage in discussions and provide explanations and descriptions on a wide range of topics.\n\nOverall, Assistant is a powerful system that can help with a wide range of tasks and provide valuable insights and information on a wide range of topics. Whether you need help with a specific question or just want to have a conversation about a particular topic, Assistant is here to assist.", + "optional": true, + "additionalParams": true, + "id": "conversationalAgent_0-input-systemMessage-string" + } + ], + "inputAnchors": [ + { + "label": "Allowed Tools", + "name": "tools", + "type": "Tool", + "list": true, + "id": "conversationalAgent_0-input-tools-Tool" + }, + { + "label": "Language Model", + "name": "model", + "type": "BaseLanguageModel", + "id": "conversationalAgent_0-input-model-BaseLanguageModel" + }, + { + "label": "Memory", + "name": "memory", + "type": "BaseChatMemory", + "id": "conversationalAgent_0-input-memory-BaseChatMemory" + } + ], + "inputs": { + "tools": ["{{webBrowser_0.data.instance}}"], + "model": "{{chatOpenAI_1.data.instance}}", + "memory": "{{bufferMemory_0.data.instance}}", + "systemMessage": "Assistant is a large language model trained by OpenAI.\n\nAssistant is designed to be able to assist with a wide range of tasks, from answering simple questions to providing in-depth explanations and discussions on a wide range of topics. As a language model, Assistant is able to generate human-like text based on the input it receives, allowing it to engage in natural-sounding conversations and provide responses that are coherent and relevant to the topic at hand.\n\nAssistant is constantly learning and improving, and its capabilities are constantly evolving. It is able to process and understand large amounts of text, and can use this knowledge to provide accurate and informative responses to a wide range of questions. Additionally, Assistant is able to generate its own text based on the input it receives, allowing it to engage in discussions and provide explanations and descriptions on a wide range of topics.\n\nOverall, Assistant is a powerful system that can help with a wide range of tasks and provide valuable insights and information on a wide range of topics. Whether you need help with a specific question or just want to have a conversation about a particular topic, Assistant is here to assist." + }, + "outputAnchors": [ + { + "id": "conversationalAgent_0-output-conversationalAgent-AgentExecutor|BaseChain", + "name": "conversationalAgent", + "label": "AgentExecutor", + "type": "AgentExecutor | BaseChain" + } + ], + "outputs": {}, + "selected": false + }, + "selected": false, + "positionAbsolute": { + "x": 1464.513303631911, + "y": 155.73036805253955 + }, + "dragging": false + }, + { + "width": 300, + "height": 523, "id": "chatOpenAI_0", "position": { - "x": 348.0817836845733, - "y": -86.56099395751443 + "x": 734.7477982032904, + "y": -400.9979556765114 }, "type": "customNode", "data": { "id": "chatOpenAI_0", "label": "ChatOpenAI", "name": "chatOpenAI", + "version": 1, "type": "ChatOpenAI", - "baseClasses": ["ChatOpenAI", "BaseChatModel", "BaseLanguageModel", "BaseLangChain"], + "baseClasses": ["ChatOpenAI", "BaseChatModel", "BaseLanguageModel"], "category": "Chat Models", "description": "Wrapper around OpenAI large language models that use the Chat endpoint", "inputParams": [ { - "label": "OpenAI Api Key", - "name": "openAIApiKey", - "type": "password", - "id": "chatOpenAI_0-input-openAIApiKey-password" + "label": "Connect Credential", + "name": "credential", + "type": "credential", + "credentialNames": ["openAIApi"], + "id": "chatOpenAI_0-input-credential-credential" }, { "label": "Model Name", @@ -35,20 +224,32 @@ "name": "gpt-4" }, { - "label": "gpt-4-0314", - "name": "gpt-4-0314" + "label": "gpt-4-0613", + "name": "gpt-4-0613" }, { - "label": "gpt-4-32k-0314", - "name": "gpt-4-32k-0314" + "label": "gpt-4-32k", + "name": "gpt-4-32k" + }, + { + "label": "gpt-4-32k-0613", + "name": "gpt-4-32k-0613" }, { "label": "gpt-3.5-turbo", "name": "gpt-3.5-turbo" }, { - "label": "gpt-3.5-turbo-0301", - "name": "gpt-3.5-turbo-0301" + "label": "gpt-3.5-turbo-0613", + "name": "gpt-3.5-turbo-0613" + }, + { + "label": "gpt-3.5-turbo-16k", + "name": "gpt-3.5-turbo-16k" + }, + { + "label": "gpt-3.5-turbo-16k-0613", + "name": "gpt-3.5-turbo-16k-0613" } ], "default": "gpt-3.5-turbo", @@ -102,6 +303,14 @@ "optional": true, "additionalParams": true, "id": "chatOpenAI_0-input-timeout-number" + }, + { + "label": "BasePath", + "name": "basepath", + "type": "string", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_0-input-basepath-string" } ], "inputAnchors": [], @@ -112,14 +321,15 @@ "topP": "", "frequencyPenalty": "", "presencePenalty": "", - "timeout": "" + "timeout": "", + "basepath": "" }, "outputAnchors": [ { - "id": "chatOpenAI_0-output-chatOpenAI-ChatOpenAI|BaseChatModel|BaseLanguageModel|BaseLangChain", + "id": "chatOpenAI_0-output-chatOpenAI-ChatOpenAI|BaseChatModel|BaseLanguageModel", "name": "chatOpenAI", "label": "ChatOpenAI", - "type": "ChatOpenAI | BaseChatModel | BaseLanguageModel | BaseLangChain" + "type": "ChatOpenAI | BaseChatModel | BaseLanguageModel" } ], "outputs": {}, @@ -127,90 +337,36 @@ }, "selected": false, "positionAbsolute": { - "x": 348.0817836845733, - "y": -86.56099395751443 + "x": 734.7477982032904, + "y": -400.9979556765114 }, "dragging": false }, { "width": 300, - "height": 376, - "id": "bufferMemory_0", - "position": { - "x": 15.045898260926037, - "y": 114.13407401971622 - }, - "type": "customNode", - "data": { - "id": "bufferMemory_0", - "label": "Buffer Memory", - "name": "bufferMemory", - "type": "BufferMemory", - "baseClasses": ["BufferMemory", "BaseChatMemory", "BaseMemory"], - "category": "Memory", - "description": "Remembers previous conversational back and forths directly", - "inputParams": [ - { - "label": "Memory Key", - "name": "memoryKey", - "type": "string", - "default": "chat_history", - "id": "bufferMemory_0-input-memoryKey-string" - }, - { - "label": "Input Key", - "name": "inputKey", - "type": "string", - "default": "input", - "id": "bufferMemory_0-input-inputKey-string" - } - ], - "inputAnchors": [], - "inputs": { - "memoryKey": "chat_history", - "inputKey": "input" - }, - "outputAnchors": [ - { - "id": "bufferMemory_0-output-bufferMemory-BufferMemory|BaseChatMemory|BaseMemory", - "name": "bufferMemory", - "label": "BufferMemory", - "type": "BufferMemory | BaseChatMemory | BaseMemory" - } - ], - "outputs": {}, - "selected": false - }, - "selected": false, - "positionAbsolute": { - "x": 15.045898260926037, - "y": 114.13407401971622 - }, - "dragging": false - }, - { - "width": 300, - "height": 330, + "height": 329, "id": "openAIEmbeddings_0", "position": { - "x": 693.9266260641734, - "y": 37.098856540087496 + "x": 403.72014625628697, + "y": -103.82540449681527 }, "type": "customNode", "data": { "id": "openAIEmbeddings_0", "label": "OpenAI Embeddings", "name": "openAIEmbeddings", + "version": 1, "type": "OpenAIEmbeddings", "baseClasses": ["OpenAIEmbeddings", "Embeddings"], "category": "Embeddings", "description": "OpenAI API to generate embeddings for a given text", "inputParams": [ { - "label": "OpenAI Api Key", - "name": "openAIApiKey", - "type": "password", - "id": "openAIEmbeddings_0-input-openAIApiKey-password" + "label": "Connect Credential", + "name": "credential", + "type": "credential", + "credentialNames": ["openAIApi"], + "id": "openAIEmbeddings_0-input-credential-credential" }, { "label": "Strip New Lines", @@ -235,13 +391,22 @@ "optional": true, "additionalParams": true, "id": "openAIEmbeddings_0-input-timeout-number" + }, + { + "label": "BasePath", + "name": "basepath", + "type": "string", + "optional": true, + "additionalParams": true, + "id": "openAIEmbeddings_0-input-basepath-string" } ], "inputAnchors": [], "inputs": { "stripNewLines": "", "batchSize": "", - "timeout": "" + "timeout": "", + "basepath": "" }, "outputAnchors": [ { @@ -256,34 +421,36 @@ }, "selected": false, "positionAbsolute": { - "x": 693.9266260641734, - "y": 37.098856540087496 + "x": 403.72014625628697, + "y": -103.82540449681527 }, "dragging": false }, { "width": 300, - "height": 524, + "height": 523, "id": "chatOpenAI_1", "position": { - "x": 691.5132411896494, - "y": -533.1696369549378 + "x": 68.312124033115, + "y": -169.65476709991256 }, "type": "customNode", "data": { "id": "chatOpenAI_1", "label": "ChatOpenAI", "name": "chatOpenAI", + "version": 1, "type": "ChatOpenAI", - "baseClasses": ["ChatOpenAI", "BaseChatModel", "BaseLanguageModel", "BaseLangChain"], + "baseClasses": ["ChatOpenAI", "BaseChatModel", "BaseLanguageModel"], "category": "Chat Models", "description": "Wrapper around OpenAI large language models that use the Chat endpoint", "inputParams": [ { - "label": "OpenAI Api Key", - "name": "openAIApiKey", - "type": "password", - "id": "chatOpenAI_1-input-openAIApiKey-password" + "label": "Connect Credential", + "name": "credential", + "type": "credential", + "credentialNames": ["openAIApi"], + "id": "chatOpenAI_1-input-credential-credential" }, { "label": "Model Name", @@ -295,20 +462,32 @@ "name": "gpt-4" }, { - "label": "gpt-4-0314", - "name": "gpt-4-0314" + "label": "gpt-4-0613", + "name": "gpt-4-0613" }, { - "label": "gpt-4-32k-0314", - "name": "gpt-4-32k-0314" + "label": "gpt-4-32k", + "name": "gpt-4-32k" + }, + { + "label": "gpt-4-32k-0613", + "name": "gpt-4-32k-0613" }, { "label": "gpt-3.5-turbo", "name": "gpt-3.5-turbo" }, { - "label": "gpt-3.5-turbo-0301", - "name": "gpt-3.5-turbo-0301" + "label": "gpt-3.5-turbo-0613", + "name": "gpt-3.5-turbo-0613" + }, + { + "label": "gpt-3.5-turbo-16k", + "name": "gpt-3.5-turbo-16k" + }, + { + "label": "gpt-3.5-turbo-16k-0613", + "name": "gpt-3.5-turbo-16k-0613" } ], "default": "gpt-3.5-turbo", @@ -362,6 +541,14 @@ "optional": true, "additionalParams": true, "id": "chatOpenAI_1-input-timeout-number" + }, + { + "label": "BasePath", + "name": "basepath", + "type": "string", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_1-input-basepath-string" } ], "inputAnchors": [], @@ -372,14 +559,15 @@ "topP": "", "frequencyPenalty": "", "presencePenalty": "", - "timeout": "" + "timeout": "", + "basepath": "" }, "outputAnchors": [ { - "id": "chatOpenAI_1-output-chatOpenAI-ChatOpenAI|BaseChatModel|BaseLanguageModel|BaseLangChain", + "id": "chatOpenAI_1-output-chatOpenAI-ChatOpenAI|BaseChatModel|BaseLanguageModel", "name": "chatOpenAI", "label": "ChatOpenAI", - "type": "ChatOpenAI | BaseChatModel | BaseLanguageModel | BaseLangChain" + "type": "ChatOpenAI | BaseChatModel | BaseLanguageModel" } ], "outputs": {}, @@ -387,161 +575,13 @@ }, "selected": false, "positionAbsolute": { - "x": 691.5132411896494, - "y": -533.1696369549378 - }, - "dragging": false - }, - { - "width": 300, - "height": 280, - "id": "webBrowser_0", - "position": { - "x": 1091.0866823400172, - "y": -16.43806989958216 - }, - "type": "customNode", - "data": { - "id": "webBrowser_0", - "label": "Web Browser", - "name": "webBrowser", - "type": "WebBrowser", - "baseClasses": ["WebBrowser", "Tool", "StructuredTool", "BaseLangChain"], - "category": "Tools", - "description": "Gives agent the ability to visit a website and extract information", - "inputParams": [], - "inputAnchors": [ - { - "label": "Language Model", - "name": "model", - "type": "BaseLanguageModel", - "id": "webBrowser_0-input-model-BaseLanguageModel" - }, - { - "label": "Embeddings", - "name": "embeddings", - "type": "Embeddings", - "id": "webBrowser_0-input-embeddings-Embeddings" - } - ], - "inputs": { - "model": "{{chatOpenAI_1.data.instance}}", - "embeddings": "{{openAIEmbeddings_0.data.instance}}" - }, - "outputAnchors": [ - { - "id": "webBrowser_0-output-webBrowser-WebBrowser|Tool|StructuredTool|BaseLangChain", - "name": "webBrowser", - "label": "WebBrowser", - "type": "WebBrowser | Tool | StructuredTool | BaseLangChain" - } - ], - "outputs": {}, - "selected": false - }, - "selected": false, - "positionAbsolute": { - "x": 1091.0866823400172, - "y": -16.43806989958216 - }, - "dragging": false - }, - { - "width": 300, - "height": 383, - "id": "conversationalAgent_0", - "position": { - "x": 1451.6222493253506, - "y": 239.69137914100338 - }, - "type": "customNode", - "data": { - "id": "conversationalAgent_0", - "label": "Conversational Agent", - "name": "conversationalAgent", - "type": "AgentExecutor", - "baseClasses": ["AgentExecutor", "BaseChain", "BaseLangChain"], - "category": "Agents", - "description": "Conversational agent for a chat model. It will utilize chat specific prompts", - "inputParams": [ - { - "label": "System Message", - "name": "systemMessage", - "type": "string", - "rows": 4, - "optional": true, - "additionalParams": true, - "id": "conversationalAgent_0-input-systemMessage-string" - }, - { - "label": "Human Message", - "name": "humanMessage", - "type": "string", - "rows": 4, - "optional": true, - "additionalParams": true, - "id": "conversationalAgent_0-input-humanMessage-string" - } - ], - "inputAnchors": [ - { - "label": "Allowed Tools", - "name": "tools", - "type": "Tool", - "list": true, - "id": "conversationalAgent_0-input-tools-Tool" - }, - { - "label": "Language Model", - "name": "model", - "type": "BaseLanguageModel", - "id": "conversationalAgent_0-input-model-BaseLanguageModel" - }, - { - "label": "Memory", - "name": "memory", - "type": "BaseChatMemory", - "id": "conversationalAgent_0-input-memory-BaseChatMemory" - } - ], - "inputs": { - "tools": ["{{webBrowser_0.data.instance}}"], - "model": "{{chatOpenAI_0.data.instance}}", - "memory": "{{bufferMemory_0.data.instance}}", - "systemMessage": "", - "humanMessage": "" - }, - "outputAnchors": [ - { - "id": "conversationalAgent_0-output-conversationalAgent-AgentExecutor|BaseChain|BaseLangChain", - "name": "conversationalAgent", - "label": "AgentExecutor", - "type": "AgentExecutor | BaseChain | BaseLangChain" - } - ], - "outputs": {}, - "selected": false - }, - "selected": false, - "positionAbsolute": { - "x": 1451.6222493253506, - "y": 239.69137914100338 + "x": 68.312124033115, + "y": -169.65476709991256 }, "dragging": false } ], "edges": [ - { - "source": "chatOpenAI_1", - "sourceHandle": "chatOpenAI_1-output-chatOpenAI-ChatOpenAI|BaseChatModel|BaseLanguageModel|BaseLangChain", - "target": "webBrowser_0", - "targetHandle": "webBrowser_0-input-model-BaseLanguageModel", - "type": "buttonedge", - "id": "chatOpenAI_1-chatOpenAI_1-output-chatOpenAI-ChatOpenAI|BaseChatModel|BaseLanguageModel|BaseLangChain-webBrowser_0-webBrowser_0-input-model-BaseLanguageModel", - "data": { - "label": "" - } - }, { "source": "openAIEmbeddings_0", "sourceHandle": "openAIEmbeddings_0-output-openAIEmbeddings-OpenAIEmbeddings|Embeddings", @@ -553,6 +593,28 @@ "label": "" } }, + { + "source": "chatOpenAI_0", + "sourceHandle": "chatOpenAI_0-output-chatOpenAI-ChatOpenAI|BaseChatModel|BaseLanguageModel", + "target": "webBrowser_0", + "targetHandle": "webBrowser_0-input-model-BaseLanguageModel", + "type": "buttonedge", + "id": "chatOpenAI_0-chatOpenAI_0-output-chatOpenAI-ChatOpenAI|BaseChatModel|BaseLanguageModel-webBrowser_0-webBrowser_0-input-model-BaseLanguageModel", + "data": { + "label": "" + } + }, + { + "source": "chatOpenAI_1", + "sourceHandle": "chatOpenAI_1-output-chatOpenAI-ChatOpenAI|BaseChatModel|BaseLanguageModel", + "target": "conversationalAgent_0", + "targetHandle": "conversationalAgent_0-input-model-BaseLanguageModel", + "type": "buttonedge", + "id": "chatOpenAI_1-chatOpenAI_1-output-chatOpenAI-ChatOpenAI|BaseChatModel|BaseLanguageModel-conversationalAgent_0-conversationalAgent_0-input-model-BaseLanguageModel", + "data": { + "label": "" + } + }, { "source": "webBrowser_0", "sourceHandle": "webBrowser_0-output-webBrowser-WebBrowser|Tool|StructuredTool|BaseLangChain", @@ -564,17 +626,6 @@ "label": "" } }, - { - "source": "chatOpenAI_0", - "sourceHandle": "chatOpenAI_0-output-chatOpenAI-ChatOpenAI|BaseChatModel|BaseLanguageModel|BaseLangChain", - "target": "conversationalAgent_0", - "targetHandle": "conversationalAgent_0-input-model-BaseLanguageModel", - "type": "buttonedge", - "id": "chatOpenAI_0-chatOpenAI_0-output-chatOpenAI-ChatOpenAI|BaseChatModel|BaseLanguageModel|BaseLangChain-conversationalAgent_0-conversationalAgent_0-input-model-BaseLanguageModel", - "data": { - "label": "" - } - }, { "source": "bufferMemory_0", "sourceHandle": "bufferMemory_0-output-bufferMemory-BufferMemory|BaseChatMemory|BaseMemory", diff --git a/packages/server/marketplaces/chatflows/WebPage QnA.json b/packages/server/marketplaces/chatflows/WebPage QnA.json new file mode 100644 index 000000000..812f0bd50 --- /dev/null +++ b/packages/server/marketplaces/chatflows/WebPage QnA.json @@ -0,0 +1,771 @@ +{ + "description": "Scrape web pages for QnA with long term memory Motorhead and return source documents", + "nodes": [ + { + "width": 300, + "height": 523, + "id": "chatOpenAI_0", + "position": { + "x": 1542.965468159417, + "y": -200.10756989974368 + }, + "type": "customNode", + "data": { + "id": "chatOpenAI_0", + "label": "ChatOpenAI", + "version": 1, + "name": "chatOpenAI", + "type": "ChatOpenAI", + "baseClasses": ["ChatOpenAI", "BaseChatModel", "BaseLanguageModel"], + "category": "Chat Models", + "description": "Wrapper around OpenAI large language models that use the Chat endpoint", + "inputParams": [ + { + "label": "Connect Credential", + "name": "credential", + "type": "credential", + "credentialNames": ["openAIApi"], + "id": "chatOpenAI_0-input-credential-credential" + }, + { + "label": "Model Name", + "name": "modelName", + "type": "options", + "options": [ + { + "label": "gpt-4", + "name": "gpt-4" + }, + { + "label": "gpt-4-0613", + "name": "gpt-4-0613" + }, + { + "label": "gpt-4-32k", + "name": "gpt-4-32k" + }, + { + "label": "gpt-4-32k-0613", + "name": "gpt-4-32k-0613" + }, + { + "label": "gpt-3.5-turbo", + "name": "gpt-3.5-turbo" + }, + { + "label": "gpt-3.5-turbo-0613", + "name": "gpt-3.5-turbo-0613" + }, + { + "label": "gpt-3.5-turbo-16k", + "name": "gpt-3.5-turbo-16k" + }, + { + "label": "gpt-3.5-turbo-16k-0613", + "name": "gpt-3.5-turbo-16k-0613" + } + ], + "default": "gpt-3.5-turbo", + "optional": true, + "id": "chatOpenAI_0-input-modelName-options" + }, + { + "label": "Temperature", + "name": "temperature", + "type": "number", + "default": 0.9, + "optional": true, + "id": "chatOpenAI_0-input-temperature-number" + }, + { + "label": "Max Tokens", + "name": "maxTokens", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_0-input-maxTokens-number" + }, + { + "label": "Top Probability", + "name": "topP", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_0-input-topP-number" + }, + { + "label": "Frequency Penalty", + "name": "frequencyPenalty", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_0-input-frequencyPenalty-number" + }, + { + "label": "Presence Penalty", + "name": "presencePenalty", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_0-input-presencePenalty-number" + }, + { + "label": "Timeout", + "name": "timeout", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_0-input-timeout-number" + }, + { + "label": "BasePath", + "name": "basepath", + "type": "string", + "optional": true, + "additionalParams": true, + "id": "chatOpenAI_0-input-basepath-string" + } + ], + "inputAnchors": [], + "inputs": { + "modelName": "gpt-3.5-turbo-16k", + "temperature": "0.9", + "maxTokens": "", + "topP": "", + "frequencyPenalty": "", + "presencePenalty": "", + "timeout": "", + "basepath": "" + }, + "outputAnchors": [ + { + "id": "chatOpenAI_0-output-chatOpenAI-ChatOpenAI|BaseChatModel|BaseLanguageModel", + "name": "chatOpenAI", + "label": "ChatOpenAI", + "type": "ChatOpenAI | BaseChatModel | BaseLanguageModel" + } + ], + "outputs": {}, + "selected": false + }, + "positionAbsolute": { + "x": 1542.965468159417, + "y": -200.10756989974368 + }, + "selected": false, + "dragging": false + }, + { + "width": 300, + "height": 328, + "id": "openAIEmbeddings_0", + "position": { + "x": 827.6835380475393, + "y": 253.8955254525015 + }, + "type": "customNode", + "data": { + "id": "openAIEmbeddings_0", + "label": "OpenAI Embeddings", + "version": 1, + "name": "openAIEmbeddings", + "type": "OpenAIEmbeddings", + "baseClasses": ["OpenAIEmbeddings", "Embeddings"], + "category": "Embeddings", + "description": "OpenAI API to generate embeddings for a given text", + "inputParams": [ + { + "label": "Connect Credential", + "name": "credential", + "type": "credential", + "credentialNames": ["openAIApi"], + "id": "openAIEmbeddings_0-input-credential-credential" + }, + { + "label": "Strip New Lines", + "name": "stripNewLines", + "type": "boolean", + "optional": true, + "additionalParams": true, + "id": "openAIEmbeddings_0-input-stripNewLines-boolean" + }, + { + "label": "Batch Size", + "name": "batchSize", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "openAIEmbeddings_0-input-batchSize-number" + }, + { + "label": "Timeout", + "name": "timeout", + "type": "number", + "optional": true, + "additionalParams": true, + "id": "openAIEmbeddings_0-input-timeout-number" + }, + { + "label": "BasePath", + "name": "basepath", + "type": "string", + "optional": true, + "additionalParams": true, + "id": "openAIEmbeddings_0-input-basepath-string" + } + ], + "inputAnchors": [], + "inputs": { + "stripNewLines": "", + "batchSize": "", + "timeout": "", + "basepath": "" + }, + "outputAnchors": [ + { + "id": "openAIEmbeddings_0-output-openAIEmbeddings-OpenAIEmbeddings|Embeddings", + "name": "openAIEmbeddings", + "label": "OpenAIEmbeddings", + "type": "OpenAIEmbeddings | Embeddings" + } + ], + "outputs": {}, + "selected": false + }, + "selected": false, + "positionAbsolute": { + "x": 827.6835380475393, + "y": 253.8955254525015 + }, + "dragging": false + }, + { + "width": 300, + "height": 376, + "id": "htmlToMarkdownTextSplitter_0", + "position": { + "x": 465.86869036784685, + "y": -17.41141011530891 + }, + "type": "customNode", + "data": { + "id": "htmlToMarkdownTextSplitter_0", + "label": "HtmlToMarkdown Text Splitter", + "version": 1, + "name": "htmlToMarkdownTextSplitter", + "type": "HtmlToMarkdownTextSplitter", + "baseClasses": [ + "HtmlToMarkdownTextSplitter", + "MarkdownTextSplitter", + "RecursiveCharacterTextSplitter", + "TextSplitter", + "BaseDocumentTransformer" + ], + "category": "Text Splitters", + "description": "Converts Html to Markdown and then split your content into documents based on the Markdown headers", + "inputParams": [ + { + "label": "Chunk Size", + "name": "chunkSize", + "type": "number", + "default": 1000, + "optional": true, + "id": "htmlToMarkdownTextSplitter_0-input-chunkSize-number" + }, + { + "label": "Chunk Overlap", + "name": "chunkOverlap", + "type": "number", + "optional": true, + "id": "htmlToMarkdownTextSplitter_0-input-chunkOverlap-number" + } + ], + "inputAnchors": [], + "inputs": { + "chunkSize": "4000", + "chunkOverlap": "" + }, + "outputAnchors": [ + { + "id": "htmlToMarkdownTextSplitter_0-output-htmlToMarkdownTextSplitter-HtmlToMarkdownTextSplitter|MarkdownTextSplitter|RecursiveCharacterTextSplitter|TextSplitter|BaseDocumentTransformer", + "name": "htmlToMarkdownTextSplitter", + "label": "HtmlToMarkdownTextSplitter", + "type": "HtmlToMarkdownTextSplitter | MarkdownTextSplitter | RecursiveCharacterTextSplitter | TextSplitter | BaseDocumentTransformer" + } + ], + "outputs": {}, + "selected": false + }, + "selected": false, + "positionAbsolute": { + "x": 465.86869036784685, + "y": -17.41141011530891 + }, + "dragging": false + }, + { + "width": 300, + "height": 479, + "id": "conversationalRetrievalQAChain_0", + "position": { + "x": 1882.5543981868987, + "y": 305.08959224761225 + }, + "type": "customNode", + "data": { + "id": "conversationalRetrievalQAChain_0", + "label": "Conversational Retrieval QA Chain", + "version": 1, + "name": "conversationalRetrievalQAChain", + "type": "ConversationalRetrievalQAChain", + "baseClasses": ["ConversationalRetrievalQAChain", "BaseChain"], + "category": "Chains", + "description": "Document QA - built on RetrievalQAChain to provide a chat history component", + "inputParams": [ + { + "label": "Return Source Documents", + "name": "returnSourceDocuments", + "type": "boolean", + "optional": true, + "id": "conversationalRetrievalQAChain_0-input-returnSourceDocuments-boolean" + }, + { + "label": "System Message", + "name": "systemMessagePrompt", + "type": "string", + "rows": 4, + "additionalParams": true, + "optional": true, + "placeholder": "I want you to act as a document that I am having a conversation with. Your name is \"AI Assistant\". You will provide me with answers from the given info. If the answer is not included, say exactly \"Hmm, I am not sure.\" and stop after that. Refuse to answer any question not about the info. Never break character.", + "id": "conversationalRetrievalQAChain_0-input-systemMessagePrompt-string" + }, + { + "label": "Chain Option", + "name": "chainOption", + "type": "options", + "options": [ + { + "label": "MapReduceDocumentsChain", + "name": "map_reduce", + "description": "Suitable for QA tasks over larger documents and can run the preprocessing step in parallel, reducing the running time" + }, + { + "label": "RefineDocumentsChain", + "name": "refine", + "description": "Suitable for QA tasks over a large number of documents." + }, + { + "label": "StuffDocumentsChain", + "name": "stuff", + "description": "Suitable for QA tasks over a small number of documents." + } + ], + "additionalParams": true, + "optional": true, + "id": "conversationalRetrievalQAChain_0-input-chainOption-options" + } + ], + "inputAnchors": [ + { + "label": "Language Model", + "name": "model", + "type": "BaseLanguageModel", + "id": "conversationalRetrievalQAChain_0-input-model-BaseLanguageModel" + }, + { + "label": "Vector Store Retriever", + "name": "vectorStoreRetriever", + "type": "BaseRetriever", + "id": "conversationalRetrievalQAChain_0-input-vectorStoreRetriever-BaseRetriever" + }, + { + "label": "Memory", + "name": "memory", + "type": "BaseMemory", + "optional": true, + "description": "If left empty, a default BufferMemory will be used", + "id": "conversationalRetrievalQAChain_0-input-memory-BaseMemory" + } + ], + "inputs": { + "model": "{{chatOpenAI_0.data.instance}}", + "vectorStoreRetriever": "{{pineconeUpsert_0.data.instance}}", + "memory": "{{motorheadMemory_0.data.instance}}", + "returnSourceDocuments": true, + "systemMessagePrompt": "I want you to act as a document that I am having a conversation with. Your name is \"AI Assistant\". You will provide me with answers from the given context. If the answer is not included, say exactly \"Hmm, I am not sure.\" and stop after that. Do not make up any information that is not in the context. Refuse to answer any question not about the info. Never break character.", + "chainOption": "" + }, + "outputAnchors": [ + { + "id": "conversationalRetrievalQAChain_0-output-conversationalRetrievalQAChain-ConversationalRetrievalQAChain|BaseChain", + "name": "conversationalRetrievalQAChain", + "label": "ConversationalRetrievalQAChain", + "type": "ConversationalRetrievalQAChain | BaseChain" + } + ], + "outputs": {}, + "selected": false + }, + "selected": false, + "positionAbsolute": { + "x": 1882.5543981868987, + "y": 305.08959224761225 + }, + "dragging": false + }, + { + "width": 300, + "height": 380, + "id": "cheerioWebScraper_0", + "position": { + "x": 831.9867292136466, + "y": -181.92350323746112 + }, + "type": "customNode", + "data": { + "id": "cheerioWebScraper_0", + "label": "Cheerio Web Scraper", + "version": 1, + "name": "cheerioWebScraper", + "type": "Document", + "baseClasses": ["Document"], + "category": "Document Loaders", + "description": "Load data from webpages", + "inputParams": [ + { + "label": "URL", + "name": "url", + "type": "string", + "id": "cheerioWebScraper_0-input-url-string" + }, + { + "label": "Get Relative Links Method", + "name": "relativeLinksMethod", + "type": "options", + "description": "Select a method to retrieve relative links", + "options": [ + { + "label": "Web Crawl", + "name": "webCrawl", + "description": "Crawl relative links from HTML URL" + }, + { + "label": "Scrape XML Sitemap", + "name": "scrapeXMLSitemap", + "description": "Scrape relative links from XML sitemap URL" + } + ], + "optional": true, + "additionalParams": true, + "id": "cheerioWebScraper_0-input-relativeLinksMethod-options" + }, + { + "label": "Get Relative Links Limit", + "name": "limit", + "type": "number", + "optional": true, + "additionalParams": true, + "description": "Only used when \"Get Relative Links Method\" is selected. Set 0 to retrieve all relative links, default limit is 10.", + "warning": "Retrieving all links might take long time, and all links will be upserted again if the flow's state changed (eg: different URL, chunk size, etc)", + "id": "cheerioWebScraper_0-input-limit-number" + }, + { + "label": "Metadata", + "name": "metadata", + "type": "json", + "optional": true, + "additionalParams": true, + "id": "cheerioWebScraper_0-input-metadata-json" + } + ], + "inputAnchors": [ + { + "label": "Text Splitter", + "name": "textSplitter", + "type": "TextSplitter", + "optional": true, + "id": "cheerioWebScraper_0-input-textSplitter-TextSplitter" + } + ], + "inputs": { + "url": "https://flowiseai.com/", + "textSplitter": "{{htmlToMarkdownTextSplitter_0.data.instance}}", + "relativeLinksMethod": "", + "limit": "", + "metadata": "" + }, + "outputAnchors": [ + { + "id": "cheerioWebScraper_0-output-cheerioWebScraper-Document", + "name": "cheerioWebScraper", + "label": "Document", + "type": "Document" + } + ], + "outputs": {}, + "selected": false + }, + "selected": false, + "positionAbsolute": { + "x": 831.9867292136466, + "y": -181.92350323746112 + }, + "dragging": false + }, + { + "width": 300, + "height": 555, + "id": "pineconeUpsert_0", + "position": { + "x": 1179.6228496246993, + "y": -167.023255532671 + }, + "type": "customNode", + "data": { + "id": "pineconeUpsert_0", + "label": "Pinecone Upsert Document", + "version": 1, + "name": "pineconeUpsert", + "type": "Pinecone", + "baseClasses": ["Pinecone", "VectorStoreRetriever", "BaseRetriever"], + "category": "Vector Stores", + "description": "Upsert documents to Pinecone", + "inputParams": [ + { + "label": "Connect Credential", + "name": "credential", + "type": "credential", + "credentialNames": ["pineconeApi"], + "id": "pineconeUpsert_0-input-credential-credential" + }, + { + "label": "Pinecone Index", + "name": "pineconeIndex", + "type": "string", + "id": "pineconeUpsert_0-input-pineconeIndex-string" + }, + { + "label": "Pinecone Namespace", + "name": "pineconeNamespace", + "type": "string", + "placeholder": "my-first-namespace", + "additionalParams": true, + "optional": true, + "id": "pineconeUpsert_0-input-pineconeNamespace-string" + }, + { + "label": "Top K", + "name": "topK", + "description": "Number of top results to fetch. Default to 4", + "placeholder": "4", + "type": "number", + "additionalParams": true, + "optional": true, + "id": "pineconeUpsert_0-input-topK-number" + } + ], + "inputAnchors": [ + { + "label": "Document", + "name": "document", + "type": "Document", + "list": true, + "id": "pineconeUpsert_0-input-document-Document" + }, + { + "label": "Embeddings", + "name": "embeddings", + "type": "Embeddings", + "id": "pineconeUpsert_0-input-embeddings-Embeddings" + } + ], + "inputs": { + "document": ["{{cheerioWebScraper_0.data.instance}}"], + "embeddings": "{{openAIEmbeddings_0.data.instance}}", + "pineconeIndex": "", + "pineconeNamespace": "", + "topK": "" + }, + "outputAnchors": [ + { + "name": "output", + "label": "Output", + "type": "options", + "options": [ + { + "id": "pineconeUpsert_0-output-retriever-Pinecone|VectorStoreRetriever|BaseRetriever", + "name": "retriever", + "label": "Pinecone Retriever", + "type": "Pinecone | VectorStoreRetriever | BaseRetriever" + }, + { + "id": "pineconeUpsert_0-output-vectorStore-Pinecone|VectorStore", + "name": "vectorStore", + "label": "Pinecone Vector Store", + "type": "Pinecone | VectorStore" + } + ], + "default": "retriever" + } + ], + "outputs": { + "output": "retriever" + }, + "selected": false + }, + "selected": false, + "positionAbsolute": { + "x": 1179.6228496246993, + "y": -167.023255532671 + }, + "dragging": false + }, + { + "width": 300, + "height": 427, + "id": "motorheadMemory_0", + "position": { + "x": 1202.1545938923578, + "y": 425.69055061366237 + }, + "type": "customNode", + "data": { + "id": "motorheadMemory_0", + "label": "Motorhead Memory", + "version": 1, + "name": "motorheadMemory", + "type": "MotorheadMemory", + "baseClasses": ["MotorheadMemory", "BaseChatMemory", "BaseMemory"], + "category": "Memory", + "description": "Use Motorhead Memory to store chat conversations", + "inputParams": [ + { + "label": "Connect Credential", + "name": "credential", + "type": "credential", + "optional": true, + "description": "Only needed when using hosted solution - https://getmetal.io", + "credentialNames": ["motorheadMemoryApi"], + "id": "motorheadMemory_0-input-credential-credential" + }, + { + "label": "Base URL", + "name": "baseURL", + "type": "string", + "optional": true, + "description": "To use the online version, leave the URL blank. More details at https://getmetal.io.", + "id": "motorheadMemory_0-input-baseURL-string" + }, + { + "label": "Session Id", + "name": "sessionId", + "type": "string", + "description": "If not specified, the first CHAT_MESSAGE_ID will be used as sessionId", + "default": "", + "additionalParams": true, + "optional": true, + "id": "motorheadMemory_0-input-sessionId-string" + }, + { + "label": "Memory Key", + "name": "memoryKey", + "type": "string", + "default": "chat_history", + "additionalParams": true, + "id": "motorheadMemory_0-input-memoryKey-string" + } + ], + "inputAnchors": [], + "inputs": { + "baseURL": "", + "sessionId": "", + "memoryKey": "chat_history" + }, + "outputAnchors": [ + { + "id": "motorheadMemory_0-output-motorheadMemory-MotorheadMemory|BaseChatMemory|BaseMemory", + "name": "motorheadMemory", + "label": "MotorheadMemory", + "type": "MotorheadMemory | BaseChatMemory | BaseMemory" + } + ], + "outputs": {}, + "selected": false + }, + "selected": false, + "positionAbsolute": { + "x": 1202.1545938923578, + "y": 425.69055061366237 + }, + "dragging": false + } + ], + "edges": [ + { + "source": "chatOpenAI_0", + "sourceHandle": "chatOpenAI_0-output-chatOpenAI-ChatOpenAI|BaseChatModel|BaseLanguageModel", + "target": "conversationalRetrievalQAChain_0", + "targetHandle": "conversationalRetrievalQAChain_0-input-model-BaseLanguageModel", + "type": "buttonedge", + "id": "chatOpenAI_0-chatOpenAI_0-output-chatOpenAI-ChatOpenAI|BaseChatModel|BaseLanguageModel-conversationalRetrievalQAChain_0-conversationalRetrievalQAChain_0-input-model-BaseLanguageModel", + "data": { + "label": "" + } + }, + { + "source": "htmlToMarkdownTextSplitter_0", + "sourceHandle": "htmlToMarkdownTextSplitter_0-output-htmlToMarkdownTextSplitter-HtmlToMarkdownTextSplitter|MarkdownTextSplitter|RecursiveCharacterTextSplitter|TextSplitter|BaseDocumentTransformer", + "target": "cheerioWebScraper_0", + "targetHandle": "cheerioWebScraper_0-input-textSplitter-TextSplitter", + "type": "buttonedge", + "id": "htmlToMarkdownTextSplitter_0-htmlToMarkdownTextSplitter_0-output-htmlToMarkdownTextSplitter-HtmlToMarkdownTextSplitter|MarkdownTextSplitter|RecursiveCharacterTextSplitter|TextSplitter|BaseDocumentTransformer-cheerioWebScraper_0-cheerioWebScraper_0-input-textSplitter-TextSplitter", + "data": { + "label": "" + } + }, + { + "source": "cheerioWebScraper_0", + "sourceHandle": "cheerioWebScraper_0-output-cheerioWebScraper-Document", + "target": "pineconeUpsert_0", + "targetHandle": "pineconeUpsert_0-input-document-Document", + "type": "buttonedge", + "id": "cheerioWebScraper_0-cheerioWebScraper_0-output-cheerioWebScraper-Document-pineconeUpsert_0-pineconeUpsert_0-input-document-Document", + "data": { + "label": "" + } + }, + { + "source": "openAIEmbeddings_0", + "sourceHandle": "openAIEmbeddings_0-output-openAIEmbeddings-OpenAIEmbeddings|Embeddings", + "target": "pineconeUpsert_0", + "targetHandle": "pineconeUpsert_0-input-embeddings-Embeddings", + "type": "buttonedge", + "id": "openAIEmbeddings_0-openAIEmbeddings_0-output-openAIEmbeddings-OpenAIEmbeddings|Embeddings-pineconeUpsert_0-pineconeUpsert_0-input-embeddings-Embeddings", + "data": { + "label": "" + } + }, + { + "source": "pineconeUpsert_0", + "sourceHandle": "pineconeUpsert_0-output-retriever-Pinecone|VectorStoreRetriever|BaseRetriever", + "target": "conversationalRetrievalQAChain_0", + "targetHandle": "conversationalRetrievalQAChain_0-input-vectorStoreRetriever-BaseRetriever", + "type": "buttonedge", + "id": "pineconeUpsert_0-pineconeUpsert_0-output-retriever-Pinecone|VectorStoreRetriever|BaseRetriever-conversationalRetrievalQAChain_0-conversationalRetrievalQAChain_0-input-vectorStoreRetriever-BaseRetriever", + "data": { + "label": "" + } + }, + { + "source": "motorheadMemory_0", + "sourceHandle": "motorheadMemory_0-output-motorheadMemory-MotorheadMemory|BaseChatMemory|BaseMemory", + "target": "conversationalRetrievalQAChain_0", + "targetHandle": "conversationalRetrievalQAChain_0-input-memory-BaseMemory", + "type": "buttonedge", + "id": "motorheadMemory_0-motorheadMemory_0-output-motorheadMemory-MotorheadMemory|BaseChatMemory|BaseMemory-conversationalRetrievalQAChain_0-conversationalRetrievalQAChain_0-input-memory-BaseMemory", + "data": { + "label": "" + } + } + ] +} diff --git a/packages/server/marketplaces/Zapier NLA.json b/packages/server/marketplaces/chatflows/Zapier NLA.json similarity index 90% rename from packages/server/marketplaces/Zapier NLA.json rename to packages/server/marketplaces/chatflows/Zapier NLA.json index eafd8f235..60258b466 100644 --- a/packages/server/marketplaces/Zapier NLA.json +++ b/packages/server/marketplaces/chatflows/Zapier NLA.json @@ -14,6 +14,7 @@ "id": "zapierNLA_0", "label": "Zapier NLA", "name": "zapierNLA", + "version": 1, "type": "ZapierNLA", "baseClasses": ["ZapierNLA", "Tool"], "category": "Tools", @@ -48,27 +49,84 @@ }, { "width": 300, - "height": 524, + "height": 280, + "id": "mrklAgentLLM_0", + "position": { + "x": 1002.5779315680477, + "y": 329.9701389591812 + }, + "type": "customNode", + "data": { + "id": "mrklAgentLLM_0", + "label": "MRKL Agent for LLMs", + "name": "mrklAgentLLM", + "version": 1, + "type": "AgentExecutor", + "baseClasses": ["AgentExecutor", "BaseChain", "BaseLangChain"], + "category": "Agents", + "description": "Agent that uses the ReAct Framework to decide what action to take, optimized to be used with LLMs", + "inputParams": [], + "inputAnchors": [ + { + "label": "Allowed Tools", + "name": "tools", + "type": "Tool", + "list": true, + "id": "mrklAgentLLM_0-input-tools-Tool" + }, + { + "label": "Language Model", + "name": "model", + "type": "BaseLanguageModel", + "id": "mrklAgentLLM_0-input-model-BaseLanguageModel" + } + ], + "inputs": { + "tools": ["{{zapierNLA_0.data.instance}}"], + "model": "{{openAI_0.data.instance}}" + }, + "outputAnchors": [ + { + "id": "mrklAgentLLM_0-output-mrklAgentLLM-AgentExecutor|BaseChain|BaseLangChain", + "name": "mrklAgentLLM", + "label": "AgentExecutor", + "type": "AgentExecutor | BaseChain | BaseLangChain" + } + ], + "outputs": {}, + "selected": false + }, + "positionAbsolute": { + "x": 1002.5779315680477, + "y": 329.9701389591812 + }, + "selected": false + }, + { + "width": 300, + "height": 523, "id": "openAI_0", "position": { - "x": 547.3867724775708, - "y": 394.1919189424442 + "x": 550.5957793208096, + "y": 378.30370661617934 }, "type": "customNode", "data": { "id": "openAI_0", "label": "OpenAI", "name": "openAI", + "version": 1, "type": "OpenAI", - "baseClasses": ["OpenAI", "BaseLLM", "BaseLanguageModel", "BaseLangChain"], + "baseClasses": ["OpenAI", "BaseLLM", "BaseLanguageModel"], "category": "LLMs", "description": "Wrapper around OpenAI large language models", "inputParams": [ { - "label": "OpenAI Api Key", - "name": "openAIApiKey", - "type": "password", - "id": "openAI_0-input-openAIApiKey-password" + "label": "Connect Credential", + "name": "credential", + "type": "credential", + "credentialNames": ["openAIApi"], + "id": "openAI_0-input-credential-credential" }, { "label": "Model Name", @@ -159,6 +217,14 @@ "optional": true, "additionalParams": true, "id": "openAI_0-input-timeout-number" + }, + { + "label": "BasePath", + "name": "basepath", + "type": "string", + "optional": true, + "additionalParams": true, + "id": "openAI_0-input-basepath-string" } ], "inputAnchors": [], @@ -171,14 +237,15 @@ "frequencyPenalty": "", "presencePenalty": "", "batchSize": "", - "timeout": "" + "timeout": "", + "basepath": "" }, "outputAnchors": [ { - "id": "openAI_0-output-openAI-OpenAI|BaseLLM|BaseLanguageModel|BaseLangChain", + "id": "openAI_0-output-openAI-OpenAI|BaseLLM|BaseLanguageModel", "name": "openAI", "label": "OpenAI", - "type": "OpenAI | BaseLLM | BaseLanguageModel | BaseLangChain" + "type": "OpenAI | BaseLLM | BaseLanguageModel" } ], "outputs": {}, @@ -186,64 +253,10 @@ }, "selected": false, "positionAbsolute": { - "x": 547.3867724775708, - "y": 394.1919189424442 + "x": 550.5957793208096, + "y": 378.30370661617934 }, "dragging": false - }, - { - "width": 300, - "height": 280, - "id": "mrklAgentLLM_0", - "position": { - "x": 1002.5779315680477, - "y": 329.9701389591812 - }, - "type": "customNode", - "data": { - "id": "mrklAgentLLM_0", - "label": "MRKL Agent for LLMs", - "name": "mrklAgentLLM", - "type": "AgentExecutor", - "baseClasses": ["AgentExecutor", "BaseChain", "BaseLangChain"], - "category": "Agents", - "description": "Agent that uses the ReAct Framework to decide what action to take, optimized to be used with LLMs", - "inputParams": [], - "inputAnchors": [ - { - "label": "Allowed Tools", - "name": "tools", - "type": "Tool", - "list": true, - "id": "mrklAgentLLM_0-input-tools-Tool" - }, - { - "label": "Language Model", - "name": "model", - "type": "BaseLanguageModel", - "id": "mrklAgentLLM_0-input-model-BaseLanguageModel" - } - ], - "inputs": { - "tools": ["{{zapierNLA_0.data.instance}}"], - "model": "{{openAI_0.data.instance}}" - }, - "outputAnchors": [ - { - "id": "mrklAgentLLM_0-output-mrklAgentLLM-AgentExecutor|BaseChain|BaseLangChain", - "name": "mrklAgentLLM", - "label": "AgentExecutor", - "type": "AgentExecutor | BaseChain | BaseLangChain" - } - ], - "outputs": {}, - "selected": false - }, - "positionAbsolute": { - "x": 1002.5779315680477, - "y": 329.9701389591812 - }, - "selected": false } ], "edges": [ @@ -260,11 +273,11 @@ }, { "source": "openAI_0", - "sourceHandle": "openAI_0-output-openAI-OpenAI|BaseLLM|BaseLanguageModel|BaseLangChain", + "sourceHandle": "openAI_0-output-openAI-OpenAI|BaseLLM|BaseLanguageModel", "target": "mrklAgentLLM_0", "targetHandle": "mrklAgentLLM_0-input-model-BaseLanguageModel", "type": "buttonedge", - "id": "openAI_0-openAI_0-output-openAI-OpenAI|BaseLLM|BaseLanguageModel|BaseLangChain-mrklAgentLLM_0-mrklAgentLLM_0-input-model-BaseLanguageModel", + "id": "openAI_0-openAI_0-output-openAI-OpenAI|BaseLLM|BaseLanguageModel-mrklAgentLLM_0-mrklAgentLLM_0-input-model-BaseLanguageModel", "data": { "label": "" } diff --git a/packages/server/marketplaces/tools/Add Hubspot Contact.json b/packages/server/marketplaces/tools/Add Hubspot Contact.json new file mode 100644 index 000000000..584df4c33 --- /dev/null +++ b/packages/server/marketplaces/tools/Add Hubspot Contact.json @@ -0,0 +1,8 @@ +{ + "name": "add_contact_hubspot", + "description": "Add new contact to Hubspot", + "color": "linear-gradient(rgb(85,198,123), rgb(0,230,99))", + "iconSrc": "https://cdn.worldvectorlogo.com/logos/hubspot-1.svg", + "schema": "[{\"id\":1,\"property\":\"email\",\"description\":\"email address of contact\",\"type\":\"string\",\"required\":true},{\"id\":2,\"property\":\"firstname\",\"description\":\"first name of contact\",\"type\":\"string\",\"required\":false},{\"id\":3,\"property\":\"lastname\",\"description\":\"last name of contact\",\"type\":\"string\",\"required\":false}]", + "func": "const fetch = require('node-fetch');\nconst url = 'https://api.hubapi.com/crm/v3/objects/contacts'\nconst token = 'YOUR-TOKEN';\n\nconst body = {\n\t\"properties\": {\n\t \"email\": $email\n\t}\n};\n\nif ($firstname) body.properties.firstname = $firstname;\nif ($lastname) body.properties.lastname = $lastname;\n\nconst options = {\n\tmethod: 'POST',\n\theaders: {\n\t 'Authorization': `Bearer ${token}`,\n\t\t'Content-Type': 'application/json'\n\t},\n\tbody: JSON.stringify(body)\n};\n\ntry {\n\tconst response = await fetch(url, options);\n\tconst text = await response.text();\n\treturn text;\n} catch (error) {\n\tconsole.error(error);\n\treturn '';\n}" +} diff --git a/packages/server/marketplaces/tools/Create Airtable Record.json b/packages/server/marketplaces/tools/Create Airtable Record.json new file mode 100644 index 000000000..c52c9199c --- /dev/null +++ b/packages/server/marketplaces/tools/Create Airtable Record.json @@ -0,0 +1,8 @@ +{ + "name": "add_airtable", + "description": "Add column1, column2 to Airtable", + "color": "linear-gradient(rgb(125,71,222), rgb(128,102,23))", + "iconSrc": "https://raw.githubusercontent.com/gilbarbara/logos/main/logos/airtable.svg", + "schema": "[{\"id\":0,\"property\":\"column1\",\"description\":\"this is column1\",\"type\":\"string\",\"required\":true},{\"id\":1,\"property\":\"column2\",\"description\":\"this is column2\",\"type\":\"string\",\"required\":true}]", + "func": "const fetch = require('node-fetch');\nconst baseId = 'YOUR-BASE-ID';\nconst tableId = 'YOUR-TABLE-ID';\nconst token = 'YOUR-TOKEN';\n\nconst body = {\n\t\"records\": [\n\t\t{\n\t\t\t\"fields\": {\n\t\t\t\t\"column1\": $column1,\n\t\t\t\t\"column2\": $column2,\n\t\t\t}\n\t\t}\n\t]\n};\n\nconst options = {\n\tmethod: 'POST',\n\theaders: {\n\t\t'Authorization': `Bearer ${token}`,\n\t\t'Content-Type': 'application/json'\n\t},\n\tbody: JSON.stringify(body)\n};\n\nconst url = `https://api.airtable.com/v0/${baseId}/${tableId}`\n\ntry {\n\tconst response = await fetch(url, options);\n\tconst text = await response.text();\n\treturn text;\n} catch (error) {\n\tconsole.error(error);\n\treturn '';\n}" +} diff --git a/packages/server/marketplaces/tools/Get Stock Mover.json b/packages/server/marketplaces/tools/Get Stock Mover.json new file mode 100644 index 000000000..9108cc503 --- /dev/null +++ b/packages/server/marketplaces/tools/Get Stock Mover.json @@ -0,0 +1,8 @@ +{ + "name": "get_stock_movers", + "description": "Get the stocks that has biggest price/volume moves, e.g. actives, gainers, losers, etc.", + "iconSrc": "https://rapidapi.com/cdn/images?url=https://rapidapi-prod-apis.s3.amazonaws.com/9c/e743343bdd41edad39a3fdffd5b974/016c33699f51603ae6fe4420c439124b.png", + "color": "linear-gradient(rgb(191,202,167), rgb(143,202,246))", + "schema": "[]", + "func": "const fetch = require('node-fetch');\nconst url = 'https://morning-star.p.rapidapi.com/market/v2/get-movers';\nconst options = {\n\tmethod: 'GET',\n\theaders: {\n\t\t'X-RapidAPI-Key': 'YOUR-API-KEY',\n\t\t'X-RapidAPI-Host': 'morning-star.p.rapidapi.com'\n\t}\n};\n\ntry {\n\tconst response = await fetch(url, options);\n\tconst result = await response.text();\n\tconsole.log(result);\n\treturn result;\n} catch (error) {\n\tconsole.error(error);\n\treturn '';\n}" +} diff --git a/packages/server/marketplaces/tools/Make Webhook.json b/packages/server/marketplaces/tools/Make Webhook.json new file mode 100644 index 000000000..24d00900f --- /dev/null +++ b/packages/server/marketplaces/tools/Make Webhook.json @@ -0,0 +1,8 @@ +{ + "name": "make_webhook", + "description": "Useful when you need to send message to Discord", + "color": "linear-gradient(rgb(19,94,2), rgb(19,124,59))", + "iconSrc": "https://github.com/FlowiseAI/Flowise/assets/26460777/517fdab2-8a6e-4781-b3c8-fb92cc78aa0b", + "schema": "[{\"id\":0,\"property\":\"message\",\"description\":\"Message to send\",\"type\":\"string\",\"required\":true}]", + "func": "/*\n* You can use any libraries imported in Flowise\n* You can use properties specified in Output Schema as variables. Ex: Property = userid, Variable = $userid\n* Must return a string value at the end of function\n*/\n\nconst fetch = require('node-fetch');\nconst webhookUrl = 'https://hook.eu1.make.com/abcdefg';\nconst body = {\n\t\"message\": $message\n};\nconst options = {\n method: 'POST',\n headers: {\n 'Content-Type': 'application/json'\n },\n body: JSON.stringify(body)\n};\ntry {\n const response = await fetch(webhookUrl, options);\n const text = await response.text();\n return text;\n} catch (error) {\n console.error(error);\n return '';\n}" +} diff --git a/packages/server/marketplaces/tools/Send Discord Message.json b/packages/server/marketplaces/tools/Send Discord Message.json new file mode 100644 index 000000000..bbfaaa905 --- /dev/null +++ b/packages/server/marketplaces/tools/Send Discord Message.json @@ -0,0 +1,8 @@ +{ + "name": "send_message_to_discord_channel", + "description": "Send message to Discord channel", + "color": "linear-gradient(rgb(155,190,84), rgb(176,69,245))", + "iconSrc": "https://raw.githubusercontent.com/gilbarbara/logos/main/logos/discord-icon.svg", + "schema": "[{\"id\":1,\"property\":\"content\",\"description\":\"message to send\",\"type\":\"string\",\"required\":true}]", + "func": "const fetch = require('node-fetch');\nconst webhookUrl = 'YOUR-WEBHOOK-URL'\n\nconst body = {\n\t\"content\": $content\n};\n\nconst options = {\n\tmethod: 'POST',\n\theaders: {\n\t\t'Content-Type': 'application/json'\n\t},\n\tbody: JSON.stringify(body)\n};\n\nconst url = `${webhookUrl}?wait=true`\n\ntry {\n\tconst response = await fetch(url, options);\n\tconst text = await response.text();\n\treturn text;\n} catch (error) {\n\tconsole.error(error);\n\treturn '';\n}" +} diff --git a/packages/server/marketplaces/tools/Send Slack Message.json b/packages/server/marketplaces/tools/Send Slack Message.json new file mode 100644 index 000000000..f15d40505 --- /dev/null +++ b/packages/server/marketplaces/tools/Send Slack Message.json @@ -0,0 +1,8 @@ +{ + "name": "send_message_to_slack_channel", + "description": "Send message to Slack channel", + "color": "linear-gradient(rgb(155,190,84), rgb(176,69,245))", + "iconSrc": "https://raw.githubusercontent.com/gilbarbara/logos/main/logos/slack-icon.svg", + "schema": "[{\"id\":1,\"property\":\"text\",\"description\":\"message to send\",\"type\":\"string\",\"required\":true}]", + "func": "const fetch = require('node-fetch');\nconst webhookUrl = 'YOUR-WEBHOOK-URL'\n\nconst body = {\n\t\"text\": $text\n};\n\nconst options = {\n\tmethod: 'POST',\n\theaders: {\n\t\t'Content-Type': 'application/json'\n\t},\n\tbody: JSON.stringify(body)\n};\n\nconst url = `${webhookUrl}`\n\ntry {\n\tconst response = await fetch(url, options);\n\tconst text = await response.text();\n\treturn text;\n} catch (error) {\n\tconsole.error(error);\n\treturn '';\n}" +} diff --git a/packages/server/marketplaces/tools/Send Teams Message.json b/packages/server/marketplaces/tools/Send Teams Message.json new file mode 100644 index 000000000..1af8111b5 --- /dev/null +++ b/packages/server/marketplaces/tools/Send Teams Message.json @@ -0,0 +1,8 @@ +{ + "name": "send_message_to_teams_channel", + "description": "Send message to Teams channel", + "color": "linear-gradient(rgb(155,190,84), rgb(176,69,245))", + "iconSrc": "https://raw.githubusercontent.com/gilbarbara/logos/main/logos/microsoft-teams.svg", + "schema": "[{\"id\":1,\"property\":\"content\",\"description\":\"message to send\",\"type\":\"string\",\"required\":true}]", + "func": "const fetch = require('node-fetch');\nconst webhookUrl = 'YOUR-WEBHOOK-URL'\n\nconst body = {\n\t\"content\": $content\n};\n\nconst options = {\n\tmethod: 'POST',\n\theaders: {\n\t\t'Content-Type': 'application/json'\n\t},\n\tbody: JSON.stringify(body)\n};\n\nconst url = `${webhookUrl}?wait=true`\n\ntry {\n\tconst response = await fetch(url, options);\n\tconst text = await response.text();\n\treturn text;\n} catch (error) {\n\tconsole.error(error);\n\treturn '';\n}" +} diff --git a/packages/server/marketplaces/tools/SendGrid Email.json b/packages/server/marketplaces/tools/SendGrid Email.json new file mode 100644 index 000000000..8a6bf9930 --- /dev/null +++ b/packages/server/marketplaces/tools/SendGrid Email.json @@ -0,0 +1,8 @@ +{ + "name": "sendgrid_email", + "description": "Send email using SendGrid", + "color": "linear-gradient(rgb(230,108,70), rgb(222,4,98))", + "iconSrc": "https://raw.githubusercontent.com/gilbarbara/logos/main/logos/sendgrid-icon.svg", + "schema": "[{\"id\":0,\"property\":\"fromEmail\",\"description\":\"Email address used to send the message\",\"type\":\"string\",\"required\":true},{\"id\":1,\"property\":\"toEmail\",\"description\":\"The intended recipient's email address\",\"type\":\"string\",\"required\":true},{\"id\":2,\"property\":\"subject\",\"description\":\"The subject of email\",\"type\":\"string\",\"required\":true},{\"id\":3,\"property\":\"content\",\"description\":\"Content of email\",\"type\":\"string\",\"required\":true}]", + "func": "const fetch = require('node-fetch');\nconst url = 'https://api.sendgrid.com/v3/mail/send';\nconst api_key = 'YOUR-API-KEY';\n\nconst body = {\n \"personalizations\": [\n {\n \"to\": [{ \"email\": $toEmail }]\n }\n ],\n\t\"from\": {\n\t \"email\": $fromEmail\n\t},\n\t\"subject\": $subject,\n\t\"content\": [\n\t {\n\t \"type\": 'text/plain',\n\t \"value\": $content\n\t }\n\t]\n};\n\nconst options = {\n\tmethod: 'POST',\n\theaders: {\n\t 'Authorization': `Bearer ${api_key}`,\n\t\t'Content-Type': 'application/json'\n\t},\n\tbody: JSON.stringify(body)\n};\n\ntry {\n\tconst response = await fetch(url, options);\n\tconst text = await response.text();\n\treturn text;\n} catch (error) {\n\tconsole.error(error);\n\treturn '';\n}" +} diff --git a/packages/server/nodemon.json b/packages/server/nodemon.json index d896b48b0..bf25eb004 100644 --- a/packages/server/nodemon.json +++ b/packages/server/nodemon.json @@ -1,6 +1,6 @@ { "ignore": ["**/*.spec.ts", ".git", "node_modules"], - "watch": ["commands", "index.ts", "src"], + "watch": ["commands", "index.ts", "src", "../components/nodes", "../components/src"], "exec": "yarn oclif-dev", "ext": "ts" } diff --git a/packages/server/package.json b/packages/server/package.json index b8777a21d..f2388b018 100644 --- a/packages/server/package.json +++ b/packages/server/package.json @@ -1,6 +1,6 @@ { "name": "flowise", - "version": "1.2.6", + "version": "1.3.3", "description": "Flowiseai Server", "main": "dist/index", "types": "dist/index.d.ts", @@ -13,8 +13,7 @@ "dist", "npm-shrinkwrap.json", "oclif.manifest.json", - "oauth2.html", - ".env" + "oauth2.html" ], "oclif": { "bin": "flowise", @@ -49,6 +48,7 @@ "@oclif/core": "^1.13.10", "axios": "^0.27.2", "cors": "^2.8.5", + "crypto-js": "^4.1.1", "dotenv": "^16.0.0", "express": "^4.17.3", "express-basic-auth": "^1.2.1", @@ -56,12 +56,17 @@ "flowise-ui": "*", "moment-timezone": "^0.5.34", "multer": "^1.4.5-lts.1", + "mysql": "^2.18.1", + "pg": "^8.11.1", "reflect-metadata": "^0.1.13", + "socket.io": "^4.6.1", "sqlite3": "^5.1.6", - "typeorm": "^0.3.6" + "typeorm": "^0.3.6", + "winston": "^3.9.0" }, "devDependencies": { "@types/cors": "^2.8.12", + "@types/crypto-js": "^4.1.1", "@types/multer": "^1.4.7", "concurrently": "^7.1.0", "nodemon": "^2.0.15", diff --git a/packages/server/src/ChatflowPool.ts b/packages/server/src/ChatflowPool.ts index 35b0d9478..d296dcfed 100644 --- a/packages/server/src/ChatflowPool.ts +++ b/packages/server/src/ChatflowPool.ts @@ -1,5 +1,6 @@ import { ICommonObject } from 'flowise-components' import { IActiveChatflows, INodeData, IReactFlowNode } from './Interface' +import logger from './utils/logger' /** * This pool is to keep track of active chatflow pools @@ -22,6 +23,7 @@ export class ChatflowPool { inSync: true } if (overrideConfig) this.activeChatflows[chatflowid].overrideConfig = overrideConfig + logger.info(`[server]: Chatflow ${chatflowid} added into ChatflowPool`) } /** @@ -32,6 +34,7 @@ export class ChatflowPool { updateInSync(chatflowid: string, inSync: boolean) { if (Object.prototype.hasOwnProperty.call(this.activeChatflows, chatflowid)) { this.activeChatflows[chatflowid].inSync = inSync + logger.info(`[server]: Chatflow ${chatflowid} updated inSync=${inSync} in ChatflowPool`) } } @@ -42,6 +45,7 @@ export class ChatflowPool { async remove(chatflowid: string) { if (Object.prototype.hasOwnProperty.call(this.activeChatflows, chatflowid)) { delete this.activeChatflows[chatflowid] + logger.info(`[server]: Chatflow ${chatflowid} removed from ChatflowPool`) } } } diff --git a/packages/server/src/ChildProcess.ts b/packages/server/src/ChildProcess.ts deleted file mode 100644 index 483379d08..000000000 --- a/packages/server/src/ChildProcess.ts +++ /dev/null @@ -1,148 +0,0 @@ -import { IChildProcessMessage, IReactFlowNode, IReactFlowObject, IRunChatflowMessageValue, INodeData } from './Interface' -import { buildLangchain, constructGraphs, getEndingNode, getStartingNodes, resolveVariables } from './utils' - -export class ChildProcess { - /** - * Stop child process when app is killed - */ - static async stopChildProcess() { - setTimeout(() => { - process.exit(0) - }, 50000) - } - - /** - * Process prediction - * @param {IRunChatflowMessageValue} messageValue - * @return {Promise} - */ - async runChildProcess(messageValue: IRunChatflowMessageValue): Promise { - process.on('SIGTERM', ChildProcess.stopChildProcess) - process.on('SIGINT', ChildProcess.stopChildProcess) - - await sendToParentProcess('start', '_') - - // Create a Queue and add our initial node in it - const { endingNodeData, chatflow, incomingInput, componentNodes } = messageValue - - let nodeToExecuteData: INodeData - let addToChatFlowPool: any = {} - - /* Don't rebuild the flow (to avoid duplicated upsert, recomputation) when all these conditions met: - * - Node Data already exists in pool - * - Still in sync (i.e the flow has not been modified since) - * - Existing overrideConfig and new overrideConfig are the same - * - Flow doesn't start with nodes that depend on incomingInput.question - ***/ - if (endingNodeData) { - nodeToExecuteData = endingNodeData - } else { - /*** Get chatflows and prepare data ***/ - const flowData = chatflow.flowData - const parsedFlowData: IReactFlowObject = JSON.parse(flowData) - const nodes = parsedFlowData.nodes - const edges = parsedFlowData.edges - - /*** Get Ending Node with Directed Graph ***/ - const { graph, nodeDependencies } = constructGraphs(nodes, edges) - const directedGraph = graph - const endingNodeId = getEndingNode(nodeDependencies, directedGraph) - if (!endingNodeId) { - await sendToParentProcess('error', `Ending node must be either a Chain or Agent`) - return - } - - const endingNodeData = nodes.find((nd) => nd.id === endingNodeId)?.data - if (!endingNodeData) { - await sendToParentProcess('error', `Ending node must be either a Chain or Agent`) - return - } - - if ( - endingNodeData.outputs && - Object.keys(endingNodeData.outputs).length && - !Object.values(endingNodeData.outputs).includes(endingNodeData.name) - ) { - await sendToParentProcess( - 'error', - `Output of ${endingNodeData.label} (${endingNodeData.id}) must be ${endingNodeData.label}, can't be an Output Prediction` - ) - return - } - - /*** Get Starting Nodes with Non-Directed Graph ***/ - const constructedObj = constructGraphs(nodes, edges, true) - const nonDirectedGraph = constructedObj.graph - const { startingNodeIds, depthQueue } = getStartingNodes(nonDirectedGraph, endingNodeId) - - /*** BFS to traverse from Starting Nodes to Ending Node ***/ - const reactFlowNodes = await buildLangchain( - startingNodeIds, - nodes, - graph, - depthQueue, - componentNodes, - incomingInput.question, - incomingInput?.overrideConfig - ) - - const nodeToExecute = reactFlowNodes.find((node: IReactFlowNode) => node.id === endingNodeId) - if (!nodeToExecute) { - await sendToParentProcess('error', `Node ${endingNodeId} not found`) - return - } - - const reactFlowNodeData: INodeData = resolveVariables(nodeToExecute.data, reactFlowNodes, incomingInput.question) - nodeToExecuteData = reactFlowNodeData - - const startingNodes = nodes.filter((nd) => startingNodeIds.includes(nd.id)) - addToChatFlowPool = { - chatflowid: chatflow.id, - nodeToExecuteData, - startingNodes, - overrideConfig: incomingInput?.overrideConfig - } - } - - const nodeInstanceFilePath = componentNodes[nodeToExecuteData.name].filePath as string - const nodeModule = await import(nodeInstanceFilePath) - const nodeInstance = new nodeModule.nodeClass() - - const result = await nodeInstance.run(nodeToExecuteData, incomingInput.question, { chatHistory: incomingInput.history }) - - await sendToParentProcess('finish', { result, addToChatFlowPool }) - } -} - -/** - * Send data back to parent process - * @param {string} key Key of message - * @param {*} value Value of message - * @returns {Promise} - */ -async function sendToParentProcess(key: string, value: any): Promise { - // tslint:disable-line:no-any - return new Promise((resolve, reject) => { - process.send!( - { - key, - value - }, - (error: Error) => { - if (error) { - return reject(error) - } - resolve() - } - ) - }) -} - -const childProcess = new ChildProcess() - -process.on('message', async (message: IChildProcessMessage) => { - if (message.key === 'start') { - await childProcess.runChildProcess(message.value) - process.exit() - } -}) diff --git a/packages/server/src/DataSource.ts b/packages/server/src/DataSource.ts index 76c8e1445..b0d564779 100644 --- a/packages/server/src/DataSource.ts +++ b/packages/server/src/DataSource.ts @@ -3,20 +3,64 @@ import path from 'path' import { DataSource } from 'typeorm' import { ChatFlow } from './entity/ChatFlow' import { ChatMessage } from './entity/ChatMessage' +import { Credential } from './entity/Credential' +import { Tool } from './entity/Tool' import { getUserHome } from './utils' let appDataSource: DataSource export const init = async (): Promise => { - const homePath = path.join(getUserHome(), '.flowise') - - appDataSource = new DataSource({ - type: 'sqlite', - database: path.resolve(homePath, 'database.sqlite'), - synchronize: true, - entities: [ChatFlow, ChatMessage], - migrations: [] - }) + let homePath + const synchronize = process.env.OVERRIDE_DATABASE === 'false' ? false : true + switch (process.env.DATABASE_TYPE) { + case 'sqlite': + homePath = process.env.DATABASE_PATH ?? path.join(getUserHome(), '.flowise') + appDataSource = new DataSource({ + type: 'sqlite', + database: path.resolve(homePath, 'database.sqlite'), + synchronize, + entities: [ChatFlow, ChatMessage, Tool, Credential], + migrations: [] + }) + break + case 'mysql': + appDataSource = new DataSource({ + type: 'mysql', + host: process.env.DATABASE_HOST, + port: parseInt(process.env.DATABASE_PORT || '3306'), + username: process.env.DATABASE_USER, + password: process.env.DATABASE_PASSWORD, + database: process.env.DATABASE_NAME, + charset: 'utf8mb4', + synchronize, + entities: [ChatFlow, ChatMessage, Tool, Credential], + migrations: [] + }) + break + case 'postgres': + appDataSource = new DataSource({ + type: 'postgres', + host: process.env.DATABASE_HOST, + port: parseInt(process.env.DATABASE_PORT || '5432'), + username: process.env.DATABASE_USER, + password: process.env.DATABASE_PASSWORD, + database: process.env.DATABASE_NAME, + synchronize, + entities: [ChatFlow, ChatMessage, Tool, Credential], + migrations: [] + }) + break + default: + homePath = process.env.DATABASE_PATH ?? path.join(getUserHome(), '.flowise') + appDataSource = new DataSource({ + type: 'sqlite', + database: path.resolve(homePath, 'database.sqlite'), + synchronize, + entities: [ChatFlow, ChatMessage, Tool, Credential], + migrations: [] + }) + break + } } export function getDataSource(): DataSource { diff --git a/packages/server/src/Interface.ts b/packages/server/src/Interface.ts index 30f9fb292..92e3054d5 100644 --- a/packages/server/src/Interface.ts +++ b/packages/server/src/Interface.ts @@ -9,10 +9,12 @@ export interface IChatFlow { id: string name: string flowData: string - apikeyid: string - deployed: boolean updatedDate: Date createdDate: Date + deployed?: boolean + isPublic?: boolean + apikeyid?: string + chatbotConfig?: string } export interface IChatMessage { @@ -21,12 +23,38 @@ export interface IChatMessage { content: string chatflowid: string createdDate: Date + sourceDocuments?: string +} + +export interface ITool { + id: string + name: string + description: string + color: string + iconSrc?: string + schema?: string + func?: string + updatedDate: Date + createdDate: Date +} + +export interface ICredential { + id: string + name: string + credentialName: string + encryptedData: string + updatedDate: Date + createdDate: Date } export interface IComponentNodes { [key: string]: INode } +export interface IComponentCredentials { + [key: string]: INode +} + export interface IVariableDict { [key: string]: string } @@ -115,6 +143,7 @@ export interface IncomingInput { question: string history: IMessage[] overrideConfig?: ICommonObject + socketIOClientId?: string } export interface IActiveChatflows { @@ -128,6 +157,7 @@ export interface IActiveChatflows { export interface IOverrideConfig { node: string + nodeId: string label: string name: string type: string @@ -139,14 +169,16 @@ export interface IDatabaseExport { apikeys: ICommonObject[] } -export interface IRunChatflowMessageValue { - chatflow: IChatFlow - incomingInput: IncomingInput - componentNodes: IComponentNodes - endingNodeData?: INodeData +export type ICredentialDataDecrypted = ICommonObject + +// Plain credential object sent to server +export interface ICredentialReqBody { + name: string + credentialName: string + plainDataObj: ICredentialDataDecrypted } -export interface IChildProcessMessage { - key: string - value?: any +// Decrypted credential object sent back to client +export interface ICredentialReturnResponse extends ICredential { + plainDataObj: ICredentialDataDecrypted } diff --git a/packages/server/src/NodesPool.ts b/packages/server/src/NodesPool.ts index b17164e05..62db41baf 100644 --- a/packages/server/src/NodesPool.ts +++ b/packages/server/src/NodesPool.ts @@ -1,17 +1,27 @@ -import { IComponentNodes } from './Interface' - +import { IComponentNodes, IComponentCredentials } from './Interface' import path from 'path' import { Dirent } from 'fs' import { getNodeModulesPackagePath } from './utils' import { promises } from 'fs' +import { ICommonObject } from 'flowise-components' export class NodesPool { componentNodes: IComponentNodes = {} + componentCredentials: IComponentCredentials = {} + private credentialIconPath: ICommonObject = {} /** - * Initialize to get all nodes + * Initialize to get all nodes & credentials */ async initialize() { + await this.initializeNodes() + await this.initializeCrdentials() + } + + /** + * Initialize nodes + */ + private async initializeNodes() { const packagePath = getNodeModulesPackagePath('flowise-components') const nodesPath = path.join(packagePath, 'dist', 'nodes') const nodeFiles = await this.getFiles(nodesPath) @@ -19,7 +29,8 @@ export class NodesPool { nodeFiles.map(async (file) => { if (file.endsWith('.js')) { const nodeModule = await require(file) - try { + + if (nodeModule.nodeClass) { const newNodeInstance = new nodeModule.nodeClass() newNodeInstance.filePath = file @@ -36,9 +47,35 @@ export class NodesPool { filePath.pop() const nodeIconAbsolutePath = `${filePath.join('/')}/${newNodeInstance.icon}` this.componentNodes[newNodeInstance.name].icon = nodeIconAbsolutePath + + // Store icon path for componentCredentials + if (newNodeInstance.credential) { + for (const credName of newNodeInstance.credential.credentialNames) { + this.credentialIconPath[credName] = nodeIconAbsolutePath + } + } } - } catch (e) { - // console.error(e); + } + } + }) + ) + } + + /** + * Initialize credentials + */ + private async initializeCrdentials() { + const packagePath = getNodeModulesPackagePath('flowise-components') + const nodesPath = path.join(packagePath, 'dist', 'credentials') + const nodeFiles = await this.getFiles(nodesPath) + return Promise.all( + nodeFiles.map(async (file) => { + if (file.endsWith('.credential.js')) { + const credentialModule = await require(file) + if (credentialModule.credClass) { + const newCredInstance = new credentialModule.credClass() + newCredInstance.icon = this.credentialIconPath[newCredInstance.name] ?? '' + this.componentCredentials[newCredInstance.name] = newCredInstance } } }) @@ -50,7 +87,7 @@ export class NodesPool { * @param {string} dir * @returns {string[]} */ - async getFiles(dir: string): Promise { + private async getFiles(dir: string): Promise { const dirents = await promises.readdir(dir, { withFileTypes: true }) const files = await Promise.all( dirents.map((dirent: Dirent) => { diff --git a/packages/server/src/commands/start.ts b/packages/server/src/commands/start.ts index 9c9e5591d..4b58ae7cf 100644 --- a/packages/server/src/commands/start.ts +++ b/packages/server/src/commands/start.ts @@ -1,10 +1,11 @@ -import { Command } from '@oclif/core' +import { Command, Flags } from '@oclif/core' import path from 'path' import * as Server from '../index' import * as DataSource from '../DataSource' import dotenv from 'dotenv' +import logger from '../utils/logger' -dotenv.config({ path: path.join(__dirname, '..', '..', '.env') }) +dotenv.config({ path: path.join(__dirname, '..', '..', '.env'), override: true }) enum EXIT_CODE { SUCCESS = 0, @@ -14,13 +15,38 @@ let processExitCode = EXIT_CODE.SUCCESS export default class Start extends Command { static args = [] + static flags = { + FLOWISE_USERNAME: Flags.string(), + FLOWISE_PASSWORD: Flags.string(), + PORT: Flags.string(), + PASSPHRASE: Flags.string(), + DEBUG: Flags.string(), + APIKEY_PATH: Flags.string(), + SECRETKEY_PATH: Flags.string(), + LOG_PATH: Flags.string(), + LOG_LEVEL: Flags.string(), + TOOL_FUNCTION_BUILTIN_DEP: Flags.string(), + TOOL_FUNCTION_EXTERNAL_DEP: Flags.string(), + OVERRIDE_DATABASE: Flags.string(), + DATABASE_TYPE: Flags.string(), + DATABASE_PATH: Flags.string(), + DATABASE_PORT: Flags.string(), + DATABASE_HOST: Flags.string(), + DATABASE_NAME: Flags.string(), + DATABASE_USER: Flags.string(), + DATABASE_PASSWORD: Flags.string(), + LANGCHAIN_TRACING_V2: Flags.string(), + LANGCHAIN_ENDPOINT: Flags.string(), + LANGCHAIN_API_KEY: Flags.string(), + LANGCHAIN_PROJECT: Flags.string() + } async stopProcess() { - console.info('Shutting down Flowise...') + logger.info('Shutting down Flowise...') try { // Shut down the app after timeout if it ever stuck removing pools setTimeout(() => { - console.info('Flowise was forced to shut down after 30 secs') + logger.info('Flowise was forced to shut down after 30 secs') process.exit(processExitCode) }, 30000) @@ -28,7 +54,7 @@ export default class Start extends Command { const serverApp = Server.getInstance() if (serverApp) await serverApp.stopApp() } catch (error) { - console.error('There was an error shutting down Flowise...', error) + logger.error('There was an error shutting down Flowise...', error) } process.exit(processExitCode) } @@ -40,16 +66,54 @@ export default class Start extends Command { // Prevent throw new Error from crashing the app // TODO: Get rid of this and send proper error message to ui process.on('uncaughtException', (err) => { - console.error('uncaughtException: ', err) + logger.error('uncaughtException: ', err) }) + const { flags } = await this.parse(Start) + + if (flags.PORT) process.env.PORT = flags.PORT + if (flags.DEBUG) process.env.DEBUG = flags.DEBUG + + // Authorization + if (flags.FLOWISE_USERNAME) process.env.FLOWISE_USERNAME = flags.FLOWISE_USERNAME + if (flags.FLOWISE_PASSWORD) process.env.FLOWISE_PASSWORD = flags.FLOWISE_PASSWORD + if (flags.APIKEY_PATH) process.env.APIKEY_PATH = flags.APIKEY_PATH + + // Credentials + if (flags.PASSPHRASE) process.env.PASSPHRASE = flags.PASSPHRASE + if (flags.SECRETKEY_PATH) process.env.SECRETKEY_PATH = flags.SECRETKEY_PATH + + // Logs + if (flags.LOG_PATH) process.env.LOG_PATH = flags.LOG_PATH + if (flags.LOG_LEVEL) process.env.LOG_LEVEL = flags.LOG_LEVEL + + // Tool functions + if (flags.TOOL_FUNCTION_BUILTIN_DEP) process.env.TOOL_FUNCTION_BUILTIN_DEP = flags.TOOL_FUNCTION_BUILTIN_DEP + if (flags.TOOL_FUNCTION_EXTERNAL_DEP) process.env.TOOL_FUNCTION_EXTERNAL_DEP = flags.TOOL_FUNCTION_EXTERNAL_DEP + + // Database config + if (flags.OVERRIDE_DATABASE) process.env.OVERRIDE_DATABASE = flags.OVERRIDE_DATABASE + if (flags.DATABASE_TYPE) process.env.DATABASE_TYPE = flags.DATABASE_TYPE + if (flags.DATABASE_PATH) process.env.DATABASE_PATH = flags.DATABASE_PATH + if (flags.DATABASE_PORT) process.env.DATABASE_PORT = flags.DATABASE_PORT + if (flags.DATABASE_HOST) process.env.DATABASE_HOST = flags.DATABASE_HOST + if (flags.DATABASE_NAME) process.env.DATABASE_NAME = flags.DATABASE_NAME + if (flags.DATABASE_USER) process.env.DATABASE_USER = flags.DATABASE_USER + if (flags.DATABASE_PASSWORD) process.env.DATABASE_PASSWORD = flags.DATABASE_PASSWORD + + // Langsmith tracing + if (flags.LANGCHAIN_TRACING_V2) process.env.LANGCHAIN_TRACING_V2 = flags.LANGCHAIN_TRACING_V2 + if (flags.LANGCHAIN_ENDPOINT) process.env.LANGCHAIN_ENDPOINT = flags.LANGCHAIN_ENDPOINT + if (flags.LANGCHAIN_API_KEY) process.env.LANGCHAIN_API_KEY = flags.LANGCHAIN_API_KEY + if (flags.LANGCHAIN_PROJECT) process.env.LANGCHAIN_PROJECT = flags.LANGCHAIN_PROJECT + await (async () => { try { - this.log('Starting Flowise...') + logger.info('Starting Flowise...') await DataSource.init() await Server.start() } catch (error) { - console.error('There was an error starting Flowise...', error) + logger.error('There was an error starting Flowise...', error) processExitCode = EXIT_CODE.FAILED // @ts-ignore process.emit('SIGINT') diff --git a/packages/server/src/entity/ChatFlow.ts b/packages/server/src/entity/ChatFlow.ts index d9b129294..4c37e083a 100644 --- a/packages/server/src/entity/ChatFlow.ts +++ b/packages/server/src/entity/ChatFlow.ts @@ -10,14 +10,20 @@ export class ChatFlow implements IChatFlow { @Column() name: string - @Column() + @Column({ type: 'text' }) flowData: string @Column({ nullable: true }) - apikeyid: string + deployed?: boolean - @Column() - deployed: boolean + @Column({ nullable: true }) + isPublic?: boolean + + @Column({ nullable: true }) + apikeyid?: string + + @Column({ nullable: true }) + chatbotConfig?: string @CreateDateColumn() createdDate: Date diff --git a/packages/server/src/entity/ChatMessage.ts b/packages/server/src/entity/ChatMessage.ts index 3380c86cd..8123020cb 100644 --- a/packages/server/src/entity/ChatMessage.ts +++ b/packages/server/src/entity/ChatMessage.ts @@ -14,9 +14,12 @@ export class ChatMessage implements IChatMessage { @Column() chatflowid: string - @Column() + @Column({ type: 'text' }) content: string + @Column({ nullable: true }) + sourceDocuments?: string + @CreateDateColumn() createdDate: Date } diff --git a/packages/server/src/entity/Credential.ts b/packages/server/src/entity/Credential.ts new file mode 100644 index 000000000..b724eed6f --- /dev/null +++ b/packages/server/src/entity/Credential.ts @@ -0,0 +1,24 @@ +/* eslint-disable */ +import { Entity, Column, PrimaryGeneratedColumn, Index, CreateDateColumn, UpdateDateColumn } from 'typeorm' +import { ICredential } from '../Interface' + +@Entity() +export class Credential implements ICredential { + @PrimaryGeneratedColumn('uuid') + id: string + + @Column() + name: string + + @Column() + credentialName: string + + @Column() + encryptedData: string + + @CreateDateColumn() + createdDate: Date + + @UpdateDateColumn() + updatedDate: Date +} diff --git a/packages/server/src/entity/Tool.ts b/packages/server/src/entity/Tool.ts new file mode 100644 index 000000000..011bf957d --- /dev/null +++ b/packages/server/src/entity/Tool.ts @@ -0,0 +1,33 @@ +/* eslint-disable */ +import { Entity, Column, CreateDateColumn, UpdateDateColumn, PrimaryGeneratedColumn } from 'typeorm' +import { ITool } from '../Interface' + +@Entity() +export class Tool implements ITool { + @PrimaryGeneratedColumn('uuid') + id: string + + @Column() + name: string + + @Column({ type: 'text' }) + description: string + + @Column() + color: string + + @Column({ nullable: true }) + iconSrc?: string + + @Column({ nullable: true }) + schema?: string + + @Column({ nullable: true }) + func?: string + + @CreateDateColumn() + createdDate: Date + + @UpdateDateColumn() + updatedDate: Date +} diff --git a/packages/server/src/index.ts b/packages/server/src/index.ts index 1ce661178..fbb61b00a 100644 --- a/packages/server/src/index.ts +++ b/packages/server/src/index.ts @@ -5,6 +5,9 @@ import cors from 'cors' import http from 'http' import * as fs from 'fs' import basicAuth from 'express-basic-auth' +import { Server } from 'socket.io' +import logger from './utils/logger' +import { expressRequestLogger } from './utils/logger' import { IChatFlow, @@ -13,8 +16,7 @@ import { IReactFlowObject, INodeData, IDatabaseExport, - IRunChatflowMessageValue, - IChildProcessMessage + ICredentialReturnResponse } from './Interface' import { getNodeModulesPackagePath, @@ -32,16 +34,27 @@ import { mapMimeTypeToInputField, findAvailableConfigs, isSameOverrideConfig, - replaceAllAPIKeys + replaceAllAPIKeys, + isFlowValidForStream, + isVectorStoreFaiss, + databaseEntities, + getApiKey, + transformToCredentialEntity, + decryptCredentialData, + clearSessionMemory, + replaceInputsWithConfig, + getEncryptionKey, + checkMemorySessionId } from './utils' -import { cloneDeep } from 'lodash' +import { cloneDeep, omit } from 'lodash' import { getDataSource } from './DataSource' import { NodesPool } from './NodesPool' import { ChatFlow } from './entity/ChatFlow' import { ChatMessage } from './entity/ChatMessage' +import { Credential } from './entity/Credential' +import { Tool } from './entity/Tool' import { ChatflowPool } from './ChatflowPool' -import { ICommonObject } from 'flowise-components' -import { fork } from 'child_process' +import { ICommonObject, INodeOptionsValue } from 'flowise-components' export class App { app: express.Application @@ -57,23 +70,27 @@ export class App { // Initialize database this.AppDataSource.initialize() .then(async () => { - console.info('📦[server]: Data Source has been initialized!') + logger.info('📦 [server]: Data Source has been initialized!') - // Initialize pools + // Initialize nodes pool this.nodesPool = new NodesPool() await this.nodesPool.initialize() + // Initialize chatflow pool this.chatflowPool = new ChatflowPool() // Initialize API keys await getAPIKeys() + + // Initialize encryption key + await getEncryptionKey() }) .catch((err) => { - console.error('❌[server]: Error during Data Source initialization:', err) + logger.error('❌ [server]: Error during Data Source initialization:', err) }) } - async config() { + async config(socketIO?: Server) { // Limit is needed to allow sending/receiving base64 encoded string this.app.use(express.json({ limit: '50mb' })) this.app.use(express.urlencoded({ limit: '50mb', extended: true })) @@ -81,13 +98,24 @@ export class App { // Allow access from * this.app.use(cors()) - if (process.env.USERNAME && process.env.PASSWORD) { - const username = process.env.USERNAME.toLocaleLowerCase() - const password = process.env.PASSWORD.toLocaleLowerCase() + // Add the expressRequestLogger middleware to log all requests + this.app.use(expressRequestLogger) + + if (process.env.FLOWISE_USERNAME && process.env.FLOWISE_PASSWORD) { + const username = process.env.FLOWISE_USERNAME + const password = process.env.FLOWISE_PASSWORD const basicAuthMiddleware = basicAuth({ users: { [username]: password } }) - const whitelistURLs = ['/api/v1/prediction/', '/api/v1/node-icon/'] + const whitelistURLs = [ + '/api/v1/verify/apikey/', + '/api/v1/chatflows/apikey/', + '/api/v1/public-chatflows', + '/api/v1/prediction/', + '/api/v1/node-icon/', + '/api/v1/components-credentials-icon/', + '/api/v1/chatflows-streaming' + ] this.app.use((req, res, next) => { if (req.url.includes('/api/v1/')) { whitelistURLs.some((url) => req.url.includes(url)) ? next() : basicAuthMiddleware(req, res, next) @@ -98,7 +126,7 @@ export class App { const upload = multer({ dest: `${path.join(__dirname, '..', 'uploads')}/` }) // ---------------------------------------- - // Nodes + // Components // ---------------------------------------- // Get all component nodes @@ -111,6 +139,16 @@ export class App { return res.json(returnData) }) + // Get all component credentials + this.app.get('/api/v1/components-credentials', async (req: Request, res: Response) => { + const returnData = [] + for (const credName in this.nodesPool.componentCredentials) { + const clonedCred = cloneDeep(this.nodesPool.componentCredentials[credName]) + returnData.push(clonedCred) + } + return res.json(returnData) + }) + // Get specific component node via name this.app.get('/api/v1/nodes/:name', (req: Request, res: Response) => { if (Object.prototype.hasOwnProperty.call(this.nodesPool.componentNodes, req.params.name)) { @@ -120,6 +158,27 @@ export class App { } }) + // Get component credential via name + this.app.get('/api/v1/components-credentials/:name', (req: Request, res: Response) => { + if (!req.params.name.includes('&')) { + if (Object.prototype.hasOwnProperty.call(this.nodesPool.componentCredentials, req.params.name)) { + return res.json(this.nodesPool.componentCredentials[req.params.name]) + } else { + throw new Error(`Credential ${req.params.name} not found`) + } + } else { + const returnResponse = [] + for (const name of req.params.name.split('&')) { + if (Object.prototype.hasOwnProperty.call(this.nodesPool.componentCredentials, name)) { + returnResponse.push(this.nodesPool.componentCredentials[name]) + } else { + throw new Error(`Credential ${name} not found`) + } + } + return res.json(returnResponse) + } + }) + // Returns specific component node icon via name this.app.get('/api/v1/node-icon/:name', (req: Request, res: Response) => { if (Object.prototype.hasOwnProperty.call(this.nodesPool.componentNodes, req.params.name)) { @@ -139,6 +198,48 @@ export class App { } }) + // Returns specific component credential icon via name + this.app.get('/api/v1/components-credentials-icon/:name', (req: Request, res: Response) => { + if (Object.prototype.hasOwnProperty.call(this.nodesPool.componentCredentials, req.params.name)) { + const credInstance = this.nodesPool.componentCredentials[req.params.name] + if (credInstance.icon === undefined) { + throw new Error(`Credential ${req.params.name} icon not found`) + } + + if (credInstance.icon.endsWith('.svg') || credInstance.icon.endsWith('.png') || credInstance.icon.endsWith('.jpg')) { + const filepath = credInstance.icon + res.sendFile(filepath) + } else { + throw new Error(`Credential ${req.params.name} icon is missing icon`) + } + } else { + throw new Error(`Credential ${req.params.name} not found`) + } + }) + + // load async options + this.app.post('/api/v1/node-load-method/:name', async (req: Request, res: Response) => { + const nodeData: INodeData = req.body + if (Object.prototype.hasOwnProperty.call(this.nodesPool.componentNodes, req.params.name)) { + try { + const nodeInstance = this.nodesPool.componentNodes[req.params.name] + const methodName = nodeData.loadMethod || '' + + const returnOptions: INodeOptionsValue[] = await nodeInstance.loadMethods![methodName]!.call(nodeInstance, nodeData, { + appDataSource: this.AppDataSource, + databaseEntities: databaseEntities + }) + + return res.json(returnOptions) + } catch (error) { + return res.json([]) + } + } else { + res.status(404).send(`Node ${req.params.name} not found`) + return + } + }) + // ---------------------------------------- // Chatflows // ---------------------------------------- @@ -149,6 +250,25 @@ export class App { return res.json(chatflows) }) + // Get specific chatflow via api key + this.app.get('/api/v1/chatflows/apikey/:apiKey', async (req: Request, res: Response) => { + try { + const apiKey = await getApiKey(req.params.apiKey) + if (!apiKey) return res.status(401).send('Unauthorized') + const chatflows = await this.AppDataSource.getRepository(ChatFlow) + .createQueryBuilder('cf') + .where('cf.apikeyid = :apikeyid', { apikeyid: apiKey.id }) + .orWhere('cf.apikeyid IS NULL') + .orWhere('cf.apikeyid = ""') + .orderBy('cf.name', 'ASC') + .getMany() + if (chatflows.length >= 1) return res.status(200).send(chatflows) + return res.status(404).send('Chatflow not found') + } catch (err: any) { + return res.status(500).send(err?.message) + } + }) + // Get specific chatflow via id this.app.get('/api/v1/chatflows/:id', async (req: Request, res: Response) => { const chatflow = await this.AppDataSource.getRepository(ChatFlow).findOneBy({ @@ -158,6 +278,16 @@ export class App { return res.status(404).send(`Chatflow ${req.params.id} not found`) }) + // Get specific chatflow via id (PUBLIC endpoint, used when sharing chatbot link) + this.app.get('/api/v1/public-chatflows/:id', async (req: Request, res: Response) => { + const chatflow = await this.AppDataSource.getRepository(ChatFlow).findOneBy({ + id: req.params.id + }) + if (chatflow && chatflow.isPublic) return res.json(chatflow) + else if (chatflow && !chatflow.isPublic) return res.status(401).send(`Unauthorized`) + return res.status(404).send(`Chatflow ${req.params.id} not found`) + }) + // Save chatflow this.app.post('/api/v1/chatflows', async (req: Request, res: Response) => { const body = req.body @@ -200,14 +330,49 @@ export class App { return res.json(results) }) + // Check if chatflow valid for streaming + this.app.get('/api/v1/chatflows-streaming/:id', async (req: Request, res: Response) => { + const chatflow = await this.AppDataSource.getRepository(ChatFlow).findOneBy({ + id: req.params.id + }) + if (!chatflow) return res.status(404).send(`Chatflow ${req.params.id} not found`) + + /*** Get Ending Node with Directed Graph ***/ + const flowData = chatflow.flowData + const parsedFlowData: IReactFlowObject = JSON.parse(flowData) + const nodes = parsedFlowData.nodes + const edges = parsedFlowData.edges + const { graph, nodeDependencies } = constructGraphs(nodes, edges) + + const endingNodeId = getEndingNode(nodeDependencies, graph) + if (!endingNodeId) return res.status(500).send(`Ending node ${endingNodeId} not found`) + + const endingNodeData = nodes.find((nd) => nd.id === endingNodeId)?.data + if (!endingNodeData) return res.status(500).send(`Ending node ${endingNodeId} data not found`) + + if (endingNodeData && endingNodeData.category !== 'Chains' && endingNodeData.category !== 'Agents') { + return res.status(500).send(`Ending node must be either a Chain or Agent`) + } + + const obj = { + isStreaming: isFlowValidForStream(nodes, endingNodeData) + } + return res.json(obj) + }) + // ---------------------------------------- // ChatMessage // ---------------------------------------- // Get all chatmessages from chatflowid this.app.get('/api/v1/chatmessage/:id', async (req: Request, res: Response) => { - const chatmessages = await this.AppDataSource.getRepository(ChatMessage).findBy({ - chatflowid: req.params.id + const chatmessages = await this.AppDataSource.getRepository(ChatMessage).find({ + where: { + chatflowid: req.params.id + }, + order: { + createdDate: 'ASC' + } }) return res.json(chatmessages) }) @@ -226,10 +391,165 @@ export class App { // Delete all chatmessages from chatflowid this.app.delete('/api/v1/chatmessage/:id', async (req: Request, res: Response) => { + const chatflow = await this.AppDataSource.getRepository(ChatFlow).findOneBy({ + id: req.params.id + }) + if (!chatflow) { + res.status(404).send(`Chatflow ${req.params.id} not found`) + return + } + const flowData = chatflow.flowData + const parsedFlowData: IReactFlowObject = JSON.parse(flowData) + const nodes = parsedFlowData.nodes + let chatId = await getChatId(chatflow.id) + if (!chatId) chatId = chatflow.id + clearSessionMemory(nodes, this.nodesPool.componentNodes, chatId, this.AppDataSource, req.query.sessionId as string) const results = await this.AppDataSource.getRepository(ChatMessage).delete({ chatflowid: req.params.id }) return res.json(results) }) + // ---------------------------------------- + // Credentials + // ---------------------------------------- + + // Create new credential + this.app.post('/api/v1/credentials', async (req: Request, res: Response) => { + const body = req.body + const newCredential = await transformToCredentialEntity(body) + const credential = this.AppDataSource.getRepository(Credential).create(newCredential) + const results = await this.AppDataSource.getRepository(Credential).save(credential) + return res.json(results) + }) + + // Get all credentials + this.app.get('/api/v1/credentials', async (req: Request, res: Response) => { + if (req.query.credentialName) { + let returnCredentials = [] + if (Array.isArray(req.query.credentialName)) { + for (let i = 0; i < req.query.credentialName.length; i += 1) { + const name = req.query.credentialName[i] as string + const credentials = await this.AppDataSource.getRepository(Credential).findBy({ + credentialName: name + }) + returnCredentials.push(...credentials) + } + } else { + const credentials = await this.AppDataSource.getRepository(Credential).findBy({ + credentialName: req.query.credentialName as string + }) + returnCredentials = [...credentials] + } + return res.json(returnCredentials) + } else { + const credentials = await this.AppDataSource.getRepository(Credential).find() + const returnCredentials = [] + for (const credential of credentials) { + returnCredentials.push(omit(credential, ['encryptedData'])) + } + return res.json(returnCredentials) + } + }) + + // Get specific credential + this.app.get('/api/v1/credentials/:id', async (req: Request, res: Response) => { + const credential = await this.AppDataSource.getRepository(Credential).findOneBy({ + id: req.params.id + }) + + if (!credential) return res.status(404).send(`Credential ${req.params.id} not found`) + + // Decrpyt credentialData + const decryptedCredentialData = await decryptCredentialData( + credential.encryptedData, + credential.credentialName, + this.nodesPool.componentCredentials + ) + const returnCredential: ICredentialReturnResponse = { + ...credential, + plainDataObj: decryptedCredentialData + } + return res.json(omit(returnCredential, ['encryptedData'])) + }) + + // Update credential + this.app.put('/api/v1/credentials/:id', async (req: Request, res: Response) => { + const credential = await this.AppDataSource.getRepository(Credential).findOneBy({ + id: req.params.id + }) + + if (!credential) return res.status(404).send(`Credential ${req.params.id} not found`) + + const body = req.body + const updateCredential = await transformToCredentialEntity(body) + this.AppDataSource.getRepository(Credential).merge(credential, updateCredential) + const result = await this.AppDataSource.getRepository(Credential).save(credential) + + return res.json(result) + }) + + // Delete all chatmessages from chatflowid + this.app.delete('/api/v1/credentials/:id', async (req: Request, res: Response) => { + const results = await this.AppDataSource.getRepository(Credential).delete({ id: req.params.id }) + return res.json(results) + }) + + // ---------------------------------------- + // Tools + // ---------------------------------------- + + // Get all tools + this.app.get('/api/v1/tools', async (req: Request, res: Response) => { + const tools = await this.AppDataSource.getRepository(Tool).find() + return res.json(tools) + }) + + // Get specific tool + this.app.get('/api/v1/tools/:id', async (req: Request, res: Response) => { + const tool = await this.AppDataSource.getRepository(Tool).findOneBy({ + id: req.params.id + }) + return res.json(tool) + }) + + // Add tool + this.app.post('/api/v1/tools', async (req: Request, res: Response) => { + const body = req.body + const newTool = new Tool() + Object.assign(newTool, body) + + const tool = this.AppDataSource.getRepository(Tool).create(newTool) + const results = await this.AppDataSource.getRepository(Tool).save(tool) + + return res.json(results) + }) + + // Update tool + this.app.put('/api/v1/tools/:id', async (req: Request, res: Response) => { + const tool = await this.AppDataSource.getRepository(Tool).findOneBy({ + id: req.params.id + }) + + if (!tool) { + res.status(404).send(`Tool ${req.params.id} not found`) + return + } + + const body = req.body + const updateTool = new Tool() + Object.assign(updateTool, body) + + this.AppDataSource.getRepository(Tool).merge(tool, updateTool) + const result = await this.AppDataSource.getRepository(Tool).save(tool) + + return res.json(result) + }) + + // Delete tool + this.app.delete('/api/v1/tools/:id', async (req: Request, res: Response) => { + const results = await this.AppDataSource.getRepository(Tool).delete({ id: req.params.id }) + return res.json(results) + }) + // ---------------------------------------- // Configuration // ---------------------------------------- @@ -242,7 +562,13 @@ export class App { const flowData = chatflow.flowData const parsedFlowData: IReactFlowObject = JSON.parse(flowData) const nodes = parsedFlowData.nodes - const availableConfigs = findAvailableConfigs(nodes) + const availableConfigs = findAvailableConfigs(nodes, this.nodesPool.componentCredentials) + return res.json(availableConfigs) + }) + + this.app.post('/api/v1/node-config', async (req: Request, res: Response) => { + const nodes = [{ data: req.body }] as IReactFlowNode[] + const availableConfigs = findAvailableConfigs(nodes, this.nodesPool.componentCredentials) return res.json(availableConfigs) }) @@ -303,12 +629,12 @@ export class App { // Send input message and get prediction result (External) this.app.post('/api/v1/prediction/:id', upload.array('files'), async (req: Request, res: Response) => { - await this.processPrediction(req, res) + await this.processPrediction(req, res, socketIO) }) // Send input message and get prediction result (Internal) this.app.post('/api/v1/internal-prediction/:id', async (req: Request, res: Response) => { - await this.processPrediction(req, res, true) + await this.processPrediction(req, res, socketIO, true) }) // ---------------------------------------- @@ -316,12 +642,12 @@ export class App { // ---------------------------------------- // Get all chatflows for marketplaces - this.app.get('/api/v1/marketplaces', async (req: Request, res: Response) => { - const marketplaceDir = path.join(__dirname, '..', 'marketplaces') + this.app.get('/api/v1/marketplaces/chatflows', async (req: Request, res: Response) => { + const marketplaceDir = path.join(__dirname, '..', 'marketplaces', 'chatflows') const jsonsInDir = fs.readdirSync(marketplaceDir).filter((file) => path.extname(file) === '.json') const templates: any[] = [] jsonsInDir.forEach((file, index) => { - const filePath = path.join(__dirname, '..', 'marketplaces', file) + const filePath = path.join(__dirname, '..', 'marketplaces', 'chatflows', file) const fileData = fs.readFileSync(filePath) const fileDataObj = JSON.parse(fileData.toString()) const template = { @@ -332,6 +658,31 @@ export class App { } templates.push(template) }) + const FlowiseDocsQnA = templates.find((tmp) => tmp.name === 'Flowise Docs QnA') + const FlowiseDocsQnAIndex = templates.findIndex((tmp) => tmp.name === 'Flowise Docs QnA') + if (FlowiseDocsQnA && FlowiseDocsQnAIndex > 0) { + templates.splice(FlowiseDocsQnAIndex, 1) + templates.unshift(FlowiseDocsQnA) + } + return res.json(templates) + }) + + // Get all tools for marketplaces + this.app.get('/api/v1/marketplaces/tools', async (req: Request, res: Response) => { + const marketplaceDir = path.join(__dirname, '..', 'marketplaces', 'tools') + const jsonsInDir = fs.readdirSync(marketplaceDir).filter((file) => path.extname(file) === '.json') + const templates: any[] = [] + jsonsInDir.forEach((file, index) => { + const filePath = path.join(__dirname, '..', 'marketplaces', 'tools', file) + const fileData = fs.readFileSync(filePath) + const fileDataObj = JSON.parse(fileData.toString()) + const template = { + ...fileDataObj, + id: index, + templateName: file.split('.json')[0] + } + templates.push(template) + }) return res.json(templates) }) @@ -363,6 +714,17 @@ export class App { return res.json(keys) }) + // Verify api key + this.app.get('/api/v1/verify/apikey/:apiKey', async (req: Request, res: Response) => { + try { + const apiKey = await getApiKey(req.params.apiKey) + if (!apiKey) return res.status(401).send('Unauthorized') + return res.status(200).send('OK') + } catch (err: any) { + return res.status(500).send(err?.message) + } + }) + // ---------------------------------------- // Serve UI static // ---------------------------------------- @@ -399,74 +761,14 @@ export class App { } } - /** - * Start child process - * @param {ChatFlow} chatflow - * @param {IncomingInput} incomingInput - * @param {INodeData} endingNodeData - */ - async startChildProcess(chatflow: ChatFlow, incomingInput: IncomingInput, endingNodeData?: INodeData) { - try { - const controller = new AbortController() - const { signal } = controller - - let childpath = path.join(__dirname, '..', 'dist', 'ChildProcess.js') - if (!fs.existsSync(childpath)) childpath = 'ChildProcess.ts' - - const childProcess = fork(childpath, [], { signal }) - - const value = { - chatflow, - incomingInput, - componentNodes: cloneDeep(this.nodesPool.componentNodes), - endingNodeData - } as IRunChatflowMessageValue - childProcess.send({ key: 'start', value } as IChildProcessMessage) - - let childProcessTimeout: NodeJS.Timeout - - return new Promise((resolve, reject) => { - childProcess.on('message', async (message: IChildProcessMessage) => { - if (message.key === 'finish') { - const { result, addToChatFlowPool } = message.value as ICommonObject - if (childProcessTimeout) { - clearTimeout(childProcessTimeout) - } - if (Object.keys(addToChatFlowPool).length) { - const { chatflowid, nodeToExecuteData, startingNodes, overrideConfig } = addToChatFlowPool - this.chatflowPool.add(chatflowid, nodeToExecuteData, startingNodes, overrideConfig) - } - resolve(result) - } - if (message.key === 'start') { - if (process.env.EXECUTION_TIMEOUT) { - childProcessTimeout = setTimeout(async () => { - childProcess.kill() - resolve(undefined) - }, parseInt(process.env.EXECUTION_TIMEOUT, 10)) - } - } - if (message.key === 'error') { - let errMessage = message.value as string - if (childProcessTimeout) { - clearTimeout(childProcessTimeout) - } - reject(errMessage) - } - }) - }) - } catch (err) { - console.error(err) - } - } - /** * Process Prediction * @param {Request} req * @param {Response} res + * @param {Server} socketIO * @param {boolean} isInternal */ - async processPrediction(req: Request, res: Response, isInternal = false) { + async processPrediction(req: Request, res: Response, socketIO?: Server, isInternal = false) { try { const chatflowid = req.params.id let incomingInput: IncomingInput = req.body @@ -478,10 +780,15 @@ export class App { }) if (!chatflow) return res.status(404).send(`Chatflow ${chatflowid} not found`) + let chatId = await getChatId(chatflow.id) + if (!chatId) chatId = chatflowid + if (!isInternal) { await this.validateKey(req, res, chatflow) } + let isStreamValid = false + const files = (req.files as any[]) || [] if (files.length) { @@ -504,13 +811,19 @@ export class App { } } - /* Don't rebuild the flow (to avoid duplicated upsert, recomputation) when all these conditions met: + /*** Get chatflows and prepare data ***/ + const flowData = chatflow.flowData + const parsedFlowData: IReactFlowObject = JSON.parse(flowData) + const nodes = parsedFlowData.nodes + const edges = parsedFlowData.edges + + /* Reuse the flow without having to rebuild (to avoid duplicated upsert, recomputation) when all these conditions met: * - Node Data already exists in pool * - Still in sync (i.e the flow has not been modified since) * - Existing overrideConfig and new overrideConfig are the same - * - Flow doesn't start with nodes that depend on incomingInput.question + * - Flow doesn't start with/contain nodes that depend on incomingInput.question ***/ - const isRebuildNeeded = () => { + const isFlowReusable = () => { return ( Object.prototype.hasOwnProperty.call(this.chatflowPool.activeChatflows, chatflowid) && this.chatflowPool.activeChatflows[chatflowid].inSync && @@ -519,94 +832,110 @@ export class App { this.chatflowPool.activeChatflows[chatflowid].overrideConfig, incomingInput.overrideConfig ) && - !isStartNodeDependOnInput(this.chatflowPool.activeChatflows[chatflowid].startingNodes) + !isStartNodeDependOnInput(this.chatflowPool.activeChatflows[chatflowid].startingNodes, nodes) ) } - if (process.env.EXECUTION_MODE === 'child') { - if (isRebuildNeeded()) { - nodeToExecuteData = this.chatflowPool.activeChatflows[chatflowid].endingNodeData - try { - const result = await this.startChildProcess(chatflow, incomingInput, nodeToExecuteData) - - return res.json(result) - } catch (error) { - return res.status(500).send(error) - } - } else { - try { - const result = await this.startChildProcess(chatflow, incomingInput) - return res.json(result) - } catch (error) { - return res.status(500).send(error) - } - } + if (isFlowReusable()) { + nodeToExecuteData = this.chatflowPool.activeChatflows[chatflowid].endingNodeData + isStreamValid = isFlowValidForStream(nodes, nodeToExecuteData) + logger.debug( + `[server]: Reuse existing chatflow ${chatflowid} with ending node ${nodeToExecuteData.label} (${nodeToExecuteData.id})` + ) } else { - if (isRebuildNeeded()) { - nodeToExecuteData = this.chatflowPool.activeChatflows[chatflowid].endingNodeData - } else { - /*** Get chatflows and prepare data ***/ - const flowData = chatflow.flowData - const parsedFlowData: IReactFlowObject = JSON.parse(flowData) - const nodes = parsedFlowData.nodes - const edges = parsedFlowData.edges + /*** Get Ending Node with Directed Graph ***/ + const { graph, nodeDependencies } = constructGraphs(nodes, edges) + const directedGraph = graph + const endingNodeId = getEndingNode(nodeDependencies, directedGraph) + if (!endingNodeId) return res.status(500).send(`Ending node ${endingNodeId} not found`) - /*** Get Ending Node with Directed Graph ***/ - const { graph, nodeDependencies } = constructGraphs(nodes, edges) - const directedGraph = graph - const endingNodeId = getEndingNode(nodeDependencies, directedGraph) - if (!endingNodeId) return res.status(500).send(`Ending node must be either a Chain or Agent`) + const endingNodeData = nodes.find((nd) => nd.id === endingNodeId)?.data + if (!endingNodeData) return res.status(500).send(`Ending node ${endingNodeId} data not found`) - const endingNodeData = nodes.find((nd) => nd.id === endingNodeId)?.data - if (!endingNodeData) return res.status(500).send(`Ending node must be either a Chain or Agent`) - - if ( - endingNodeData.outputs && - Object.keys(endingNodeData.outputs).length && - !Object.values(endingNodeData.outputs).includes(endingNodeData.name) - ) { - return res - .status(500) - .send( - `Output of ${endingNodeData.label} (${endingNodeData.id}) must be ${endingNodeData.label}, can't be an Output Prediction` - ) - } - - /*** Get Starting Nodes with Non-Directed Graph ***/ - const constructedObj = constructGraphs(nodes, edges, true) - const nonDirectedGraph = constructedObj.graph - const { startingNodeIds, depthQueue } = getStartingNodes(nonDirectedGraph, endingNodeId) - - /*** BFS to traverse from Starting Nodes to Ending Node ***/ - const reactFlowNodes = await buildLangchain( - startingNodeIds, - nodes, - graph, - depthQueue, - this.nodesPool.componentNodes, - incomingInput.question, - incomingInput?.overrideConfig - ) - - const nodeToExecute = reactFlowNodes.find((node: IReactFlowNode) => node.id === endingNodeId) - if (!nodeToExecute) return res.status(404).send(`Node ${endingNodeId} not found`) - - const reactFlowNodeData: INodeData = resolveVariables(nodeToExecute.data, reactFlowNodes, incomingInput.question) - nodeToExecuteData = reactFlowNodeData - - const startingNodes = nodes.filter((nd) => startingNodeIds.includes(nd.id)) - this.chatflowPool.add(chatflowid, nodeToExecuteData, startingNodes, incomingInput?.overrideConfig) + if (endingNodeData && endingNodeData.category !== 'Chains' && endingNodeData.category !== 'Agents') { + return res.status(500).send(`Ending node must be either a Chain or Agent`) } - const nodeInstanceFilePath = this.nodesPool.componentNodes[nodeToExecuteData.name].filePath as string - const nodeModule = await import(nodeInstanceFilePath) - const nodeInstance = new nodeModule.nodeClass() + if ( + endingNodeData.outputs && + Object.keys(endingNodeData.outputs).length && + !Object.values(endingNodeData.outputs).includes(endingNodeData.name) + ) { + return res + .status(500) + .send( + `Output of ${endingNodeData.label} (${endingNodeData.id}) must be ${endingNodeData.label}, can't be an Output Prediction` + ) + } - const result = await nodeInstance.run(nodeToExecuteData, incomingInput.question, { chatHistory: incomingInput.history }) + isStreamValid = isFlowValidForStream(nodes, endingNodeData) - return res.json(result) + /*** Get Starting Nodes with Non-Directed Graph ***/ + const constructedObj = constructGraphs(nodes, edges, true) + const nonDirectedGraph = constructedObj.graph + const { startingNodeIds, depthQueue } = getStartingNodes(nonDirectedGraph, endingNodeId) + + logger.debug(`[server]: Start building chatflow ${chatflowid}`) + /*** BFS to traverse from Starting Nodes to Ending Node ***/ + const reactFlowNodes = await buildLangchain( + startingNodeIds, + nodes, + graph, + depthQueue, + this.nodesPool.componentNodes, + incomingInput.question, + incomingInput.history, + chatId, + this.AppDataSource, + incomingInput?.overrideConfig + ) + + const nodeToExecute = reactFlowNodes.find((node: IReactFlowNode) => node.id === endingNodeId) + if (!nodeToExecute) return res.status(404).send(`Node ${endingNodeId} not found`) + + if (incomingInput.overrideConfig) + nodeToExecute.data = replaceInputsWithConfig(nodeToExecute.data, incomingInput.overrideConfig) + const reactFlowNodeData: INodeData = resolveVariables( + nodeToExecute.data, + reactFlowNodes, + incomingInput.question, + incomingInput.history + ) + nodeToExecuteData = reactFlowNodeData + + const startingNodes = nodes.filter((nd) => startingNodeIds.includes(nd.id)) + this.chatflowPool.add(chatflowid, nodeToExecuteData, startingNodes, incomingInput?.overrideConfig) } + + const nodeInstanceFilePath = this.nodesPool.componentNodes[nodeToExecuteData.name].filePath as string + const nodeModule = await import(nodeInstanceFilePath) + const nodeInstance = new nodeModule.nodeClass() + + isStreamValid = isStreamValid && !isVectorStoreFaiss(nodeToExecuteData) + logger.debug(`[server]: Running ${nodeToExecuteData.label} (${nodeToExecuteData.id})`) + + if (nodeToExecuteData.instance) checkMemorySessionId(nodeToExecuteData.instance, chatId) + + const result = isStreamValid + ? await nodeInstance.run(nodeToExecuteData, incomingInput.question, { + chatHistory: incomingInput.history, + socketIO, + socketIOClientId: incomingInput.socketIOClientId, + logger, + appDataSource: this.AppDataSource, + databaseEntities + }) + : await nodeInstance.run(nodeToExecuteData, incomingInput.question, { + chatHistory: incomingInput.history, + logger, + appDataSource: this.AppDataSource, + databaseEntities + }) + + logger.debug(`[server]: Finished running ${nodeToExecuteData.label} (${nodeToExecuteData.id})`) + return res.json(result) } catch (e: any) { + logger.error('[server]: Error:', e) return res.status(500).send(e.message) } } @@ -616,11 +945,28 @@ export class App { const removePromises: any[] = [] await Promise.all(removePromises) } catch (e) { - console.error(`❌[server]: Flowise Server shut down error: ${e}`) + logger.error(`❌[server]: Flowise Server shut down error: ${e}`) } } } +/** + * Get first chat message id + * @param {string} chatflowid + * @returns {string} + */ +export async function getChatId(chatflowid: string) { + // first chatmessage id as the unique chat id + const firstChatMessage = await getDataSource() + .getRepository(ChatMessage) + .createQueryBuilder('cm') + .select('cm.id') + .where('chatflowid = :chatflowid', { chatflowid }) + .orderBy('cm.createdDate', 'ASC') + .getOne() + return firstChatMessage ? firstChatMessage.id : '' +} + let serverApp: App | undefined export async function start(): Promise { @@ -629,11 +975,17 @@ export async function start(): Promise { const port = parseInt(process.env.PORT || '', 10) || 3000 const server = http.createServer(serverApp.app) + const io = new Server(server, { + cors: { + origin: '*' + } + }) + await serverApp.initDatabase() - await serverApp.config() + await serverApp.config(io) server.listen(port, () => { - console.info(`⚡️[server]: Flowise Server is listening at ${port}`) + logger.info(`⚡️ [server]: Flowise Server is listening at ${port}`) }) } diff --git a/packages/server/src/utils/config.ts b/packages/server/src/utils/config.ts new file mode 100644 index 000000000..b5f5884eb --- /dev/null +++ b/packages/server/src/utils/config.ts @@ -0,0 +1,25 @@ +// BEWARE: This file is an intereem solution until we have a proper config strategy + +import path from 'path' +import dotenv from 'dotenv' + +dotenv.config({ path: path.join(__dirname, '..', '..', '.env'), override: true }) + +// default config +const loggingConfig = { + dir: process.env.LOG_PATH ?? path.join(__dirname, '..', '..', 'logs'), + server: { + level: process.env.LOG_LEVEL ?? 'info', + filename: 'server.log', + errorFilename: 'server-error.log' + }, + express: { + level: process.env.LOG_LEVEL ?? 'info', + format: 'jsonl', // can't be changed currently + filename: 'server-requests.log.jsonl' // should end with .jsonl + } +} + +export default { + logging: loggingConfig +} diff --git a/packages/server/src/utils/index.ts b/packages/server/src/utils/index.ts index 787612844..788b7c0ec 100644 --- a/packages/server/src/utils/index.ts +++ b/packages/server/src/utils/index.ts @@ -1,6 +1,7 @@ import path from 'path' import fs from 'fs' import moment from 'moment' +import logger from './logger' import { IComponentNodes, IDepthQueue, @@ -12,13 +13,34 @@ import { IReactFlowNode, IVariableDict, INodeData, - IOverrideConfig + IOverrideConfig, + ICredentialDataDecrypted, + IComponentCredentials, + ICredentialReqBody } from '../Interface' -import { cloneDeep, get } from 'lodash' -import { ICommonObject, getInputVariables } from 'flowise-components' +import { cloneDeep, get, omit, merge, isEqual } from 'lodash' +import { + ICommonObject, + getInputVariables, + IDatabaseEntity, + handleEscapeCharacters, + IMessage, + convertChatHistoryToText +} from 'flowise-components' import { scryptSync, randomBytes, timingSafeEqual } from 'crypto' +import { lib, PBKDF2, AES, enc } from 'crypto-js' + +import { ChatFlow } from '../entity/ChatFlow' +import { ChatMessage } from '../entity/ChatMessage' +import { Credential } from '../entity/Credential' +import { Tool } from '../entity/Tool' +import { DataSource } from 'typeorm' const QUESTION_VAR_PREFIX = 'question' +const CHAT_HISTORY_VAR_PREFIX = 'chat_history' +const REDACTED_CREDENTIAL_VALUE = '_FLOWISE_BLANK_07167752-1a71-43b1-bf8f-4f32252165db' + +export const databaseEntities: IDatabaseEntity = { ChatFlow: ChatFlow, ChatMessage: ChatMessage, Tool: Tool, Credential: Credential } /** * Returns the home folder path of the user if @@ -168,12 +190,15 @@ export const getEndingNode = (nodeDependencies: INodeDependencies, graph: INodeD /** * Build langchain from start to end - * @param {string} startingNodeId + * @param {string[]} startingNodeIds * @param {IReactFlowNode[]} reactFlowNodes * @param {INodeDirectedGraph} graph * @param {IDepthQueue} depthQueue * @param {IComponentNodes} componentNodes * @param {string} question + * @param {string} chatId + * @param {DataSource} appDataSource + * @param {ICommonObject} overrideConfig */ export const buildLangchain = async ( startingNodeIds: string[], @@ -182,6 +207,9 @@ export const buildLangchain = async ( depthQueue: IDepthQueue, componentNodes: IComponentNodes, question: string, + chatHistory: IMessage[], + chatId: string, + appDataSource: DataSource, overrideConfig?: ICommonObject ) => { const flowNodes = cloneDeep(reactFlowNodes) @@ -212,11 +240,18 @@ export const buildLangchain = async ( let flowNodeData = cloneDeep(reactFlowNode.data) if (overrideConfig) flowNodeData = replaceInputsWithConfig(flowNodeData, overrideConfig) - const reactFlowNodeData: INodeData = resolveVariables(flowNodeData, flowNodes, question) + const reactFlowNodeData: INodeData = resolveVariables(flowNodeData, flowNodes, question, chatHistory) - flowNodes[nodeIndex].data.instance = await newNodeInstance.init(reactFlowNodeData, question) + logger.debug(`[server]: Initializing ${reactFlowNode.data.label} (${reactFlowNode.data.id})`) + flowNodes[nodeIndex].data.instance = await newNodeInstance.init(reactFlowNodeData, question, { + chatId, + appDataSource, + databaseEntities, + logger + }) + logger.debug(`[server]: Finished initializing ${reactFlowNode.data.label} (${reactFlowNode.data.id})`) } catch (e: any) { - console.error(e) + logger.error(e) throw new Error(e) } @@ -255,6 +290,32 @@ export const buildLangchain = async ( return flowNodes } +/** + * Clear memory + * @param {IReactFlowNode[]} reactFlowNodes + * @param {IComponentNodes} componentNodes + * @param {string} chatId + * @param {DataSource} appDataSource + * @param {string} sessionId + */ +export const clearSessionMemory = async ( + reactFlowNodes: IReactFlowNode[], + componentNodes: IComponentNodes, + chatId: string, + appDataSource: DataSource, + sessionId?: string +) => { + for (const node of reactFlowNodes) { + if (node.data.category !== 'Memory') continue + const nodeInstanceFilePath = componentNodes[node.data.name].filePath as string + const nodeModule = await import(nodeInstanceFilePath) + const newNodeInstance = new nodeModule.nodeClass() + if (sessionId && node.data.inputs) node.data.inputs.sessionId = sessionId + if (newNodeInstance.clearSessionMemory) + await newNodeInstance?.clearSessionMemory(node.data, { chatId, appDataSource, databaseEntities, logger }) + } +} + /** * Get variable value from outputResponses.output * @param {string} paramValue @@ -263,7 +324,13 @@ export const buildLangchain = async ( * @param {boolean} isAcceptVariable * @returns {string} */ -export const getVariableValue = (paramValue: string, reactFlowNodes: IReactFlowNode[], question: string, isAcceptVariable = false) => { +export const getVariableValue = ( + paramValue: string, + reactFlowNodes: IReactFlowNode[], + question: string, + chatHistory: IMessage[], + isAcceptVariable = false +) => { let returnVal = paramValue const variableStack = [] const variableDict = {} as IVariableDict @@ -284,8 +351,17 @@ export const getVariableValue = (paramValue: string, reactFlowNodes: IReactFlowN const variableEndIdx = startIdx const variableFullPath = returnVal.substring(variableStartIdx, variableEndIdx) + /** + * Apply string transformation to convert special chars: + * FROM: hello i am ben\n\n\thow are you? + * TO: hello i am benFLOWISE_NEWLINEFLOWISE_NEWLINEFLOWISE_TABhow are you? + */ if (isAcceptVariable && variableFullPath === QUESTION_VAR_PREFIX) { - variableDict[`{{${variableFullPath}}}`] = question + variableDict[`{{${variableFullPath}}}`] = handleEscapeCharacters(question, false) + } + + if (isAcceptVariable && variableFullPath === CHAT_HISTORY_VAR_PREFIX) { + variableDict[`{{${variableFullPath}}}`] = handleEscapeCharacters(convertChatHistoryToText(chatHistory), false) } // Split by first occurrence of '.' to get just nodeId @@ -317,6 +393,25 @@ export const getVariableValue = (paramValue: string, reactFlowNodes: IReactFlowN return returnVal } +/** + * Temporarily disable streaming if vectorStore is Faiss + * @param {INodeData} flowNodeData + * @returns {boolean} + */ +export const isVectorStoreFaiss = (flowNodeData: INodeData) => { + if (flowNodeData.inputs && flowNodeData.inputs.vectorStoreRetriever) { + const vectorStoreRetriever = flowNodeData.inputs.vectorStoreRetriever + if (typeof vectorStoreRetriever === 'string' && vectorStoreRetriever.includes('faiss')) return true + if ( + typeof vectorStoreRetriever === 'object' && + vectorStoreRetriever.vectorStore && + vectorStoreRetriever.vectorStore.constructor.name === 'FaissStore' + ) + return true + } + return false +} + /** * Loop through each inputs and resolve variable if neccessary * @param {INodeData} reactFlowNodeData @@ -324,8 +419,18 @@ export const getVariableValue = (paramValue: string, reactFlowNodes: IReactFlowN * @param {string} question * @returns {INodeData} */ -export const resolveVariables = (reactFlowNodeData: INodeData, reactFlowNodes: IReactFlowNode[], question: string): INodeData => { - const flowNodeData = cloneDeep(reactFlowNodeData) +export const resolveVariables = ( + reactFlowNodeData: INodeData, + reactFlowNodes: IReactFlowNode[], + question: string, + chatHistory: IMessage[] +): INodeData => { + let flowNodeData = cloneDeep(reactFlowNodeData) + if (reactFlowNodeData.instance && isVectorStoreFaiss(reactFlowNodeData)) { + // omit and merge because cloneDeep of instance gives "Illegal invocation" Exception + const flowNodeDataWithoutInstance = cloneDeep(omit(reactFlowNodeData, ['instance'])) + flowNodeData = merge(flowNodeDataWithoutInstance, { instance: reactFlowNodeData.instance }) + } const types = 'inputs' const getParamValues = (paramsObj: ICommonObject) => { @@ -334,13 +439,13 @@ export const resolveVariables = (reactFlowNodeData: INodeData, reactFlowNodes: I if (Array.isArray(paramValue)) { const resolvedInstances = [] for (const param of paramValue) { - const resolvedInstance = getVariableValue(param, reactFlowNodes, question) + const resolvedInstance = getVariableValue(param, reactFlowNodes, question, chatHistory) resolvedInstances.push(resolvedInstance) } paramsObj[key] = resolvedInstances } else { const isAcceptVariable = reactFlowNodeData.inputParams.find((param) => param.name === key)?.acceptVariable ?? false - const resolvedInstance = getVariableValue(paramValue, reactFlowNodes, question, isAcceptVariable) + const resolvedInstance = getVariableValue(paramValue, reactFlowNodes, question, chatHistory, isAcceptVariable) paramsObj[key] = resolvedInstance } } @@ -363,9 +468,20 @@ export const replaceInputsWithConfig = (flowNodeData: INodeData, overrideConfig: const types = 'inputs' const getParamValues = (paramsObj: ICommonObject) => { - for (const key in paramsObj) { - const paramValue: string = paramsObj[key] - paramsObj[key] = overrideConfig[key] ?? paramValue + for (const config in overrideConfig) { + // If overrideConfig[key] is object + if (overrideConfig[config] && typeof overrideConfig[config] === 'object') { + const nodeIds = Object.keys(overrideConfig[config]) + if (!nodeIds.includes(flowNodeData.id)) continue + else paramsObj[config] = overrideConfig[config][flowNodeData.id] + continue + } + + let paramValue = overrideConfig[config] ?? paramsObj[config] + // Check if boolean + if (paramValue === 'true') paramValue = true + else if (paramValue === 'false') paramValue = false + paramsObj[config] = paramValue } } @@ -382,13 +498,17 @@ export const replaceInputsWithConfig = (flowNodeData: INodeData, overrideConfig: * @param {IReactFlowNode[]} startingNodes * @returns {boolean} */ -export const isStartNodeDependOnInput = (startingNodes: IReactFlowNode[]): boolean => { +export const isStartNodeDependOnInput = (startingNodes: IReactFlowNode[], nodes: IReactFlowNode[]): boolean => { for (const node of startingNodes) { for (const inputName in node.data.inputs) { const inputVariables = getInputVariables(node.data.inputs[inputName]) if (inputVariables.length > 0) return true } } + const whitelistNodeNames = ['vectorStoreToDocument'] + for (const node of nodes) { + if (whitelistNodeNames.includes(node.data.name)) return true + } return false } @@ -414,7 +534,7 @@ export const isSameOverrideConfig = ( Object.keys(existingOverrideConfig).length && newOverrideConfig && Object.keys(newOverrideConfig).length && - JSON.stringify(existingOverrideConfig) === JSON.stringify(newOverrideConfig) + isEqual(existingOverrideConfig, newOverrideConfig) ) { return true } @@ -428,7 +548,7 @@ export const isSameOverrideConfig = ( * @returns {string} */ export const getAPIKeyPath = (): string => { - return path.join(__dirname, '..', '..', 'api.json') + return process.env.APIKEY_PATH ? path.join(process.env.APIKEY_PATH, 'api.json') : path.join(__dirname, '..', '..', 'api.json') } /** @@ -512,6 +632,18 @@ export const addAPIKey = async (keyName: string): Promise => { return content } +/** + * Get API Key details + * @param {string} apiKey + * @returns {Promise} + */ +export const getApiKey = async (apiKey: string) => { + const existingAPIKeys = await getAPIKeys() + const keyIndex = existingAPIKeys.findIndex((key) => key.apiKey === apiKey) + if (keyIndex < 0) return undefined + return existingAPIKeys[keyIndex] +} + /** * Update existing API key * @param {string} keyIdToUpdate @@ -548,7 +680,7 @@ export const replaceAllAPIKeys = async (content: ICommonObject[]): Promise try { await fs.promises.writeFile(getAPIKeyPath(), JSON.stringify(content), 'utf8') } catch (error) { - console.error(error) + logger.error(error) } } @@ -567,39 +699,82 @@ export const mapMimeTypeToInputField = (mimeType: string) => { return 'jsonFile' case 'text/csv': return 'csvFile' + case 'application/json-lines': + case 'application/jsonl': + case 'text/jsonl': + return 'jsonlinesFile' case 'application/vnd.openxmlformats-officedocument.wordprocessingml.document': return 'docxFile' + case 'application/vnd.yaml': + case 'application/x-yaml': + case 'text/vnd.yaml': + case 'text/x-yaml': + case 'text/yaml': + return 'yamlFile' default: return '' } } /** - * Find all available inpur params config + * Find all available input params config * @param {IReactFlowNode[]} reactFlowNodes - * @returns {Promise} + * @param {IComponentCredentials} componentCredentials + * @returns {IOverrideConfig[]} */ -export const findAvailableConfigs = (reactFlowNodes: IReactFlowNode[]) => { +export const findAvailableConfigs = (reactFlowNodes: IReactFlowNode[], componentCredentials: IComponentCredentials) => { const configs: IOverrideConfig[] = [] for (const flowNode of reactFlowNodes) { for (const inputParam of flowNode.data.inputParams) { let obj: IOverrideConfig - if (inputParam.type === 'password' || inputParam.type === 'options') { - continue - } else if (inputParam.type === 'file') { + if (inputParam.type === 'file') { obj = { node: flowNode.data.label, + nodeId: flowNode.data.id, label: inputParam.label, name: 'files', type: inputParam.fileType ?? inputParam.type } + } else if (inputParam.type === 'options') { + obj = { + node: flowNode.data.label, + nodeId: flowNode.data.id, + label: inputParam.label, + name: inputParam.name, + type: inputParam.options + ? inputParam.options + ?.map((option) => { + return option.name + }) + .join(', ') + : 'string' + } + } else if (inputParam.type === 'credential') { + // get component credential inputs + for (const name of inputParam.credentialNames ?? []) { + if (Object.prototype.hasOwnProperty.call(componentCredentials, name)) { + const inputs = componentCredentials[name]?.inputs ?? [] + for (const input of inputs) { + obj = { + node: flowNode.data.label, + nodeId: flowNode.data.id, + label: input.label, + name: input.name, + type: input.type === 'password' ? 'string' : input.type + } + configs.push(obj) + } + } + } + continue } else { obj = { node: flowNode.data.label, + nodeId: flowNode.data.id, label: inputParam.label, name: inputParam.name, - type: inputParam.type + type: inputParam.type === 'password' ? 'string' : inputParam.type } } if (!configs.some((config) => JSON.stringify(config) === JSON.stringify(obj))) { @@ -610,3 +785,166 @@ export const findAvailableConfigs = (reactFlowNodes: IReactFlowNode[]) => { return configs } + +/** + * Check to see if flow valid for stream + * @param {IReactFlowNode[]} reactFlowNodes + * @param {INodeData} endingNodeData + * @returns {boolean} + */ +export const isFlowValidForStream = (reactFlowNodes: IReactFlowNode[], endingNodeData: INodeData) => { + const streamAvailableLLMs = { + 'Chat Models': ['azureChatOpenAI', 'chatOpenAI', 'chatAnthropic'], + LLMs: ['azureOpenAI', 'openAI'] + } + + let isChatOrLLMsExist = false + for (const flowNode of reactFlowNodes) { + const data = flowNode.data + if (data.category === 'Chat Models' || data.category === 'LLMs') { + isChatOrLLMsExist = true + const validLLMs = streamAvailableLLMs[data.category] + if (!validLLMs.includes(data.name)) return false + } + } + + let isValidChainOrAgent = false + if (endingNodeData.category === 'Chains') { + // Chains that are not available to stream + const blacklistChains = ['openApiChain'] + isValidChainOrAgent = !blacklistChains.includes(endingNodeData.name) + } else if (endingNodeData.category === 'Agents') { + // Agent that are available to stream + const whitelistAgents = ['openAIFunctionAgent', 'csvAgent', 'airtableAgent', 'conversationalRetrievalAgent'] + isValidChainOrAgent = whitelistAgents.includes(endingNodeData.name) + } + + return isChatOrLLMsExist && isValidChainOrAgent && !isVectorStoreFaiss(endingNodeData) +} + +/** + * Returns the path of encryption key + * @returns {string} + */ +export const getEncryptionKeyPath = (): string => { + return process.env.SECRETKEY_PATH + ? path.join(process.env.SECRETKEY_PATH, 'encryption.key') + : path.join(__dirname, '..', '..', 'encryption.key') +} + +/** + * Generate an encryption key + * @returns {string} + */ +export const generateEncryptKey = (): string => { + const salt = lib.WordArray.random(128 / 8) + const key256Bits = PBKDF2(process.env.PASSPHRASE || 'MYPASSPHRASE', salt, { + keySize: 256 / 32, + iterations: 1000 + }) + return key256Bits.toString() +} + +/** + * Returns the encryption key + * @returns {Promise} + */ +export const getEncryptionKey = async (): Promise => { + try { + return await fs.promises.readFile(getEncryptionKeyPath(), 'utf8') + } catch (error) { + const encryptKey = generateEncryptKey() + await fs.promises.writeFile(getEncryptionKeyPath(), encryptKey) + return encryptKey + } +} + +/** + * Encrypt credential data + * @param {ICredentialDataDecrypted} plainDataObj + * @returns {Promise} + */ +export const encryptCredentialData = async (plainDataObj: ICredentialDataDecrypted): Promise => { + const encryptKey = await getEncryptionKey() + return AES.encrypt(JSON.stringify(plainDataObj), encryptKey).toString() +} + +/** + * Decrypt credential data + * @param {string} encryptedData + * @param {string} componentCredentialName + * @param {IComponentCredentials} componentCredentials + * @returns {Promise} + */ +export const decryptCredentialData = async ( + encryptedData: string, + componentCredentialName?: string, + componentCredentials?: IComponentCredentials +): Promise => { + const encryptKey = await getEncryptionKey() + const decryptedData = AES.decrypt(encryptedData, encryptKey) + try { + if (componentCredentialName && componentCredentials) { + const plainDataObj = JSON.parse(decryptedData.toString(enc.Utf8)) + return redactCredentialWithPasswordType(componentCredentialName, plainDataObj, componentCredentials) + } + return JSON.parse(decryptedData.toString(enc.Utf8)) + } catch (e) { + console.error(e) + throw new Error('Credentials could not be decrypted.') + } +} + +/** + * Transform ICredentialBody from req to Credential entity + * @param {ICredentialReqBody} body + * @returns {Credential} + */ +export const transformToCredentialEntity = async (body: ICredentialReqBody): Promise => { + const encryptedData = await encryptCredentialData(body.plainDataObj) + + const credentialBody = { + name: body.name, + credentialName: body.credentialName, + encryptedData + } + + const newCredential = new Credential() + Object.assign(newCredential, credentialBody) + + return newCredential +} + +/** + * Redact values that are of password type to avoid sending back to client + * @param {string} componentCredentialName + * @param {ICredentialDataDecrypted} decryptedCredentialObj + * @param {IComponentCredentials} componentCredentials + * @returns {ICredentialDataDecrypted} + */ +export const redactCredentialWithPasswordType = ( + componentCredentialName: string, + decryptedCredentialObj: ICredentialDataDecrypted, + componentCredentials: IComponentCredentials +): ICredentialDataDecrypted => { + const plainDataObj = cloneDeep(decryptedCredentialObj) + for (const cred in plainDataObj) { + const inputParam = componentCredentials[componentCredentialName].inputs?.find((inp) => inp.type === 'password' && inp.name === cred) + if (inputParam) { + plainDataObj[cred] = REDACTED_CREDENTIAL_VALUE + } + } + return plainDataObj +} + +/** + * Replace sessionId with new chatId + * Ex: after clear chat history, use the new chatId as sessionId + * @param {any} instance + * @param {string} chatId + */ +export const checkMemorySessionId = (instance: any, chatId: string) => { + if (instance.memory && instance.memory.isSessionIdUsingChatMessageId && chatId) { + instance.memory.sessionId = chatId + } +} diff --git a/packages/server/src/utils/logger.ts b/packages/server/src/utils/logger.ts new file mode 100644 index 000000000..839f1ad74 --- /dev/null +++ b/packages/server/src/utils/logger.ts @@ -0,0 +1,106 @@ +import * as path from 'path' +import * as fs from 'fs' +import config from './config' // should be replaced by node-config or similar +import { createLogger, transports, format } from 'winston' +import { NextFunction, Request, Response } from 'express' + +const { combine, timestamp, printf, errors } = format + +// expect the log dir be relative to the projects root +const logDir = config.logging.dir + +// Create the log directory if it doesn't exist +if (!fs.existsSync(logDir)) { + fs.mkdirSync(logDir) +} + +const logger = createLogger({ + format: combine( + timestamp({ format: 'YYYY-MM-DD HH:mm:ss' }), + format.json(), + printf(({ level, message, timestamp, stack }) => { + const text = `${timestamp} [${level.toUpperCase()}]: ${message}` + return stack ? text + '\n' + stack : text + }), + errors({ stack: true }) + ), + defaultMeta: { + package: 'server' + }, + transports: [ + new transports.Console(), + new transports.File({ + filename: path.join(logDir, config.logging.server.filename ?? 'server.log'), + level: config.logging.server.level ?? 'info' + }), + new transports.File({ + filename: path.join(logDir, config.logging.server.errorFilename ?? 'server-error.log'), + level: 'error' // Log only errors to this file + }) + ], + exceptionHandlers: [ + new transports.File({ + filename: path.join(logDir, config.logging.server.errorFilename ?? 'server-error.log') + }) + ], + rejectionHandlers: [ + new transports.File({ + filename: path.join(logDir, config.logging.server.errorFilename ?? 'server-error.log') + }) + ] +}) + +/** + * This function is used by express as a middleware. + * @example + * this.app = express() + * this.app.use(expressRequestLogger) + */ +export function expressRequestLogger(req: Request, res: Response, next: NextFunction): void { + const unwantedLogURLs = ['/api/v1/node-icon/'] + if (req.url.includes('/api/v1/') && !unwantedLogURLs.some((url) => req.url.includes(url))) { + const fileLogger = createLogger({ + format: combine(timestamp({ format: 'YYYY-MM-DD HH:mm:ss' }), format.json(), errors({ stack: true })), + defaultMeta: { + package: 'server', + request: { + method: req.method, + url: req.url, + body: req.body, + query: req.query, + params: req.params, + headers: req.headers + } + }, + transports: [ + new transports.File({ + filename: path.join(logDir, config.logging.express.filename ?? 'server-requests.log.jsonl'), + level: config.logging.express.level ?? 'debug' + }) + ] + }) + + const getRequestEmoji = (method: string) => { + const requetsEmojis: Record = { + GET: '⬇️', + POST: '⬆️', + PUT: '🖊', + DELETE: '❌', + OPTION: '🔗' + } + + return requetsEmojis[method] || '?' + } + + if (req.method !== 'GET') { + fileLogger.info(`${getRequestEmoji(req.method)} ${req.method} ${req.url}`) + logger.info(`${getRequestEmoji(req.method)} ${req.method} ${req.url}`) + } else { + fileLogger.http(`${getRequestEmoji(req.method)} ${req.method} ${req.url}`) + } + } + + next() +} + +export default logger diff --git a/packages/ui/README-ZH.md b/packages/ui/README-ZH.md new file mode 100644 index 000000000..21aaf482f --- /dev/null +++ b/packages/ui/README-ZH.md @@ -0,0 +1,19 @@ + + +# 流程界面 + +[English](./README.md) | 中文 + +Flowise 的 React 前端界面。 + +![Flowise](https://github.com/FlowiseAI/Flowise/blob/main/images/flowise.gif?raw=true) + +安装: + +```bash +npm i flowise-ui +``` + +## 许可证 + +本仓库中的源代码在[MIT 许可证](https://github.com/FlowiseAI/Flowise/blob/master/LICENSE.md)下提供。 diff --git a/packages/ui/README.md b/packages/ui/README.md index fff7f9ea7..257dc6f40 100644 --- a/packages/ui/README.md +++ b/packages/ui/README.md @@ -2,6 +2,8 @@ # Flowise UI +English | [中文](./README-ZH.md) + React frontend ui for Flowise. ![Flowise](https://github.com/FlowiseAI/Flowise/blob/main/images/flowise.gif?raw=true) diff --git a/packages/ui/craco.config.js b/packages/ui/craco.config.js new file mode 100644 index 000000000..142305e01 --- /dev/null +++ b/packages/ui/craco.config.js @@ -0,0 +1,16 @@ +module.exports = { + webpack: { + configure: { + module: { + rules: [ + { + test: /\.m?js$/, + resolve: { + fullySpecified: false + } + } + ] + } + } + } +} diff --git a/packages/ui/package.json b/packages/ui/package.json index b367b1830..4468de917 100644 --- a/packages/ui/package.json +++ b/packages/ui/package.json @@ -1,6 +1,6 @@ { "name": "flowise-ui", - "version": "1.2.5", + "version": "1.3.1", "license": "SEE LICENSE IN LICENSE.md", "homepage": "https://flowiseai.com", "author": { @@ -13,8 +13,12 @@ "@emotion/styled": "^11.10.6", "@mui/icons-material": "^5.0.3", "@mui/material": "^5.11.12", + "@mui/x-data-grid": "^6.8.0", "@tabler/icons": "^1.39.1", "clsx": "^1.1.1", + "flowise-embed": "*", + "flowise-embed-react": "*", + "flowise-react-json-view": "*", "formik": "^2.2.6", "framer-motion": "^4.1.13", "history": "^5.0.0", @@ -26,26 +30,31 @@ "prop-types": "^15.7.2", "react": "^18.2.0", "react-code-blocks": "^0.0.9-0", + "react-color": "^2.19.3", "react-datepicker": "^4.8.0", "react-device-detect": "^1.17.0", "react-dom": "^18.2.0", - "react-json-view": "^1.21.3", "react-markdown": "^8.0.6", "react-perfect-scrollbar": "^1.5.8", "react-redux": "^8.0.5", "react-router": "~6.3.0", "react-router-dom": "~6.3.0", "react-simple-code-editor": "^0.11.2", + "react-syntax-highlighter": "^15.5.0", "reactflow": "^11.5.6", "redux": "^4.0.5", + "rehype-mathjax": "^4.0.2", + "remark-gfm": "^3.0.1", + "remark-math": "^5.1.1", + "socket.io-client": "^4.6.1", "yup": "^0.32.9" }, "scripts": { - "start": "react-scripts start", - "dev": "react-scripts start", - "build": "react-scripts build", - "test": "react-scripts test", - "eject": "react-scripts eject" + "start": "craco start", + "dev": "craco start", + "build": "craco build", + "test": "craco test", + "eject": "craco eject" }, "babel": { "presets": [ @@ -66,6 +75,8 @@ }, "devDependencies": { "@babel/eslint-parser": "^7.15.8", + "@babel/plugin-proposal-private-property-in-object": "^7.21.11", + "@craco/craco": "^7.1.0", "@testing-library/jest-dom": "^5.11.10", "@testing-library/react": "^14.0.0", "@testing-library/user-event": "^12.8.3", diff --git a/packages/ui/public/index.html b/packages/ui/public/index.html index 270cc8058..b4ec9ea10 100644 --- a/packages/ui/public/index.html +++ b/packages/ui/public/index.html @@ -1,13 +1,13 @@ - Flowise - LangchainJS UI + Flowise - Low-code LLM apps builder - + @@ -17,13 +17,13 @@ - + - + diff --git a/packages/ui/src/api/chatflows.js b/packages/ui/src/api/chatflows.js index eae010eda..8810b5a5a 100644 --- a/packages/ui/src/api/chatflows.js +++ b/packages/ui/src/api/chatflows.js @@ -4,16 +4,22 @@ const getAllChatflows = () => client.get('/chatflows') const getSpecificChatflow = (id) => client.get(`/chatflows/${id}`) +const getSpecificChatflowFromPublicEndpoint = (id) => client.get(`/public-chatflows/${id}`) + const createNewChatflow = (body) => client.post(`/chatflows`, body) const updateChatflow = (id, body) => client.put(`/chatflows/${id}`, body) const deleteChatflow = (id) => client.delete(`/chatflows/${id}`) +const getIsChatflowStreaming = (id) => client.get(`/chatflows-streaming/${id}`) + export default { getAllChatflows, getSpecificChatflow, + getSpecificChatflowFromPublicEndpoint, createNewChatflow, updateChatflow, - deleteChatflow + deleteChatflow, + getIsChatflowStreaming } diff --git a/packages/ui/src/api/client.js b/packages/ui/src/api/client.js index cafdf0b3a..8235bde4c 100644 --- a/packages/ui/src/api/client.js +++ b/packages/ui/src/api/client.js @@ -14,8 +14,8 @@ apiClient.interceptors.request.use(function (config) { if (username && password) { config.auth = { - username: username.toLocaleLowerCase(), - password: password.toLocaleLowerCase() + username, + password } } diff --git a/packages/ui/src/api/config.js b/packages/ui/src/api/config.js index 0fb8297df..47ee51a05 100644 --- a/packages/ui/src/api/config.js +++ b/packages/ui/src/api/config.js @@ -1,7 +1,9 @@ import client from './client' const getConfig = (id) => client.get(`/flow-config/${id}`) +const getNodeConfig = (body) => client.post(`/node-config`, body) export default { - getConfig + getConfig, + getNodeConfig } diff --git a/packages/ui/src/api/credentials.js b/packages/ui/src/api/credentials.js new file mode 100644 index 000000000..9dbdcf7ad --- /dev/null +++ b/packages/ui/src/api/credentials.js @@ -0,0 +1,28 @@ +import client from './client' + +const getAllCredentials = () => client.get('/credentials') + +const getCredentialsByName = (componentCredentialName) => client.get(`/credentials?credentialName=${componentCredentialName}`) + +const getAllComponentsCredentials = () => client.get('/components-credentials') + +const getSpecificCredential = (id) => client.get(`/credentials/${id}`) + +const getSpecificComponentCredential = (name) => client.get(`/components-credentials/${name}`) + +const createCredential = (body) => client.post(`/credentials`, body) + +const updateCredential = (id, body) => client.put(`/credentials/${id}`, body) + +const deleteCredential = (id) => client.delete(`/credentials/${id}`) + +export default { + getAllCredentials, + getCredentialsByName, + getAllComponentsCredentials, + getSpecificCredential, + getSpecificComponentCredential, + createCredential, + updateCredential, + deleteCredential +} diff --git a/packages/ui/src/api/marketplaces.js b/packages/ui/src/api/marketplaces.js index 6906fb4e4..3fd4ae872 100644 --- a/packages/ui/src/api/marketplaces.js +++ b/packages/ui/src/api/marketplaces.js @@ -1,7 +1,9 @@ import client from './client' -const getAllMarketplaces = () => client.get('/marketplaces') +const getAllChatflowsMarketplaces = () => client.get('/marketplaces/chatflows') +const getAllToolsMarketplaces = () => client.get('/marketplaces/tools') export default { - getAllMarketplaces + getAllChatflowsMarketplaces, + getAllToolsMarketplaces } diff --git a/packages/ui/src/api/tools.js b/packages/ui/src/api/tools.js new file mode 100644 index 000000000..77992a2ab --- /dev/null +++ b/packages/ui/src/api/tools.js @@ -0,0 +1,19 @@ +import client from './client' + +const getAllTools = () => client.get('/tools') + +const getSpecificTool = (id) => client.get(`/tools/${id}`) + +const createNewTool = (body) => client.post(`/tools`, body) + +const updateTool = (id, body) => client.put(`/tools/${id}`, body) + +const deleteTool = (id) => client.delete(`/tools/${id}`) + +export default { + getAllTools, + getSpecificTool, + createNewTool, + updateTool, + deleteTool +} diff --git a/packages/ui/src/assets/images/account.png b/packages/ui/src/assets/images/account.png new file mode 100644 index 000000000..a758a1db9 Binary files /dev/null and b/packages/ui/src/assets/images/account.png differ diff --git a/packages/ui/src/assets/images/chathistory.png b/packages/ui/src/assets/images/chathistory.png new file mode 100644 index 000000000..52f496a89 Binary files /dev/null and b/packages/ui/src/assets/images/chathistory.png differ diff --git a/packages/ui/src/assets/images/credential_empty.svg b/packages/ui/src/assets/images/credential_empty.svg new file mode 100644 index 000000000..0951ee07a --- /dev/null +++ b/packages/ui/src/assets/images/credential_empty.svg @@ -0,0 +1 @@ + \ No newline at end of file diff --git a/packages/ui/src/assets/images/robot.png b/packages/ui/src/assets/images/robot.png new file mode 100644 index 000000000..d4fe920a9 Binary files /dev/null and b/packages/ui/src/assets/images/robot.png differ diff --git a/packages/ui/src/assets/images/sharing.png b/packages/ui/src/assets/images/sharing.png new file mode 100644 index 000000000..1e538f2ef Binary files /dev/null and b/packages/ui/src/assets/images/sharing.png differ diff --git a/packages/ui/src/assets/images/tools_empty.svg b/packages/ui/src/assets/images/tools_empty.svg new file mode 100644 index 000000000..9a2a2a77c --- /dev/null +++ b/packages/ui/src/assets/images/tools_empty.svg @@ -0,0 +1 @@ + \ No newline at end of file diff --git a/packages/ui/src/layout/MainLayout/Header/ProfileSection/index.js b/packages/ui/src/layout/MainLayout/Header/ProfileSection/index.js index f6f6a7307..41de3dd44 100644 --- a/packages/ui/src/layout/MainLayout/Header/ProfileSection/index.js +++ b/packages/ui/src/layout/MainLayout/Header/ProfileSection/index.js @@ -27,9 +27,10 @@ import PerfectScrollbar from 'react-perfect-scrollbar' import MainCard from 'ui-component/cards/MainCard' import Transitions from 'ui-component/extended/Transitions' import { BackdropLoader } from 'ui-component/loading/BackdropLoader' +import AboutDialog from 'ui-component/dialog/AboutDialog' // assets -import { IconLogout, IconSettings, IconFileExport, IconFileDownload } from '@tabler/icons' +import { IconLogout, IconSettings, IconFileExport, IconFileDownload, IconInfoCircle } from '@tabler/icons' // API import databaseApi from 'api/database' @@ -49,6 +50,7 @@ const ProfileSection = ({ username, handleLogout }) => { const [open, setOpen] = useState(false) const [loading, setLoading] = useState(false) + const [aboutDialogOpen, setAboutDialogOpen] = useState(false) const anchorRef = useRef(null) const uploadRef = useRef(null) @@ -215,6 +217,18 @@ const ProfileSection = ({ username, handleLogout }) => { Export Database} /> + { + setOpen(false) + setAboutDialogOpen(true) + }} + > + + + + About Flowise} /> + {localStorage.getItem('username') && localStorage.getItem('password') && ( { handleFileUpload(e)} /> + setAboutDialogOpen(false)} /> ) } diff --git a/packages/ui/src/menu-items/dashboard.js b/packages/ui/src/menu-items/dashboard.js index f1cd5062e..87ef88f98 100644 --- a/packages/ui/src/menu-items/dashboard.js +++ b/packages/ui/src/menu-items/dashboard.js @@ -1,8 +1,8 @@ // assets -import { IconHierarchy, IconBuildingStore, IconKey } from '@tabler/icons' +import { IconHierarchy, IconBuildingStore, IconKey, IconTool, IconLock } from '@tabler/icons' // constant -const icons = { IconHierarchy, IconBuildingStore, IconKey } +const icons = { IconHierarchy, IconBuildingStore, IconKey, IconTool, IconLock } // ==============================|| DASHBOARD MENU ITEMS ||============================== // @@ -27,6 +27,22 @@ const dashboard = { icon: icons.IconBuildingStore, breadcrumbs: true }, + { + id: 'tools', + title: 'Tools', + type: 'item', + url: '/tools', + icon: icons.IconTool, + breadcrumbs: true + }, + { + id: 'credentials', + title: 'Credentials', + type: 'item', + url: '/credentials', + icon: icons.IconLock, + breadcrumbs: true + }, { id: 'apikey', title: 'API Keys', diff --git a/packages/ui/src/routes/ChatbotRoutes.js b/packages/ui/src/routes/ChatbotRoutes.js new file mode 100644 index 000000000..25d298d68 --- /dev/null +++ b/packages/ui/src/routes/ChatbotRoutes.js @@ -0,0 +1,23 @@ +import { lazy } from 'react' + +// project imports +import Loadable from 'ui-component/loading/Loadable' +import MinimalLayout from 'layout/MinimalLayout' + +// canvas routing +const ChatbotFull = Loadable(lazy(() => import('views/chatbot'))) + +// ==============================|| CANVAS ROUTING ||============================== // + +const ChatbotRoutes = { + path: '/', + element: , + children: [ + { + path: '/chatbot/:id', + element: + } + ] +} + +export default ChatbotRoutes diff --git a/packages/ui/src/routes/MainRoutes.js b/packages/ui/src/routes/MainRoutes.js index 5353e41a8..9a1c29aff 100644 --- a/packages/ui/src/routes/MainRoutes.js +++ b/packages/ui/src/routes/MainRoutes.js @@ -13,6 +13,12 @@ const Marketplaces = Loadable(lazy(() => import('views/marketplaces'))) // apikey routing const APIKey = Loadable(lazy(() => import('views/apikey'))) +// tools routing +const Tools = Loadable(lazy(() => import('views/tools'))) + +// credentials routing +const Credentials = Loadable(lazy(() => import('views/credentials'))) + // ==============================|| MAIN ROUTING ||============================== // const MainRoutes = { @@ -34,6 +40,14 @@ const MainRoutes = { { path: '/apikey', element: + }, + { + path: '/tools', + element: + }, + { + path: '/credentials', + element: } ] } diff --git a/packages/ui/src/routes/index.js b/packages/ui/src/routes/index.js index 15fe4dcab..ff8c19200 100644 --- a/packages/ui/src/routes/index.js +++ b/packages/ui/src/routes/index.js @@ -3,10 +3,11 @@ import { useRoutes } from 'react-router-dom' // routes import MainRoutes from './MainRoutes' import CanvasRoutes from './CanvasRoutes' +import ChatbotRoutes from './ChatbotRoutes' import config from 'config' // ==============================|| ROUTING RENDER ||============================== // export default function ThemeRoutes() { - return useRoutes([MainRoutes, CanvasRoutes], config.basename) + return useRoutes([MainRoutes, CanvasRoutes, ChatbotRoutes], config.basename) } diff --git a/packages/ui/src/store/actions.js b/packages/ui/src/store/actions.js index 306c5cb0a..0c68f8f23 100644 --- a/packages/ui/src/store/actions.js +++ b/packages/ui/src/store/actions.js @@ -11,6 +11,10 @@ export const SET_DARKMODE = '@customization/SET_DARKMODE' export const SET_DIRTY = '@canvas/SET_DIRTY' export const REMOVE_DIRTY = '@canvas/REMOVE_DIRTY' export const SET_CHATFLOW = '@canvas/SET_CHATFLOW' +export const SHOW_CANVAS_DIALOG = '@canvas/SHOW_CANVAS_DIALOG' +export const HIDE_CANVAS_DIALOG = '@canvas/HIDE_CANVAS_DIALOG' +export const SET_COMPONENT_NODES = '@canvas/SET_COMPONENT_NODES' +export const SET_COMPONENT_CREDENTIALS = '@canvas/SET_COMPONENT_CREDENTIALS' // action - notifier reducer export const ENQUEUE_SNACKBAR = 'ENQUEUE_SNACKBAR' diff --git a/packages/ui/src/store/constant.js b/packages/ui/src/store/constant.js index c3138257e..c0fce49db 100644 --- a/packages/ui/src/store/constant.js +++ b/packages/ui/src/store/constant.js @@ -5,3 +5,4 @@ export const appDrawerWidth = 320 export const maxScroll = 100000 export const baseURL = process.env.NODE_ENV === 'production' ? window.location.origin : window.location.origin.replace(':8080', ':3000') export const uiBaseURL = window.location.origin +export const FLOWISE_CREDENTIAL_ID = 'FLOWISE_CREDENTIAL_ID' diff --git a/packages/ui/src/store/context/ReactFlowContext.js b/packages/ui/src/store/context/ReactFlowContext.js index 4c35d7020..055cb8bc9 100644 --- a/packages/ui/src/store/context/ReactFlowContext.js +++ b/packages/ui/src/store/context/ReactFlowContext.js @@ -1,7 +1,9 @@ import { createContext, useState } from 'react' +import { useDispatch } from 'react-redux' import PropTypes from 'prop-types' import { getUniqueNodeId } from 'utils/genericHelper' import { cloneDeep } from 'lodash' +import { SET_DIRTY } from 'store/actions' const initialValue = { reactFlowInstance: null, @@ -14,17 +16,20 @@ const initialValue = { export const flowContext = createContext(initialValue) export const ReactFlowContext = ({ children }) => { + const dispatch = useDispatch() const [reactFlowInstance, setReactFlowInstance] = useState(null) const deleteNode = (nodeid) => { deleteConnectedInput(nodeid, 'node') reactFlowInstance.setNodes(reactFlowInstance.getNodes().filter((n) => n.id !== nodeid)) reactFlowInstance.setEdges(reactFlowInstance.getEdges().filter((ns) => ns.source !== nodeid && ns.target !== nodeid)) + dispatch({ type: SET_DIRTY }) } const deleteEdge = (edgeid) => { deleteConnectedInput(edgeid, 'edge') reactFlowInstance.setEdges(reactFlowInstance.getEdges().filter((edge) => edge.id !== edgeid)) + dispatch({ type: SET_DIRTY }) } const deleteConnectedInput = (id, type) => { @@ -103,6 +108,7 @@ export const ReactFlowContext = ({ children }) => { } reactFlowInstance.setNodes([...nodes, duplicatedNode]) + dispatch({ type: SET_DIRTY }) } } diff --git a/packages/ui/src/store/reducers/canvasReducer.js b/packages/ui/src/store/reducers/canvasReducer.js index e98805bb1..1c5e486f7 100644 --- a/packages/ui/src/store/reducers/canvasReducer.js +++ b/packages/ui/src/store/reducers/canvasReducer.js @@ -3,7 +3,10 @@ import * as actionTypes from '../actions' export const initialState = { isDirty: false, - chatflow: null + chatflow: null, + canvasDialogShow: false, + componentNodes: [], + componentCredentials: [] } // ==============================|| CANVAS REDUCER ||============================== // @@ -25,6 +28,26 @@ const canvasReducer = (state = initialState, action) => { ...state, chatflow: action.chatflow } + case actionTypes.SHOW_CANVAS_DIALOG: + return { + ...state, + canvasDialogShow: true + } + case actionTypes.HIDE_CANVAS_DIALOG: + return { + ...state, + canvasDialogShow: false + } + case actionTypes.SET_COMPONENT_NODES: + return { + ...state, + componentNodes: action.componentNodes + } + case actionTypes.SET_COMPONENT_CREDENTIALS: + return { + ...state, + componentCredentials: action.componentCredentials + } default: return state } diff --git a/packages/ui/src/themes/compStyleOverride.js b/packages/ui/src/themes/compStyleOverride.js index eb6f6de94..c04cc3f17 100644 --- a/packages/ui/src/themes/compStyleOverride.js +++ b/packages/ui/src/themes/compStyleOverride.js @@ -1,6 +1,39 @@ export default function componentStyleOverrides(theme) { const bgColor = theme.colors?.grey50 return { + MuiCssBaseline: { + styleOverrides: { + body: { + scrollbarWidth: 'thin', + scrollbarColor: theme?.customization?.isDarkMode + ? `${theme.colors?.grey500} ${theme.colors?.darkPrimaryMain}` + : `${theme.colors?.grey300} ${theme.paper}`, + '&::-webkit-scrollbar, & *::-webkit-scrollbar': { + width: 12, + height: 12, + backgroundColor: theme?.customization?.isDarkMode ? theme.colors?.darkPrimaryMain : theme.paper + }, + '&::-webkit-scrollbar-thumb, & *::-webkit-scrollbar-thumb': { + borderRadius: 8, + backgroundColor: theme?.customization?.isDarkMode ? theme.colors?.grey500 : theme.colors?.grey300, + minHeight: 24, + border: `3px solid ${theme?.customization?.isDarkMode ? theme.colors?.darkPrimaryMain : theme.paper}` + }, + '&::-webkit-scrollbar-thumb:focus, & *::-webkit-scrollbar-thumb:focus': { + backgroundColor: theme?.customization?.isDarkMode ? theme.colors?.darkPrimary200 : theme.colors?.grey500 + }, + '&::-webkit-scrollbar-thumb:active, & *::-webkit-scrollbar-thumb:active': { + backgroundColor: theme?.customization?.isDarkMode ? theme.colors?.darkPrimary200 : theme.colors?.grey500 + }, + '&::-webkit-scrollbar-thumb:hover, & *::-webkit-scrollbar-thumb:hover': { + backgroundColor: theme?.customization?.isDarkMode ? theme.colors?.darkPrimary200 : theme.colors?.grey500 + }, + '&::-webkit-scrollbar-corner, & *::-webkit-scrollbar-corner': { + backgroundColor: theme?.customization?.isDarkMode ? theme.colors?.darkPrimaryMain : theme.paper + } + } + } + }, MuiButton: { styleOverrides: { root: { @@ -103,6 +136,9 @@ export default function componentStyleOverrides(theme) { '&::placeholder': { color: theme.darkTextSecondary, fontSize: '0.875rem' + }, + '&.Mui-disabled': { + WebkitTextFillColor: theme?.customization?.isDarkMode ? theme.colors?.grey500 : theme.darkTextSecondary } } } diff --git a/packages/ui/src/themes/palette.js b/packages/ui/src/themes/palette.js index a4a5104dd..19a7df119 100644 --- a/packages/ui/src/themes/palette.js +++ b/packages/ui/src/themes/palette.js @@ -7,7 +7,8 @@ export default function themePalette(theme) { return { mode: theme?.customization?.navType, common: { - black: theme.colors?.darkPaper + black: theme.colors?.darkPaper, + dark: theme.colors?.darkPrimaryMain }, primary: { light: theme.customization.isDarkMode ? theme.colors?.darkPrimaryLight : theme.colors?.primaryLight, @@ -89,6 +90,10 @@ export default function themePalette(theme) { }, codeEditor: { main: theme.customization.isDarkMode ? theme.colors?.darkPrimary800 : theme.colors?.primaryLight + }, + nodeToolTip: { + background: theme.customization.isDarkMode ? theme.colors?.darkPrimary800 : theme.colors?.paper, + color: theme.customization.isDarkMode ? theme.colors?.paper : 'rgba(0, 0, 0, 0.87)' } } } diff --git a/packages/ui/src/ui-component/cards/ItemCard.js b/packages/ui/src/ui-component/cards/ItemCard.js index 506947ce6..1e8789d76 100644 --- a/packages/ui/src/ui-component/cards/ItemCard.js +++ b/packages/ui/src/ui-component/cards/ItemCard.js @@ -1,8 +1,8 @@ import PropTypes from 'prop-types' // material-ui -import { styled, useTheme } from '@mui/material/styles' -import { Box, Grid, Chip, Typography } from '@mui/material' +import { styled } from '@mui/material/styles' +import { Box, Grid, Typography } from '@mui/material' // project imports import MainCard from 'ui-component/cards/MainCard' @@ -28,19 +28,6 @@ const CardWrapper = styled(MainCard)(({ theme }) => ({ // ===========================|| CONTRACT CARD ||=========================== // const ItemCard = ({ isLoading, data, images, onClick }) => { - const theme = useTheme() - - const chipSX = { - height: 24, - padding: '0 6px' - } - - const activeChatflowSX = { - ...chipSX, - color: 'white', - backgroundColor: theme.palette.success.dark - } - return ( <> {isLoading ? ( @@ -49,11 +36,42 @@ const ItemCard = ({ isLoading, data, images, onClick }) => { -
+
+ {data.iconSrc && ( +
+ )} + {!data.iconSrc && data.color && ( +
+ )} - {data.name} + {data.templateName || data.name}
{data.description && ( @@ -61,13 +79,6 @@ const ItemCard = ({ isLoading, data, images, onClick }) => { {data.description} )} - - {data.deployed && ( - - - - )} - {images && (
{ + let apiReturn = await axios + .get(api) + .then(async function (response) { + return response.data + }) + .catch(function (error) { + console.error(error) + }) + return apiReturn +} + +const AboutDialog = ({ show, onCancel }) => { + const portalElement = document.getElementById('portal') + + const [data, setData] = useState({}) + + useEffect(() => { + if (show) { + const fetchData = async (api) => { + let response = await fetchLatestVer({ api }) + setData(response) + } + + fetchData('https://api.github.com/repos/FlowiseAI/Flowise/releases/latest') + } + + // eslint-disable-next-line react-hooks/exhaustive-deps + }, [show]) + + const component = show ? ( + + + Flowise Version + + + {data && ( + + + + + Latest Version + Published At + + + + + + + {data.name} + + + {moment(data.published_at).fromNow()} + + +
+
+ )} +
+
+ ) : null + + return createPortal(component, portalElement) +} + +AboutDialog.propTypes = { + show: PropTypes.bool, + onCancel: PropTypes.func +} + +export default AboutDialog diff --git a/packages/ui/src/ui-component/dialog/AdditionalParamsDialog.js b/packages/ui/src/ui-component/dialog/AdditionalParamsDialog.js index 66a1eaf64..7cf9b3b77 100644 --- a/packages/ui/src/ui-component/dialog/AdditionalParamsDialog.js +++ b/packages/ui/src/ui-component/dialog/AdditionalParamsDialog.js @@ -1,12 +1,15 @@ import { createPortal } from 'react-dom' +import { useDispatch } from 'react-redux' import { useState, useEffect } from 'react' import PropTypes from 'prop-types' import { Dialog, DialogContent } from '@mui/material' import PerfectScrollbar from 'react-perfect-scrollbar' import NodeInputHandler from 'views/canvas/NodeInputHandler' +import { HIDE_CANVAS_DIALOG, SHOW_CANVAS_DIALOG } from 'store/actions' const AdditionalParamsDialog = ({ show, dialogProps, onCancel }) => { const portalElement = document.getElementById('portal') + const dispatch = useDispatch() const [inputParams, setInputParams] = useState([]) const [data, setData] = useState({}) @@ -21,6 +24,12 @@ const AdditionalParamsDialog = ({ show, dialogProps, onCancel }) => { } }, [dialogProps]) + useEffect(() => { + if (show) dispatch({ type: SHOW_CANVAS_DIALOG }) + else dispatch({ type: HIDE_CANVAS_DIALOG }) + return () => dispatch({ type: HIDE_CANVAS_DIALOG }) + }, [show, dispatch]) + const component = show ? ( { {confirmState.title} - - {confirmState.description} - + {confirmState.description} diff --git a/packages/ui/src/ui-component/dialog/EditPromptValuesDialog.js b/packages/ui/src/ui-component/dialog/EditPromptValuesDialog.js deleted file mode 100644 index 199b13067..000000000 --- a/packages/ui/src/ui-component/dialog/EditPromptValuesDialog.js +++ /dev/null @@ -1,256 +0,0 @@ -import { createPortal } from 'react-dom' -import { useState, useEffect } from 'react' -import { useSelector } from 'react-redux' -import PropTypes from 'prop-types' -import { - Button, - Dialog, - DialogActions, - DialogContent, - Box, - List, - ListItemButton, - ListItem, - ListItemAvatar, - ListItemText, - Typography, - Stack -} from '@mui/material' -import { useTheme } from '@mui/material/styles' -import PerfectScrollbar from 'react-perfect-scrollbar' -import { StyledButton } from 'ui-component/button/StyledButton' -import { DarkCodeEditor } from 'ui-component/editor/DarkCodeEditor' -import { LightCodeEditor } from 'ui-component/editor/LightCodeEditor' - -import './EditPromptValuesDialog.css' -import { baseURL } from 'store/constant' - -const EditPromptValuesDialog = ({ show, dialogProps, onCancel, onConfirm }) => { - const portalElement = document.getElementById('portal') - - const theme = useTheme() - const customization = useSelector((state) => state.customization) - const languageType = 'json' - - const [inputValue, setInputValue] = useState('') - const [inputParam, setInputParam] = useState(null) - const [textCursorPosition, setTextCursorPosition] = useState({}) - - useEffect(() => { - if (dialogProps.value) setInputValue(dialogProps.value) - if (dialogProps.inputParam) setInputParam(dialogProps.inputParam) - - return () => { - setInputValue('') - setInputParam(null) - setTextCursorPosition({}) - } - }, [dialogProps]) - - const onMouseUp = (e) => { - if (e.target && e.target.selectionEnd && e.target.value) { - const cursorPosition = e.target.selectionEnd - const textBeforeCursorPosition = e.target.value.substring(0, cursorPosition) - const textAfterCursorPosition = e.target.value.substring(cursorPosition, e.target.value.length) - const body = { - textBeforeCursorPosition, - textAfterCursorPosition - } - setTextCursorPosition(body) - } else { - setTextCursorPosition({}) - } - } - - const onSelectOutputResponseClick = (node, isUserQuestion = false) => { - let variablePath = isUserQuestion ? `question` : `${node.id}.data.instance` - if (textCursorPosition) { - let newInput = '' - if (textCursorPosition.textBeforeCursorPosition === undefined && textCursorPosition.textAfterCursorPosition === undefined) - newInput = `${inputValue}${`{{${variablePath}}}`}` - else newInput = `${textCursorPosition.textBeforeCursorPosition}{{${variablePath}}}${textCursorPosition.textAfterCursorPosition}` - setInputValue(newInput) - } - } - - const component = show ? ( - - -
- {inputParam && inputParam.type === 'string' && ( -
- - {inputParam.label} - - - {customization.isDarkMode ? ( - setInputValue(code)} - placeholder={inputParam.placeholder} - type={languageType} - onMouseUp={(e) => onMouseUp(e)} - onBlur={(e) => onMouseUp(e)} - style={{ - fontSize: '0.875rem', - minHeight: 'calc(100vh - 220px)', - width: '100%' - }} - /> - ) : ( - setInputValue(code)} - placeholder={inputParam.placeholder} - type={languageType} - onMouseUp={(e) => onMouseUp(e)} - onBlur={(e) => onMouseUp(e)} - style={{ - fontSize: '0.875rem', - minHeight: 'calc(100vh - 220px)', - width: '100%' - }} - /> - )} - -
- )} - {!dialogProps.disabled && inputParam && inputParam.acceptVariable && ( -
- - Select Variable - - - - - onSelectOutputResponseClick(null, true)} - > - - -
- AI -
-
- -
-
- {dialogProps.availableNodesForVariable && - dialogProps.availableNodesForVariable.length > 0 && - dialogProps.availableNodesForVariable.map((node, index) => { - const selectedOutputAnchor = node.data.outputAnchors[0].options.find( - (ancr) => ancr.name === node.data.outputs['output'] - ) - return ( - onSelectOutputResponseClick(node)} - > - - -
- {node.data.name} -
-
- -
-
- ) - })} -
-
-
-
- )} -
-
- - - onConfirm(inputValue, inputParam.name)}> - {dialogProps.confirmButtonName} - - -
- ) : null - - return createPortal(component, portalElement) -} - -EditPromptValuesDialog.propTypes = { - show: PropTypes.bool, - dialogProps: PropTypes.object, - onCancel: PropTypes.func, - onConfirm: PropTypes.func -} - -export default EditPromptValuesDialog diff --git a/packages/ui/src/ui-component/dialog/EditPromptValuesDialog.css b/packages/ui/src/ui-component/dialog/ExpandTextDialog.css similarity index 100% rename from packages/ui/src/ui-component/dialog/EditPromptValuesDialog.css rename to packages/ui/src/ui-component/dialog/ExpandTextDialog.css diff --git a/packages/ui/src/ui-component/dialog/ExpandTextDialog.js b/packages/ui/src/ui-component/dialog/ExpandTextDialog.js new file mode 100644 index 000000000..2a4ec4f5a --- /dev/null +++ b/packages/ui/src/ui-component/dialog/ExpandTextDialog.js @@ -0,0 +1,113 @@ +import { createPortal } from 'react-dom' +import { useState, useEffect } from 'react' +import { useSelector, useDispatch } from 'react-redux' +import PropTypes from 'prop-types' +import { Button, Dialog, DialogActions, DialogContent, Typography } from '@mui/material' +import { useTheme } from '@mui/material/styles' +import PerfectScrollbar from 'react-perfect-scrollbar' +import { StyledButton } from 'ui-component/button/StyledButton' +import { DarkCodeEditor } from 'ui-component/editor/DarkCodeEditor' +import { LightCodeEditor } from 'ui-component/editor/LightCodeEditor' +import { HIDE_CANVAS_DIALOG, SHOW_CANVAS_DIALOG } from 'store/actions' + +import './ExpandTextDialog.css' + +const ExpandTextDialog = ({ show, dialogProps, onCancel, onConfirm }) => { + const portalElement = document.getElementById('portal') + + const theme = useTheme() + const dispatch = useDispatch() + const customization = useSelector((state) => state.customization) + const languageType = 'json' + + const [inputValue, setInputValue] = useState('') + const [inputParam, setInputParam] = useState(null) + + useEffect(() => { + if (dialogProps.value) setInputValue(dialogProps.value) + if (dialogProps.inputParam) setInputParam(dialogProps.inputParam) + + return () => { + setInputValue('') + setInputParam(null) + } + }, [dialogProps]) + + useEffect(() => { + if (show) dispatch({ type: SHOW_CANVAS_DIALOG }) + else dispatch({ type: HIDE_CANVAS_DIALOG }) + return () => dispatch({ type: HIDE_CANVAS_DIALOG }) + }, [show, dispatch]) + + const component = show ? ( + + +
+ {inputParam && inputParam.type === 'string' && ( +
+ + {inputParam.label} + + + {customization.isDarkMode ? ( + setInputValue(code)} + placeholder={inputParam.placeholder} + type={languageType} + style={{ + fontSize: '0.875rem', + minHeight: 'calc(100vh - 220px)', + width: '100%' + }} + /> + ) : ( + setInputValue(code)} + placeholder={inputParam.placeholder} + type={languageType} + style={{ + fontSize: '0.875rem', + minHeight: 'calc(100vh - 220px)', + width: '100%' + }} + /> + )} + +
+ )} +
+
+ + + onConfirm(inputValue, inputParam.name)}> + {dialogProps.confirmButtonName} + + +
+ ) : null + + return createPortal(component, portalElement) +} + +ExpandTextDialog.propTypes = { + show: PropTypes.bool, + dialogProps: PropTypes.object, + onCancel: PropTypes.func, + onConfirm: PropTypes.func +} + +export default ExpandTextDialog diff --git a/packages/ui/src/ui-component/dialog/FormatPromptValuesDialog.js b/packages/ui/src/ui-component/dialog/FormatPromptValuesDialog.js new file mode 100644 index 000000000..233f0762a --- /dev/null +++ b/packages/ui/src/ui-component/dialog/FormatPromptValuesDialog.js @@ -0,0 +1,65 @@ +import { useEffect } from 'react' +import { createPortal } from 'react-dom' +import { useSelector, useDispatch } from 'react-redux' +import PropTypes from 'prop-types' +import { Dialog, DialogContent, DialogTitle } from '@mui/material' +import PerfectScrollbar from 'react-perfect-scrollbar' +import { JsonEditorInput } from 'ui-component/json/JsonEditor' +import { HIDE_CANVAS_DIALOG, SHOW_CANVAS_DIALOG } from 'store/actions' + +const FormatPromptValuesDialog = ({ show, dialogProps, onChange, onCancel }) => { + const portalElement = document.getElementById('portal') + const customization = useSelector((state) => state.customization) + const dispatch = useDispatch() + + useEffect(() => { + if (show) dispatch({ type: SHOW_CANVAS_DIALOG }) + else dispatch({ type: HIDE_CANVAS_DIALOG }) + return () => dispatch({ type: HIDE_CANVAS_DIALOG }) + }, [show, dispatch]) + + const component = show ? ( + + + Format Prompt Values + + + + onChange(newValue)} + value={dialogProps.value} + isDarkMode={customization.isDarkMode} + inputParam={dialogProps.inputParam} + nodes={dialogProps.nodes} + edges={dialogProps.edges} + nodeId={dialogProps.nodeId} + /> + + + + ) : null + + return createPortal(component, portalElement) +} + +FormatPromptValuesDialog.propTypes = { + show: PropTypes.bool, + dialogProps: PropTypes.object, + onChange: PropTypes.func, + onCancel: PropTypes.func +} + +export default FormatPromptValuesDialog diff --git a/packages/ui/src/ui-component/dialog/NodeInfoDialog.js b/packages/ui/src/ui-component/dialog/NodeInfoDialog.js new file mode 100644 index 000000000..74c45a1a8 --- /dev/null +++ b/packages/ui/src/ui-component/dialog/NodeInfoDialog.js @@ -0,0 +1,141 @@ +import { createPortal } from 'react-dom' +import { useDispatch } from 'react-redux' +import { useEffect } from 'react' +import PropTypes from 'prop-types' + +// Material +import { Dialog, DialogContent, DialogTitle } from '@mui/material' +import { TableViewOnly } from 'ui-component/table/Table' + +// Store +import { HIDE_CANVAS_DIALOG, SHOW_CANVAS_DIALOG } from 'store/actions' +import { baseURL } from 'store/constant' + +// API +import configApi from 'api/config' +import useApi from 'hooks/useApi' + +const NodeInfoDialog = ({ show, dialogProps, onCancel }) => { + const portalElement = document.getElementById('portal') + const dispatch = useDispatch() + + const getNodeConfigApi = useApi(configApi.getNodeConfig) + + useEffect(() => { + if (dialogProps.data) { + getNodeConfigApi.request(dialogProps.data) + } + + // eslint-disable-next-line react-hooks/exhaustive-deps + }, [dialogProps]) + + useEffect(() => { + if (show) dispatch({ type: SHOW_CANVAS_DIALOG }) + else dispatch({ type: HIDE_CANVAS_DIALOG }) + return () => dispatch({ type: HIDE_CANVAS_DIALOG }) + }, [show, dispatch]) + + const component = show ? ( + + + {dialogProps.data && dialogProps.data.name && dialogProps.data.label && ( +
+
+ {dialogProps.data.name} +
+
+ {dialogProps.data.label} +
+
+ {dialogProps.data.id} +
+ {dialogProps.data.version && ( +
+ version {dialogProps.data.version} +
+ )} +
+
+
+ )} +
+ + {dialogProps.data?.description && ( +
+ {dialogProps.data.description} +
+ )} + {getNodeConfigApi.data && getNodeConfigApi.data.length > 0 && ( + + )} +
+
+ ) : null + + return createPortal(component, portalElement) +} + +NodeInfoDialog.propTypes = { + show: PropTypes.bool, + dialogProps: PropTypes.object, + onCancel: PropTypes.func +} + +export default NodeInfoDialog diff --git a/packages/ui/src/ui-component/dialog/SourceDocDialog.js b/packages/ui/src/ui-component/dialog/SourceDocDialog.js new file mode 100644 index 000000000..6bf8692fb --- /dev/null +++ b/packages/ui/src/ui-component/dialog/SourceDocDialog.js @@ -0,0 +1,57 @@ +import { createPortal } from 'react-dom' +import { useState, useEffect } from 'react' +import { useSelector } from 'react-redux' +import PropTypes from 'prop-types' +import { Dialog, DialogContent, DialogTitle } from '@mui/material' +import ReactJson from 'flowise-react-json-view' + +const SourceDocDialog = ({ show, dialogProps, onCancel }) => { + const portalElement = document.getElementById('portal') + const customization = useSelector((state) => state.customization) + + const [data, setData] = useState({}) + + useEffect(() => { + if (dialogProps.data) setData(dialogProps.data) + + return () => { + setData({}) + } + }, [dialogProps]) + + const component = show ? ( + + + Source Document + + + + + + ) : null + + return createPortal(component, portalElement) +} + +SourceDocDialog.propTypes = { + show: PropTypes.bool, + dialogProps: PropTypes.object, + onCancel: PropTypes.func +} + +export default SourceDocDialog diff --git a/packages/ui/src/ui-component/dropdown/AsyncDropdown.js b/packages/ui/src/ui-component/dropdown/AsyncDropdown.js new file mode 100644 index 000000000..b24fa02b5 --- /dev/null +++ b/packages/ui/src/ui-component/dropdown/AsyncDropdown.js @@ -0,0 +1,178 @@ +import { useState, useEffect, Fragment } from 'react' +import { useSelector } from 'react-redux' +import PropTypes from 'prop-types' +import axios from 'axios' + +// Material +import Autocomplete, { autocompleteClasses } from '@mui/material/Autocomplete' +import { Popper, CircularProgress, TextField, Box, Typography } from '@mui/material' +import { styled } from '@mui/material/styles' + +// API +import credentialsApi from 'api/credentials' + +// const +import { baseURL } from 'store/constant' + +const StyledPopper = styled(Popper)({ + boxShadow: '0px 8px 10px -5px rgb(0 0 0 / 20%), 0px 16px 24px 2px rgb(0 0 0 / 14%), 0px 6px 30px 5px rgb(0 0 0 / 12%)', + borderRadius: '10px', + [`& .${autocompleteClasses.listbox}`]: { + boxSizing: 'border-box', + '& ul': { + padding: 10, + margin: 10 + } + } +}) + +const fetchList = async ({ name, nodeData }) => { + const loadMethod = nodeData.inputParams.find((param) => param.name === name)?.loadMethod + const username = localStorage.getItem('username') + const password = localStorage.getItem('password') + + let lists = await axios + .post( + `${baseURL}/api/v1/node-load-method/${nodeData.name}`, + { ...nodeData, loadMethod }, + { auth: username && password ? { username, password } : undefined } + ) + .then(async function (response) { + return response.data + }) + .catch(function (error) { + console.error(error) + }) + return lists +} + +export const AsyncDropdown = ({ + name, + nodeData, + value, + onSelect, + isCreateNewOption, + onCreateNew, + credentialNames = [], + disabled = false, + disableClearable = false +}) => { + const customization = useSelector((state) => state.customization) + + const [open, setOpen] = useState(false) + const [options, setOptions] = useState([]) + const [loading, setLoading] = useState(false) + const findMatchingOptions = (options = [], value) => options.find((option) => option.name === value) + const getDefaultOptionValue = () => '' + const addNewOption = [{ label: '- Create New -', name: '-create-' }] + let [internalValue, setInternalValue] = useState(value ?? 'choose an option') + + const fetchCredentialList = async () => { + try { + let names = '' + if (credentialNames.length > 1) { + names = credentialNames.join('&credentialName=') + } else { + names = credentialNames[0] + } + const resp = await credentialsApi.getCredentialsByName(names) + if (resp.data) { + const returnList = [] + for (let i = 0; i < resp.data.length; i += 1) { + const data = { + label: resp.data[i].name, + name: resp.data[i].id + } + returnList.push(data) + } + return returnList + } + } catch (error) { + console.error(error) + } + } + + useEffect(() => { + setLoading(true) + ;(async () => { + const fetchData = async () => { + let response = credentialNames.length ? await fetchCredentialList() : await fetchList({ name, nodeData }) + if (isCreateNewOption) setOptions([...response, ...addNewOption]) + else setOptions([...response]) + setLoading(false) + } + fetchData() + })() + + // eslint-disable-next-line react-hooks/exhaustive-deps + }, []) + + return ( + <> + { + setOpen(true) + }} + onClose={() => { + setOpen(false) + }} + options={options} + value={findMatchingOptions(options, internalValue) || getDefaultOptionValue()} + onChange={(e, selection) => { + const value = selection ? selection.name : '' + if (isCreateNewOption && value === '-create-') { + onCreateNew() + } else { + setInternalValue(value) + onSelect(value) + } + }} + PopperComponent={StyledPopper} + loading={loading} + renderInput={(params) => ( + + {loading ? : null} + {params.InputProps.endAdornment} + + ) + }} + /> + )} + renderOption={(props, option) => ( + +
+ {option.label} + {option.description && ( + {option.description} + )} +
+
+ )} + /> + + ) +} + +AsyncDropdown.propTypes = { + name: PropTypes.string, + nodeData: PropTypes.object, + value: PropTypes.string, + onSelect: PropTypes.func, + onCreateNew: PropTypes.func, + disabled: PropTypes.bool, + credentialNames: PropTypes.array, + disableClearable: PropTypes.bool, + isCreateNewOption: PropTypes.bool +} diff --git a/packages/ui/src/ui-component/editor/DarkCodeEditor.js b/packages/ui/src/ui-component/editor/DarkCodeEditor.js index 3925f4a66..bf0719dd9 100644 --- a/packages/ui/src/ui-component/editor/DarkCodeEditor.js +++ b/packages/ui/src/ui-component/editor/DarkCodeEditor.js @@ -21,6 +21,7 @@ export const DarkCodeEditor = ({ value, placeholder, disabled = false, type, sty onValueChange={onValueChange} onMouseUp={onMouseUp} onBlur={onBlur} + tabSize={4} style={{ ...style, background: theme.palette.codeEditor.main diff --git a/packages/ui/src/ui-component/editor/LightCodeEditor.js b/packages/ui/src/ui-component/editor/LightCodeEditor.js index 86f7057df..14dcbf29a 100644 --- a/packages/ui/src/ui-component/editor/LightCodeEditor.js +++ b/packages/ui/src/ui-component/editor/LightCodeEditor.js @@ -21,6 +21,7 @@ export const LightCodeEditor = ({ value, placeholder, disabled = false, type, st onValueChange={onValueChange} onMouseUp={onMouseUp} onBlur={onBlur} + tabSize={4} style={{ ...style, background: theme.palette.card.main diff --git a/packages/ui/src/ui-component/grid/Grid.js b/packages/ui/src/ui-component/grid/Grid.js new file mode 100644 index 000000000..0670d69bd --- /dev/null +++ b/packages/ui/src/ui-component/grid/Grid.js @@ -0,0 +1,43 @@ +import PropTypes from 'prop-types' +import { DataGrid } from '@mui/x-data-grid' +import { IconPlus } from '@tabler/icons' +import { Button } from '@mui/material' + +export const Grid = ({ columns, rows, style, disabled = false, onRowUpdate, addNewRow }) => { + const handleProcessRowUpdate = (newRow) => { + onRowUpdate(newRow) + return newRow + } + + return ( + <> + {!disabled && ( + + )} + {rows && columns && ( +
+ { + return !disabled + }} + onProcessRowUpdateError={(error) => console.error(error)} + rows={rows} + columns={columns} + /> +
+ )} + + ) +} + +Grid.propTypes = { + rows: PropTypes.array, + columns: PropTypes.array, + style: PropTypes.any, + disabled: PropTypes.bool, + addNewRow: PropTypes.func, + onRowUpdate: PropTypes.func +} diff --git a/packages/ui/src/ui-component/input/Input.js b/packages/ui/src/ui-component/input/Input.js index 1861bf655..95bf968d0 100644 --- a/packages/ui/src/ui-component/input/Input.js +++ b/packages/ui/src/ui-component/input/Input.js @@ -1,7 +1,7 @@ import { useState } from 'react' import PropTypes from 'prop-types' import { FormControl, OutlinedInput } from '@mui/material' -import EditPromptValuesDialog from 'ui-component/dialog/EditPromptValuesDialog' +import ExpandTextDialog from 'ui-component/dialog/ExpandTextDialog' export const Input = ({ inputParam, value, onChange, disabled = false, showDialog, dialogProps, onDialogCancel, onDialogConfirm }) => { const [myValue, setMyValue] = useState(value ?? '') @@ -37,6 +37,7 @@ export const Input = ({ inputParam, value, onChange, disabled = false, showDialo onChange(e.target.value) }} inputProps={{ + step: inputParam.step ?? 1, style: { height: inputParam.rows ? '90px' : 'inherit' } @@ -44,7 +45,7 @@ export const Input = ({ inputParam, value, onChange, disabled = false, showDialo /> {showDialog && ( - + > )} ) @@ -60,7 +61,7 @@ export const Input = ({ inputParam, value, onChange, disabled = false, showDialo Input.propTypes = { inputParam: PropTypes.object, - value: PropTypes.string, + value: PropTypes.oneOfType([PropTypes.string, PropTypes.number]), onChange: PropTypes.func, disabled: PropTypes.bool, showDialog: PropTypes.bool, diff --git a/packages/ui/src/ui-component/json/JsonEditor.js b/packages/ui/src/ui-component/json/JsonEditor.js index 06442df27..4bf8f306d 100644 --- a/packages/ui/src/ui-component/json/JsonEditor.js +++ b/packages/ui/src/ui-component/json/JsonEditor.js @@ -1,10 +1,32 @@ -import { useState } from 'react' +import { useEffect, useState } from 'react' import PropTypes from 'prop-types' -import { FormControl } from '@mui/material' -import ReactJson from 'react-json-view' +import { FormControl, Popover } from '@mui/material' +import ReactJson from 'flowise-react-json-view' +import SelectVariable from './SelectVariable' +import { cloneDeep } from 'lodash' +import { getAvailableNodesForVariable } from 'utils/genericHelper' -export const JsonEditorInput = ({ value, onChange, disabled = false, isDarkMode = false }) => { +export const JsonEditorInput = ({ value, onChange, inputParam, nodes, edges, nodeId, disabled = false, isDarkMode = false }) => { const [myValue, setMyValue] = useState(value ? JSON.parse(value) : {}) + const [availableNodesForVariable, setAvailableNodesForVariable] = useState([]) + const [mouseUpKey, setMouseUpKey] = useState('') + + const [anchorEl, setAnchorEl] = useState(null) + const openPopOver = Boolean(anchorEl) + + const handleClosePopOver = () => { + setAnchorEl(null) + } + + const setNewVal = (val) => { + const newVal = cloneDeep(myValue) + newVal[mouseUpKey] = val + onChange(JSON.stringify(newVal)) + setMyValue((params) => ({ + ...params, + [mouseUpKey]: val + })) + } const onClipboardCopy = (e) => { const src = e.src @@ -15,6 +37,13 @@ export const JsonEditorInput = ({ value, onChange, disabled = false, isDarkMode } } + useEffect(() => { + if (!disabled && nodes && edges && nodeId && inputParam) { + const nodesForVariable = inputParam?.acceptVariable ? getAvailableNodesForVariable(nodes, edges, nodeId, inputParam.id) : [] + setAvailableNodesForVariable(nodesForVariable) + } + }, [disabled, inputParam, nodes, edges, nodeId]) + return ( <> @@ -30,28 +59,60 @@ export const JsonEditorInput = ({ value, onChange, disabled = false, isDarkMode /> )} {!disabled && ( - onClipboardCopy(e)} - onEdit={(edit) => { - setMyValue(edit.updated_src) - onChange(JSON.stringify(edit.updated_src)) - }} - onAdd={() => { - //console.log(add) - }} - onDelete={(deleteobj) => { - setMyValue(deleteobj.updated_src) - onChange(JSON.stringify(deleteobj.updated_src)) - }} - /> +
+ onClipboardCopy(e)} + onMouseUp={(event) => { + if (inputParam?.acceptVariable) { + setMouseUpKey(event.name) + setAnchorEl(event.currentTarget) + } + }} + onEdit={(edit) => { + setMyValue(edit.updated_src) + onChange(JSON.stringify(edit.updated_src)) + }} + onAdd={() => { + //console.log(add) + }} + onDelete={(deleteobj) => { + setMyValue(deleteobj.updated_src) + onChange(JSON.stringify(deleteobj.updated_src)) + }} + /> +
)}
+ {inputParam?.acceptVariable && ( + + { + setNewVal(val) + handleClosePopOver() + }} + /> + + )} ) } @@ -60,5 +121,9 @@ JsonEditorInput.propTypes = { value: PropTypes.string, onChange: PropTypes.func, disabled: PropTypes.bool, - isDarkMode: PropTypes.bool + isDarkMode: PropTypes.bool, + inputParam: PropTypes.object, + nodes: PropTypes.array, + edges: PropTypes.array, + nodeId: PropTypes.string } diff --git a/packages/ui/src/ui-component/json/SelectVariable.js b/packages/ui/src/ui-component/json/SelectVariable.js new file mode 100644 index 000000000..7a482baef --- /dev/null +++ b/packages/ui/src/ui-component/json/SelectVariable.js @@ -0,0 +1,166 @@ +import { useSelector } from 'react-redux' +import PropTypes from 'prop-types' +import { Box, List, ListItemButton, ListItem, ListItemAvatar, ListItemText, Typography, Stack } from '@mui/material' +import PerfectScrollbar from 'react-perfect-scrollbar' +import robotPNG from 'assets/images/robot.png' +import chatPNG from 'assets/images/chathistory.png' +import { baseURL } from 'store/constant' + +const SelectVariable = ({ availableNodesForVariable, disabled = false, onSelectAndReturnVal }) => { + const customization = useSelector((state) => state.customization) + + const onSelectOutputResponseClick = (node, prefix) => { + let variablePath = node ? `${node.id}.data.instance` : prefix + const newInput = `{{${variablePath}}}` + onSelectAndReturnVal(newInput) + } + + return ( + <> + {!disabled && ( +
+ + Select Variable + + + + + onSelectOutputResponseClick(null, 'question')} + > + + +
+ AI +
+
+ +
+
+ onSelectOutputResponseClick(null, 'chat_history')} + > + + +
+ chatHistory +
+
+ +
+
+ {availableNodesForVariable && + availableNodesForVariable.length > 0 && + availableNodesForVariable.map((node, index) => { + const selectedOutputAnchor = node.data.outputAnchors[0].options.find( + (ancr) => ancr.name === node.data.outputs['output'] + ) + return ( + onSelectOutputResponseClick(node)} + > + + +
+ {node.data.name} +
+
+ +
+
+ ) + })} +
+
+
+
+ )} + + ) +} + +SelectVariable.propTypes = { + availableNodesForVariable: PropTypes.array, + disabled: PropTypes.bool, + onSelectAndReturnVal: PropTypes.func +} + +export default SelectVariable diff --git a/packages/ui/src/ui-component/markdown/CodeBlock.js b/packages/ui/src/ui-component/markdown/CodeBlock.js new file mode 100644 index 000000000..77caa346c --- /dev/null +++ b/packages/ui/src/ui-component/markdown/CodeBlock.js @@ -0,0 +1,123 @@ +import { IconClipboard, IconDownload } from '@tabler/icons' +import { memo, useState } from 'react' +import { Prism as SyntaxHighlighter } from 'react-syntax-highlighter' +import { oneDark } from 'react-syntax-highlighter/dist/esm/styles/prism' +import PropTypes from 'prop-types' +import { Box, IconButton, Popover, Typography } from '@mui/material' +import { useTheme } from '@mui/material/styles' + +const programmingLanguages = { + javascript: '.js', + python: '.py', + java: '.java', + c: '.c', + cpp: '.cpp', + 'c++': '.cpp', + 'c#': '.cs', + ruby: '.rb', + php: '.php', + swift: '.swift', + 'objective-c': '.m', + kotlin: '.kt', + typescript: '.ts', + go: '.go', + perl: '.pl', + rust: '.rs', + scala: '.scala', + haskell: '.hs', + lua: '.lua', + shell: '.sh', + sql: '.sql', + html: '.html', + css: '.css' +} + +export const CodeBlock = memo(({ language, chatflowid, isDialog, value }) => { + const theme = useTheme() + const [anchorEl, setAnchorEl] = useState(null) + const openPopOver = Boolean(anchorEl) + + const handleClosePopOver = () => { + setAnchorEl(null) + } + + const copyToClipboard = (event) => { + if (!navigator.clipboard || !navigator.clipboard.writeText) { + return + } + + navigator.clipboard.writeText(value) + setAnchorEl(event.currentTarget) + setTimeout(() => { + handleClosePopOver() + }, 1500) + } + + const downloadAsFile = () => { + const fileExtension = programmingLanguages[language] || '.file' + const suggestedFileName = `file-${chatflowid}${fileExtension}` + const fileName = suggestedFileName + + if (!fileName) { + // user pressed cancel on prompt + return + } + + const blob = new Blob([value], { type: 'text/plain' }) + const url = URL.createObjectURL(blob) + const link = document.createElement('a') + link.download = fileName + link.href = url + link.style.display = 'none' + document.body.appendChild(link) + link.click() + document.body.removeChild(link) + URL.revokeObjectURL(url) + } + + return ( +
+ +
+ {language} +
+ + + + + + Copied! + + + + + +
+
+ + + {value} + +
+ ) +}) +CodeBlock.displayName = 'CodeBlock' + +CodeBlock.propTypes = { + language: PropTypes.string, + chatflowid: PropTypes.string, + isDialog: PropTypes.bool, + value: PropTypes.string +} diff --git a/packages/ui/src/ui-component/markdown/MemoizedReactMarkdown.js b/packages/ui/src/ui-component/markdown/MemoizedReactMarkdown.js new file mode 100644 index 000000000..f9770a9f3 --- /dev/null +++ b/packages/ui/src/ui-component/markdown/MemoizedReactMarkdown.js @@ -0,0 +1,4 @@ +import { memo } from 'react' +import ReactMarkdown from 'react-markdown' + +export const MemoizedReactMarkdown = memo(ReactMarkdown, (prevProps, nextProps) => prevProps.children === nextProps.children) diff --git a/packages/ui/src/ui-component/table/Table.js b/packages/ui/src/ui-component/table/Table.js index a6ab312e1..2cf391827 100644 --- a/packages/ui/src/ui-component/table/Table.js +++ b/packages/ui/src/ui-component/table/Table.js @@ -16,9 +16,11 @@ export const TableViewOnly = ({ columns, rows }) => { {rows.map((row, index) => ( - {Object.keys(row).map((key, index) => ( - {row[key]} - ))} + {Object.keys(row) + .slice(-3) + .map((key, index) => ( + {row[key]} + ))} ))} diff --git a/packages/ui/src/utils/genericHelper.js b/packages/ui/src/utils/genericHelper.js index c1dcb1086..324cc1121 100644 --- a/packages/ui/src/utils/genericHelper.js +++ b/packages/ui/src/utils/genericHelper.js @@ -39,8 +39,9 @@ export const initNode = (nodeData, newNodeId) => { const incoming = nodeData.inputs ? nodeData.inputs.length : 0 const outgoing = 1 - const whitelistTypes = ['options', 'string', 'number', 'boolean', 'password', 'json', 'code', 'date', 'file', 'folder'] + const whitelistTypes = ['asyncOptions', 'options', 'string', 'number', 'boolean', 'password', 'json', 'code', 'date', 'file', 'folder'] + // Inputs for (let i = 0; i < incoming; i += 1) { const newInput = { ...nodeData.inputs[i], @@ -53,6 +54,16 @@ export const initNode = (nodeData, newNodeId) => { } } + // Credential + if (nodeData.credential) { + const newInput = { + ...nodeData.credential, + id: `${newNodeId}-input-${nodeData.credential.name}-${nodeData.credential.type}` + } + inputParams.unshift(newInput) + } + + // Outputs const outputAnchors = [] for (let i = 0; i < outgoing; i += 1) { if (nodeData.outputs && nodeData.outputs.length) { @@ -129,6 +140,8 @@ export const initNode = (nodeData, newNodeId) => { } ] */ + + // Inputs if (nodeData.inputs) { nodeData.inputAnchors = inputAnchors nodeData.inputParams = inputParams @@ -139,13 +152,17 @@ export const initNode = (nodeData, newNodeId) => { nodeData.inputs = {} } + // Outputs if (nodeData.outputs) { nodeData.outputs = initializeDefaultNodeData(outputAnchors) } else { nodeData.outputs = {} } - nodeData.outputAnchors = outputAnchors + + // Credential + if (nodeData.credential) nodeData.credential = '' + nodeData.id = newNodeId return nodeData @@ -168,8 +185,10 @@ export const isValidConnection = (connection, reactFlowInstance) => { //sourceHandle: "llmChain_0-output-llmChain-BaseChain" //targetHandle: "mrlkAgentLLM_0-input-model-BaseLanguageModel" - const sourceTypes = sourceHandle.split('-')[sourceHandle.split('-').length - 1].split('|') - const targetTypes = targetHandle.split('-')[targetHandle.split('-').length - 1].split('|') + let sourceTypes = sourceHandle.split('-')[sourceHandle.split('-').length - 1].split('|') + sourceTypes = sourceTypes.map((s) => s.trim()) + let targetTypes = targetHandle.split('-')[targetHandle.split('-').length - 1].split('|') + targetTypes = targetTypes.map((t) => t.trim()) if (targetTypes.some((t) => sourceTypes.includes(t))) { let targetNode = reactFlowInstance.getNode(target) @@ -249,6 +268,7 @@ export const generateExportFlowData = (flowData) => { const newNodeData = { id: node.data.id, label: node.data.label, + version: node.data.version, name: node.data.name, type: node.data.type, baseClasses: node.data.baseClasses, @@ -285,7 +305,7 @@ export const generateExportFlowData = (flowData) => { } export const getAvailableNodesForVariable = (nodes, edges, target, targetHandle) => { - // example edge id = "llmChain_0-llmChain_0-output-outputPrediction-string-llmChain_1-llmChain_1-input-promptValues-string" + // example edge id = "llmChain_0-llmChain_0-output-outputPrediction-string|json-llmChain_1-llmChain_1-input-promptValues-string" // {source} -{sourceHandle} -{target} -{targetHandle} const parentNodes = [] const inputEdges = edges.filter((edg) => edg.target === target && edg.targetHandle === targetHandle) @@ -314,3 +334,78 @@ export const rearrangeToolsOrdering = (newValues, sourceNodeId) => { newValues.sort((a, b) => sortKey(a) - sortKey(b)) } + +export const throttle = (func, limit) => { + let lastFunc + let lastRan + + return (...args) => { + if (!lastRan) { + func(...args) + lastRan = Date.now() + } else { + clearTimeout(lastFunc) + lastFunc = setTimeout(() => { + if (Date.now() - lastRan >= limit) { + func(...args) + lastRan = Date.now() + } + }, limit - (Date.now() - lastRan)) + } + } +} + +export const generateRandomGradient = () => { + function randomColor() { + var color = 'rgb(' + for (var i = 0; i < 3; i++) { + var random = Math.floor(Math.random() * 256) + color += random + if (i < 2) { + color += ',' + } + } + color += ')' + return color + } + + var gradient = 'linear-gradient(' + randomColor() + ', ' + randomColor() + ')' + + return gradient +} + +export const getInputVariables = (paramValue) => { + let returnVal = paramValue + const variableStack = [] + const inputVariables = [] + let startIdx = 0 + const endIdx = returnVal.length + + while (startIdx < endIdx) { + const substr = returnVal.substring(startIdx, startIdx + 1) + + // Store the opening double curly bracket + if (substr === '{') { + variableStack.push({ substr, startIdx: startIdx + 1 }) + } + + // Found the complete variable + if (substr === '}' && variableStack.length > 0 && variableStack[variableStack.length - 1].substr === '{') { + const variableStartIdx = variableStack[variableStack.length - 1].startIdx + const variableEndIdx = startIdx + const variableFullPath = returnVal.substring(variableStartIdx, variableEndIdx) + inputVariables.push(variableFullPath) + variableStack.pop() + } + startIdx += 1 + } + return inputVariables +} + +export const isValidURL = (url) => { + try { + return new URL(url) + } catch (err) { + return undefined + } +} diff --git a/packages/ui/src/views/canvas/AddNodes.js b/packages/ui/src/views/canvas/AddNodes.js index 968b2c061..c6134cb9a 100644 --- a/packages/ui/src/views/canvas/AddNodes.js +++ b/packages/ui/src/views/canvas/AddNodes.js @@ -1,5 +1,5 @@ import { useState, useRef, useEffect } from 'react' -import { useSelector } from 'react-redux' +import { useSelector, useDispatch } from 'react-redux' import PropTypes from 'prop-types' // material-ui @@ -34,16 +34,18 @@ import Transitions from 'ui-component/extended/Transitions' import { StyledFab } from 'ui-component/button/StyledFab' // icons -import { IconPlus, IconSearch, IconMinus } from '@tabler/icons' +import { IconPlus, IconSearch, IconMinus, IconX } from '@tabler/icons' // const import { baseURL } from 'store/constant' +import { SET_COMPONENT_NODES } from 'store/actions' // ==============================|| ADD NODES||============================== // const AddNodes = ({ nodesData, node }) => { const theme = useTheme() const customization = useSelector((state) => state.customization) + const dispatch = useDispatch() const [searchValue, setSearchValue] = useState('') const [nodes, setNodes] = useState({}) @@ -61,11 +63,20 @@ const AddNodes = ({ nodesData, node }) => { } } + const getSearchedNodes = (value) => { + const passed = nodesData.filter((nd) => { + const passesQuery = nd.name.toLowerCase().includes(value.toLowerCase()) + const passesCategory = nd.category.toLowerCase().includes(value.toLowerCase()) + return passesQuery || passesCategory + }) + return passed + } + const filterSearch = (value) => { setSearchValue(value) setTimeout(() => { if (value) { - const returnData = nodesData.filter((nd) => nd.name.toLowerCase().includes(value.toLowerCase())) + const returnData = getSearchedNodes(value) groupByCategory(returnData, true) scrollTop() } else if (value === '') { @@ -122,8 +133,11 @@ const AddNodes = ({ nodesData, node }) => { }, [node]) useEffect(() => { - if (nodesData) groupByCategory(nodesData) - }, [nodesData]) + if (nodesData) { + groupByCategory(nodesData) + dispatch({ type: SET_COMPONENT_NODES, componentNodes: nodesData }) + } + }, [nodesData, dispatch]) return ( <> @@ -167,7 +181,7 @@ const AddNodes = ({ nodesData, node }) => { Add Nodes filterSearch(e.target.value)} @@ -177,6 +191,28 @@ const AddNodes = ({ nodesData, node }) => { } + endAdornment={ + + filterSearch('')} + style={{ + cursor: 'pointer' + }} + /> + + } aria-describedby='search-helper-text' inputProps={{ 'aria-label': 'weight' @@ -218,6 +254,7 @@ const AddNodes = ({ nodesData, node }) => { expanded={categoryExpanded[category] || false} onChange={handleAccordionChange(category)} key={category} + disableGutters > } diff --git a/packages/ui/src/views/canvas/CanvasHeader.js b/packages/ui/src/views/canvas/CanvasHeader.js index 1f4a1f93b..d25635325 100644 --- a/packages/ui/src/views/canvas/CanvasHeader.js +++ b/packages/ui/src/views/canvas/CanvasHeader.js @@ -1,6 +1,6 @@ import PropTypes from 'prop-types' import { useNavigate } from 'react-router-dom' -import { useSelector } from 'react-redux' +import { useSelector, useDispatch } from 'react-redux' import { useEffect, useRef, useState } from 'react' // material-ui @@ -13,7 +13,7 @@ import { IconSettings, IconChevronLeft, IconDeviceFloppy, IconPencil, IconCheck, // project imports import Settings from 'views/settings' import SaveChatflowDialog from 'ui-component/dialog/SaveChatflowDialog' -import APICodeDialog from 'ui-component/dialog/APICodeDialog' +import APICodeDialog from 'views/chatflows/APICodeDialog' // API import chatflowsApi from 'api/chatflows' @@ -24,11 +24,13 @@ import useApi from 'hooks/useApi' // utils import { generateExportFlowData } from 'utils/genericHelper' import { uiBaseURL } from 'store/constant' +import { SET_CHATFLOW } from 'store/actions' // ==============================|| CANVAS HEADER ||============================== // const CanvasHeader = ({ chatflow, handleSaveFlow, handleDeleteFlow, handleLoadFlow }) => { const theme = useTheme() + const dispatch = useDispatch() const navigate = useNavigate() const flowNameRef = useRef() const settingsRef = useRef() @@ -88,8 +90,8 @@ const CanvasHeader = ({ chatflow, handleSaveFlow, handleDeleteFlow, handleLoadFl } const onAPIDialogClick = () => { + // If file type is file, isFormDataRequired = true let isFormDataRequired = false - try { const flowData = JSON.parse(chatflow.flowData) const nodes = flowData.nodes @@ -103,11 +105,27 @@ const CanvasHeader = ({ chatflow, handleSaveFlow, handleDeleteFlow, handleLoadFl console.error(e) } + // If sessionId memory, isSessionMemory = true + let isSessionMemory = false + try { + const flowData = JSON.parse(chatflow.flowData) + const nodes = flowData.nodes + for (const node of nodes) { + if (node.data.inputParams.find((param) => param.name === 'sessionId')) { + isSessionMemory = true + break + } + } + } catch (e) { + console.error(e) + } + setAPIDialogProps({ title: 'Embed in website or use as API', chatflowid: chatflow.id, chatflowApiKeyId: chatflow.apikeyid, - isFormDataRequired + isFormDataRequired, + isSessionMemory }) setAPIDialogOpen(true) } @@ -125,6 +143,7 @@ const CanvasHeader = ({ chatflow, handleSaveFlow, handleDeleteFlow, handleLoadFl useEffect(() => { if (updateChatflowApi.data) { setFlowName(updateChatflowApi.data.name) + dispatch({ type: SET_CHATFLOW, chatflow: updateChatflowApi.data }) } setEditingFlowName(false) diff --git a/packages/ui/src/views/canvas/CanvasNode.js b/packages/ui/src/views/canvas/CanvasNode.js index 9263d4b6a..cabe23291 100644 --- a/packages/ui/src/views/canvas/CanvasNode.js +++ b/packages/ui/src/views/canvas/CanvasNode.js @@ -1,19 +1,22 @@ import PropTypes from 'prop-types' -import { useContext, useState } from 'react' +import { useContext, useState, useEffect } from 'react' +import { useSelector } from 'react-redux' // material-ui import { styled, useTheme } from '@mui/material/styles' import { IconButton, Box, Typography, Divider, Button } from '@mui/material' +import Tooltip, { tooltipClasses } from '@mui/material/Tooltip' // project imports import MainCard from 'ui-component/cards/MainCard' import NodeInputHandler from './NodeInputHandler' import NodeOutputHandler from './NodeOutputHandler' import AdditionalParamsDialog from 'ui-component/dialog/AdditionalParamsDialog' +import NodeInfoDialog from 'ui-component/dialog/NodeInfoDialog' // const import { baseURL } from 'store/constant' -import { IconTrash, IconCopy } from '@tabler/icons' +import { IconTrash, IconCopy, IconInfoCircle, IconAlertTriangle } from '@tabler/icons' import { flowContext } from 'store/context/ReactFlowContext' const CardWrapper = styled(MainCard)(({ theme }) => ({ @@ -30,14 +33,39 @@ const CardWrapper = styled(MainCard)(({ theme }) => ({ } })) +const LightTooltip = styled(({ className, ...props }) => )(({ theme }) => ({ + [`& .${tooltipClasses.tooltip}`]: { + backgroundColor: theme.palette.nodeToolTip.background, + color: theme.palette.nodeToolTip.color, + boxShadow: theme.shadows[1] + } +})) + // ===========================|| CANVAS NODE ||=========================== // const CanvasNode = ({ data }) => { const theme = useTheme() + const canvas = useSelector((state) => state.canvas) const { deleteNode, duplicateNode } = useContext(flowContext) const [showDialog, setShowDialog] = useState(false) const [dialogProps, setDialogProps] = useState({}) + const [showInfoDialog, setShowInfoDialog] = useState(false) + const [infoDialogProps, setInfoDialogProps] = useState({}) + const [warningMessage, setWarningMessage] = useState('') + const [open, setOpen] = useState(false) + + const handleClose = () => { + setOpen(false) + } + + const handleOpen = () => { + setOpen(true) + } + + const nodeOutdatedMessage = (oldVersion, newVersion) => `Node version ${oldVersion} outdated\nUpdate to latest version ${newVersion}` + + const nodeVersionEmptyMessage = (newVersion) => `Node outdated\nUpdate to latest version ${newVersion}` const onDialogClicked = () => { const dialogProps = { @@ -50,6 +78,17 @@ const CanvasNode = ({ data }) => { setShowDialog(true) } + useEffect(() => { + const componentNode = canvas.componentNodes.find((nd) => nd.name === data.name) + if (componentNode) { + if (!data.version) { + setWarningMessage(nodeVersionEmptyMessage(componentNode.version)) + } else { + if (componentNode.version > data.version) setWarningMessage(nodeOutdatedMessage(data.version, componentNode.version)) + } + } + }, [canvas.componentNodes, data.name, data.version]) + return ( <> { }} border={false} > - -
- -
- Notification -
-
- - - {data.label} - - -
- { - duplicateNode(data.id) - }} - sx={{ height: 35, width: 35, '&:hover': { color: theme?.palette.primary.main } }} - color={theme?.customization?.isDarkMode ? theme.colors?.paper : 'inherit'} - > - - - { - deleteNode(data.id) - }} - sx={{ height: 35, width: 35, mr: 1, '&:hover': { color: 'red' } }} - color={theme?.customization?.isDarkMode ? theme.colors?.paper : 'inherit'} - > - - -
- {(data.inputAnchors.length > 0 || data.inputParams.length > 0) && ( - <> - - - - Inputs - - - - - )} - {data.inputAnchors.map((inputAnchor, index) => ( - - ))} - {data.inputParams.map((inputParam, index) => ( - - ))} - {data.inputParams.find((param) => param.additionalParams) && ( + param.additionalParams).length === - data.inputParams.length + data.inputAnchors.length - ? 20 - : 0 + background: 'transparent', + display: 'flex', + flexDirection: 'column' }} > - + { + duplicateNode(data.id) + }} + sx={{ height: '35px', width: '35px', '&:hover': { color: theme?.palette.primary.main } }} + color={theme?.customization?.isDarkMode ? theme.colors?.paper : 'inherit'} + > + + + { + deleteNode(data.id) + }} + sx={{ height: '35px', width: '35px', '&:hover': { color: 'red' } }} + color={theme?.customization?.isDarkMode ? theme.colors?.paper : 'inherit'} + > + + + { + setInfoDialogProps({ data }) + setShowInfoDialog(true) + }} + sx={{ height: '35px', width: '35px', '&:hover': { color: theme?.palette.secondary.main } }} + color={theme?.customization?.isDarkMode ? theme.colors?.paper : 'inherit'} + > + +
- )} - - - - Output - - - + } + placement='right-start' + > + +
+ +
+ Notification +
+
+ + + {data.label} + + + {warningMessage && ( + <> +
+ {warningMessage}} placement='top'> + + + + + + )} +
+ {(data.inputAnchors.length > 0 || data.inputParams.length > 0) && ( + <> + + + + Inputs + + + + + )} + {data.inputAnchors.map((inputAnchor, index) => ( + + ))} + {data.inputParams.map((inputParam, index) => ( + + ))} + {data.inputParams.find((param) => param.additionalParams) && ( +
param.additionalParams).length === + data.inputParams.length + data.inputAnchors.length + ? 20 + : 0 + }} + > + +
+ )} + + + + Output + + + - {data.outputAnchors.map((outputAnchor, index) => ( - - ))} -
+ {data.outputAnchors.map((outputAnchor, index) => ( + + ))} + + setShowDialog(false)} > + setShowInfoDialog(false)}> ) } diff --git a/packages/ui/src/views/canvas/CredentialInputHandler.js b/packages/ui/src/views/canvas/CredentialInputHandler.js new file mode 100644 index 000000000..4f8747191 --- /dev/null +++ b/packages/ui/src/views/canvas/CredentialInputHandler.js @@ -0,0 +1,149 @@ +import PropTypes from 'prop-types' +import { useRef, useState } from 'react' + +// material-ui +import { IconButton } from '@mui/material' +import { IconEdit } from '@tabler/icons' + +// project import +import { AsyncDropdown } from 'ui-component/dropdown/AsyncDropdown' +import AddEditCredentialDialog from 'views/credentials/AddEditCredentialDialog' +import CredentialListDialog from 'views/credentials/CredentialListDialog' + +// API +import credentialsApi from 'api/credentials' + +// ===========================|| CredentialInputHandler ||=========================== // + +const CredentialInputHandler = ({ inputParam, data, onSelect, disabled = false }) => { + const ref = useRef(null) + const [credentialId, setCredentialId] = useState(data?.credential ?? '') + const [showCredentialListDialog, setShowCredentialListDialog] = useState(false) + const [credentialListDialogProps, setCredentialListDialogProps] = useState({}) + const [showSpecificCredentialDialog, setShowSpecificCredentialDialog] = useState(false) + const [specificCredentialDialogProps, setSpecificCredentialDialogProps] = useState({}) + const [reloadTimestamp, setReloadTimestamp] = useState(Date.now().toString()) + + const editCredential = (credentialId) => { + const dialogProp = { + type: 'EDIT', + cancelButtonName: 'Cancel', + confirmButtonName: 'Save', + credentialId + } + setSpecificCredentialDialogProps(dialogProp) + setShowSpecificCredentialDialog(true) + } + + const addAsyncOption = async () => { + try { + let names = '' + if (inputParam.credentialNames.length > 1) { + names = inputParam.credentialNames.join('&') + } else { + names = inputParam.credentialNames[0] + } + const componentCredentialsResp = await credentialsApi.getSpecificComponentCredential(names) + if (componentCredentialsResp.data) { + if (Array.isArray(componentCredentialsResp.data)) { + const dialogProp = { + title: 'Add New Credential', + componentsCredentials: componentCredentialsResp.data + } + setCredentialListDialogProps(dialogProp) + setShowCredentialListDialog(true) + } else { + const dialogProp = { + type: 'ADD', + cancelButtonName: 'Cancel', + confirmButtonName: 'Add', + credentialComponent: componentCredentialsResp.data + } + setSpecificCredentialDialogProps(dialogProp) + setShowSpecificCredentialDialog(true) + } + } + } catch (error) { + console.error(error) + } + } + + const onConfirmAsyncOption = (selectedCredentialId = '') => { + setCredentialId(selectedCredentialId) + setReloadTimestamp(Date.now().toString()) + setSpecificCredentialDialogProps({}) + setShowSpecificCredentialDialog(false) + onSelect(selectedCredentialId) + } + + const onCredentialSelected = (credentialComponent) => { + setShowCredentialListDialog(false) + const dialogProp = { + type: 'ADD', + cancelButtonName: 'Cancel', + confirmButtonName: 'Add', + credentialComponent + } + setSpecificCredentialDialogProps(dialogProp) + setShowSpecificCredentialDialog(true) + } + + return ( +
+ {inputParam && ( + <> + {inputParam.type === 'credential' && ( + <> +
+
+ { + setCredentialId(newValue) + onSelect(newValue) + }} + onCreateNew={() => addAsyncOption(inputParam.name)} + /> + {credentialId && ( + editCredential(credentialId)}> + + + )} +
+ + )} + + )} + {showSpecificCredentialDialog && ( + setShowSpecificCredentialDialog(false)} + onConfirm={onConfirmAsyncOption} + > + )} + {showCredentialListDialog && ( + setShowCredentialListDialog(false)} + onCredentialSelected={onCredentialSelected} + > + )} +
+ ) +} + +CredentialInputHandler.propTypes = { + inputParam: PropTypes.object, + data: PropTypes.object, + onSelect: PropTypes.func, + disabled: PropTypes.bool +} + +export default CredentialInputHandler diff --git a/packages/ui/src/views/canvas/NodeInputHandler.js b/packages/ui/src/views/canvas/NodeInputHandler.js index 1dc656e8f..176df52f8 100644 --- a/packages/ui/src/views/canvas/NodeInputHandler.js +++ b/packages/ui/src/views/canvas/NodeInputHandler.js @@ -5,19 +5,31 @@ import { useSelector } from 'react-redux' // material-ui import { useTheme, styled } from '@mui/material/styles' -import { Box, Typography, Tooltip, IconButton } from '@mui/material' +import { Box, Typography, Tooltip, IconButton, Button } from '@mui/material' import { tooltipClasses } from '@mui/material/Tooltip' -import { IconArrowsMaximize } from '@tabler/icons' +import { IconArrowsMaximize, IconEdit, IconAlertTriangle } from '@tabler/icons' // project import import { Dropdown } from 'ui-component/dropdown/Dropdown' +import { AsyncDropdown } from 'ui-component/dropdown/AsyncDropdown' import { Input } from 'ui-component/input/Input' import { File } from 'ui-component/file/File' import { SwitchInput } from 'ui-component/switch/Switch' import { flowContext } from 'store/context/ReactFlowContext' -import { isValidConnection, getAvailableNodesForVariable } from 'utils/genericHelper' +import { isValidConnection } from 'utils/genericHelper' import { JsonEditorInput } from 'ui-component/json/JsonEditor' import { TooltipWithParser } from 'ui-component/tooltip/TooltipWithParser' +import ToolDialog from 'views/tools/ToolDialog' +import FormatPromptValuesDialog from 'ui-component/dialog/FormatPromptValuesDialog' +import CredentialInputHandler from './CredentialInputHandler' + +// utils +import { getInputVariables } from 'utils/genericHelper' + +// const +import { FLOWISE_CREDENTIAL_ID } from 'store/constant' + +const EDITABLE_TOOLS = ['selectedTool'] const CustomWidthTooltip = styled(({ className, ...props }) => )({ [`& .${tooltipClasses.tooltip}`]: { @@ -36,6 +48,11 @@ const NodeInputHandler = ({ inputAnchor, inputParam, data, disabled = false, isA const [position, setPosition] = useState(0) const [showExpandDialog, setShowExpandDialog] = useState(false) const [expandDialogProps, setExpandDialogProps] = useState({}) + const [showAsyncOptionDialog, setAsyncOptionEditDialog] = useState('') + const [asyncOptionEditDialogProps, setAsyncOptionEditDialogProps] = useState({}) + const [reloadTimestamp, setReloadTimestamp] = useState(Date.now().toString()) + const [showFormatPromptValuesDialog, setShowFormatPromptValuesDialog] = useState(false) + const [formatPromptValuesDialogProps, setFormatPromptValuesDialogProps] = useState({}) const onExpandDialogClicked = (value, inputParam) => { const dialogProp = { @@ -45,22 +62,75 @@ const NodeInputHandler = ({ inputAnchor, inputParam, data, disabled = false, isA confirmButtonName: 'Save', cancelButtonName: 'Cancel' } - - if (!disabled) { - const nodes = reactFlowInstance.getNodes() - const edges = reactFlowInstance.getEdges() - const nodesForVariable = inputParam.acceptVariable ? getAvailableNodesForVariable(nodes, edges, data.id, inputParam.id) : [] - dialogProp.availableNodesForVariable = nodesForVariable - } setExpandDialogProps(dialogProp) setShowExpandDialog(true) } + const onFormatPromptValuesClicked = (value, inputParam) => { + // Preset values if the field is format prompt values + let inputValue = value + if (inputParam.name === 'promptValues' && !value) { + const obj = {} + const templateValue = + (data.inputs['template'] ?? '') + (data.inputs['systemMessagePrompt'] ?? '') + (data.inputs['humanMessagePrompt'] ?? '') + const inputVariables = getInputVariables(templateValue) + for (const inputVariable of inputVariables) { + obj[inputVariable] = '' + } + if (Object.keys(obj).length) inputValue = JSON.stringify(obj) + } + const dialogProp = { + value: inputValue, + inputParam, + nodes: reactFlowInstance.getNodes(), + edges: reactFlowInstance.getEdges(), + nodeId: data.id + } + setFormatPromptValuesDialogProps(dialogProp) + setShowFormatPromptValuesDialog(true) + } + const onExpandDialogSave = (newValue, inputParamName) => { setShowExpandDialog(false) data.inputs[inputParamName] = newValue } + const editAsyncOption = (inputParamName, inputValue) => { + if (inputParamName === 'selectedTool') { + setAsyncOptionEditDialogProps({ + title: 'Edit Tool', + type: 'EDIT', + cancelButtonName: 'Cancel', + confirmButtonName: 'Save', + toolId: inputValue + }) + } + setAsyncOptionEditDialog(inputParamName) + } + + const addAsyncOption = (inputParamName) => { + if (inputParamName === 'selectedTool') { + setAsyncOptionEditDialogProps({ + title: 'Add New Tool', + type: 'ADD', + cancelButtonName: 'Cancel', + confirmButtonName: 'Add' + }) + } + setAsyncOptionEditDialog(inputParamName) + } + + const onConfirmAsyncOption = (selectedOptionId = '') => { + if (!selectedOptionId) { + data.inputs[showAsyncOptionDialog] = '' + } else { + data.inputs[showAsyncOptionDialog] = selectedOptionId + setReloadTimestamp(Date.now().toString()) + } + setAsyncOptionEditDialogProps({}) + setAsyncOptionEditDialog('') + } + useEffect(() => { if (ref.current && ref.current.offsetTop && ref.current.clientHeight) { setPosition(ref.current.offsetTop + ref.current.clientHeight / 2) @@ -95,6 +165,7 @@ const NodeInputHandler = ({ inputAnchor, inputParam, data, disabled = false, isA {inputAnchor.label} {!inputAnchor.optional &&  *} + {inputAnchor.description && } @@ -144,6 +215,33 @@ const NodeInputHandler = ({ inputAnchor, inputParam, data, disabled = false, isA )}
+ {inputParam.warning && ( +
+ + {inputParam.warning} +
+ )} + {inputParam.type === 'credential' && ( + { + data.credential = newValue + data.inputs[FLOWISE_CREDENTIAL_ID] = newValue // in case data.credential is not updated + }} + /> + )} {inputParam.type === 'file' && ( (data.inputs[inputParam.name] = newValue)} @@ -172,12 +271,33 @@ const NodeInputHandler = ({ inputAnchor, inputParam, data, disabled = false, isA /> )} {inputParam.type === 'json' && ( - (data.inputs[inputParam.name] = newValue)} - value={data.inputs[inputParam.name] ?? inputParam.default ?? ''} - isDarkMode={customization.isDarkMode} - /> + <> + {!inputParam?.acceptVariable && ( + (data.inputs[inputParam.name] = newValue)} + value={data.inputs[inputParam.name] ?? inputParam.default ?? ''} + isDarkMode={customization.isDarkMode} + /> + )} + {inputParam?.acceptVariable && ( + <> + + setShowFormatPromptValuesDialog(false)} + onChange={(newValue) => (data.inputs[inputParam.name] = newValue)} + > + + )} + )} {inputParam.type === 'options' && ( (data.inputs[inputParam.name] = newValue)} - value={data.inputs[inputParam.name] ?? inputParam.default ?? 'chose an option'} + value={data.inputs[inputParam.name] ?? inputParam.default ?? 'choose an option'} /> )} + {inputParam.type === 'asyncOptions' && ( + <> + {data.inputParams.length === 1 &&
} +
+ (data.inputs[inputParam.name] = newValue)} + onCreateNew={() => addAsyncOption(inputParam.name)} + /> + {EDITABLE_TOOLS.includes(inputParam.name) && data.inputs[inputParam.name] && ( + editAsyncOption(inputParam.name, data.inputs[inputParam.name])} + > + + + )} +
+ + )} )} + setAsyncOptionEditDialog('')} + onConfirm={onConfirmAsyncOption} + >
) } diff --git a/packages/ui/src/views/canvas/index.js b/packages/ui/src/views/canvas/index.js index f71acbdfe..c0206a9e0 100644 --- a/packages/ui/src/views/canvas/index.js +++ b/packages/ui/src/views/canvas/index.js @@ -12,6 +12,7 @@ import { enqueueSnackbar as enqueueSnackbarAction, closeSnackbar as closeSnackbarAction } from 'store/actions' +import { omit, cloneDeep } from 'lodash' // material-ui import { Toolbar, Box, AppBar, Button } from '@mui/material' @@ -23,7 +24,7 @@ import ButtonEdge from './ButtonEdge' import CanvasHeader from './CanvasHeader' import AddNodes from './AddNodes' import ConfirmDialog from 'ui-component/dialog/ConfirmDialog' -import { ChatMessage } from 'views/chatmessage/ChatMessage' +import { ChatPopUp } from 'views/chatmessage/ChatPopUp' import { flowContext } from 'store/context/ReactFlowContext' // API @@ -41,6 +42,9 @@ import { IconX } from '@tabler/icons' import { getUniqueNodeId, initNode, getEdgeLabelName, rearrangeToolsOrdering } from 'utils/genericHelper' import useNotifier from 'utils/useNotifier' +// const +import { FLOWISE_CREDENTIAL_ID } from 'store/constant' + const nodeTypes = { customNode: CanvasNode } const edgeTypes = { buttonedge: ButtonEdge } @@ -185,23 +189,28 @@ const Canvas = () => { const handleSaveFlow = (chatflowName) => { if (reactFlowInstance) { - setNodes((nds) => - nds.map((node) => { - node.data = { - ...node.data, - selected: false - } - return node - }) - ) + const nodes = reactFlowInstance.getNodes().map((node) => { + const nodeData = cloneDeep(node.data) + if (Object.prototype.hasOwnProperty.call(nodeData.inputs, FLOWISE_CREDENTIAL_ID)) { + nodeData.credential = nodeData.inputs[FLOWISE_CREDENTIAL_ID] + nodeData.inputs = omit(nodeData.inputs, [FLOWISE_CREDENTIAL_ID]) + } + node.data = { + ...nodeData, + selected: false + } + return node + }) const rfInstanceObject = reactFlowInstance.toObject() + rfInstanceObject.nodes = nodes const flowData = JSON.stringify(rfInstanceObject) if (!chatflow.id) { const newChatflowBody = { name: chatflowName, deployed: false, + isPublic: false, flowData } createNewChatflowApi.request(newChatflowBody) @@ -502,6 +511,7 @@ const Canvas = () => { onConnect={onConnect} onInit={setReactFlowInstance} fitView + deleteKeyCode={canvas.canvasDialogShow ? null : ['Backspace', 'Delete']} minZoom={0.1} > { /> - +
diff --git a/packages/ui/src/views/chatbot/index.js b/packages/ui/src/views/chatbot/index.js new file mode 100644 index 000000000..2b5721e79 --- /dev/null +++ b/packages/ui/src/views/chatbot/index.js @@ -0,0 +1,124 @@ +import { useEffect, useState } from 'react' +import { FullPageChat } from 'flowise-embed-react' +import { useNavigate } from 'react-router-dom' + +// Project import +import LoginDialog from 'ui-component/dialog/LoginDialog' + +// API +import chatflowsApi from 'api/chatflows' + +// Hooks +import useApi from 'hooks/useApi' + +//Const +import { baseURL } from 'store/constant' + +// ==============================|| Chatbot ||============================== // + +const ChatbotFull = () => { + const URLpath = document.location.pathname.toString().split('/') + const chatflowId = URLpath[URLpath.length - 1] === 'chatbot' ? '' : URLpath[URLpath.length - 1] + const navigate = useNavigate() + + const [chatflow, setChatflow] = useState(null) + const [chatbotTheme, setChatbotTheme] = useState({}) + const [loginDialogOpen, setLoginDialogOpen] = useState(false) + const [loginDialogProps, setLoginDialogProps] = useState({}) + const [isLoading, setLoading] = useState(true) + const [chatbotOverrideConfig, setChatbotOverrideConfig] = useState({}) + + const getSpecificChatflowFromPublicApi = useApi(chatflowsApi.getSpecificChatflowFromPublicEndpoint) + const getSpecificChatflowApi = useApi(chatflowsApi.getSpecificChatflow) + + const onLoginClick = (username, password) => { + localStorage.setItem('username', username) + localStorage.setItem('password', password) + navigate(0) + } + + useEffect(() => { + getSpecificChatflowFromPublicApi.request(chatflowId) + + // eslint-disable-next-line react-hooks/exhaustive-deps + }, []) + + useEffect(() => { + if (getSpecificChatflowFromPublicApi.error) { + if (getSpecificChatflowFromPublicApi.error?.response?.status === 401) { + if (localStorage.getItem('username') && localStorage.getItem('password')) { + getSpecificChatflowApi.request(chatflowId) + } else { + setLoginDialogProps({ + title: 'Login', + confirmButtonName: 'Login' + }) + setLoginDialogOpen(true) + } + } + } + // eslint-disable-next-line react-hooks/exhaustive-deps + }, [getSpecificChatflowFromPublicApi.error]) + + useEffect(() => { + if (getSpecificChatflowApi.error) { + if (getSpecificChatflowApi.error?.response?.status === 401) { + setLoginDialogProps({ + title: 'Login', + confirmButtonName: 'Login' + }) + setLoginDialogOpen(true) + } + } + }, [getSpecificChatflowApi.error]) + + useEffect(() => { + if (getSpecificChatflowFromPublicApi.data || getSpecificChatflowApi.data) { + const chatflowData = getSpecificChatflowFromPublicApi.data || getSpecificChatflowApi.data + setChatflow(chatflowData) + if (chatflowData.chatbotConfig) { + try { + const parsedConfig = JSON.parse(chatflowData.chatbotConfig) + setChatbotTheme(parsedConfig) + if (parsedConfig.overrideConfig) { + // Generate new sessionId + if (parsedConfig.overrideConfig.generateNewSession) { + parsedConfig.overrideConfig.sessionId = Date.now().toString() + } + setChatbotOverrideConfig(parsedConfig.overrideConfig) + } + } catch (e) { + console.error(e) + setChatbotTheme({}) + setChatbotOverrideConfig({}) + } + } + } + }, [getSpecificChatflowFromPublicApi.data, getSpecificChatflowApi.data]) + + useEffect(() => { + setLoading(getSpecificChatflowFromPublicApi.loading || getSpecificChatflowApi.loading) + }, [getSpecificChatflowFromPublicApi.loading, getSpecificChatflowApi.loading]) + + return ( + <> + {!isLoading ? ( + <> + {!chatflow || chatflow.apikeyid ? ( +

Invalid Chatbot

+ ) : ( + + )} + + + ) : null} + + ) +} + +export default ChatbotFull diff --git a/packages/ui/src/ui-component/dialog/APICodeDialog.js b/packages/ui/src/views/chatflows/APICodeDialog.js similarity index 55% rename from packages/ui/src/ui-component/dialog/APICodeDialog.js rename to packages/ui/src/views/chatflows/APICodeDialog.js index 940015281..0ae8a8f4e 100644 --- a/packages/ui/src/ui-component/dialog/APICodeDialog.js +++ b/packages/ui/src/views/chatflows/APICodeDialog.js @@ -4,11 +4,25 @@ import { useState, useEffect } from 'react' import { useDispatch } from 'react-redux' import PropTypes from 'prop-types' -import { Tabs, Tab, Dialog, DialogContent, DialogTitle, Box } from '@mui/material' +import { + Tabs, + Tab, + Dialog, + DialogContent, + DialogTitle, + Box, + Accordion, + AccordionSummary, + AccordionDetails, + Typography +} from '@mui/material' import { CopyBlock, atomOneDark } from 'react-code-blocks' +import ExpandMoreIcon from '@mui/icons-material/ExpandMore' // Project import import { Dropdown } from 'ui-component/dropdown/Dropdown' +import ShareChatbot from './ShareChatbot' +import EmbedChat from './EmbedChat' // Const import { baseURL } from 'store/constant' @@ -19,6 +33,7 @@ import pythonSVG from 'assets/images/python.svg' import javascriptSVG from 'assets/images/javascript.svg' import cURLSVG from 'assets/images/cURL.svg' import EmbedSVG from 'assets/images/embed.svg' +import ShareChatbotSVG from 'assets/images/sharing.png' // API import apiKeyApi from 'api/apikey' @@ -30,6 +45,8 @@ import useApi from 'hooks/useApi' import { CheckboxInput } from 'ui-component/checkbox/Checkbox' import { TableViewOnly } from 'ui-component/table/Table' +import { IconBulb } from '@tabler/icons' + function TabPanel(props) { const { children, value, index, ...other } = props return ( @@ -79,7 +96,7 @@ const getConfigExamplesForJS = (configData, bodyType) => { else if (config.type === 'number') exampleVal = `1` else if (config.name === 'files') exampleVal = `input.files[0]` finalStr += bodyType === 'json' ? `\n "${config.name}": ${exampleVal},` : `formData.append("${config.name}", ${exampleVal})\n` - if (i === loop - 1 && bodyType !== 'json') `formData.append("question", "Hey, how are you?")\n` + if (i === loop - 1 && bodyType !== 'json') finalStr += `formData.append("question", "Hey, how are you?")\n` } return finalStr } @@ -94,7 +111,7 @@ const getConfigExamplesForPython = (configData, bodyType) => { if (config.type === 'string') exampleVal = `"example"` else if (config.type === 'boolean') exampleVal = `true` else if (config.type === 'number') exampleVal = `1` - else if (config.name === 'files') exampleVal = `('example${config.type}', open('example${config.type}', 'rb'))` + else if (config.name === 'files') continue finalStr += bodyType === 'json' ? `\n "${config.name}": ${exampleVal},` : `\n "${config.name}": ${exampleVal},` if (i === loop - 1 && bodyType !== 'json') finalStr += `\n "question": "Hey, how are you?"\n` } @@ -119,30 +136,24 @@ const getConfigExamplesForCurl = (configData, bodyType) => { return finalStr } -const embedCode = (chatflowid) => { - return `` -} - const APICodeDialog = ({ show, dialogProps, onCancel }) => { const portalElement = document.getElementById('portal') const navigate = useNavigate() const dispatch = useDispatch() - const codes = ['Embed', 'Python', 'JavaScript', 'cURL'] + + const codes = ['Embed', 'Python', 'JavaScript', 'cURL', 'Share Chatbot'] const [value, setValue] = useState(0) const [keyOptions, setKeyOptions] = useState([]) const [apiKeys, setAPIKeys] = useState([]) const [chatflowApiKeyId, setChatflowApiKeyId] = useState('') const [selectedApiKey, setSelectedApiKey] = useState({}) const [checkboxVal, setCheckbox] = useState(false) + const [nodeConfig, setNodeConfig] = useState({}) + const [nodeConfigExpanded, setNodeConfigExpanded] = useState({}) const getAllAPIKeysApi = useApi(apiKeyApi.getAllAPIKeys) const updateChatflowApi = useApi(chatflowsApi.updateChatflow) + const getIsChatflowStreamingApi = useApi(chatflowsApi.getIsChatflowStreaming) const getConfigApi = useApi(configApi.getConfig) const onCheckBoxChanged = (newVal) => { @@ -165,12 +176,36 @@ const APICodeDialog = ({ show, dialogProps, onCancel }) => { updateChatflowApi.request(dialogProps.chatflowid, updateBody) } + const groupByNodeLabel = (nodes, isFilter = false) => { + const accordianNodes = {} + const result = nodes.reduce(function (r, a) { + r[a.node] = r[a.node] || [] + r[a.node].push(a) + accordianNodes[a.node] = isFilter ? true : false + return r + }, Object.create(null)) + setNodeConfig(result) + setNodeConfigExpanded(accordianNodes) + } + + const handleAccordionChange = (nodeLabel) => (event, isExpanded) => { + const accordianNodes = { ...nodeConfigExpanded } + accordianNodes[nodeLabel] = isExpanded + setNodeConfigExpanded(accordianNodes) + } + useEffect(() => { if (updateChatflowApi.data) { dispatch({ type: SET_CHATFLOW, chatflow: updateChatflowApi.data }) } }, [updateChatflowApi.data, dispatch]) + useEffect(() => { + if (getConfigApi.data) { + groupByNodeLabel(getConfigApi.data) + } + }, [getConfigApi.data]) + const handleChange = (event, newValue) => { setValue(newValue) } @@ -195,7 +230,10 @@ output = query({ "${baseURL}/api/v1/prediction/${dialogProps.chatflowid}", { method: "POST", - body: data + headers: { + "Content-Type": "application/json" + }, + body: JSON.stringify(data) } ); const result = await response.json(); @@ -209,9 +247,8 @@ query({"question": "Hey, how are you?"}).then((response) => { } else if (codeLang === 'cURL') { return `curl ${baseURL}/api/v1/prediction/${dialogProps.chatflowid} \\ -X POST \\ - -d '{"question": "Hey, how are you?"}'` - } else if (codeLang === 'Embed') { - return embedCode(dialogProps.chatflowid) + -d '{"question": "Hey, how are you?"}' \\ + -H "Content-Type: application/json"` } return '' } @@ -236,9 +273,12 @@ output = query({ const response = await fetch( "${baseURL}/api/v1/prediction/${dialogProps.chatflowid}", { - headers: { Authorization: "Bearer ${selectedApiKey?.apiKey}" }, + headers: { + Authorization: "Bearer ${selectedApiKey?.apiKey}", + "Content-Type": "application/json" + }, method: "POST", - body: data + body: JSON.stringify(data) } ); const result = await response.json(); @@ -253,9 +293,8 @@ query({"question": "Hey, how are you?"}).then((response) => { return `curl ${baseURL}/api/v1/prediction/${dialogProps.chatflowid} \\ -X POST \\ -d '{"question": "Hey, how are you?"}' \\ + -H "Content-Type: application/json" \\ -H "Authorization: Bearer ${selectedApiKey?.apiKey}"` - } else if (codeLang === 'Embed') { - return embedCode(dialogProps.chatflowid) } return '' } @@ -263,7 +302,7 @@ query({"question": "Hey, how are you?"}).then((response) => { const getLang = (codeLang) => { if (codeLang === 'Python') { return 'python' - } else if (codeLang === 'JavaScript' || codeLang === 'Embed') { + } else if (codeLang === 'JavaScript') { return 'javascript' } else if (codeLang === 'cURL') { return 'bash' @@ -280,6 +319,8 @@ query({"question": "Hey, how are you?"}).then((response) => { return EmbedSVG } else if (codeLang === 'cURL') { return cURLSVG + } else if (codeLang === 'Share Chatbot') { + return ShareChatbotSVG } return pythonSVG } @@ -288,15 +329,20 @@ query({"question": "Hey, how are you?"}).then((response) => { const getConfigCodeWithFormData = (codeLang, configData) => { if (codeLang === 'Python') { + configData = unshiftFiles(configData) + const fileType = configData[0].type return `import requests API_URL = "${baseURL}/api/v1/prediction/${dialogProps.chatflowid}" # use form data to upload files -form_data = {${getConfigExamplesForPython(configData, 'formData')}} +form_data = { + "files": ${`('example${fileType}', open('example${fileType}', 'rb'))`} +} +body_data = {${getConfigExamplesForPython(configData, 'formData')}} def query(form_data): - response = requests.post(API_URL, files=form_data) + response = requests.post(API_URL, files=form_data, data=body_data) return response.json() output = query(form_data) @@ -323,7 +369,8 @@ query(formData).then((response) => { ` } else if (codeLang === 'cURL') { return `curl ${baseURL}/api/v1/prediction/${dialogProps.chatflowid} \\ - -X POST \\${getConfigExamplesForCurl(configData, 'formData')}` + -X POST \\${getConfigExamplesForCurl(configData, 'formData')} \\ + -H "Content-Type: multipart/form-data"` } return '' } @@ -332,16 +379,21 @@ query(formData).then((response) => { const getConfigCodeWithFormDataWithAuth = (codeLang, configData) => { if (codeLang === 'Python') { + configData = unshiftFiles(configData) + const fileType = configData[0].type return `import requests API_URL = "${baseURL}/api/v1/prediction/${dialogProps.chatflowid}" headers = {"Authorization": "Bearer ${selectedApiKey?.apiKey}"} # use form data to upload files -form_data = {${getConfigExamplesForPython(configData, 'formData')}} +form_data = { + "files": ${`('example${fileType}', open('example${fileType}', 'rb'))`} +} +body_data = {${getConfigExamplesForPython(configData, 'formData')}} def query(form_data): - response = requests.post(API_URL, headers=headers, files=form_data) + response = requests.post(API_URL, headers=headers, files=form_data, data=body_data) return response.json() output = query(form_data) @@ -370,6 +422,7 @@ query(formData).then((response) => { } else if (codeLang === 'cURL') { return `curl ${baseURL}/api/v1/prediction/${dialogProps.chatflowid} \\ -X POST \\${getConfigExamplesForCurl(configData, 'formData')} \\ + -H "Content-Type: multipart/form-data" \\ -H "Authorization: Bearer ${selectedApiKey?.apiKey}"` } return '' @@ -399,7 +452,10 @@ output = query({ "${baseURL}/api/v1/prediction/${dialogProps.chatflowid}", { method: "POST", - body: data + headers: { + "Content-Type": "application/json" + }, + body: JSON.stringify(data) } ); const result = await response.json(); @@ -417,7 +473,8 @@ query({ } else if (codeLang === 'cURL') { return `curl ${baseURL}/api/v1/prediction/${dialogProps.chatflowid} \\ -X POST \\ - -d '{"question": "Hey, how are you?", "overrideConfig": {${getConfigExamplesForCurl(configData, 'json')}}'` + -d '{"question": "Hey, how are you?", "overrideConfig": {${getConfigExamplesForCurl(configData, 'json')}}' \\ + -H "Content-Type: application/json"` } return '' } @@ -446,9 +503,12 @@ output = query({ const response = await fetch( "${baseURL}/api/v1/prediction/${dialogProps.chatflowid}", { - headers: { Authorization: "Bearer ${selectedApiKey?.apiKey}" }, + headers: { + Authorization: "Bearer ${selectedApiKey?.apiKey}", + "Content-Type": "application/json" + }, method: "POST", - body: data + body: JSON.stringify(data) } ); const result = await response.json(); @@ -467,11 +527,38 @@ query({ return `curl ${baseURL}/api/v1/prediction/${dialogProps.chatflowid} \\ -X POST \\ -d '{"question": "Hey, how are you?", "overrideConfig": {${getConfigExamplesForCurl(configData, 'json')}}' \\ + -H "Content-Type: application/json" \\ -H "Authorization: Bearer ${selectedApiKey?.apiKey}"` } return '' } + const getMultiConfigCodeWithFormData = (codeLang) => { + if (codeLang === 'Python') { + return `body_data = { + "openAIApiKey[chatOpenAI_0]": "sk-my-openai-1st-key", + "openAIApiKey[openAIEmbeddings_0]": "sk-my-openai-2nd-key" +}` + } else if (codeLang === 'JavaScript') { + return `formData.append("openAIApiKey[chatOpenAI_0]", "sk-my-openai-1st-key") +formData.append("openAIApiKey[openAIEmbeddings_0]", "sk-my-openai-2nd-key")` + } else if (codeLang === 'cURL') { + return `-F "openAIApiKey[chatOpenAI_0]=sk-my-openai-1st-key" \\ +-F "openAIApiKey[openAIEmbeddings_0]=sk-my-openai-2nd-key" \\` + } + } + + const getMultiConfigCode = () => { + return `{ + "overrideConfig": { + "openAIApiKey": { + "chatOpenAI_0": "sk-my-openai-1st-key", + "openAIEmbeddings_0": "sk-my-openai-2nd-key" + } + } +}` + } + useEffect(() => { if (getAllAPIKeysApi.data) { const options = [ @@ -503,6 +590,7 @@ query({ useEffect(() => { if (show) { getAllAPIKeysApi.request() + getIsChatflowStreamingApi.request(dialogProps.chatflowid) } // eslint-disable-next-line react-hooks/exhaustive-deps @@ -537,57 +625,151 @@ query({ ))} - {value !== 0 && ( -
- onApiKeySelected(newValue)} - value={dialogProps.chatflowApiKeyId ?? chatflowApiKeyId ?? 'Choose an API key'} - /> -
- )} +
+ onApiKeySelected(newValue)} + value={dialogProps.chatflowApiKeyId ?? chatflowApiKeyId ?? 'Choose an API key'} + /> +
{codes.map((codeLang, index) => ( - {value === 0 && ( + {(codeLang === 'Embed' || codeLang === 'Share Chatbot') && chatflowApiKeyId && ( <> - - Paste this anywhere in the {``} tag of your html file - -
+

You cannot use API key while embedding/sharing chatbot.

+

+ Please select "No Authorization" from the dropdown at the top right corner. +

)} - - {value !== 0 && } - {value !== 0 && checkboxVal && getConfigApi.data && getConfigApi.data.length > 0 && ( + {codeLang === 'Embed' && !chatflowApiKeyId && } + {codeLang !== 'Embed' && codeLang !== 'Share Chatbot' && ( <> - + + {checkboxVal && getConfigApi.data && getConfigApi.data.length > 0 && ( + <> + {Object.keys(nodeConfig) + .sort() + .map((nodeLabel) => ( + + } + aria-controls={`nodes-accordian-${nodeLabel}`} + id={`nodes-accordian-header-${nodeLabel}`} + > +
+ {nodeLabel} +
+ + {nodeConfig[nodeLabel][0].nodeId} + +
+
+
+ + + +
+ ))} + +
+
+ + + You can also specify multiple values for a config parameter by specifying the node id + +
+
+ +
+
+ + )} + {getIsChatflowStreamingApi.data?.isStreaming && ( +

+ Read  + + here + +  on how to stream response back to application +

+ )} )} + {codeLang === 'Share Chatbot' && !chatflowApiKeyId && ( + + )}
))} diff --git a/packages/ui/src/views/chatflows/EmbedChat.js b/packages/ui/src/views/chatflows/EmbedChat.js new file mode 100644 index 000000000..c6385efba --- /dev/null +++ b/packages/ui/src/views/chatflows/EmbedChat.js @@ -0,0 +1,324 @@ +import { useState } from 'react' +import PropTypes from 'prop-types' + +import { Tabs, Tab, Box } from '@mui/material' +import { CopyBlock, atomOneDark } from 'react-code-blocks' + +// Project import +import { CheckboxInput } from 'ui-component/checkbox/Checkbox' + +// Const +import { baseURL } from 'store/constant' + +function TabPanel(props) { + const { children, value, index, ...other } = props + return ( + + ) +} + +TabPanel.propTypes = { + children: PropTypes.node, + index: PropTypes.number.isRequired, + value: PropTypes.number.isRequired +} + +function a11yProps(index) { + return { + id: `attachment-tab-${index}`, + 'aria-controls': `attachment-tabpanel-${index}` + } +} + +const embedPopupHtmlCode = (chatflowid) => { + return `` +} + +const embedPopupReactCode = (chatflowid) => { + return `import { BubbleChat } from 'flowise-embed-react' + +const App = () => { + return ( + + ); +};` +} + +const embedFullpageHtmlCode = (chatflowid) => { + return ` +` +} + +const embedFullpageReactCode = (chatflowid) => { + return `import { FullPageChat } from "flowise-embed-react" + +const App = () => { + return ( + + ); +};` +} + +const buttonConfig = (isReact = false) => { + return isReact + ? `button: { + backgroundColor: "#3B81F6", + right: 20, + bottom: 20, + size: "medium", + iconColor: "white", + customIconSrc: "https://raw.githubusercontent.com/walkxcode/dashboard-icons/main/svg/google-messages.svg", + }` + : `button: { + backgroundColor: "#3B81F6", + right: 20, + bottom: 20, + size: "medium", + iconColor: "white", + customIconSrc: "https://raw.githubusercontent.com/walkxcode/dashboard-icons/main/svg/google-messages.svg", + }` +} + +const chatwindowConfig = (isReact = false) => { + return isReact + ? `chatWindow: { + welcomeMessage: "Hello! This is custom welcome message", + backgroundColor: "#ffffff", + height: 700, + width: 400, + fontSize: 16, + poweredByTextColor: "#303235", + botMessage: { + backgroundColor: "#f7f8ff", + textColor: "#303235", + showAvatar: true, + avatarSrc: "https://raw.githubusercontent.com/zahidkhawaja/langchain-chat-nextjs/main/public/parroticon.png", + }, + userMessage: { + backgroundColor: "#3B81F6", + textColor: "#ffffff", + showAvatar: true, + avatarSrc: "https://raw.githubusercontent.com/zahidkhawaja/langchain-chat-nextjs/main/public/usericon.png", + }, + textInput: { + placeholder: "Type your question", + backgroundColor: "#ffffff", + textColor: "#303235", + sendButtonColor: "#3B81F6", + } + }` + : `chatWindow: { + welcomeMessage: "Hello! This is custom welcome message", + backgroundColor: "#ffffff", + height: 700, + width: 400, + fontSize: 16, + poweredByTextColor: "#303235", + botMessage: { + backgroundColor: "#f7f8ff", + textColor: "#303235", + showAvatar: true, + avatarSrc: "https://raw.githubusercontent.com/zahidkhawaja/langchain-chat-nextjs/main/public/parroticon.png", + }, + userMessage: { + backgroundColor: "#3B81F6", + textColor: "#ffffff", + showAvatar: true, + avatarSrc: "https://raw.githubusercontent.com/zahidkhawaja/langchain-chat-nextjs/main/public/usericon.png", + }, + textInput: { + placeholder: "Type your question", + backgroundColor: "#ffffff", + textColor: "#303235", + sendButtonColor: "#3B81F6", + } + }` +} + +const embedPopupHtmlCodeCustomization = (chatflowid) => { + return `` +} + +const embedPopupReactCodeCustomization = (chatflowid) => { + return `import { BubbleChat } from 'flowise-embed-react' + +const App = () => { + return ( + + ); +};` +} + +const embedFullpageHtmlCodeCustomization = (chatflowid) => { + return ` +` +} + +const embedFullpageReactCodeCustomization = (chatflowid) => { + return `import { FullPageChat } from "flowise-embed-react" + +const App = () => { + return ( + + ); +};` +} + +const EmbedChat = ({ chatflowid }) => { + const codes = ['Popup Html', 'Fullpage Html', 'Popup React', 'Fullpage React'] + const [value, setValue] = useState(0) + const [embedChatCheckboxVal, setEmbedChatCheckbox] = useState(false) + + const onCheckBoxEmbedChatChanged = (newVal) => { + setEmbedChatCheckbox(newVal) + } + + const handleChange = (event, newValue) => { + setValue(newValue) + } + + const getCode = (codeLang) => { + switch (codeLang) { + case 'Popup Html': + return embedPopupHtmlCode(chatflowid) + case 'Fullpage Html': + return embedFullpageHtmlCode(chatflowid) + case 'Popup React': + return embedPopupReactCode(chatflowid) + case 'Fullpage React': + return embedFullpageReactCode(chatflowid) + default: + return '' + } + } + + const getCodeCustomization = (codeLang) => { + switch (codeLang) { + case 'Popup Html': + return embedPopupHtmlCodeCustomization(chatflowid) + case 'Fullpage Html': + return embedFullpageHtmlCodeCustomization(chatflowid) + case 'Popup React': + return embedPopupReactCodeCustomization(chatflowid) + case 'Fullpage React': + return embedFullpageReactCodeCustomization(chatflowid) + default: + return '' + } + } + + return ( + <> +
+
+ + {codes.map((codeLang, index) => ( + + ))} + +
+
+
+ {codes.map((codeLang, index) => ( + + {(value === 0 || value === 1) && ( + <> + + Paste this anywhere in the {``} tag of your html file. +

+ You can also specify a  + + version + + : {`https://cdn.jsdelivr.net/npm/flowise-embed@/dist/web.js`} +

+
+
+ + )} + + + + + {embedChatCheckboxVal && ( + + )} +
+ ))} + + ) +} + +EmbedChat.propTypes = { + chatflowid: PropTypes.string +} + +export default EmbedChat diff --git a/packages/ui/src/views/chatflows/ShareChatbot.js b/packages/ui/src/views/chatflows/ShareChatbot.js new file mode 100644 index 000000000..0f05c28ab --- /dev/null +++ b/packages/ui/src/views/chatflows/ShareChatbot.js @@ -0,0 +1,495 @@ +import { useState } from 'react' +import { useDispatch, useSelector } from 'react-redux' +import { enqueueSnackbar as enqueueSnackbarAction, closeSnackbar as closeSnackbarAction, SET_CHATFLOW } from 'store/actions' +import { SketchPicker } from 'react-color' +import PropTypes from 'prop-types' + +import { Box, Typography, Button, Switch, OutlinedInput, Popover, Stack, IconButton } from '@mui/material' +import { useTheme } from '@mui/material/styles' + +// Project import +import { StyledButton } from 'ui-component/button/StyledButton' +import { TooltipWithParser } from 'ui-component/tooltip/TooltipWithParser' + +// Icons +import { IconX, IconCopy, IconArrowUpRightCircle } from '@tabler/icons' + +// API +import chatflowsApi from 'api/chatflows' + +// utils +import useNotifier from 'utils/useNotifier' + +// Const +import { baseURL } from 'store/constant' + +const defaultConfig = { + backgroundColor: '#ffffff', + fontSize: 16, + poweredByTextColor: '#303235', + botMessage: { + backgroundColor: '#f7f8ff', + textColor: '#303235' + }, + userMessage: { + backgroundColor: '#3B81F6', + textColor: '#ffffff' + }, + textInput: { + backgroundColor: '#ffffff', + textColor: '#303235', + sendButtonColor: '#3B81F6' + } +} + +const ShareChatbot = ({ isSessionMemory }) => { + const dispatch = useDispatch() + const theme = useTheme() + const chatflow = useSelector((state) => state.canvas.chatflow) + const chatflowid = chatflow.id + const chatbotConfig = chatflow.chatbotConfig ? JSON.parse(chatflow.chatbotConfig) : {} + + useNotifier() + + const enqueueSnackbar = (...args) => dispatch(enqueueSnackbarAction(...args)) + const closeSnackbar = (...args) => dispatch(closeSnackbarAction(...args)) + + const [isPublicChatflow, setChatflowIsPublic] = useState(chatflow.isPublic ?? false) + const [generateNewSession, setGenerateNewSession] = useState(chatbotConfig?.generateNewSession ?? false) + + const [welcomeMessage, setWelcomeMessage] = useState(chatbotConfig?.welcomeMessage ?? '') + const [backgroundColor, setBackgroundColor] = useState(chatbotConfig?.backgroundColor ?? defaultConfig.backgroundColor) + const [fontSize, setFontSize] = useState(chatbotConfig?.fontSize ?? defaultConfig.fontSize) + const [poweredByTextColor, setPoweredByTextColor] = useState(chatbotConfig?.poweredByTextColor ?? defaultConfig.poweredByTextColor) + + const [botMessageBackgroundColor, setBotMessageBackgroundColor] = useState( + chatbotConfig?.botMessage?.backgroundColor ?? defaultConfig.botMessage.backgroundColor + ) + const [botMessageTextColor, setBotMessageTextColor] = useState( + chatbotConfig?.botMessage?.textColor ?? defaultConfig.botMessage.textColor + ) + const [botMessageAvatarSrc, setBotMessageAvatarSrc] = useState(chatbotConfig?.botMessage?.avatarSrc ?? '') + const [botMessageShowAvatar, setBotMessageShowAvatar] = useState(chatbotConfig?.botMessage?.showAvatar ?? false) + + const [userMessageBackgroundColor, setUserMessageBackgroundColor] = useState( + chatbotConfig?.userMessage?.backgroundColor ?? defaultConfig.userMessage.backgroundColor + ) + const [userMessageTextColor, setUserMessageTextColor] = useState( + chatbotConfig?.userMessage?.textColor ?? defaultConfig.userMessage.textColor + ) + const [userMessageAvatarSrc, setUserMessageAvatarSrc] = useState(chatbotConfig?.userMessage?.avatarSrc ?? '') + const [userMessageShowAvatar, setUserMessageShowAvatar] = useState(chatbotConfig?.userMessage?.showAvatar ?? false) + + const [textInputBackgroundColor, setTextInputBackgroundColor] = useState( + chatbotConfig?.textInput?.backgroundColor ?? defaultConfig.textInput.backgroundColor + ) + const [textInputTextColor, setTextInputTextColor] = useState(chatbotConfig?.textInput?.textColor ?? defaultConfig.textInput.textColor) + const [textInputPlaceholder, setTextInputPlaceholder] = useState(chatbotConfig?.textInput?.placeholder ?? '') + const [textInputSendButtonColor, setTextInputSendButtonColor] = useState( + chatbotConfig?.textInput?.sendButtonColor ?? defaultConfig.textInput.sendButtonColor + ) + + const [colorAnchorEl, setColorAnchorEl] = useState(null) + const [selectedColorConfig, setSelectedColorConfig] = useState('') + const [sketchPickerColor, setSketchPickerColor] = useState('') + const openColorPopOver = Boolean(colorAnchorEl) + + const [copyAnchorEl, setCopyAnchorEl] = useState(null) + const openCopyPopOver = Boolean(copyAnchorEl) + + const formatObj = () => { + const obj = { + botMessage: { + showAvatar: false + }, + userMessage: { + showAvatar: false + }, + textInput: {}, + overrideConfig: {} + } + if (welcomeMessage) obj.welcomeMessage = welcomeMessage + if (backgroundColor) obj.backgroundColor = backgroundColor + if (fontSize) obj.fontSize = fontSize + if (poweredByTextColor) obj.poweredByTextColor = poweredByTextColor + + if (botMessageBackgroundColor) obj.botMessage.backgroundColor = botMessageBackgroundColor + if (botMessageTextColor) obj.botMessage.textColor = botMessageTextColor + if (botMessageAvatarSrc) obj.botMessage.avatarSrc = botMessageAvatarSrc + if (botMessageShowAvatar) obj.botMessage.showAvatar = botMessageShowAvatar + + if (userMessageBackgroundColor) obj.userMessage.backgroundColor = userMessageBackgroundColor + if (userMessageTextColor) obj.userMessage.textColor = userMessageTextColor + if (userMessageAvatarSrc) obj.userMessage.avatarSrc = userMessageAvatarSrc + if (userMessageShowAvatar) obj.userMessage.showAvatar = userMessageShowAvatar + + if (textInputBackgroundColor) obj.textInput.backgroundColor = textInputBackgroundColor + if (textInputTextColor) obj.textInput.textColor = textInputTextColor + if (textInputPlaceholder) obj.textInput.placeholder = textInputPlaceholder + if (textInputSendButtonColor) obj.textInput.sendButtonColor = textInputSendButtonColor + + if (isSessionMemory) obj.overrideConfig.generateNewSession = generateNewSession + + return obj + } + + const onSave = async () => { + try { + const saveResp = await chatflowsApi.updateChatflow(chatflowid, { + chatbotConfig: JSON.stringify(formatObj()) + }) + if (saveResp.data) { + enqueueSnackbar({ + message: 'Chatbot Configuration Saved', + options: { + key: new Date().getTime() + Math.random(), + variant: 'success', + action: (key) => ( + + ) + } + }) + dispatch({ type: SET_CHATFLOW, chatflow: saveResp.data }) + } + } catch (error) { + console.error(error) + const errorData = error.response.data || `${error.response.status}: ${error.response.statusText}` + enqueueSnackbar({ + message: `Failed to save Chatbot Configuration: ${errorData}`, + options: { + key: new Date().getTime() + Math.random(), + variant: 'error', + persist: true, + action: (key) => ( + + ) + } + }) + } + } + + const onSwitchChange = async (checked) => { + try { + const saveResp = await chatflowsApi.updateChatflow(chatflowid, { isPublic: checked }) + if (saveResp.data) { + enqueueSnackbar({ + message: 'Chatbot Configuration Saved', + options: { + key: new Date().getTime() + Math.random(), + variant: 'success', + action: (key) => ( + + ) + } + }) + dispatch({ type: SET_CHATFLOW, chatflow: saveResp.data }) + } + } catch (error) { + console.error(error) + const errorData = error.response.data || `${error.response.status}: ${error.response.statusText}` + enqueueSnackbar({ + message: `Failed to save Chatbot Configuration: ${errorData}`, + options: { + key: new Date().getTime() + Math.random(), + variant: 'error', + persist: true, + action: (key) => ( + + ) + } + }) + } + } + + const handleClosePopOver = () => { + setColorAnchorEl(null) + } + + const handleCloseCopyPopOver = () => { + setCopyAnchorEl(null) + } + + const onColorSelected = (hexColor) => { + switch (selectedColorConfig) { + case 'backgroundColor': + setBackgroundColor(hexColor) + break + case 'poweredByTextColor': + setPoweredByTextColor(hexColor) + break + case 'botMessageBackgroundColor': + setBotMessageBackgroundColor(hexColor) + break + case 'botMessageTextColor': + setBotMessageTextColor(hexColor) + break + case 'userMessageBackgroundColor': + setUserMessageBackgroundColor(hexColor) + break + case 'userMessageTextColor': + setUserMessageTextColor(hexColor) + break + case 'textInputBackgroundColor': + setTextInputBackgroundColor(hexColor) + break + case 'textInputTextColor': + setTextInputTextColor(hexColor) + break + case 'textInputSendButtonColor': + setTextInputSendButtonColor(hexColor) + break + } + setSketchPickerColor(hexColor) + } + + const onTextChanged = (value, fieldName) => { + switch (fieldName) { + case 'welcomeMessage': + setWelcomeMessage(value) + break + case 'fontSize': + setFontSize(value) + break + case 'botMessageAvatarSrc': + setBotMessageAvatarSrc(value) + break + case 'userMessageAvatarSrc': + setUserMessageAvatarSrc(value) + break + case 'textInputPlaceholder': + setTextInputPlaceholder(value) + break + } + } + + const onBooleanChanged = (value, fieldName) => { + switch (fieldName) { + case 'botMessageShowAvatar': + setBotMessageShowAvatar(value) + break + case 'userMessageShowAvatar': + setUserMessageShowAvatar(value) + break + case 'generateNewSession': + setGenerateNewSession(value) + break + } + } + + const colorField = (color, fieldName, fieldLabel) => { + return ( + +
+ {fieldLabel} + { + setSelectedColorConfig(fieldName) + setSketchPickerColor(color ?? '#ffffff') + setColorAnchorEl(event.currentTarget) + }} + > +
+
+ ) + } + + const booleanField = (value, fieldName, fieldLabel) => { + return ( + +
+ {fieldLabel} + { + onBooleanChanged(event.target.checked, fieldName) + }} + /> +
+
+ ) + } + + const textField = (message, fieldName, fieldLabel, fieldType = 'string', placeholder = '') => { + return ( + +
+ {fieldLabel} + { + onTextChanged(e.target.value, fieldName) + }} + /> +
+
+ ) + } + + return ( + <> + + + {`${baseURL}/chatbot/${chatflowid}`} + + { + navigator.clipboard.writeText(`${baseURL}/chatbot/${chatflowid}`) + setCopyAnchorEl(event.currentTarget) + setTimeout(() => { + handleCloseCopyPopOver() + }, 1500) + }} + > + + + window.open(`${baseURL}/chatbot/${chatflowid}`, '_blank')}> + + +
+
+ { + setChatflowIsPublic(event.target.checked) + onSwitchChange(event.target.checked) + }} + /> + Make Public + +
+ + {textField(welcomeMessage, 'welcomeMessage', 'Welcome Message', 'string', 'Hello! This is custom welcome message')} + {colorField(backgroundColor, 'backgroundColor', 'Background Color')} + {textField(fontSize, 'fontSize', 'Font Size', 'number')} + {colorField(poweredByTextColor, 'poweredByTextColor', 'PoweredBy TextColor')} + + {/*BOT Message*/} + + Bot Message + + {colorField(botMessageBackgroundColor, 'botMessageBackgroundColor', 'Background Color')} + {colorField(botMessageTextColor, 'botMessageTextColor', 'Text Color')} + {textField( + botMessageAvatarSrc, + 'botMessageAvatarSrc', + 'Avatar Link', + 'string', + `https://raw.githubusercontent.com/zahidkhawaja/langchain-chat-nextjs/main/public/parroticon.png` + )} + {booleanField(botMessageShowAvatar, 'botMessageShowAvatar', 'Show Avatar')} + + {/*USER Message*/} + + User Message + + {colorField(userMessageBackgroundColor, 'userMessageBackgroundColor', 'Background Color')} + {colorField(userMessageTextColor, 'userMessageTextColor', 'Text Color')} + {textField( + userMessageAvatarSrc, + 'userMessageAvatarSrc', + 'Avatar Link', + 'string', + `https://raw.githubusercontent.com/zahidkhawaja/langchain-chat-nextjs/main/public/usericon.png` + )} + {booleanField(userMessageShowAvatar, 'userMessageShowAvatar', 'Show Avatar')} + + {/*TEXT Input*/} + + Text Input + + {colorField(textInputBackgroundColor, 'textInputBackgroundColor', 'Background Color')} + {colorField(textInputTextColor, 'textInputTextColor', 'Text Color')} + {textField(textInputPlaceholder, 'textInputPlaceholder', 'TextInput Placeholder', 'string', `Type question..`)} + {colorField(textInputSendButtonColor, 'textInputSendButtonColor', 'TextIntput Send Button Color')} + + {/*Session Memory Input*/} + {isSessionMemory && ( + <> + + Session Memory + + {booleanField(generateNewSession, 'generateNewSession', 'Start new session when chatbot link is opened or refreshed')} + + )} + + onSave()}> + Save Changes + + + onColorSelected(color.hex)} /> + + + + Copied! + + + + ) +} + +ShareChatbot.propTypes = { + isSessionMemory: PropTypes.bool +} + +export default ShareChatbot diff --git a/packages/ui/src/views/chatmessage/ChatExpandDialog.js b/packages/ui/src/views/chatmessage/ChatExpandDialog.js new file mode 100644 index 000000000..1b2037a87 --- /dev/null +++ b/packages/ui/src/views/chatmessage/ChatExpandDialog.js @@ -0,0 +1,62 @@ +import { createPortal } from 'react-dom' +import PropTypes from 'prop-types' +import { useSelector } from 'react-redux' + +import { Dialog, DialogContent, DialogTitle, Button } from '@mui/material' +import { ChatMessage } from './ChatMessage' +import { StyledButton } from 'ui-component/button/StyledButton' +import { IconEraser } from '@tabler/icons' + +const ChatExpandDialog = ({ show, dialogProps, onClear, onCancel }) => { + const portalElement = document.getElementById('portal') + const customization = useSelector((state) => state.customization) + + const component = show ? ( + + +
+ {dialogProps.title} +
+ {customization.isDarkMode && ( + } + > + Clear Chat + + )} + {!customization.isDarkMode && ( + + )} +
+
+ + + +
+ ) : null + + return createPortal(component, portalElement) +} + +ChatExpandDialog.propTypes = { + show: PropTypes.bool, + dialogProps: PropTypes.object, + onClear: PropTypes.func, + onCancel: PropTypes.func +} + +export default ChatExpandDialog diff --git a/packages/ui/src/views/chatmessage/ChatMessage.css b/packages/ui/src/views/chatmessage/ChatMessage.css index a29e49ffd..3b006c1d4 100644 --- a/packages/ui/src/views/chatmessage/ChatMessage.css +++ b/packages/ui/src/views/chatmessage/ChatMessage.css @@ -2,6 +2,7 @@ width: 100%; height: 100%; overflow-y: scroll; + overflow-x: hidden; border-radius: 0.5rem; } @@ -75,12 +76,15 @@ } .markdownanswer a { + display: block; + margin-right: 2.5rem; + word-wrap: break-word; color: #16bed7; font-weight: 500; } .markdownanswer code { - color: #15cb19; + color: #0ab126; font-weight: 500; white-space: pre-wrap !important; } @@ -92,6 +96,7 @@ .boticon, .usericon { + margin-top: 1rem; margin-right: 1rem; border-radius: 1rem; } @@ -119,3 +124,13 @@ justify-content: center; align-items: center; } + +.cloud-dialog { + width: 100%; + height: 100vh; + overflow-y: scroll; + border-radius: 0.5rem; + display: flex; + justify-content: center; + align-items: center; +} diff --git a/packages/ui/src/views/chatmessage/ChatMessage.js b/packages/ui/src/views/chatmessage/ChatMessage.js index e50f5bd5e..506f02da6 100644 --- a/packages/ui/src/views/chatmessage/ChatMessage.js +++ b/packages/ui/src/views/chatmessage/ChatMessage.js @@ -1,53 +1,43 @@ -import { useState, useRef, useEffect } from 'react' -import { useDispatch, useSelector } from 'react-redux' -import ReactMarkdown from 'react-markdown' +import { useState, useRef, useEffect, useCallback } from 'react' +import { useSelector } from 'react-redux' import PropTypes from 'prop-types' -import { enqueueSnackbar as enqueueSnackbarAction, closeSnackbar as closeSnackbarAction } from 'store/actions' +import socketIOClient from 'socket.io-client' +import { cloneDeep } from 'lodash' +import rehypeMathjax from 'rehype-mathjax' +import remarkGfm from 'remark-gfm' +import remarkMath from 'remark-math' -import { - ClickAwayListener, - Paper, - Popper, - CircularProgress, - OutlinedInput, - Divider, - InputAdornment, - IconButton, - Box, - Button -} from '@mui/material' +import { CircularProgress, OutlinedInput, Divider, InputAdornment, IconButton, Box, Chip } from '@mui/material' import { useTheme } from '@mui/material/styles' -import { IconMessage, IconX, IconSend, IconEraser } from '@tabler/icons' +import { IconSend } from '@tabler/icons' // project import -import { StyledFab } from 'ui-component/button/StyledFab' -import MainCard from 'ui-component/cards/MainCard' -import Transitions from 'ui-component/extended/Transitions' +import { CodeBlock } from 'ui-component/markdown/CodeBlock' +import { MemoizedReactMarkdown } from 'ui-component/markdown/MemoizedReactMarkdown' +import SourceDocDialog from 'ui-component/dialog/SourceDocDialog' import './ChatMessage.css' // api import chatmessageApi from 'api/chatmessage' +import chatflowsApi from 'api/chatflows' import predictionApi from 'api/prediction' // Hooks import useApi from 'hooks/useApi' -import useConfirm from 'hooks/useConfirm' -import useNotifier from 'utils/useNotifier' -import { maxScroll } from 'store/constant' +// Const +import { baseURL, maxScroll } from 'store/constant' -export const ChatMessage = ({ chatflowid }) => { +import robotPNG from 'assets/images/robot.png' +import userPNG from 'assets/images/account.png' +import { isValidURL } from 'utils/genericHelper' + +export const ChatMessage = ({ open, chatflowid, isDialog }) => { const theme = useTheme() const customization = useSelector((state) => state.customization) - const { confirm } = useConfirm() - const dispatch = useDispatch() + const ps = useRef() - useNotifier() - const enqueueSnackbar = (...args) => dispatch(enqueueSnackbarAction(...args)) - const closeSnackbar = (...args) => dispatch(closeSnackbarAction(...args)) - - const [open, setOpen] = useState(false) const [userInput, setUserInput] = useState('') const [loading, setLoading] = useState(false) const [messages, setMessages] = useState([ @@ -56,85 +46,78 @@ export const ChatMessage = ({ chatflowid }) => { type: 'apiMessage' } ]) + const [socketIOClientId, setSocketIOClientId] = useState('') + const [isChatFlowAvailableToStream, setIsChatFlowAvailableToStream] = useState(false) + const [sourceDialogOpen, setSourceDialogOpen] = useState(false) + const [sourceDialogProps, setSourceDialogProps] = useState({}) const inputRef = useRef(null) - const anchorRef = useRef(null) - const prevOpen = useRef(open) const getChatmessageApi = useApi(chatmessageApi.getChatmessageFromChatflow) + const getIsChatflowStreamingApi = useApi(chatflowsApi.getIsChatflowStreaming) - const handleClose = (event) => { - if (anchorRef.current && anchorRef.current.contains(event.target)) { - return - } - setOpen(false) + const onSourceDialogClick = (data) => { + setSourceDialogProps({ data }) + setSourceDialogOpen(true) } - const handleToggle = () => { - setOpen((prevOpen) => !prevOpen) + const onURLClick = (data) => { + window.open(data, '_blank') } - const clearChat = async () => { - const confirmPayload = { - title: `Clear Chat History`, - description: `Are you sure you want to clear all chat history?`, - confirmButtonName: 'Clear', - cancelButtonName: 'Cancel' - } - const isConfirmed = await confirm(confirmPayload) - - if (isConfirmed) { - try { - await chatmessageApi.deleteChatmessage(chatflowid) - enqueueSnackbar({ - message: 'Succesfully cleared all chat history', - options: { - key: new Date().getTime() + Math.random(), - variant: 'success', - action: (key) => ( - - ) - } - }) - } catch (error) { - const errorData = error.response.data || `${error.response.status}: ${error.response.statusText}` - enqueueSnackbar({ - message: errorData, - options: { - key: new Date().getTime() + Math.random(), - variant: 'error', - persist: true, - action: (key) => ( - - ) - } - }) + const removeDuplicateURL = (message) => { + const visitedURLs = [] + const newSourceDocuments = [] + message.sourceDocuments.forEach((source) => { + if (isValidURL(source.metadata.source) && !visitedURLs.includes(source.metadata.source)) { + visitedURLs.push(source.metadata.source) + newSourceDocuments.push(source) + } else if (!isValidURL(source.metadata.source)) { + newSourceDocuments.push(source) } - } + }) + return newSourceDocuments } const scrollToBottom = () => { if (ps.current) { - ps.current.scrollTo({ top: maxScroll, behavior: 'smooth' }) + ps.current.scrollTo({ top: maxScroll }) } } - const addChatMessage = async (message, type) => { + const onChange = useCallback((e) => setUserInput(e.target.value), [setUserInput]) + + const addChatMessage = async (message, type, sourceDocuments) => { try { const newChatMessageBody = { role: type, content: message, chatflowid: chatflowid } + if (sourceDocuments) newChatMessageBody.sourceDocuments = JSON.stringify(sourceDocuments) await chatmessageApi.createNewChatmessage(chatflowid, newChatMessageBody) } catch (error) { console.error(error) } } + const updateLastMessage = (text) => { + setMessages((prevMessages) => { + let allMessages = [...cloneDeep(prevMessages)] + if (allMessages[allMessages.length - 1].type === 'userMessage') return allMessages + allMessages[allMessages.length - 1].message += text + return allMessages + }) + } + + const updateLastMessageSourceDocuments = (sourceDocuments) => { + setMessages((prevMessages) => { + let allMessages = [...cloneDeep(prevMessages)] + if (allMessages[allMessages.length - 1].type === 'userMessage') return allMessages + allMessages[allMessages.length - 1].sourceDocuments = sourceDocuments + return allMessages + }) + } + // Handle errors const handleError = (message = 'Oops! There seems to be an error. Please try again.') => { message = message.replace(`Unable to parse JSON response from chat agent.\n\n`, '') @@ -143,7 +126,7 @@ export const ChatMessage = ({ chatflowid }) => { setLoading(false) setUserInput('') setTimeout(() => { - inputRef.current.focus() + inputRef.current?.focus() }, 100) } @@ -157,22 +140,39 @@ export const ChatMessage = ({ chatflowid }) => { setLoading(true) setMessages((prevMessages) => [...prevMessages, { message: userInput, type: 'userMessage' }]) - addChatMessage(userInput, 'userMessage') + // waiting for first chatmessage saved, the first chatmessage will be used in sendMessageAndGetPrediction + await addChatMessage(userInput, 'userMessage') // Send user question and history to API try { - const response = await predictionApi.sendMessageAndGetPrediction(chatflowid, { + const params = { question: userInput, history: messages.filter((msg) => msg.message !== 'Hi there! How can I help?') - }) + } + if (isChatFlowAvailableToStream) params.socketIOClientId = socketIOClientId + + const response = await predictionApi.sendMessageAndGetPrediction(chatflowid, params) + if (response.data) { const data = response.data - setMessages((prevMessages) => [...prevMessages, { message: data, type: 'apiMessage' }]) - addChatMessage(data, 'apiMessage') + if (typeof data === 'object' && data.text && data.sourceDocuments) { + if (!isChatFlowAvailableToStream) { + setMessages((prevMessages) => [ + ...prevMessages, + { message: data.text, sourceDocuments: data.sourceDocuments, type: 'apiMessage' } + ]) + } + addChatMessage(data.text, 'apiMessage', data.sourceDocuments) + } else { + if (!isChatFlowAvailableToStream) { + setMessages((prevMessages) => [...prevMessages, { message: data, type: 'apiMessage' }]) + } + addChatMessage(data, 'apiMessage') + } setLoading(false) setUserInput('') setTimeout(() => { - inputRef.current.focus() + inputRef.current?.focus() scrollToBottom() }, 100) } @@ -185,7 +185,9 @@ export const ChatMessage = ({ chatflowid }) => { // Prevent blank submissions and allow for multiline input const handleEnter = (e) => { - if (e.key === 'Enter' && userInput) { + // Check if IME composition is in progress + const isIMEComposition = e.isComposing || e.keyCode === 229 + if (e.key === 'Enter' && userInput && !isIMEComposition) { if (!e.shiftKey && userInput) { handleSubmit(e) } @@ -199,10 +201,12 @@ export const ChatMessage = ({ chatflowid }) => { if (getChatmessageApi.data) { const loadedMessages = [] for (const message of getChatmessageApi.data) { - loadedMessages.push({ + const obj = { message: message.content, type: message.role - }) + } + if (message.sourceDocuments) obj.sourceDocuments = JSON.parse(message.sourceDocuments) + loadedMessages.push(obj) } setMessages((prevMessages) => [...prevMessages, ...loadedMessages]) } @@ -210,22 +214,49 @@ export const ChatMessage = ({ chatflowid }) => { // eslint-disable-next-line react-hooks/exhaustive-deps }, [getChatmessageApi.data]) + // Get chatflow streaming capability + useEffect(() => { + if (getIsChatflowStreamingApi.data) { + setIsChatFlowAvailableToStream(getIsChatflowStreamingApi.data?.isStreaming ?? false) + } + + // eslint-disable-next-line react-hooks/exhaustive-deps + }, [getIsChatflowStreamingApi.data]) + // Auto scroll chat to bottom useEffect(() => { scrollToBottom() }, [messages]) useEffect(() => { - if (prevOpen.current === true && open === false) { - anchorRef.current.focus() + if (isDialog && inputRef) { + setTimeout(() => { + inputRef.current?.focus() + }, 100) } + }, [isDialog, inputRef]) + useEffect(() => { + let socket if (open && chatflowid) { getChatmessageApi.request(chatflowid) + getIsChatflowStreamingApi.request(chatflowid) scrollToBottom() - } - prevOpen.current = open + socket = socketIOClient(baseURL) + + socket.on('connect', () => { + setSocketIOClientId(socket.id) + }) + + socket.on('start', () => { + setMessages((prevMessages) => [...prevMessages, { message: '', type: 'apiMessage' }]) + }) + + socket.on('sourceDocuments', updateLastMessageSourceDocuments) + + socket.on('token', updateLastMessage) + } return () => { setUserInput('') @@ -236,6 +267,10 @@ export const ChatMessage = ({ chatflowid }) => { type: 'apiMessage' } ]) + if (socket) { + socket.disconnect() + setSocketIOClientId('') + } } // eslint-disable-next-line react-hooks/exhaustive-deps @@ -243,151 +278,143 @@ export const ChatMessage = ({ chatflowid }) => { return ( <> - - {open ? : } - - {open && ( - - - - )} - - {({ TransitionProps }) => ( - - - - -
-
- {messages.map((message, index) => { - return ( - // The latest message sent by the user will be animated while waiting for a response - +
+ {messages && + messages.map((message, index) => { + return ( + // The latest message sent by the user will be animated while waiting for a response + <> + + {/* Display the correct icon depending on the message type */} + {message.type === 'apiMessage' ? ( + AI + ) : ( + Me + )} +
+
+ {/* Messages are being rendered in Markdown format */} + + ) : ( + + {children} + + ) } - > - {/* Display the correct icon depending on the message type */} - {message.type === 'apiMessage' ? ( - AI + {message.message} + +
+ {message.sourceDocuments && ( +
+ {removeDuplicateURL(message).map((source, index) => { + const URL = isValidURL(source.metadata.source) + return ( + + URL ? onURLClick(source.metadata.source) : onSourceDialogClick(source) + } /> - ) : ( - Me - )} -
- {/* Messages are being rendered in Markdown format */} - {message.message} -
- - ) - })} + ) + })} +
+ )}
-
- -
-
-
- setUserInput(e.target.value)} - endAdornment={ - - - {loading ? ( -
- -
- ) : ( - // Send icon SVG in input field - - )} -
-
- } - /> - -
-
- - - - - )} - +
+ + ) + })} +
+
+ +
+
+
+ + + {loading ? ( +
+ +
+ ) : ( + // Send icon SVG in input field + + )} +
+ + } + /> + +
+
+ setSourceDialogOpen(false)} /> ) } -ChatMessage.propTypes = { chatflowid: PropTypes.string } +ChatMessage.propTypes = { + open: PropTypes.bool, + chatflowid: PropTypes.string, + isDialog: PropTypes.bool +} diff --git a/packages/ui/src/views/chatmessage/ChatPopUp.js b/packages/ui/src/views/chatmessage/ChatPopUp.js new file mode 100644 index 000000000..93050c3a8 --- /dev/null +++ b/packages/ui/src/views/chatmessage/ChatPopUp.js @@ -0,0 +1,208 @@ +import { useState, useRef, useEffect } from 'react' +import { useDispatch } from 'react-redux' +import PropTypes from 'prop-types' + +import { ClickAwayListener, Paper, Popper, Button } from '@mui/material' +import { useTheme } from '@mui/material/styles' +import { IconMessage, IconX, IconEraser, IconArrowsMaximize } from '@tabler/icons' + +// project import +import { StyledFab } from 'ui-component/button/StyledFab' +import MainCard from 'ui-component/cards/MainCard' +import Transitions from 'ui-component/extended/Transitions' +import { ChatMessage } from './ChatMessage' +import ChatExpandDialog from './ChatExpandDialog' + +// api +import chatmessageApi from 'api/chatmessage' + +// Hooks +import useConfirm from 'hooks/useConfirm' +import useNotifier from 'utils/useNotifier' + +// Const +import { enqueueSnackbar as enqueueSnackbarAction, closeSnackbar as closeSnackbarAction } from 'store/actions' + +export const ChatPopUp = ({ chatflowid }) => { + const theme = useTheme() + const { confirm } = useConfirm() + const dispatch = useDispatch() + + useNotifier() + const enqueueSnackbar = (...args) => dispatch(enqueueSnackbarAction(...args)) + const closeSnackbar = (...args) => dispatch(closeSnackbarAction(...args)) + + const [open, setOpen] = useState(false) + const [showExpandDialog, setShowExpandDialog] = useState(false) + const [expandDialogProps, setExpandDialogProps] = useState({}) + + const anchorRef = useRef(null) + const prevOpen = useRef(open) + + const handleClose = (event) => { + if (anchorRef.current && anchorRef.current.contains(event.target)) { + return + } + setOpen(false) + } + + const handleToggle = () => { + setOpen((prevOpen) => !prevOpen) + } + + const expandChat = () => { + const props = { + open: true, + chatflowid: chatflowid + } + setExpandDialogProps(props) + setShowExpandDialog(true) + } + + const resetChatDialog = () => { + const props = { + ...expandDialogProps, + open: false + } + setExpandDialogProps(props) + setTimeout(() => { + const resetProps = { + ...expandDialogProps, + open: true + } + setExpandDialogProps(resetProps) + }, 500) + } + + const clearChat = async () => { + const confirmPayload = { + title: `Clear Chat History`, + description: `Are you sure you want to clear all chat history?`, + confirmButtonName: 'Clear', + cancelButtonName: 'Cancel' + } + const isConfirmed = await confirm(confirmPayload) + + if (isConfirmed) { + try { + await chatmessageApi.deleteChatmessage(chatflowid) + resetChatDialog() + enqueueSnackbar({ + message: 'Succesfully cleared all chat history', + options: { + key: new Date().getTime() + Math.random(), + variant: 'success', + action: (key) => ( + + ) + } + }) + } catch (error) { + const errorData = error.response.data || `${error.response.status}: ${error.response.statusText}` + enqueueSnackbar({ + message: errorData, + options: { + key: new Date().getTime() + Math.random(), + variant: 'error', + persist: true, + action: (key) => ( + + ) + } + }) + } + } + } + + useEffect(() => { + if (prevOpen.current === true && open === false) { + anchorRef.current.focus() + } + prevOpen.current = open + + // eslint-disable-next-line react-hooks/exhaustive-deps + }, [open, chatflowid]) + + return ( + <> + + {open ? : } + + {open && ( + + + + )} + {open && ( + + + + )} + + {({ TransitionProps }) => ( + + + + + + + + + + )} + + setShowExpandDialog(false)} + > + + ) +} + +ChatPopUp.propTypes = { chatflowid: PropTypes.string } diff --git a/packages/ui/src/views/credentials/AddEditCredentialDialog.js b/packages/ui/src/views/credentials/AddEditCredentialDialog.js new file mode 100644 index 000000000..65b72a5fa --- /dev/null +++ b/packages/ui/src/views/credentials/AddEditCredentialDialog.js @@ -0,0 +1,283 @@ +import { createPortal } from 'react-dom' +import PropTypes from 'prop-types' +import { useState, useEffect } from 'react' +import { useDispatch } from 'react-redux' +import { enqueueSnackbar as enqueueSnackbarAction, closeSnackbar as closeSnackbarAction } from 'store/actions' +import parser from 'html-react-parser' + +// Material +import { Button, Dialog, DialogActions, DialogContent, DialogTitle, Box, Stack, OutlinedInput, Typography } from '@mui/material' + +// Project imports +import { StyledButton } from 'ui-component/button/StyledButton' +import ConfirmDialog from 'ui-component/dialog/ConfirmDialog' +import CredentialInputHandler from './CredentialInputHandler' + +// Icons +import { IconX } from '@tabler/icons' + +// API +import credentialsApi from 'api/credentials' + +// Hooks +import useApi from 'hooks/useApi' + +// utils +import useNotifier from 'utils/useNotifier' + +// const +import { baseURL } from 'store/constant' +import { HIDE_CANVAS_DIALOG, SHOW_CANVAS_DIALOG } from 'store/actions' + +const AddEditCredentialDialog = ({ show, dialogProps, onCancel, onConfirm }) => { + const portalElement = document.getElementById('portal') + + const dispatch = useDispatch() + + // ==============================|| Snackbar ||============================== // + + useNotifier() + + const enqueueSnackbar = (...args) => dispatch(enqueueSnackbarAction(...args)) + const closeSnackbar = (...args) => dispatch(closeSnackbarAction(...args)) + + const getSpecificCredentialApi = useApi(credentialsApi.getSpecificCredential) + const getSpecificComponentCredentialApi = useApi(credentialsApi.getSpecificComponentCredential) + + const [credential, setCredential] = useState({}) + const [name, setName] = useState('') + const [credentialData, setCredentialData] = useState({}) + const [componentCredential, setComponentCredential] = useState({}) + + useEffect(() => { + if (getSpecificCredentialApi.data) { + setCredential(getSpecificCredentialApi.data) + if (getSpecificCredentialApi.data.name) { + setName(getSpecificCredentialApi.data.name) + } + if (getSpecificCredentialApi.data.plainDataObj) { + setCredentialData(getSpecificCredentialApi.data.plainDataObj) + } + getSpecificComponentCredentialApi.request(getSpecificCredentialApi.data.credentialName) + } + + // eslint-disable-next-line react-hooks/exhaustive-deps + }, [getSpecificCredentialApi.data]) + + useEffect(() => { + if (getSpecificComponentCredentialApi.data) { + setComponentCredential(getSpecificComponentCredentialApi.data) + } + }, [getSpecificComponentCredentialApi.data]) + + useEffect(() => { + if (dialogProps.type === 'EDIT' && dialogProps.data) { + // When credential dialog is opened from Credentials dashboard + getSpecificCredentialApi.request(dialogProps.data.id) + } else if (dialogProps.type === 'EDIT' && dialogProps.credentialId) { + // When credential dialog is opened from node in canvas + getSpecificCredentialApi.request(dialogProps.credentialId) + } else if (dialogProps.type === 'ADD' && dialogProps.credentialComponent) { + // When credential dialog is to add a new credential + setName('') + setCredential({}) + setCredentialData({}) + setComponentCredential(dialogProps.credentialComponent) + } + + // eslint-disable-next-line react-hooks/exhaustive-deps + }, [dialogProps]) + + useEffect(() => { + if (show) dispatch({ type: SHOW_CANVAS_DIALOG }) + else dispatch({ type: HIDE_CANVAS_DIALOG }) + return () => dispatch({ type: HIDE_CANVAS_DIALOG }) + }, [show, dispatch]) + + const addNewCredential = async () => { + try { + const obj = { + name, + credentialName: componentCredential.name, + plainDataObj: credentialData + } + const createResp = await credentialsApi.createCredential(obj) + if (createResp.data) { + enqueueSnackbar({ + message: 'New Credential added', + options: { + key: new Date().getTime() + Math.random(), + variant: 'success', + action: (key) => ( + + ) + } + }) + onConfirm(createResp.data.id) + } + } catch (error) { + const errorData = error.response.data || `${error.response.status}: ${error.response.statusText}` + enqueueSnackbar({ + message: `Failed to add new Credential: ${errorData}`, + options: { + key: new Date().getTime() + Math.random(), + variant: 'error', + persist: true, + action: (key) => ( + + ) + } + }) + onCancel() + } + } + + const saveCredential = async () => { + try { + const saveResp = await credentialsApi.updateCredential(credential.id, { + name, + credentialName: componentCredential.name, + plainDataObj: credentialData + }) + if (saveResp.data) { + enqueueSnackbar({ + message: 'Credential saved', + options: { + key: new Date().getTime() + Math.random(), + variant: 'success', + action: (key) => ( + + ) + } + }) + onConfirm(saveResp.data.id) + } + } catch (error) { + const errorData = error.response.data || `${error.response.status}: ${error.response.statusText}` + enqueueSnackbar({ + message: `Failed to save Credential: ${errorData}`, + options: { + key: new Date().getTime() + Math.random(), + variant: 'error', + persist: true, + action: (key) => ( + + ) + } + }) + onCancel() + } + } + + const component = show ? ( + + + {componentCredential && componentCredential.label && ( +
+
+ {componentCredential.name} +
+ {componentCredential.label} +
+ )} +
+ + {componentCredential && componentCredential.description && ( + +
+ {parser(componentCredential.description)} +
+
+ )} + {componentCredential && componentCredential.label && ( + + + + Credential Name +  * + + + setName(e.target.value)} + /> + + )} + {componentCredential && + componentCredential.inputs && + componentCredential.inputs.map((inputParam, index) => ( + + ))} +
+ + (dialogProps.type === 'ADD' ? addNewCredential() : saveCredential())} + > + {dialogProps.confirmButtonName} + + + +
+ ) : null + + return createPortal(component, portalElement) +} + +AddEditCredentialDialog.propTypes = { + show: PropTypes.bool, + dialogProps: PropTypes.object, + onCancel: PropTypes.func, + onConfirm: PropTypes.func +} + +export default AddEditCredentialDialog diff --git a/packages/ui/src/views/credentials/CredentialInputHandler.js b/packages/ui/src/views/credentials/CredentialInputHandler.js new file mode 100644 index 000000000..30cc57466 --- /dev/null +++ b/packages/ui/src/views/credentials/CredentialInputHandler.js @@ -0,0 +1,137 @@ +import PropTypes from 'prop-types' +import { useRef, useState } from 'react' +import { useSelector } from 'react-redux' + +// material-ui +import { Box, Typography, IconButton } from '@mui/material' +import { IconArrowsMaximize, IconAlertTriangle } from '@tabler/icons' + +// project import +import { Dropdown } from 'ui-component/dropdown/Dropdown' +import { Input } from 'ui-component/input/Input' +import { SwitchInput } from 'ui-component/switch/Switch' +import { JsonEditorInput } from 'ui-component/json/JsonEditor' +import { TooltipWithParser } from 'ui-component/tooltip/TooltipWithParser' + +// ===========================|| NodeInputHandler ||=========================== // + +const CredentialInputHandler = ({ inputParam, data, disabled = false }) => { + const customization = useSelector((state) => state.customization) + const ref = useRef(null) + + const [showExpandDialog, setShowExpandDialog] = useState(false) + const [expandDialogProps, setExpandDialogProps] = useState({}) + + const onExpandDialogClicked = (value, inputParam) => { + const dialogProp = { + value, + inputParam, + disabled, + confirmButtonName: 'Save', + cancelButtonName: 'Cancel' + } + setExpandDialogProps(dialogProp) + setShowExpandDialog(true) + } + + const onExpandDialogSave = (newValue, inputParamName) => { + setShowExpandDialog(false) + data[inputParamName] = newValue + } + + return ( +
+ {inputParam && ( + <> + +
+ + {inputParam.label} + {!inputParam.optional &&  *} + {inputParam.description && } + +
+ {inputParam.type === 'string' && inputParam.rows && ( + onExpandDialogClicked(data[inputParam.name] ?? inputParam.default ?? '', inputParam)} + > + + + )} +
+ {inputParam.warning && ( +
+ + {inputParam.warning} +
+ )} + + {inputParam.type === 'boolean' && ( + (data[inputParam.name] = newValue)} + value={data[inputParam.name] ?? inputParam.default ?? false} + /> + )} + {(inputParam.type === 'string' || inputParam.type === 'password' || inputParam.type === 'number') && ( + (data[inputParam.name] = newValue)} + value={data[inputParam.name] ?? inputParam.default ?? ''} + showDialog={showExpandDialog} + dialogProps={expandDialogProps} + onDialogCancel={() => setShowExpandDialog(false)} + onDialogConfirm={(newValue, inputParamName) => onExpandDialogSave(newValue, inputParamName)} + /> + )} + {inputParam.type === 'json' && ( + (data[inputParam.name] = newValue)} + value={data[inputParam.name] ?? inputParam.default ?? ''} + isDarkMode={customization.isDarkMode} + /> + )} + {inputParam.type === 'options' && ( + (data[inputParam.name] = newValue)} + value={data[inputParam.name] ?? inputParam.default ?? 'choose an option'} + /> + )} +
+ + )} +
+ ) +} + +CredentialInputHandler.propTypes = { + inputAnchor: PropTypes.object, + inputParam: PropTypes.object, + data: PropTypes.object, + disabled: PropTypes.bool +} + +export default CredentialInputHandler diff --git a/packages/ui/src/views/credentials/CredentialListDialog.js b/packages/ui/src/views/credentials/CredentialListDialog.js new file mode 100644 index 000000000..e0a3e08de --- /dev/null +++ b/packages/ui/src/views/credentials/CredentialListDialog.js @@ -0,0 +1,179 @@ +import { useState, useEffect } from 'react' +import { createPortal } from 'react-dom' +import { useSelector, useDispatch } from 'react-redux' +import PropTypes from 'prop-types' +import { + List, + ListItemButton, + ListItem, + ListItemAvatar, + ListItemText, + Dialog, + DialogContent, + DialogTitle, + Box, + OutlinedInput, + InputAdornment +} from '@mui/material' +import { useTheme } from '@mui/material/styles' +import { IconSearch, IconX } from '@tabler/icons' + +// const +import { baseURL } from 'store/constant' +import { HIDE_CANVAS_DIALOG, SHOW_CANVAS_DIALOG } from 'store/actions' + +const CredentialListDialog = ({ show, dialogProps, onCancel, onCredentialSelected }) => { + const portalElement = document.getElementById('portal') + const customization = useSelector((state) => state.customization) + const dispatch = useDispatch() + const theme = useTheme() + const [searchValue, setSearchValue] = useState('') + const [componentsCredentials, setComponentsCredentials] = useState([]) + + const filterSearch = (value) => { + setSearchValue(value) + setTimeout(() => { + if (value) { + const searchData = dialogProps.componentsCredentials.filter((crd) => crd.name.toLowerCase().includes(value.toLowerCase())) + setComponentsCredentials(searchData) + } else if (value === '') { + setComponentsCredentials(dialogProps.componentsCredentials) + } + // scrollTop() + }, 500) + } + + useEffect(() => { + if (dialogProps.componentsCredentials) { + setComponentsCredentials(dialogProps.componentsCredentials) + } + }, [dialogProps]) + + useEffect(() => { + if (show) dispatch({ type: SHOW_CANVAS_DIALOG }) + else dispatch({ type: HIDE_CANVAS_DIALOG }) + return () => dispatch({ type: HIDE_CANVAS_DIALOG }) + }, [show, dispatch]) + + const component = show ? ( + + + {dialogProps.title} + + filterSearch(e.target.value)} + placeholder='Search credential' + startAdornment={ + + + + } + endAdornment={ + + filterSearch('')} + style={{ + cursor: 'pointer' + }} + /> + + } + aria-describedby='search-helper-text' + inputProps={{ + 'aria-label': 'weight' + }} + /> + + + + + {[...componentsCredentials].map((componentCredential) => ( +
+ onCredentialSelected(componentCredential)} + sx={{ p: 0, borderRadius: `${customization.borderRadius}px` }} + > + + +
+ {componentCredential.name} +
+
+ +
+
+
+ ))} +
+
+
+ ) : null + + return createPortal(component, portalElement) +} + +CredentialListDialog.propTypes = { + show: PropTypes.bool, + dialogProps: PropTypes.object, + onCancel: PropTypes.func, + onCredentialSelected: PropTypes.func +} + +export default CredentialListDialog diff --git a/packages/ui/src/views/credentials/index.js b/packages/ui/src/views/credentials/index.js new file mode 100644 index 000000000..9db990a7c --- /dev/null +++ b/packages/ui/src/views/credentials/index.js @@ -0,0 +1,276 @@ +import { useEffect, useState } from 'react' +import { useDispatch, useSelector } from 'react-redux' +import { enqueueSnackbar as enqueueSnackbarAction, closeSnackbar as closeSnackbarAction } from 'store/actions' +import moment from 'moment' + +// material-ui +import { Button, Box, Stack, Table, TableBody, TableCell, TableContainer, TableHead, TableRow, Paper, IconButton } from '@mui/material' +import { useTheme } from '@mui/material/styles' + +// project imports +import MainCard from 'ui-component/cards/MainCard' +import { StyledButton } from 'ui-component/button/StyledButton' +import CredentialListDialog from './CredentialListDialog' +import ConfirmDialog from 'ui-component/dialog/ConfirmDialog' +import AddEditCredentialDialog from './AddEditCredentialDialog' + +// API +import credentialsApi from 'api/credentials' + +// Hooks +import useApi from 'hooks/useApi' +import useConfirm from 'hooks/useConfirm' + +// utils +import useNotifier from 'utils/useNotifier' + +// Icons +import { IconTrash, IconEdit, IconX, IconPlus } from '@tabler/icons' +import CredentialEmptySVG from 'assets/images/credential_empty.svg' + +// const +import { baseURL } from 'store/constant' +import { SET_COMPONENT_CREDENTIALS } from 'store/actions' + +// ==============================|| Credentials ||============================== // + +const Credentials = () => { + const theme = useTheme() + const customization = useSelector((state) => state.customization) + + const dispatch = useDispatch() + useNotifier() + + const enqueueSnackbar = (...args) => dispatch(enqueueSnackbarAction(...args)) + const closeSnackbar = (...args) => dispatch(closeSnackbarAction(...args)) + + const [showCredentialListDialog, setShowCredentialListDialog] = useState(false) + const [credentialListDialogProps, setCredentialListDialogProps] = useState({}) + const [showSpecificCredentialDialog, setShowSpecificCredentialDialog] = useState(false) + const [specificCredentialDialogProps, setSpecificCredentialDialogProps] = useState({}) + const [credentials, setCredentials] = useState([]) + const [componentsCredentials, setComponentsCredentials] = useState([]) + + const { confirm } = useConfirm() + + const getAllCredentialsApi = useApi(credentialsApi.getAllCredentials) + const getAllComponentsCredentialsApi = useApi(credentialsApi.getAllComponentsCredentials) + + const listCredential = () => { + const dialogProp = { + title: 'Add New Credential', + componentsCredentials + } + setCredentialListDialogProps(dialogProp) + setShowCredentialListDialog(true) + } + + const addNew = (credentialComponent) => { + const dialogProp = { + type: 'ADD', + cancelButtonName: 'Cancel', + confirmButtonName: 'Add', + credentialComponent + } + setSpecificCredentialDialogProps(dialogProp) + setShowSpecificCredentialDialog(true) + } + + const edit = (credential) => { + const dialogProp = { + type: 'EDIT', + cancelButtonName: 'Cancel', + confirmButtonName: 'Save', + data: credential + } + setSpecificCredentialDialogProps(dialogProp) + setShowSpecificCredentialDialog(true) + } + + const deleteCredential = async (credential) => { + const confirmPayload = { + title: `Delete`, + description: `Delete credential ${credential.name}?`, + confirmButtonName: 'Delete', + cancelButtonName: 'Cancel' + } + const isConfirmed = await confirm(confirmPayload) + + if (isConfirmed) { + try { + const deleteResp = await credentialsApi.deleteCredential(credential.id) + if (deleteResp.data) { + enqueueSnackbar({ + message: 'Credential deleted', + options: { + key: new Date().getTime() + Math.random(), + variant: 'success', + action: (key) => ( + + ) + } + }) + onConfirm() + } + } catch (error) { + const errorData = error.response.data || `${error.response.status}: ${error.response.statusText}` + enqueueSnackbar({ + message: `Failed to delete Credential: ${errorData}`, + options: { + key: new Date().getTime() + Math.random(), + variant: 'error', + persist: true, + action: (key) => ( + + ) + } + }) + onCancel() + } + } + } + + const onCredentialSelected = (credentialComponent) => { + setShowCredentialListDialog(false) + addNew(credentialComponent) + } + + const onConfirm = () => { + setShowCredentialListDialog(false) + setShowSpecificCredentialDialog(false) + getAllCredentialsApi.request() + } + + useEffect(() => { + getAllCredentialsApi.request() + getAllComponentsCredentialsApi.request() + // eslint-disable-next-line react-hooks/exhaustive-deps + }, []) + + useEffect(() => { + if (getAllCredentialsApi.data) { + setCredentials(getAllCredentialsApi.data) + } + }, [getAllCredentialsApi.data]) + + useEffect(() => { + if (getAllComponentsCredentialsApi.data) { + setComponentsCredentials(getAllComponentsCredentialsApi.data) + dispatch({ type: SET_COMPONENT_CREDENTIALS, componentsCredentials: getAllComponentsCredentialsApi.data }) + } + }, [getAllComponentsCredentialsApi.data, dispatch]) + + return ( + <> + + +

Credentials 

+ + + } + > + Add Credential + +
+ {credentials.length <= 0 && ( + + + CredentialEmptySVG + +
No Credentials Yet
+
+ )} + {credentials.length > 0 && ( + + + + + Name + Last Updated + Created + + + + + + {credentials.map((credential, index) => ( + + +
+
+ {credential.credentialName} +
+ {credential.name} +
+
+ {moment(credential.updatedDate).format('DD-MMM-YY')} + {moment(credential.createdDate).format('DD-MMM-YY')} + + edit(credential)}> + + + + + deleteCredential(credential)}> + + + +
+ ))} +
+
+
+ )} +
+ setShowCredentialListDialog(false)} + onCredentialSelected={onCredentialSelected} + > + setShowSpecificCredentialDialog(false)} + onConfirm={onConfirm} + > + + + ) +} + +export default Credentials diff --git a/packages/ui/src/views/marketplaces/index.js b/packages/ui/src/views/marketplaces/index.js index ba9eb3d63..a78361616 100644 --- a/packages/ui/src/views/marketplaces/index.js +++ b/packages/ui/src/views/marketplaces/index.js @@ -1,16 +1,19 @@ import { useEffect, useState } from 'react' import { useNavigate } from 'react-router-dom' import { useSelector } from 'react-redux' +import PropTypes from 'prop-types' // material-ui -import { Grid, Box, Stack } from '@mui/material' +import { Grid, Box, Stack, Tabs, Tab } from '@mui/material' import { useTheme } from '@mui/material/styles' +import { IconHierarchy, IconTool } from '@tabler/icons' // project imports import MainCard from 'ui-component/cards/MainCard' import ItemCard from 'ui-component/cards/ItemCard' import { gridSpacing } from 'store/constant' import WorkflowEmptySVG from 'assets/images/workflow_empty.svg' +import ToolDialog from 'views/tools/ToolDialog' // API import marketplacesApi from 'api/marketplaces' @@ -21,6 +24,27 @@ import useApi from 'hooks/useApi' // const import { baseURL } from 'store/constant' +function TabPanel(props) { + const { children, value, index, ...other } = props + return ( + + ) +} + +TabPanel.propTypes = { + children: PropTypes.node, + index: PropTypes.number.isRequired, + value: PropTypes.number.isRequired +} + // ==============================|| Marketplace ||============================== // const Marketplace = () => { @@ -29,29 +53,66 @@ const Marketplace = () => { const theme = useTheme() const customization = useSelector((state) => state.customization) - const [isLoading, setLoading] = useState(true) + const [isChatflowsLoading, setChatflowsLoading] = useState(true) + const [isToolsLoading, setToolsLoading] = useState(true) const [images, setImages] = useState({}) + const tabItems = ['Chatflows', 'Tools'] + const [value, setValue] = useState(0) + const [showToolDialog, setShowToolDialog] = useState(false) + const [toolDialogProps, setToolDialogProps] = useState({}) - const getAllMarketplacesApi = useApi(marketplacesApi.getAllMarketplaces) + const getAllChatflowsMarketplacesApi = useApi(marketplacesApi.getAllChatflowsMarketplaces) + const getAllToolsMarketplacesApi = useApi(marketplacesApi.getAllToolsMarketplaces) + + const onUseTemplate = (selectedTool) => { + const dialogProp = { + title: 'Add New Tool', + type: 'IMPORT', + cancelButtonName: 'Cancel', + confirmButtonName: 'Add', + data: selectedTool + } + setToolDialogProps(dialogProp) + setShowToolDialog(true) + } + + const goToTool = (selectedTool) => { + const dialogProp = { + title: selectedTool.templateName, + type: 'TEMPLATE', + data: selectedTool + } + setToolDialogProps(dialogProp) + setShowToolDialog(true) + } const goToCanvas = (selectedChatflow) => { navigate(`/marketplace/${selectedChatflow.id}`, { state: selectedChatflow }) } + const handleChange = (event, newValue) => { + setValue(newValue) + } + useEffect(() => { - getAllMarketplacesApi.request() + getAllChatflowsMarketplacesApi.request() + getAllToolsMarketplacesApi.request() // eslint-disable-next-line react-hooks/exhaustive-deps }, []) useEffect(() => { - setLoading(getAllMarketplacesApi.loading) - }, [getAllMarketplacesApi.loading]) + setChatflowsLoading(getAllChatflowsMarketplacesApi.loading) + }, [getAllChatflowsMarketplacesApi.loading]) useEffect(() => { - if (getAllMarketplacesApi.data) { + setToolsLoading(getAllToolsMarketplacesApi.loading) + }, [getAllToolsMarketplacesApi.loading]) + + useEffect(() => { + if (getAllChatflowsMarketplacesApi.data) { try { - const chatflows = getAllMarketplacesApi.data + const chatflows = getAllChatflowsMarketplacesApi.data const images = {} for (let i = 0; i < chatflows.length; i += 1) { const flowDataStr = chatflows[i].flowData @@ -70,31 +131,83 @@ const Marketplace = () => { console.error(e) } } - }, [getAllMarketplacesApi.data]) + }, [getAllChatflowsMarketplacesApi.data]) return ( - - -

Marketplace

-
- - {!isLoading && - getAllMarketplacesApi.data && - getAllMarketplacesApi.data.map((data, index) => ( - - goToCanvas(data)} data={data} images={images[data.id]} /> - - ))} - - {!isLoading && (!getAllMarketplacesApi.data || getAllMarketplacesApi.data.length === 0) && ( - - - WorkflowEmptySVG - -
No Marketplace Yet
+ <> + + +

Marketplace

- )} -
+ + {tabItems.map((item, index) => ( + : } + iconPosition='start' + label={{item}} + /> + ))} + + {tabItems.map((item, index) => ( + + {item === 'Chatflows' && ( + + {!isChatflowsLoading && + getAllChatflowsMarketplacesApi.data && + getAllChatflowsMarketplacesApi.data.map((data, index) => ( + + goToCanvas(data)} data={data} images={images[data.id]} /> + + ))} + + )} + {item === 'Tools' && ( + + {!isToolsLoading && + getAllToolsMarketplacesApi.data && + getAllToolsMarketplacesApi.data.map((data, index) => ( + + goToTool(data)} /> + + ))} + + )} + + ))} + {!isChatflowsLoading && (!getAllChatflowsMarketplacesApi.data || getAllChatflowsMarketplacesApi.data.length === 0) && ( + + + WorkflowEmptySVG + +
No Marketplace Yet
+
+ )} + {!isToolsLoading && (!getAllToolsMarketplacesApi.data || getAllToolsMarketplacesApi.data.length === 0) && ( + + + WorkflowEmptySVG + +
No Marketplace Yet
+
+ )} +
+ setShowToolDialog(false)} + onConfirm={() => setShowToolDialog(false)} + onUseTemplate={(tool) => onUseTemplate(tool)} + > + ) } diff --git a/packages/ui/src/views/tools/ToolDialog.js b/packages/ui/src/views/tools/ToolDialog.js new file mode 100644 index 000000000..2b67f6d4d --- /dev/null +++ b/packages/ui/src/views/tools/ToolDialog.js @@ -0,0 +1,571 @@ +import { createPortal } from 'react-dom' +import PropTypes from 'prop-types' +import { useState, useEffect, useCallback, useMemo } from 'react' +import { useDispatch, useSelector } from 'react-redux' +import { enqueueSnackbar as enqueueSnackbarAction, closeSnackbar as closeSnackbarAction } from 'store/actions' +import { cloneDeep } from 'lodash' + +import { Box, Typography, Button, Dialog, DialogActions, DialogContent, DialogTitle, Stack, OutlinedInput } from '@mui/material' +import { StyledButton } from 'ui-component/button/StyledButton' +import { Grid } from 'ui-component/grid/Grid' +import { TooltipWithParser } from 'ui-component/tooltip/TooltipWithParser' +import { GridActionsCellItem } from '@mui/x-data-grid' +import DeleteIcon from '@mui/icons-material/Delete' +import ConfirmDialog from 'ui-component/dialog/ConfirmDialog' +import { DarkCodeEditor } from 'ui-component/editor/DarkCodeEditor' +import { LightCodeEditor } from 'ui-component/editor/LightCodeEditor' +import { useTheme } from '@mui/material/styles' + +// Icons +import { IconX, IconFileExport } from '@tabler/icons' + +// API +import toolsApi from 'api/tools' + +// Hooks +import useConfirm from 'hooks/useConfirm' +import useApi from 'hooks/useApi' + +// utils +import useNotifier from 'utils/useNotifier' +import { generateRandomGradient } from 'utils/genericHelper' +import { HIDE_CANVAS_DIALOG, SHOW_CANVAS_DIALOG } from 'store/actions' + +const exampleAPIFunc = `/* +* You can use any libraries imported in Flowise +* You can use properties specified in Output Schema as variables. Ex: Property = userid, Variable = $userid +* Must return a string value at the end of function +*/ + +const fetch = require('node-fetch'); +const url = 'https://api.open-meteo.com/v1/forecast?latitude=52.52&longitude=13.41¤t_weather=true'; +const options = { + method: 'GET', + headers: { + 'Content-Type': 'application/json' + } +}; +try { + const response = await fetch(url, options); + const text = await response.text(); + return text; +} catch (error) { + console.error(error); + return ''; +}` + +const ToolDialog = ({ show, dialogProps, onUseTemplate, onCancel, onConfirm }) => { + const portalElement = document.getElementById('portal') + const theme = useTheme() + + const customization = useSelector((state) => state.customization) + const dispatch = useDispatch() + + // ==============================|| Snackbar ||============================== // + + useNotifier() + const { confirm } = useConfirm() + + const enqueueSnackbar = (...args) => dispatch(enqueueSnackbarAction(...args)) + const closeSnackbar = (...args) => dispatch(closeSnackbarAction(...args)) + + const getSpecificToolApi = useApi(toolsApi.getSpecificTool) + + const [toolId, setToolId] = useState('') + const [toolName, setToolName] = useState('') + const [toolDesc, setToolDesc] = useState('') + const [toolIcon, setToolIcon] = useState('') + const [toolSchema, setToolSchema] = useState([]) + const [toolFunc, setToolFunc] = useState('') + + const deleteItem = useCallback( + (id) => () => { + setTimeout(() => { + setToolSchema((prevRows) => prevRows.filter((row) => row.id !== id)) + }) + }, + [] + ) + + const addNewRow = () => { + setTimeout(() => { + setToolSchema((prevRows) => { + let allRows = [...cloneDeep(prevRows)] + const lastRowId = allRows.length ? allRows[allRows.length - 1].id + 1 : 1 + allRows.push({ + id: lastRowId, + property: '', + description: '', + type: '', + required: false + }) + return allRows + }) + }) + } + + const onRowUpdate = (newRow) => { + setTimeout(() => { + setToolSchema((prevRows) => { + let allRows = [...cloneDeep(prevRows)] + const indexToUpdate = allRows.findIndex((row) => row.id === newRow.id) + if (indexToUpdate >= 0) { + allRows[indexToUpdate] = { ...newRow } + } + return allRows + }) + }) + } + + const columns = useMemo( + () => [ + { field: 'property', headerName: 'Property', editable: true, flex: 1 }, + { + field: 'type', + headerName: 'Type', + type: 'singleSelect', + valueOptions: ['string', 'number', 'boolean', 'date'], + editable: true, + width: 120 + }, + { field: 'description', headerName: 'Description', editable: true, flex: 1 }, + { field: 'required', headerName: 'Required', type: 'boolean', editable: true, width: 80 }, + { + field: 'actions', + type: 'actions', + width: 80, + getActions: (params) => [ + } label='Delete' onClick={deleteItem(params.id)} /> + ] + } + ], + [deleteItem] + ) + + const formatSchema = (schema) => { + try { + const parsedSchema = JSON.parse(schema) + return parsedSchema.map((sch, index) => { + return { + ...sch, + id: index + } + }) + } catch (e) { + return [] + } + } + + useEffect(() => { + if (show) dispatch({ type: SHOW_CANVAS_DIALOG }) + else dispatch({ type: HIDE_CANVAS_DIALOG }) + return () => dispatch({ type: HIDE_CANVAS_DIALOG }) + }, [show, dispatch]) + + useEffect(() => { + if (getSpecificToolApi.data) { + setToolId(getSpecificToolApi.data.id) + setToolName(getSpecificToolApi.data.name) + setToolDesc(getSpecificToolApi.data.description) + setToolSchema(formatSchema(getSpecificToolApi.data.schema)) + if (getSpecificToolApi.data.func) setToolFunc(getSpecificToolApi.data.func) + else setToolFunc('') + } + }, [getSpecificToolApi.data]) + + useEffect(() => { + if (dialogProps.type === 'EDIT' && dialogProps.data) { + // When tool dialog is opened from Tools dashboard + setToolId(dialogProps.data.id) + setToolName(dialogProps.data.name) + setToolDesc(dialogProps.data.description) + setToolIcon(dialogProps.data.iconSrc) + setToolSchema(formatSchema(dialogProps.data.schema)) + if (dialogProps.data.func) setToolFunc(dialogProps.data.func) + else setToolFunc('') + } else if (dialogProps.type === 'EDIT' && dialogProps.toolId) { + // When tool dialog is opened from CustomTool node in canvas + getSpecificToolApi.request(dialogProps.toolId) + } else if (dialogProps.type === 'IMPORT' && dialogProps.data) { + // When tool dialog is to import existing tool + setToolName(dialogProps.data.name) + setToolDesc(dialogProps.data.description) + setToolIcon(dialogProps.data.iconSrc) + setToolSchema(formatSchema(dialogProps.data.schema)) + if (dialogProps.data.func) setToolFunc(dialogProps.data.func) + else setToolFunc('') + } else if (dialogProps.type === 'TEMPLATE' && dialogProps.data) { + // When tool dialog is a template + setToolName(dialogProps.data.name) + setToolDesc(dialogProps.data.description) + setToolIcon(dialogProps.data.iconSrc) + setToolSchema(formatSchema(dialogProps.data.schema)) + if (dialogProps.data.func) setToolFunc(dialogProps.data.func) + else setToolFunc('') + } else if (dialogProps.type === 'ADD') { + // When tool dialog is to add a new tool + setToolId('') + setToolName('') + setToolDesc('') + setToolIcon('') + setToolSchema([]) + setToolFunc('') + } + + // eslint-disable-next-line react-hooks/exhaustive-deps + }, [dialogProps]) + + const useToolTemplate = () => { + onUseTemplate(dialogProps.data) + } + + const exportTool = async () => { + try { + const toolResp = await toolsApi.getSpecificTool(toolId) + if (toolResp.data) { + const toolData = toolResp.data + delete toolData.id + delete toolData.createdDate + delete toolData.updatedDate + let dataStr = JSON.stringify(toolData) + let dataUri = 'data:application/json;charset=utf-8,' + encodeURIComponent(dataStr) + + let exportFileDefaultName = `${toolName}-CustomTool.json` + + let linkElement = document.createElement('a') + linkElement.setAttribute('href', dataUri) + linkElement.setAttribute('download', exportFileDefaultName) + linkElement.click() + } + } catch (error) { + const errorData = error.response.data || `${error.response.status}: ${error.response.statusText}` + enqueueSnackbar({ + message: `Failed to export Tool: ${errorData}`, + options: { + key: new Date().getTime() + Math.random(), + variant: 'error', + persist: true, + action: (key) => ( + + ) + } + }) + onCancel() + } + } + + const addNewTool = async () => { + try { + const obj = { + name: toolName, + description: toolDesc, + color: generateRandomGradient(), + schema: JSON.stringify(toolSchema), + func: toolFunc, + iconSrc: toolIcon + } + const createResp = await toolsApi.createNewTool(obj) + if (createResp.data) { + enqueueSnackbar({ + message: 'New Tool added', + options: { + key: new Date().getTime() + Math.random(), + variant: 'success', + action: (key) => ( + + ) + } + }) + onConfirm(createResp.data.id) + } + } catch (error) { + const errorData = error.response.data || `${error.response.status}: ${error.response.statusText}` + enqueueSnackbar({ + message: `Failed to add new Tool: ${errorData}`, + options: { + key: new Date().getTime() + Math.random(), + variant: 'error', + persist: true, + action: (key) => ( + + ) + } + }) + onCancel() + } + } + + const saveTool = async () => { + try { + const saveResp = await toolsApi.updateTool(toolId, { + name: toolName, + description: toolDesc, + schema: JSON.stringify(toolSchema), + func: toolFunc, + iconSrc: toolIcon + }) + if (saveResp.data) { + enqueueSnackbar({ + message: 'Tool saved', + options: { + key: new Date().getTime() + Math.random(), + variant: 'success', + action: (key) => ( + + ) + } + }) + onConfirm(saveResp.data.id) + } + } catch (error) { + console.error(error) + const errorData = error.response.data || `${error.response.status}: ${error.response.statusText}` + enqueueSnackbar({ + message: `Failed to save Tool: ${errorData}`, + options: { + key: new Date().getTime() + Math.random(), + variant: 'error', + persist: true, + action: (key) => ( + + ) + } + }) + onCancel() + } + } + + const deleteTool = async () => { + const confirmPayload = { + title: `Delete Tool`, + description: `Delete tool ${toolName}?`, + confirmButtonName: 'Delete', + cancelButtonName: 'Cancel' + } + const isConfirmed = await confirm(confirmPayload) + + if (isConfirmed) { + try { + const delResp = await toolsApi.deleteTool(toolId) + if (delResp.data) { + enqueueSnackbar({ + message: 'Tool deleted', + options: { + key: new Date().getTime() + Math.random(), + variant: 'success', + action: (key) => ( + + ) + } + }) + onConfirm() + } + } catch (error) { + const errorData = error.response.data || `${error.response.status}: ${error.response.statusText}` + enqueueSnackbar({ + message: `Failed to delete Tool: ${errorData}`, + options: { + key: new Date().getTime() + Math.random(), + variant: 'error', + persist: true, + action: (key) => ( + + ) + } + }) + onCancel() + } + } + } + + const component = show ? ( + + +
+ {dialogProps.title} +
+ {dialogProps.type === 'EDIT' && ( + + )} +
+ + + + + + Tool Name +  * + + + + setToolName(e.target.value)} + /> + + + + + Tool description +  * + + + + setToolDesc(e.target.value)} + /> + + + + Tool Icon Src + + setToolIcon(e.target.value)} + /> + + + + + Output Schema + + + + + + + + + Javascript Function + + + + {dialogProps.type !== 'TEMPLATE' && ( + + )} + {customization.isDarkMode ? ( + setToolFunc(code)} + style={{ + fontSize: '0.875rem', + minHeight: 'calc(100vh - 220px)', + width: '100%', + borderRadius: 5 + }} + /> + ) : ( + setToolFunc(code)} + style={{ + fontSize: '0.875rem', + minHeight: 'calc(100vh - 220px)', + width: '100%', + border: `1px solid ${theme.palette.grey[300]}`, + borderRadius: 5 + }} + /> + )} + + + + {dialogProps.type === 'EDIT' && ( + deleteTool()}> + Delete + + )} + {dialogProps.type === 'TEMPLATE' && ( + + Use Template + + )} + {dialogProps.type !== 'TEMPLATE' && ( + (dialogProps.type === 'ADD' || dialogProps.type === 'IMPORT' ? addNewTool() : saveTool())} + > + {dialogProps.confirmButtonName} + + )} + + +
+ ) : null + + return createPortal(component, portalElement) +} + +ToolDialog.propTypes = { + show: PropTypes.bool, + dialogProps: PropTypes.object, + onUseTemplate: PropTypes.func, + onCancel: PropTypes.func, + onConfirm: PropTypes.func +} + +export default ToolDialog diff --git a/packages/ui/src/views/tools/index.js b/packages/ui/src/views/tools/index.js new file mode 100644 index 000000000..c97ec6609 --- /dev/null +++ b/packages/ui/src/views/tools/index.js @@ -0,0 +1,155 @@ +import { useEffect, useState, useRef } from 'react' +import { useSelector } from 'react-redux' + +// material-ui +import { Grid, Box, Stack, Button } from '@mui/material' +import { useTheme } from '@mui/material/styles' + +// project imports +import MainCard from 'ui-component/cards/MainCard' +import ItemCard from 'ui-component/cards/ItemCard' +import { gridSpacing } from 'store/constant' +import ToolEmptySVG from 'assets/images/tools_empty.svg' +import { StyledButton } from 'ui-component/button/StyledButton' +import ToolDialog from './ToolDialog' + +// API +import toolsApi from 'api/tools' + +// Hooks +import useApi from 'hooks/useApi' + +// icons +import { IconPlus, IconFileImport } from '@tabler/icons' + +// ==============================|| CHATFLOWS ||============================== // + +const Tools = () => { + const theme = useTheme() + const customization = useSelector((state) => state.customization) + + const getAllToolsApi = useApi(toolsApi.getAllTools) + + const [showDialog, setShowDialog] = useState(false) + const [dialogProps, setDialogProps] = useState({}) + + const inputRef = useRef(null) + + const onUploadFile = (file) => { + try { + const dialogProp = { + title: 'Add New Tool', + type: 'IMPORT', + cancelButtonName: 'Cancel', + confirmButtonName: 'Save', + data: JSON.parse(file) + } + setDialogProps(dialogProp) + setShowDialog(true) + } catch (e) { + console.error(e) + } + } + + const handleFileUpload = (e) => { + if (!e.target.files) return + + const file = e.target.files[0] + + const reader = new FileReader() + reader.onload = (evt) => { + if (!evt?.target?.result) { + return + } + const { result } = evt.target + onUploadFile(result) + } + reader.readAsText(file) + } + + const addNew = () => { + const dialogProp = { + title: 'Add New Tool', + type: 'ADD', + cancelButtonName: 'Cancel', + confirmButtonName: 'Add' + } + setDialogProps(dialogProp) + setShowDialog(true) + } + + const edit = (selectedTool) => { + const dialogProp = { + title: 'Edit Tool', + type: 'EDIT', + cancelButtonName: 'Cancel', + confirmButtonName: 'Save', + data: selectedTool + } + setDialogProps(dialogProp) + setShowDialog(true) + } + + const onConfirm = () => { + setShowDialog(false) + getAllToolsApi.request() + } + + useEffect(() => { + getAllToolsApi.request() + + // eslint-disable-next-line react-hooks/exhaustive-deps + }, []) + + return ( + <> + + +

Tools

+ + + + + handleFileUpload(e)} /> + }> + Create + + + +
+ + {!getAllToolsApi.loading && + getAllToolsApi.data && + getAllToolsApi.data.map((data, index) => ( + + edit(data)} /> + + ))} + + {!getAllToolsApi.loading && (!getAllToolsApi.data || getAllToolsApi.data.length === 0) && ( + + + ToolEmptySVG + +
No Tools Created Yet
+
+ )} +
+ setShowDialog(false)} + onConfirm={onConfirm} + > + + ) +} + +export default Tools