Feature/DocumentStore (#2106)
* datasource: initial commit * datasource: datasource details and chunks * datasource: Document Store Node * more changes * Document Store - Base functionality * Document Store Loader Component * Document Store Loader Component * before merging the modularity PR * after merging the modularity PR * preview mode * initial draft PR * fixes * minor updates and fixes * preview with loader and splitter * preview with credential * show stored chunks * preview update... * edit config * save, preview and other changes * save, preview and other changes * save, process and other changes * save, process and other changes * alpha1 - for internal testing * rerouting urls * bug fix on new leader create * pagination support for chunks * delete document store * Update pnpm-lock.yaml * doc store card view * Update store files to use updated storage functions, Document Store Table View and other changes * ui changes * add expanded chunk dialog, improve ui * change throw Error to InternalError * Bug Fixes and removal of subFolder, adding of view chunks for store * lint fixes * merge changes * DocumentStoreStatus component * ui changes for doc store * add remove metadata key field, add custom document loader * add chatflows used doc store chips * add types/interfaces to DocumentStore Services * document loader list dialog title bar color change * update interfaces * Whereused Chatflow Name and Added chunkNo to retain order of created chunks. * use typeorm order chunkNo, ui changes --------- Co-authored-by: Henry <hzj94@hotmail.com> Co-authored-by: Henry Heng <henryheng@flowiseai.com>
This commit is contained in:
parent
af4e28aa91
commit
40e36d1b39
|
|
@ -16,7 +16,7 @@
|
|||
<a href="https://github.com/FlowiseAI/Flowise">
|
||||
<img width="100%" src="https://github.com/FlowiseAI/Flowise/blob/main/images/flowise.gif?raw=true"></a>
|
||||
|
||||
## ⚡クイックスタート
|
||||
## ⚡ クイックスタート
|
||||
|
||||
[NodeJS](https://nodejs.org/en/download) >= 18.15.0 をダウンロードしてインストール
|
||||
|
||||
|
|
@ -67,7 +67,7 @@
|
|||
|
||||
## 👨💻 開発者向け
|
||||
|
||||
Flowise には、3つの異なるモジュールが1つの mono リポジトリにあります。
|
||||
Flowise には、3 つの異なるモジュールが 1 つの mono リポジトリにあります。
|
||||
|
||||
- `server`: API ロジックを提供する Node バックエンド
|
||||
- `ui`: React フロントエンド
|
||||
|
|
|
|||
28
README-KR.md
28
README-KR.md
|
|
@ -16,7 +16,7 @@ English | [中文](./README-ZH.md) | [日本語](./README-JA.md) | 한국어
|
|||
<a href="https://github.com/FlowiseAI/Flowise">
|
||||
<img width="100%" src="https://github.com/FlowiseAI/Flowise/blob/main/images/flowise.gif?raw=true"></a>
|
||||
|
||||
## ⚡빠른 시작 가이드
|
||||
## ⚡빠른 시작 가이드
|
||||
|
||||
18.15.0 버전 이상의 [NodeJS](https://nodejs.org/en/download) 다운로드 및 설치
|
||||
|
||||
|
|
@ -43,10 +43,10 @@ English | [中文](./README-ZH.md) | [日本語](./README-JA.md) | 한국어
|
|||
### 도커 컴포즈 활용
|
||||
|
||||
1. 프로젝트의 최상위(root) 디렉토리에 있는 `docker` 폴더로 이동하세요.
|
||||
2. `.env.example` 파일을 복사한 후, 같은 경로에 붙여넣기 한 다음, `.env`로 이름을 변경합니다.
|
||||
2. `.env.example` 파일을 복사한 후, 같은 경로에 붙여넣기 한 다음, `.env`로 이름을 변경합니다.
|
||||
3. `docker-compose up -d` 실행
|
||||
4. [http://localhost:3000](http://localhost:3000) URL 열기
|
||||
5. `docker-compose stop` 명령어를 통해 컨테이너를 종료시킬 수 있습니다.
|
||||
5. `docker-compose stop` 명령어를 통해 컨테이너를 종료시킬 수 있습니다.
|
||||
|
||||
### 도커 이미지 활용
|
||||
|
||||
|
|
@ -70,7 +70,7 @@ English | [中文](./README-ZH.md) | [日本語](./README-JA.md) | 한국어
|
|||
Flowise는 단일 리포지토리에 3개의 서로 다른 모듈이 있습니다.
|
||||
|
||||
- `server`: API 로직을 제공하는 노드 백엔드
|
||||
- `ui`: 리액트 프론트엔드
|
||||
- `ui`: 리액트 프론트엔드
|
||||
- `components`: 서드파티 노드 통합을 위한 컴포넌트
|
||||
|
||||
### 사전 설치 요건
|
||||
|
|
@ -112,11 +112,11 @@ Flowise는 단일 리포지토리에 3개의 서로 다른 모듈이 있습니
|
|||
pnpm start
|
||||
```
|
||||
|
||||
이제 [http://localhost:3000](http://localhost:3000)에서 애플리케이션에 접속할 수 있습니다.
|
||||
이제 [http://localhost:3000](http://localhost:3000)에서 애플리케이션에 접속할 수 있습니다.
|
||||
|
||||
6. 개발 환경에서 빌드할 경우:
|
||||
|
||||
- `packages/ui`경로에 `.env` 파일을 생성하고 `VITE_PORT`(`.env.example` 참조)를 지정합니다.
|
||||
- `packages/ui`경로에 `.env` 파일을 생성하고 `VITE_PORT`(`.env.example` 참조)를 지정합니다.
|
||||
- `packages/server`경로에 `.env` 파일을 생성하고 `PORT`(`.env.example` 참조)를 지정합니다.
|
||||
- 실행하기
|
||||
|
||||
|
|
@ -143,7 +143,7 @@ Flowise는 인스턴스 구성을 위한 다양한 환경 변수를 지원합니
|
|||
|
||||
[Flowise 문서](https://docs.flowiseai.com/)
|
||||
|
||||
## 🌐 자체 호스팅 하기
|
||||
## 🌐 자체 호스팅 하기
|
||||
|
||||
기존 인프라 환경에서 Flowise를 자체 호스팅으로 배포하세요. 다양한 배포 [deployments](https://docs.flowiseai.com/configuration/deployment) 방법을 지원합니다.
|
||||
|
||||
|
|
@ -182,23 +182,23 @@ Flowise는 인스턴스 구성을 위한 다양한 환경 변수를 지원합니
|
|||
|
||||
## 💻 클라우드 호스팅 서비스
|
||||
|
||||
곧 출시될 예정입니다.
|
||||
곧 출시될 예정입니다.
|
||||
|
||||
## 🙋 기술 지원
|
||||
## 🙋 기술 지원
|
||||
|
||||
질문, 버그 리포팅, 새로운 기능 요청 등은 [discussion](https://github.com/FlowiseAI/Flowise/discussions) 섹션에서 자유롭게 이야기 해주세요.
|
||||
질문, 버그 리포팅, 새로운 기능 요청 등은 [discussion](https://github.com/FlowiseAI/Flowise/discussions) 섹션에서 자유롭게 이야기 해주세요.
|
||||
|
||||
## 🙌 오픈소스 활동에 기여하기
|
||||
|
||||
다음과 같은 멋진 기여자들(contributors)에게 감사드립니다.
|
||||
다음과 같은 멋진 기여자들(contributors)에게 감사드립니다.
|
||||
|
||||
<a href="https://github.com/FlowiseAI/Flowise/graphs/contributors">
|
||||
<img src="https://contrib.rocks/image?repo=FlowiseAI/Flowise" />
|
||||
</a>
|
||||
|
||||
[contributing guide](CONTRIBUTING.md)를 살펴보세요. 디스코드 [Discord](https://discord.gg/jbaHfsRVBW) 채널에서도 이슈나 질의응답을 진행하실 수 있습니다.
|
||||
[contributing guide](CONTRIBUTING.md)를 살펴보세요. 디스코드 [Discord](https://discord.gg/jbaHfsRVBW) 채널에서도 이슈나 질의응답을 진행하실 수 있습니다.
|
||||
[](https://star-history.com/#FlowiseAI/Flowise&Date)
|
||||
|
||||
## 📄 라이센스
|
||||
## 📄 라이센스
|
||||
|
||||
본 리포지토리의 소스코드는 [Apache License Version 2.0](LICENSE.md) 라이센스가 적용됩니다.
|
||||
본 리포지토리의 소스코드는 [Apache License Version 2.0](LICENSE.md) 라이센스가 적용됩니다.
|
||||
|
|
|
|||
|
|
@ -1,8 +1,9 @@
|
|||
import axios, { AxiosRequestConfig } from 'axios'
|
||||
import { omit } from 'lodash'
|
||||
import { Document } from '@langchain/core/documents'
|
||||
import { TextSplitter } from 'langchain/text_splitter'
|
||||
import { BaseDocumentLoader } from 'langchain/document_loaders/base'
|
||||
import { ICommonObject, INode, INodeData, INodeParams } from '../../../src/Interface'
|
||||
import { ICommonObject, IDocument, INode, INodeData, INodeParams } from '../../../src/Interface'
|
||||
|
||||
class API_DocumentLoaders implements INode {
|
||||
label: string
|
||||
|
|
@ -66,6 +67,25 @@ class API_DocumentLoaders implements INode {
|
|||
'JSON body for the POST request. If not specified, agent will try to figure out itself from AIPlugin if provided',
|
||||
additionalParams: true,
|
||||
optional: true
|
||||
},
|
||||
{
|
||||
label: 'Additional Metadata',
|
||||
name: 'metadata',
|
||||
type: 'json',
|
||||
description: 'Additional metadata to be added to the extracted documents',
|
||||
optional: true,
|
||||
additionalParams: true
|
||||
},
|
||||
{
|
||||
label: 'Omit Metadata Keys',
|
||||
name: 'omitMetadataKeys',
|
||||
type: 'string',
|
||||
rows: 4,
|
||||
description:
|
||||
'Each document loader comes with a default set of metadata keys that are extracted from the document. You can use this field to omit some of the default metadata keys. The value should be a list of keys, seperated by comma',
|
||||
placeholder: 'key1, key2, key3.nestedKey1',
|
||||
optional: true,
|
||||
additionalParams: true
|
||||
}
|
||||
]
|
||||
}
|
||||
|
|
@ -76,6 +96,12 @@ class API_DocumentLoaders implements INode {
|
|||
const method = nodeData.inputs?.method as string
|
||||
const textSplitter = nodeData.inputs?.textSplitter as TextSplitter
|
||||
const metadata = nodeData.inputs?.metadata
|
||||
const _omitMetadataKeys = nodeData.inputs?.omitMetadataKeys as string
|
||||
|
||||
let omitMetadataKeys: string[] = []
|
||||
if (_omitMetadataKeys) {
|
||||
omitMetadataKeys = _omitMetadataKeys.split(',').map((key) => key.trim())
|
||||
}
|
||||
|
||||
const options: ApiLoaderParams = {
|
||||
url,
|
||||
|
|
@ -94,7 +120,7 @@ class API_DocumentLoaders implements INode {
|
|||
|
||||
const loader = new ApiLoader(options)
|
||||
|
||||
let docs = []
|
||||
let docs: IDocument[] = []
|
||||
|
||||
if (textSplitter) {
|
||||
docs = await loader.loadAndSplit(textSplitter)
|
||||
|
|
@ -104,18 +130,26 @@ class API_DocumentLoaders implements INode {
|
|||
|
||||
if (metadata) {
|
||||
const parsedMetadata = typeof metadata === 'object' ? metadata : JSON.parse(metadata)
|
||||
let finaldocs = []
|
||||
for (const doc of docs) {
|
||||
const newdoc = {
|
||||
...doc,
|
||||
metadata: {
|
||||
docs = docs.map((doc) => ({
|
||||
...doc,
|
||||
metadata: omit(
|
||||
{
|
||||
...doc.metadata,
|
||||
...parsedMetadata
|
||||
}
|
||||
}
|
||||
finaldocs.push(newdoc)
|
||||
}
|
||||
return finaldocs
|
||||
},
|
||||
omitMetadataKeys
|
||||
)
|
||||
}))
|
||||
} else {
|
||||
docs = docs.map((doc) => ({
|
||||
...doc,
|
||||
metadata: omit(
|
||||
{
|
||||
...doc.metadata
|
||||
},
|
||||
omitMetadataKeys
|
||||
)
|
||||
}))
|
||||
}
|
||||
|
||||
return docs
|
||||
|
|
@ -146,7 +180,7 @@ class ApiLoader extends BaseDocumentLoader {
|
|||
this.method = method
|
||||
}
|
||||
|
||||
public async load(): Promise<Document[]> {
|
||||
public async load(): Promise<IDocument[]> {
|
||||
if (this.method === 'POST') {
|
||||
return this.executePostRequest(this.url, this.headers, this.body)
|
||||
} else {
|
||||
|
|
@ -154,7 +188,7 @@ class ApiLoader extends BaseDocumentLoader {
|
|||
}
|
||||
}
|
||||
|
||||
protected async executeGetRequest(url: string, headers?: ICommonObject): Promise<Document[]> {
|
||||
protected async executeGetRequest(url: string, headers?: ICommonObject): Promise<IDocument[]> {
|
||||
try {
|
||||
const config: AxiosRequestConfig = {}
|
||||
if (headers) {
|
||||
|
|
@ -174,7 +208,7 @@ class ApiLoader extends BaseDocumentLoader {
|
|||
}
|
||||
}
|
||||
|
||||
protected async executePostRequest(url: string, headers?: ICommonObject, body?: ICommonObject): Promise<Document[]> {
|
||||
protected async executePostRequest(url: string, headers?: ICommonObject, body?: ICommonObject): Promise<IDocument[]> {
|
||||
try {
|
||||
const config: AxiosRequestConfig = {}
|
||||
if (headers) {
|
||||
|
|
|
|||
|
|
@ -1,9 +1,10 @@
|
|||
import axios from 'axios'
|
||||
import { omit } from 'lodash'
|
||||
import { Document } from '@langchain/core/documents'
|
||||
import { TextSplitter } from 'langchain/text_splitter'
|
||||
import { BaseDocumentLoader } from 'langchain/document_loaders/base'
|
||||
import { getCredentialData, getCredentialParam } from '../../../src/utils'
|
||||
import { ICommonObject, INode, INodeData, INodeParams } from '../../../src/Interface'
|
||||
import { IDocument, ICommonObject, INode, INodeData, INodeParams } from '../../../src/Interface'
|
||||
|
||||
class Airtable_DocumentLoaders implements INode {
|
||||
label: string
|
||||
|
|
@ -93,9 +94,21 @@ class Airtable_DocumentLoaders implements INode {
|
|||
description: 'Number of results to return. Ignored when Return All is enabled.'
|
||||
},
|
||||
{
|
||||
label: 'Metadata',
|
||||
label: 'Additional Metadata',
|
||||
name: 'metadata',
|
||||
type: 'json',
|
||||
description: 'Additional metadata to be added to the extracted documents',
|
||||
optional: true,
|
||||
additionalParams: true
|
||||
},
|
||||
{
|
||||
label: 'Omit Metadata Keys',
|
||||
name: 'omitMetadataKeys',
|
||||
type: 'string',
|
||||
rows: 4,
|
||||
description:
|
||||
'Each document loader comes with a default set of metadata keys that are extracted from the document. You can use this field to omit some of the default metadata keys. The value should be a list of keys, seperated by comma',
|
||||
placeholder: 'key1, key2, key3.nestedKey1',
|
||||
optional: true,
|
||||
additionalParams: true
|
||||
}
|
||||
|
|
@ -111,6 +124,12 @@ class Airtable_DocumentLoaders implements INode {
|
|||
const limit = nodeData.inputs?.limit as string
|
||||
const textSplitter = nodeData.inputs?.textSplitter as TextSplitter
|
||||
const metadata = nodeData.inputs?.metadata
|
||||
const _omitMetadataKeys = nodeData.inputs?.omitMetadataKeys as string
|
||||
|
||||
let omitMetadataKeys: string[] = []
|
||||
if (_omitMetadataKeys) {
|
||||
omitMetadataKeys = _omitMetadataKeys.split(',').map((key) => key.trim())
|
||||
}
|
||||
|
||||
const credentialData = await getCredentialData(nodeData.credential ?? '', options)
|
||||
const accessToken = getCredentialParam('accessToken', credentialData, nodeData)
|
||||
|
|
@ -131,7 +150,7 @@ class Airtable_DocumentLoaders implements INode {
|
|||
throw new Error('Base ID and Table ID must be provided.')
|
||||
}
|
||||
|
||||
let docs = []
|
||||
let docs: IDocument[] = []
|
||||
|
||||
if (textSplitter) {
|
||||
docs = await loader.loadAndSplit(textSplitter)
|
||||
|
|
@ -141,18 +160,26 @@ class Airtable_DocumentLoaders implements INode {
|
|||
|
||||
if (metadata) {
|
||||
const parsedMetadata = typeof metadata === 'object' ? metadata : JSON.parse(metadata)
|
||||
let finaldocs = []
|
||||
for (const doc of docs) {
|
||||
const newdoc = {
|
||||
...doc,
|
||||
metadata: {
|
||||
docs = docs.map((doc) => ({
|
||||
...doc,
|
||||
metadata: omit(
|
||||
{
|
||||
...doc.metadata,
|
||||
...parsedMetadata
|
||||
}
|
||||
}
|
||||
finaldocs.push(newdoc)
|
||||
}
|
||||
return finaldocs
|
||||
},
|
||||
omitMetadataKeys
|
||||
)
|
||||
}))
|
||||
} else {
|
||||
docs = docs.map((doc) => ({
|
||||
...doc,
|
||||
metadata: omit(
|
||||
{
|
||||
...doc.metadata
|
||||
},
|
||||
omitMetadataKeys
|
||||
)
|
||||
}))
|
||||
}
|
||||
|
||||
return docs
|
||||
|
|
@ -213,7 +240,7 @@ class AirtableLoader extends BaseDocumentLoader {
|
|||
this.returnAll = returnAll
|
||||
}
|
||||
|
||||
public async load(): Promise<Document[]> {
|
||||
public async load(): Promise<IDocument[]> {
|
||||
if (this.returnAll) {
|
||||
return this.loadAll()
|
||||
}
|
||||
|
|
@ -238,7 +265,7 @@ class AirtableLoader extends BaseDocumentLoader {
|
|||
}
|
||||
}
|
||||
|
||||
private createDocumentFromPage(page: AirtableLoaderPage): Document {
|
||||
private createDocumentFromPage(page: AirtableLoaderPage): IDocument {
|
||||
// Generate the URL
|
||||
const pageUrl = `https://api.airtable.com/v0/${this.baseId}/${this.tableId}/${page.id}`
|
||||
|
||||
|
|
@ -251,7 +278,7 @@ class AirtableLoader extends BaseDocumentLoader {
|
|||
})
|
||||
}
|
||||
|
||||
private async loadLimit(): Promise<Document[]> {
|
||||
private async loadLimit(): Promise<IDocument[]> {
|
||||
let data: AirtableLoaderRequest = {
|
||||
maxRecords: this.limit,
|
||||
view: this.viewId
|
||||
|
|
@ -282,7 +309,7 @@ class AirtableLoader extends BaseDocumentLoader {
|
|||
return returnPages.map((page) => this.createDocumentFromPage(page))
|
||||
}
|
||||
|
||||
private async loadAll(): Promise<Document[]> {
|
||||
private async loadAll(): Promise<IDocument[]> {
|
||||
let data: AirtableLoaderRequest = {
|
||||
view: this.viewId
|
||||
}
|
||||
|
|
|
|||
|
|
@ -1,3 +1,4 @@
|
|||
import { omit } from 'lodash'
|
||||
import { INode, INodeData, INodeParams, ICommonObject } from '../../../src/Interface'
|
||||
import { getCredentialData, getCredentialParam } from '../../../src/utils'
|
||||
import { TextSplitter } from 'langchain/text_splitter'
|
||||
|
|
@ -92,9 +93,21 @@ class ApifyWebsiteContentCrawler_DocumentLoaders implements INode {
|
|||
additionalParams: true
|
||||
},
|
||||
{
|
||||
label: 'Metadata',
|
||||
label: 'Additional Metadata',
|
||||
name: 'metadata',
|
||||
type: 'json',
|
||||
description: 'Additional metadata to be added to the extracted documents',
|
||||
optional: true,
|
||||
additionalParams: true
|
||||
},
|
||||
{
|
||||
label: 'Omit Metadata Keys',
|
||||
name: 'omitMetadataKeys',
|
||||
type: 'string',
|
||||
rows: 4,
|
||||
description:
|
||||
'Each document loader comes with a default set of metadata keys that are extracted from the document. You can use this field to omit some of the default metadata keys. The value should be a list of keys, seperated by comma',
|
||||
placeholder: 'key1, key2, key3.nestedKey1',
|
||||
optional: true,
|
||||
additionalParams: true
|
||||
}
|
||||
|
|
@ -110,6 +123,12 @@ class ApifyWebsiteContentCrawler_DocumentLoaders implements INode {
|
|||
async init(nodeData: INodeData, _: string, options: ICommonObject): Promise<any> {
|
||||
const textSplitter = nodeData.inputs?.textSplitter as TextSplitter
|
||||
const metadata = nodeData.inputs?.metadata
|
||||
const _omitMetadataKeys = nodeData.inputs?.omitMetadataKeys as string
|
||||
|
||||
let omitMetadataKeys: string[] = []
|
||||
if (_omitMetadataKeys) {
|
||||
omitMetadataKeys = _omitMetadataKeys.split(',').map((key) => key.trim())
|
||||
}
|
||||
|
||||
// Get input options and merge with additional input
|
||||
const urls = nodeData.inputs?.urls as string
|
||||
|
|
@ -153,18 +172,26 @@ class ApifyWebsiteContentCrawler_DocumentLoaders implements INode {
|
|||
|
||||
if (metadata) {
|
||||
const parsedMetadata = typeof metadata === 'object' ? metadata : JSON.parse(metadata)
|
||||
let finaldocs = []
|
||||
for (const doc of docs) {
|
||||
const newdoc = {
|
||||
...doc,
|
||||
metadata: {
|
||||
docs = docs.map((doc) => ({
|
||||
...doc,
|
||||
metadata: omit(
|
||||
{
|
||||
...doc.metadata,
|
||||
...parsedMetadata
|
||||
}
|
||||
}
|
||||
finaldocs.push(newdoc)
|
||||
}
|
||||
return finaldocs
|
||||
},
|
||||
omitMetadataKeys
|
||||
)
|
||||
}))
|
||||
} else {
|
||||
docs = docs.map((doc) => ({
|
||||
...doc,
|
||||
metadata: omit(
|
||||
{
|
||||
...doc.metadata
|
||||
},
|
||||
omitMetadataKeys
|
||||
)
|
||||
}))
|
||||
}
|
||||
|
||||
return docs
|
||||
|
|
|
|||
|
|
@ -1,10 +1,11 @@
|
|||
import { ICommonObject, INode, INodeData, INodeParams } from '../../../src/Interface'
|
||||
import { TextSplitter } from 'langchain/text_splitter'
|
||||
import { omit } from 'lodash'
|
||||
import { CheerioWebBaseLoader, WebBaseLoaderParams } from 'langchain/document_loaders/web/cheerio'
|
||||
import { test } from 'linkifyjs'
|
||||
import { parse } from 'css-what'
|
||||
import { webCrawl, xmlScrape } from '../../../src'
|
||||
import { SelectorType } from 'cheerio'
|
||||
import { ICommonObject, IDocument, INode, INodeData, INodeParams } from '../../../src/Interface'
|
||||
|
||||
class Cheerio_DocumentLoaders implements INode {
|
||||
label: string
|
||||
|
|
@ -55,6 +56,7 @@ class Cheerio_DocumentLoaders implements INode {
|
|||
description: 'Scrape relative links from XML sitemap URL'
|
||||
}
|
||||
],
|
||||
default: 'webCrawl',
|
||||
optional: true,
|
||||
additionalParams: true
|
||||
},
|
||||
|
|
@ -78,9 +80,21 @@ class Cheerio_DocumentLoaders implements INode {
|
|||
additionalParams: true
|
||||
},
|
||||
{
|
||||
label: 'Metadata',
|
||||
label: 'Additional Metadata',
|
||||
name: 'metadata',
|
||||
type: 'json',
|
||||
description: 'Additional metadata to be added to the extracted documents',
|
||||
optional: true,
|
||||
additionalParams: true
|
||||
},
|
||||
{
|
||||
label: 'Omit Metadata Keys',
|
||||
name: 'omitMetadataKeys',
|
||||
type: 'string',
|
||||
rows: 4,
|
||||
description:
|
||||
'Each document loader comes with a default set of metadata keys that are extracted from the document. You can use this field to omit some of the default metadata keys. The value should be a list of keys, seperated by comma',
|
||||
placeholder: 'key1, key2, key3.nestedKey1',
|
||||
optional: true,
|
||||
additionalParams: true
|
||||
}
|
||||
|
|
@ -94,6 +108,13 @@ class Cheerio_DocumentLoaders implements INode {
|
|||
const selectedLinks = nodeData.inputs?.selectedLinks as string[]
|
||||
let limit = parseInt(nodeData.inputs?.limit as string)
|
||||
|
||||
const _omitMetadataKeys = nodeData.inputs?.omitMetadataKeys as string
|
||||
|
||||
let omitMetadataKeys: string[] = []
|
||||
if (_omitMetadataKeys) {
|
||||
omitMetadataKeys = _omitMetadataKeys.split(',').map((key) => key.trim())
|
||||
}
|
||||
|
||||
let url = nodeData.inputs?.url as string
|
||||
url = url.trim()
|
||||
if (!test(url)) {
|
||||
|
|
@ -123,7 +144,8 @@ class Cheerio_DocumentLoaders implements INode {
|
|||
}
|
||||
}
|
||||
|
||||
let docs = []
|
||||
let docs: IDocument[] = []
|
||||
|
||||
if (relativeLinksMethod) {
|
||||
if (process.env.DEBUG === 'true') options.logger.info(`Start ${relativeLinksMethod}`)
|
||||
// if limit is 0 we don't want it to default to 10 so we check explicitly for null or undefined
|
||||
|
|
@ -154,18 +176,26 @@ class Cheerio_DocumentLoaders implements INode {
|
|||
|
||||
if (metadata) {
|
||||
const parsedMetadata = typeof metadata === 'object' ? metadata : JSON.parse(metadata)
|
||||
let finaldocs = []
|
||||
for (const doc of docs) {
|
||||
const newdoc = {
|
||||
...doc,
|
||||
metadata: {
|
||||
docs = docs.map((doc) => ({
|
||||
...doc,
|
||||
metadata: omit(
|
||||
{
|
||||
...doc.metadata,
|
||||
...parsedMetadata
|
||||
}
|
||||
}
|
||||
finaldocs.push(newdoc)
|
||||
}
|
||||
return finaldocs
|
||||
},
|
||||
omitMetadataKeys
|
||||
)
|
||||
}))
|
||||
} else {
|
||||
docs = docs.map((doc) => ({
|
||||
...doc,
|
||||
metadata: omit(
|
||||
{
|
||||
...doc.metadata
|
||||
},
|
||||
omitMetadataKeys
|
||||
)
|
||||
}))
|
||||
}
|
||||
|
||||
return docs
|
||||
|
|
|
|||
|
|
@ -1,3 +1,4 @@
|
|||
import { omit } from 'lodash'
|
||||
import { ICommonObject, INode, INodeData, INodeParams } from '../../../src/Interface'
|
||||
import { TextSplitter } from 'langchain/text_splitter'
|
||||
import { ConfluencePagesLoader, ConfluencePagesLoaderParams } from 'langchain/document_loaders/web/confluence'
|
||||
|
|
@ -59,9 +60,21 @@ class Confluence_DocumentLoaders implements INode {
|
|||
optional: true
|
||||
},
|
||||
{
|
||||
label: 'Metadata',
|
||||
label: 'Additional Metadata',
|
||||
name: 'metadata',
|
||||
type: 'json',
|
||||
description: 'Additional metadata to be added to the extracted documents',
|
||||
optional: true,
|
||||
additionalParams: true
|
||||
},
|
||||
{
|
||||
label: 'Omit Metadata Keys',
|
||||
name: 'omitMetadataKeys',
|
||||
type: 'string',
|
||||
rows: 4,
|
||||
description:
|
||||
'Each document loader comes with a default set of metadata keys that are extracted from the document. You can use this field to omit some of the default metadata keys. The value should be a list of keys, seperated by comma',
|
||||
placeholder: 'key1, key2, key3.nestedKey1',
|
||||
optional: true,
|
||||
additionalParams: true
|
||||
}
|
||||
|
|
@ -74,6 +87,12 @@ class Confluence_DocumentLoaders implements INode {
|
|||
const limit = nodeData.inputs?.limit as number
|
||||
const textSplitter = nodeData.inputs?.textSplitter as TextSplitter
|
||||
const metadata = nodeData.inputs?.metadata
|
||||
const _omitMetadataKeys = nodeData.inputs?.omitMetadataKeys as string
|
||||
|
||||
let omitMetadataKeys: string[] = []
|
||||
if (_omitMetadataKeys) {
|
||||
omitMetadataKeys = _omitMetadataKeys.split(',').map((key) => key.trim())
|
||||
}
|
||||
|
||||
const credentialData = await getCredentialData(nodeData.credential ?? '', options)
|
||||
const accessToken = getCredentialParam('accessToken', credentialData, nodeData)
|
||||
|
|
@ -107,18 +126,26 @@ class Confluence_DocumentLoaders implements INode {
|
|||
|
||||
if (metadata) {
|
||||
const parsedMetadata = typeof metadata === 'object' ? metadata : JSON.parse(metadata)
|
||||
let finaldocs = []
|
||||
for (const doc of docs) {
|
||||
const newdoc = {
|
||||
...doc,
|
||||
metadata: {
|
||||
docs = docs.map((doc) => ({
|
||||
...doc,
|
||||
metadata: omit(
|
||||
{
|
||||
...doc.metadata,
|
||||
...parsedMetadata
|
||||
}
|
||||
}
|
||||
finaldocs.push(newdoc)
|
||||
}
|
||||
return finaldocs
|
||||
},
|
||||
omitMetadataKeys
|
||||
)
|
||||
}))
|
||||
} else {
|
||||
docs = docs.map((doc) => ({
|
||||
...doc,
|
||||
metadata: omit(
|
||||
{
|
||||
...doc.metadata
|
||||
},
|
||||
omitMetadataKeys
|
||||
)
|
||||
}))
|
||||
}
|
||||
|
||||
return docs
|
||||
|
|
|
|||
|
|
@ -1,4 +1,5 @@
|
|||
import { ICommonObject, INode, INodeData, INodeParams } from '../../../src/Interface'
|
||||
import { omit } from 'lodash'
|
||||
import { ICommonObject, IDocument, INode, INodeData, INodeParams } from '../../../src/Interface'
|
||||
import { TextSplitter } from 'langchain/text_splitter'
|
||||
import { CSVLoader } from 'langchain/document_loaders/fs/csv'
|
||||
import { getFileFromStorage } from '../../../src'
|
||||
|
|
@ -45,9 +46,21 @@ class Csv_DocumentLoaders implements INode {
|
|||
optional: true
|
||||
},
|
||||
{
|
||||
label: 'Metadata',
|
||||
label: 'Additional Metadata',
|
||||
name: 'metadata',
|
||||
type: 'json',
|
||||
description: 'Additional metadata to be added to the extracted documents',
|
||||
optional: true,
|
||||
additionalParams: true
|
||||
},
|
||||
{
|
||||
label: 'Omit Metadata Keys',
|
||||
name: 'omitMetadataKeys',
|
||||
type: 'string',
|
||||
rows: 4,
|
||||
description:
|
||||
'Each document loader comes with a default set of metadata keys that are extracted from the document. You can use this field to omit some of the default metadata keys. The value should be a list of keys, seperated by comma',
|
||||
placeholder: 'key1, key2, key3.nestedKey1',
|
||||
optional: true,
|
||||
additionalParams: true
|
||||
}
|
||||
|
|
@ -59,8 +72,14 @@ class Csv_DocumentLoaders implements INode {
|
|||
const csvFileBase64 = nodeData.inputs?.csvFile as string
|
||||
const columnName = nodeData.inputs?.columnName as string
|
||||
const metadata = nodeData.inputs?.metadata
|
||||
const _omitMetadataKeys = nodeData.inputs?.omitMetadataKeys as string
|
||||
|
||||
let alldocs = []
|
||||
let omitMetadataKeys: string[] = []
|
||||
if (_omitMetadataKeys) {
|
||||
omitMetadataKeys = _omitMetadataKeys.split(',').map((key) => key.trim())
|
||||
}
|
||||
|
||||
let docs: IDocument[] = []
|
||||
let files: string[] = []
|
||||
|
||||
if (csvFileBase64.startsWith('FILE-STORAGE::')) {
|
||||
|
|
@ -78,11 +97,9 @@ class Csv_DocumentLoaders implements INode {
|
|||
const loader = new CSVLoader(blob, columnName.trim().length === 0 ? undefined : columnName.trim())
|
||||
|
||||
if (textSplitter) {
|
||||
const docs = await loader.loadAndSplit(textSplitter)
|
||||
alldocs.push(...docs)
|
||||
docs.push(...(await loader.loadAndSplit(textSplitter)))
|
||||
} else {
|
||||
const docs = await loader.load()
|
||||
alldocs.push(...docs)
|
||||
docs.push(...(await loader.loadAndSplit(textSplitter)))
|
||||
}
|
||||
}
|
||||
} else {
|
||||
|
|
@ -100,32 +117,38 @@ class Csv_DocumentLoaders implements INode {
|
|||
const loader = new CSVLoader(blob, columnName.trim().length === 0 ? undefined : columnName.trim())
|
||||
|
||||
if (textSplitter) {
|
||||
const docs = await loader.loadAndSplit(textSplitter)
|
||||
alldocs.push(...docs)
|
||||
docs.push(...(await loader.loadAndSplit(textSplitter)))
|
||||
} else {
|
||||
const docs = await loader.load()
|
||||
alldocs.push(...docs)
|
||||
docs.push(...(await loader.load()))
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if (metadata) {
|
||||
const parsedMetadata = typeof metadata === 'object' ? metadata : JSON.parse(metadata)
|
||||
let finaldocs = []
|
||||
for (const doc of alldocs) {
|
||||
const newdoc = {
|
||||
...doc,
|
||||
metadata: {
|
||||
docs = docs.map((doc) => ({
|
||||
...doc,
|
||||
metadata: omit(
|
||||
{
|
||||
...doc.metadata,
|
||||
...parsedMetadata
|
||||
}
|
||||
}
|
||||
finaldocs.push(newdoc)
|
||||
}
|
||||
return finaldocs
|
||||
},
|
||||
omitMetadataKeys
|
||||
)
|
||||
}))
|
||||
} else {
|
||||
docs = docs.map((doc) => ({
|
||||
...doc,
|
||||
metadata: omit(
|
||||
{
|
||||
...doc.metadata
|
||||
},
|
||||
omitMetadataKeys
|
||||
)
|
||||
}))
|
||||
}
|
||||
|
||||
return alldocs
|
||||
return docs
|
||||
}
|
||||
}
|
||||
|
||||
|
|
|
|||
|
|
@ -0,0 +1,163 @@
|
|||
import { ICommonObject, IDatabaseEntity, INode, INodeData, INodeOutputsValue, INodeParams } from '../../../src/Interface'
|
||||
import { NodeVM } from 'vm2'
|
||||
import { DataSource } from 'typeorm'
|
||||
import { availableDependencies, defaultAllowBuiltInDep, getVars, handleEscapeCharacters, prepareSandboxVars } from '../../../src/utils'
|
||||
|
||||
class CustomDocumentLoader_DocumentLoaders implements INode {
|
||||
label: string
|
||||
name: string
|
||||
version: number
|
||||
description: string
|
||||
type: string
|
||||
icon: string
|
||||
category: string
|
||||
badge: string
|
||||
baseClasses: string[]
|
||||
inputs: INodeParams[]
|
||||
outputs: INodeOutputsValue[]
|
||||
|
||||
constructor() {
|
||||
this.label = 'Custom Document Loader'
|
||||
this.name = 'customDocumentLoader'
|
||||
this.version = 1.0
|
||||
this.type = 'Document'
|
||||
this.icon = 'customDocLoader.svg'
|
||||
this.category = 'Document Loaders'
|
||||
this.badge = 'NEW'
|
||||
this.description = `Custom function for loading documents`
|
||||
this.baseClasses = [this.type]
|
||||
this.inputs = [
|
||||
{
|
||||
label: 'Input Variables',
|
||||
name: 'functionInputVariables',
|
||||
description: 'Input variables can be used in the function with prefix $. For example: $var',
|
||||
type: 'json',
|
||||
optional: true,
|
||||
acceptVariable: true,
|
||||
list: true
|
||||
},
|
||||
{
|
||||
label: 'Javascript Function',
|
||||
name: 'javascriptFunction',
|
||||
type: 'code',
|
||||
description: `Must return an array of document objects containing metadata and pageContent if "Document" is selected in the output. If "Text" is selected in the output, it must return a string.`,
|
||||
placeholder: `return [
|
||||
{
|
||||
pageContent: 'Document Content',
|
||||
metadata: {
|
||||
title: 'Document Title',
|
||||
}
|
||||
}
|
||||
]`
|
||||
}
|
||||
]
|
||||
this.outputs = [
|
||||
{
|
||||
label: 'Document',
|
||||
name: 'document',
|
||||
description: 'Array of document objects containing metadata and pageContent',
|
||||
baseClasses: [...this.baseClasses, 'json']
|
||||
},
|
||||
{
|
||||
label: 'Text',
|
||||
name: 'text',
|
||||
description: 'Concatenated string from pageContent of documents',
|
||||
baseClasses: ['string', 'json']
|
||||
}
|
||||
]
|
||||
}
|
||||
|
||||
async init(nodeData: INodeData, input: string, options: ICommonObject): Promise<any> {
|
||||
const output = nodeData.outputs?.output as string
|
||||
const javascriptFunction = nodeData.inputs?.javascriptFunction as string
|
||||
const functionInputVariablesRaw = nodeData.inputs?.functionInputVariables
|
||||
const appDataSource = options.appDataSource as DataSource
|
||||
const databaseEntities = options.databaseEntities as IDatabaseEntity
|
||||
|
||||
const variables = await getVars(appDataSource, databaseEntities, nodeData)
|
||||
const flow = {
|
||||
chatflowId: options.chatflowid,
|
||||
sessionId: options.sessionId,
|
||||
chatId: options.chatId,
|
||||
input
|
||||
}
|
||||
|
||||
let inputVars: ICommonObject = {}
|
||||
if (functionInputVariablesRaw) {
|
||||
try {
|
||||
inputVars =
|
||||
typeof functionInputVariablesRaw === 'object' ? functionInputVariablesRaw : JSON.parse(functionInputVariablesRaw)
|
||||
} catch (exception) {
|
||||
throw new Error('Invalid JSON in the Custom Document Loader Input Variables: ' + exception)
|
||||
}
|
||||
}
|
||||
|
||||
// Some values might be a stringified JSON, parse it
|
||||
for (const key in inputVars) {
|
||||
let value = inputVars[key]
|
||||
if (typeof value === 'string') {
|
||||
value = handleEscapeCharacters(value, true)
|
||||
if (value.startsWith('{') && value.endsWith('}')) {
|
||||
try {
|
||||
value = JSON.parse(value)
|
||||
} catch (e) {
|
||||
// ignore
|
||||
}
|
||||
}
|
||||
inputVars[key] = value
|
||||
}
|
||||
}
|
||||
|
||||
let sandbox: any = { $input: input }
|
||||
sandbox['$vars'] = prepareSandboxVars(variables)
|
||||
sandbox['$flow'] = flow
|
||||
|
||||
if (Object.keys(inputVars).length) {
|
||||
for (const item in inputVars) {
|
||||
sandbox[`$${item}`] = inputVars[item]
|
||||
}
|
||||
}
|
||||
|
||||
const builtinDeps = process.env.TOOL_FUNCTION_BUILTIN_DEP
|
||||
? defaultAllowBuiltInDep.concat(process.env.TOOL_FUNCTION_BUILTIN_DEP.split(','))
|
||||
: defaultAllowBuiltInDep
|
||||
const externalDeps = process.env.TOOL_FUNCTION_EXTERNAL_DEP ? process.env.TOOL_FUNCTION_EXTERNAL_DEP.split(',') : []
|
||||
const deps = availableDependencies.concat(externalDeps)
|
||||
|
||||
const nodeVMOptions = {
|
||||
console: 'inherit',
|
||||
sandbox,
|
||||
require: {
|
||||
external: { modules: deps },
|
||||
builtin: builtinDeps
|
||||
}
|
||||
} as any
|
||||
|
||||
const vm = new NodeVM(nodeVMOptions)
|
||||
try {
|
||||
const response = await vm.run(`module.exports = async function() {${javascriptFunction}}()`, __dirname)
|
||||
|
||||
if (output === 'document' && Array.isArray(response)) {
|
||||
if (response.length === 0) return response
|
||||
if (
|
||||
response[0].pageContent &&
|
||||
typeof response[0].pageContent === 'string' &&
|
||||
response[0].metadata &&
|
||||
typeof response[0].metadata === 'object'
|
||||
)
|
||||
return response
|
||||
throw new Error('Document object must contain pageContent and metadata')
|
||||
}
|
||||
|
||||
if (output === 'text' && typeof response === 'string') {
|
||||
return handleEscapeCharacters(response, false)
|
||||
}
|
||||
|
||||
return response
|
||||
} catch (e) {
|
||||
throw new Error(e)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
module.exports = { nodeClass: CustomDocumentLoader_DocumentLoaders }
|
||||
|
|
@ -0,0 +1 @@
|
|||
<svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="icon icon-tabler icons-tabler-outline icon-tabler-writing"><path stroke="none" d="M0 0h24v24H0z" fill="none"/><path d="M20 17v-12c0 -1.121 -.879 -2 -2 -2s-2 .879 -2 2v12l2 2l2 -2z" /><path d="M16 7h4" /><path d="M18 19h-13a2 2 0 1 1 0 -4h4a2 2 0 1 0 0 -4h-3" /></svg>
|
||||
|
After Width: | Height: | Size: 465 B |
|
|
@ -0,0 +1,95 @@
|
|||
import { ICommonObject, IDatabaseEntity, INode, INodeData, INodeOptionsValue, INodeOutputsValue, INodeParams } from '../../../src/Interface'
|
||||
import { DataSource } from 'typeorm'
|
||||
import { Document } from '@langchain/core/documents'
|
||||
|
||||
class DocStore_DocumentLoaders implements INode {
|
||||
label: string
|
||||
name: string
|
||||
version: number
|
||||
description: string
|
||||
type: string
|
||||
icon: string
|
||||
category: string
|
||||
baseClasses: string[]
|
||||
inputs: INodeParams[]
|
||||
outputs: INodeOutputsValue[]
|
||||
badge: string
|
||||
|
||||
constructor() {
|
||||
this.label = 'Document Store'
|
||||
this.name = 'documentStore'
|
||||
this.version = 1.0
|
||||
this.type = 'Document'
|
||||
this.icon = 'dstore.svg'
|
||||
this.badge = 'NEW'
|
||||
this.category = 'Document Loaders'
|
||||
this.description = `Load data from pre-configured document stores`
|
||||
this.baseClasses = [this.type]
|
||||
this.inputs = [
|
||||
{
|
||||
label: 'Select Store',
|
||||
name: 'selectedStore',
|
||||
type: 'asyncOptions',
|
||||
loadMethod: 'listStores'
|
||||
}
|
||||
]
|
||||
this.outputs = [
|
||||
{
|
||||
label: 'Document',
|
||||
name: 'document',
|
||||
description: 'Array of document objects containing metadata and pageContent',
|
||||
baseClasses: [...this.baseClasses, 'json']
|
||||
},
|
||||
{
|
||||
label: 'Text',
|
||||
name: 'text',
|
||||
description: 'Concatenated string from pageContent of documents',
|
||||
baseClasses: ['string', 'json']
|
||||
}
|
||||
]
|
||||
}
|
||||
|
||||
//@ts-ignore
|
||||
loadMethods = {
|
||||
async listStores(_: INodeData, options: ICommonObject): Promise<INodeOptionsValue[]> {
|
||||
const returnData: INodeOptionsValue[] = []
|
||||
|
||||
const appDataSource = options.appDataSource as DataSource
|
||||
const databaseEntities = options.databaseEntities as IDatabaseEntity
|
||||
|
||||
if (appDataSource === undefined || !appDataSource) {
|
||||
return returnData
|
||||
}
|
||||
|
||||
const stores = await appDataSource.getRepository(databaseEntities['DocumentStore']).find()
|
||||
for (const store of stores) {
|
||||
if (store.status === 'SYNC') {
|
||||
const obj = {
|
||||
name: store.id,
|
||||
label: store.name,
|
||||
description: store.description
|
||||
}
|
||||
returnData.push(obj)
|
||||
}
|
||||
}
|
||||
return returnData
|
||||
}
|
||||
}
|
||||
|
||||
async init(nodeData: INodeData, _: string, options: ICommonObject): Promise<any> {
|
||||
const selectedStore = nodeData.inputs?.selectedStore as string
|
||||
const appDataSource = options.appDataSource as DataSource
|
||||
const databaseEntities = options.databaseEntities as IDatabaseEntity
|
||||
const chunks = await appDataSource
|
||||
.getRepository(databaseEntities['DocumentStoreFileChunk'])
|
||||
.find({ where: { storeId: selectedStore } })
|
||||
|
||||
const finalDocs = []
|
||||
for (const chunk of chunks) {
|
||||
finalDocs.push(new Document({ pageContent: chunk.pageContent, metadata: JSON.parse(chunk.metadata) }))
|
||||
}
|
||||
return finalDocs
|
||||
}
|
||||
}
|
||||
|
||||
module.exports = { nodeClass: DocStore_DocumentLoaders }
|
||||
|
|
@ -0,0 +1,15 @@
|
|||
<svg
|
||||
xmlns="http://www.w3.org/2000/svg"
|
||||
width="24"
|
||||
height="24"
|
||||
viewBox="0 0 24 24"
|
||||
fill="none"
|
||||
stroke="currentColor"
|
||||
stroke-width="2"
|
||||
stroke-linecap="round"
|
||||
stroke-linejoin="round"
|
||||
>
|
||||
<path d="M12 4l-8 4l8 4l8 -4l-8 -4" />
|
||||
<path d="M4 12l8 4l8 -4" />
|
||||
<path d="M4 16l8 4l8 -4" />
|
||||
</svg>
|
||||
|
After Width: | Height: | Size: 305 B |
|
|
@ -1,4 +1,5 @@
|
|||
import { ICommonObject, INode, INodeData, INodeParams } from '../../../src/Interface'
|
||||
import { omit } from 'lodash'
|
||||
import { ICommonObject, IDocument, INode, INodeData, INodeParams } from '../../../src/Interface'
|
||||
import { TextSplitter } from 'langchain/text_splitter'
|
||||
import { DocxLoader } from 'langchain/document_loaders/fs/docx'
|
||||
import { getFileFromStorage } from '../../../src'
|
||||
|
|
@ -37,9 +38,21 @@ class Docx_DocumentLoaders implements INode {
|
|||
optional: true
|
||||
},
|
||||
{
|
||||
label: 'Metadata',
|
||||
label: 'Additional Metadata',
|
||||
name: 'metadata',
|
||||
type: 'json',
|
||||
description: 'Additional metadata to be added to the extracted documents',
|
||||
optional: true,
|
||||
additionalParams: true
|
||||
},
|
||||
{
|
||||
label: 'Omit Metadata Keys',
|
||||
name: 'omitMetadataKeys',
|
||||
type: 'string',
|
||||
rows: 4,
|
||||
description:
|
||||
'Each document loader comes with a default set of metadata keys that are extracted from the document. You can use this field to omit some of the default metadata keys. The value should be a list of keys, seperated by comma',
|
||||
placeholder: 'key1, key2, key3.nestedKey1',
|
||||
optional: true,
|
||||
additionalParams: true
|
||||
}
|
||||
|
|
@ -50,8 +63,14 @@ class Docx_DocumentLoaders implements INode {
|
|||
const textSplitter = nodeData.inputs?.textSplitter as TextSplitter
|
||||
const docxFileBase64 = nodeData.inputs?.docxFile as string
|
||||
const metadata = nodeData.inputs?.metadata
|
||||
const _omitMetadataKeys = nodeData.inputs?.omitMetadataKeys as string
|
||||
|
||||
let alldocs = []
|
||||
let omitMetadataKeys: string[] = []
|
||||
if (_omitMetadataKeys) {
|
||||
omitMetadataKeys = _omitMetadataKeys.split(',').map((key) => key.trim())
|
||||
}
|
||||
|
||||
let docs: IDocument[] = []
|
||||
let files: string[] = []
|
||||
|
||||
if (docxFileBase64.startsWith('FILE-STORAGE::')) {
|
||||
|
|
@ -69,11 +88,9 @@ class Docx_DocumentLoaders implements INode {
|
|||
const loader = new DocxLoader(blob)
|
||||
|
||||
if (textSplitter) {
|
||||
const docs = await loader.loadAndSplit(textSplitter)
|
||||
alldocs.push(...docs)
|
||||
docs.push(...(await loader.loadAndSplit(textSplitter)))
|
||||
} else {
|
||||
const docs = await loader.load()
|
||||
alldocs.push(...docs)
|
||||
docs.push(...(await loader.load()))
|
||||
}
|
||||
}
|
||||
} else {
|
||||
|
|
@ -91,32 +108,38 @@ class Docx_DocumentLoaders implements INode {
|
|||
const loader = new DocxLoader(blob)
|
||||
|
||||
if (textSplitter) {
|
||||
const docs = await loader.loadAndSplit(textSplitter)
|
||||
alldocs.push(...docs)
|
||||
docs.push(...(await loader.loadAndSplit(textSplitter)))
|
||||
} else {
|
||||
const docs = await loader.load()
|
||||
alldocs.push(...docs)
|
||||
docs.push(...(await loader.load()))
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if (metadata) {
|
||||
const parsedMetadata = typeof metadata === 'object' ? metadata : JSON.parse(metadata)
|
||||
let finaldocs = []
|
||||
for (const doc of alldocs) {
|
||||
const newdoc = {
|
||||
...doc,
|
||||
metadata: {
|
||||
docs = docs.map((doc) => ({
|
||||
...doc,
|
||||
metadata: omit(
|
||||
{
|
||||
...doc.metadata,
|
||||
...parsedMetadata
|
||||
}
|
||||
}
|
||||
finaldocs.push(newdoc)
|
||||
}
|
||||
return finaldocs
|
||||
},
|
||||
omitMetadataKeys
|
||||
)
|
||||
}))
|
||||
} else {
|
||||
docs = docs.map((doc) => ({
|
||||
...doc,
|
||||
metadata: omit(
|
||||
{
|
||||
...doc.metadata
|
||||
},
|
||||
omitMetadataKeys
|
||||
)
|
||||
}))
|
||||
}
|
||||
|
||||
return alldocs
|
||||
return docs
|
||||
}
|
||||
}
|
||||
|
||||
|
|
|
|||
|
|
@ -1,3 +1,4 @@
|
|||
import { omit } from 'lodash'
|
||||
import { getCredentialData, getCredentialParam } from '../../../src'
|
||||
import { ICommonObject, INode, INodeData, INodeParams } from '../../../src/Interface'
|
||||
import { FigmaFileLoader, FigmaLoaderParams } from 'langchain/document_loaders/web/figma'
|
||||
|
|
@ -60,9 +61,21 @@ class Figma_DocumentLoaders implements INode {
|
|||
optional: true
|
||||
},
|
||||
{
|
||||
label: 'Metadata',
|
||||
label: 'Additional Metadata',
|
||||
name: 'metadata',
|
||||
type: 'json',
|
||||
description: 'Additional metadata to be added to the extracted documents',
|
||||
optional: true,
|
||||
additionalParams: true
|
||||
},
|
||||
{
|
||||
label: 'Omit Metadata Keys',
|
||||
name: 'omitMetadataKeys',
|
||||
type: 'string',
|
||||
rows: 4,
|
||||
description:
|
||||
'Each document loader comes with a default set of metadata keys that are extracted from the document. You can use this field to omit some of the default metadata keys. The value should be a list of keys, seperated by comma',
|
||||
placeholder: 'key1, key2, key3.nestedKey1',
|
||||
optional: true,
|
||||
additionalParams: true
|
||||
}
|
||||
|
|
@ -74,6 +87,12 @@ class Figma_DocumentLoaders implements INode {
|
|||
const fileKey = nodeData.inputs?.fileKey as string
|
||||
const textSplitter = nodeData.inputs?.textSplitter as TextSplitter
|
||||
const metadata = nodeData.inputs?.metadata
|
||||
const _omitMetadataKeys = nodeData.inputs?.omitMetadataKeys as string
|
||||
|
||||
let omitMetadataKeys: string[] = []
|
||||
if (_omitMetadataKeys) {
|
||||
omitMetadataKeys = _omitMetadataKeys.split(',').map((key) => key.trim())
|
||||
}
|
||||
|
||||
const credentialData = await getCredentialData(nodeData.credential ?? '', options)
|
||||
const accessToken = getCredentialParam('accessToken', credentialData, nodeData)
|
||||
|
|
@ -86,19 +105,30 @@ class Figma_DocumentLoaders implements INode {
|
|||
|
||||
const loader = new FigmaFileLoader(figmaOptions)
|
||||
|
||||
const docs = textSplitter ? await loader.loadAndSplit() : await loader.load()
|
||||
let docs = textSplitter ? await loader.loadAndSplit() : await loader.load()
|
||||
|
||||
if (metadata) {
|
||||
const parsedMetadata = typeof metadata === 'object' ? metadata : JSON.parse(metadata)
|
||||
return docs.map((doc) => {
|
||||
return {
|
||||
...doc,
|
||||
metadata: {
|
||||
docs = docs.map((doc) => ({
|
||||
...doc,
|
||||
metadata: omit(
|
||||
{
|
||||
...doc.metadata,
|
||||
...parsedMetadata
|
||||
}
|
||||
}
|
||||
})
|
||||
},
|
||||
omitMetadataKeys
|
||||
)
|
||||
}))
|
||||
} else {
|
||||
docs = docs.map((doc) => ({
|
||||
...doc,
|
||||
metadata: omit(
|
||||
{
|
||||
...doc.metadata
|
||||
},
|
||||
omitMetadataKeys
|
||||
)
|
||||
}))
|
||||
}
|
||||
|
||||
return docs
|
||||
|
|
|
|||
|
|
@ -1,3 +1,4 @@
|
|||
import { omit } from 'lodash'
|
||||
import { INode, INodeData, INodeParams } from '../../../src/Interface'
|
||||
import { TextSplitter } from 'langchain/text_splitter'
|
||||
import { TextLoader } from 'langchain/document_loaders/fs/text'
|
||||
|
|
@ -65,9 +66,21 @@ class Folder_DocumentLoaders implements INode {
|
|||
additionalParams: true
|
||||
},
|
||||
{
|
||||
label: 'Metadata',
|
||||
label: 'Additional Metadata',
|
||||
name: 'metadata',
|
||||
type: 'json',
|
||||
description: 'Additional metadata to be added to the extracted documents',
|
||||
optional: true,
|
||||
additionalParams: true
|
||||
},
|
||||
{
|
||||
label: 'Omit Metadata Keys',
|
||||
name: 'omitMetadataKeys',
|
||||
type: 'string',
|
||||
rows: 4,
|
||||
description:
|
||||
'Each document loader comes with a default set of metadata keys that are extracted from the document. You can use this field to omit some of the default metadata keys. The value should be a list of keys, seperated by comma',
|
||||
placeholder: 'key1, key2, key3.nestedKey1',
|
||||
optional: true,
|
||||
additionalParams: true
|
||||
}
|
||||
|
|
@ -80,6 +93,12 @@ class Folder_DocumentLoaders implements INode {
|
|||
const metadata = nodeData.inputs?.metadata
|
||||
const recursive = nodeData.inputs?.recursive as boolean
|
||||
const pdfUsage = nodeData.inputs?.pdfUsage
|
||||
const _omitMetadataKeys = nodeData.inputs?.omitMetadataKeys as string
|
||||
|
||||
let omitMetadataKeys: string[] = []
|
||||
if (_omitMetadataKeys) {
|
||||
omitMetadataKeys = _omitMetadataKeys.split(',').map((key) => key.trim())
|
||||
}
|
||||
|
||||
const loader = new DirectoryLoader(
|
||||
folderPath,
|
||||
|
|
@ -141,18 +160,26 @@ class Folder_DocumentLoaders implements INode {
|
|||
|
||||
if (metadata) {
|
||||
const parsedMetadata = typeof metadata === 'object' ? metadata : JSON.parse(metadata)
|
||||
let finaldocs = []
|
||||
for (const doc of docs) {
|
||||
const newdoc = {
|
||||
...doc,
|
||||
metadata: {
|
||||
docs = docs.map((doc) => ({
|
||||
...doc,
|
||||
metadata: omit(
|
||||
{
|
||||
...doc.metadata,
|
||||
...parsedMetadata
|
||||
}
|
||||
}
|
||||
finaldocs.push(newdoc)
|
||||
}
|
||||
return finaldocs
|
||||
},
|
||||
omitMetadataKeys
|
||||
)
|
||||
}))
|
||||
} else {
|
||||
docs = docs.map((doc) => ({
|
||||
...doc,
|
||||
metadata: omit(
|
||||
{
|
||||
...doc.metadata
|
||||
},
|
||||
omitMetadataKeys
|
||||
)
|
||||
}))
|
||||
}
|
||||
|
||||
return docs
|
||||
|
|
|
|||
|
|
@ -1,3 +1,4 @@
|
|||
import { omit } from 'lodash'
|
||||
import { INode, INodeData, INodeParams } from '../../../src/Interface'
|
||||
import { TextSplitter } from 'langchain/text_splitter'
|
||||
import { GitbookLoader } from 'langchain/document_loaders/web/gitbook'
|
||||
|
|
@ -44,9 +45,21 @@ class Gitbook_DocumentLoaders implements INode {
|
|||
optional: true
|
||||
},
|
||||
{
|
||||
label: 'Metadata',
|
||||
label: 'Additional Metadata',
|
||||
name: 'metadata',
|
||||
type: 'json',
|
||||
description: 'Additional metadata to be added to the extracted documents',
|
||||
optional: true,
|
||||
additionalParams: true
|
||||
},
|
||||
{
|
||||
label: 'Omit Metadata Keys',
|
||||
name: 'omitMetadataKeys',
|
||||
type: 'string',
|
||||
rows: 4,
|
||||
description:
|
||||
'Each document loader comes with a default set of metadata keys that are extracted from the document. You can use this field to omit some of the default metadata keys. The value should be a list of keys, seperated by comma',
|
||||
placeholder: 'key1, key2, key3.nestedKey1',
|
||||
optional: true,
|
||||
additionalParams: true
|
||||
}
|
||||
|
|
@ -57,22 +70,39 @@ class Gitbook_DocumentLoaders implements INode {
|
|||
const shouldLoadAllPaths = nodeData.inputs?.shouldLoadAllPaths as boolean
|
||||
const textSplitter = nodeData.inputs?.textSplitter as TextSplitter
|
||||
const metadata = nodeData.inputs?.metadata
|
||||
const _omitMetadataKeys = nodeData.inputs?.omitMetadataKeys as string
|
||||
|
||||
let omitMetadataKeys: string[] = []
|
||||
if (_omitMetadataKeys) {
|
||||
omitMetadataKeys = _omitMetadataKeys.split(',').map((key) => key.trim())
|
||||
}
|
||||
|
||||
const loader = shouldLoadAllPaths ? new GitbookLoader(webPath, { shouldLoadAllPaths }) : new GitbookLoader(webPath)
|
||||
|
||||
const docs = textSplitter ? await loader.loadAndSplit() : await loader.load()
|
||||
let docs = textSplitter ? await loader.loadAndSplit() : await loader.load()
|
||||
|
||||
if (metadata) {
|
||||
const parsedMetadata = typeof metadata === 'object' ? metadata : JSON.parse(metadata)
|
||||
return docs.map((doc) => {
|
||||
return {
|
||||
...doc,
|
||||
metadata: {
|
||||
docs = docs.map((doc) => ({
|
||||
...doc,
|
||||
metadata: omit(
|
||||
{
|
||||
...doc.metadata,
|
||||
...parsedMetadata
|
||||
}
|
||||
}
|
||||
})
|
||||
},
|
||||
omitMetadataKeys
|
||||
)
|
||||
}))
|
||||
} else {
|
||||
docs = docs.map((doc) => ({
|
||||
...doc,
|
||||
metadata: omit(
|
||||
{
|
||||
...doc.metadata
|
||||
},
|
||||
omitMetadataKeys
|
||||
)
|
||||
}))
|
||||
}
|
||||
|
||||
return docs
|
||||
|
|
|
|||
|
|
@ -1,3 +1,4 @@
|
|||
import { omit } from 'lodash'
|
||||
import { ICommonObject, INode, INodeData, INodeParams } from '../../../src/Interface'
|
||||
import { TextSplitter } from 'langchain/text_splitter'
|
||||
import { GithubRepoLoader, GithubRepoLoaderParams } from 'langchain/document_loaders/web/github'
|
||||
|
|
@ -86,9 +87,21 @@ class Github_DocumentLoaders implements INode {
|
|||
optional: true
|
||||
},
|
||||
{
|
||||
label: 'Metadata',
|
||||
label: 'Additional Metadata',
|
||||
name: 'metadata',
|
||||
type: 'json',
|
||||
description: 'Additional metadata to be added to the extracted documents',
|
||||
optional: true,
|
||||
additionalParams: true
|
||||
},
|
||||
{
|
||||
label: 'Omit Metadata Keys',
|
||||
name: 'omitMetadataKeys',
|
||||
type: 'string',
|
||||
rows: 4,
|
||||
description:
|
||||
'Each document loader comes with a default set of metadata keys that are extracted from the document. You can use this field to omit some of the default metadata keys. The value should be a list of keys, seperated by comma',
|
||||
placeholder: 'key1, key2, key3.nestedKey1',
|
||||
optional: true,
|
||||
additionalParams: true
|
||||
}
|
||||
|
|
@ -104,6 +117,12 @@ class Github_DocumentLoaders implements INode {
|
|||
const maxConcurrency = nodeData.inputs?.maxConcurrency as string
|
||||
const maxRetries = nodeData.inputs?.maxRetries as string
|
||||
const ignorePath = nodeData.inputs?.ignorePath as string
|
||||
const _omitMetadataKeys = nodeData.inputs?.omitMetadataKeys as string
|
||||
|
||||
let omitMetadataKeys: string[] = []
|
||||
if (_omitMetadataKeys) {
|
||||
omitMetadataKeys = _omitMetadataKeys.split(',').map((key) => key.trim())
|
||||
}
|
||||
|
||||
const credentialData = await getCredentialData(nodeData.credential ?? '', options)
|
||||
const accessToken = getCredentialParam('accessToken', credentialData, nodeData)
|
||||
|
|
@ -120,19 +139,30 @@ class Github_DocumentLoaders implements INode {
|
|||
if (ignorePath) githubOptions.ignorePaths = JSON.parse(ignorePath)
|
||||
|
||||
const loader = new GithubRepoLoader(repoLink, githubOptions)
|
||||
const docs = textSplitter ? await loader.loadAndSplit(textSplitter) : await loader.load()
|
||||
let docs = textSplitter ? await loader.loadAndSplit(textSplitter) : await loader.load()
|
||||
|
||||
if (metadata) {
|
||||
const parsedMetadata = typeof metadata === 'object' ? metadata : JSON.parse(metadata)
|
||||
return docs.map((doc) => {
|
||||
return {
|
||||
...doc,
|
||||
metadata: {
|
||||
docs = docs.map((doc) => ({
|
||||
...doc,
|
||||
metadata: omit(
|
||||
{
|
||||
...doc.metadata,
|
||||
...parsedMetadata
|
||||
}
|
||||
}
|
||||
})
|
||||
},
|
||||
omitMetadataKeys
|
||||
)
|
||||
}))
|
||||
} else {
|
||||
docs = docs.map((doc) => ({
|
||||
...doc,
|
||||
metadata: omit(
|
||||
{
|
||||
...doc.metadata
|
||||
},
|
||||
omitMetadataKeys
|
||||
)
|
||||
}))
|
||||
}
|
||||
|
||||
return docs
|
||||
|
|
|
|||
|
|
@ -1,4 +1,5 @@
|
|||
import { ICommonObject, INode, INodeData, INodeParams } from '../../../src/Interface'
|
||||
import { omit } from 'lodash'
|
||||
import { ICommonObject, IDocument, INode, INodeData, INodeParams } from '../../../src/Interface'
|
||||
import { TextSplitter } from 'langchain/text_splitter'
|
||||
import { JSONLoader } from 'langchain/document_loaders/fs/json'
|
||||
import { getFileFromStorage } from '../../../src'
|
||||
|
|
@ -45,9 +46,21 @@ class Json_DocumentLoaders implements INode {
|
|||
optional: true
|
||||
},
|
||||
{
|
||||
label: 'Metadata',
|
||||
label: 'Additional Metadata',
|
||||
name: 'metadata',
|
||||
type: 'json',
|
||||
description: 'Additional metadata to be added to the extracted documents',
|
||||
optional: true,
|
||||
additionalParams: true
|
||||
},
|
||||
{
|
||||
label: 'Omit Metadata Keys',
|
||||
name: 'omitMetadataKeys',
|
||||
type: 'string',
|
||||
rows: 4,
|
||||
description:
|
||||
'Each document loader comes with a default set of metadata keys that are extracted from the document. You can use this field to omit some of the default metadata keys. The value should be a list of keys, seperated by comma',
|
||||
placeholder: 'key1, key2, key3.nestedKey1',
|
||||
optional: true,
|
||||
additionalParams: true
|
||||
}
|
||||
|
|
@ -59,6 +72,12 @@ class Json_DocumentLoaders implements INode {
|
|||
const jsonFileBase64 = nodeData.inputs?.jsonFile as string
|
||||
const pointersName = nodeData.inputs?.pointersName as string
|
||||
const metadata = nodeData.inputs?.metadata
|
||||
const _omitMetadataKeys = nodeData.inputs?.omitMetadataKeys as string
|
||||
|
||||
let omitMetadataKeys: string[] = []
|
||||
if (_omitMetadataKeys) {
|
||||
omitMetadataKeys = _omitMetadataKeys.split(',').map((key) => key.trim())
|
||||
}
|
||||
|
||||
let pointers: string[] = []
|
||||
if (pointersName) {
|
||||
|
|
@ -66,7 +85,7 @@ class Json_DocumentLoaders implements INode {
|
|||
pointers = outputString.split(',').map((pointer) => '/' + pointer.trim())
|
||||
}
|
||||
|
||||
let alldocs = []
|
||||
let docs: IDocument[] = []
|
||||
let files: string[] = []
|
||||
|
||||
//FILE-STORAGE::["CONTRIBUTING.md","LICENSE.md","README.md"]
|
||||
|
|
@ -85,11 +104,9 @@ class Json_DocumentLoaders implements INode {
|
|||
const loader = new JSONLoader(blob, pointers.length != 0 ? pointers : undefined)
|
||||
|
||||
if (textSplitter) {
|
||||
const docs = await loader.loadAndSplit(textSplitter)
|
||||
alldocs.push(...docs)
|
||||
docs.push(...(await loader.loadAndSplit(textSplitter)))
|
||||
} else {
|
||||
const docs = await loader.load()
|
||||
alldocs.push(...docs)
|
||||
docs.push(...(await loader.load()))
|
||||
}
|
||||
}
|
||||
} else {
|
||||
|
|
@ -107,32 +124,38 @@ class Json_DocumentLoaders implements INode {
|
|||
const loader = new JSONLoader(blob, pointers.length != 0 ? pointers : undefined)
|
||||
|
||||
if (textSplitter) {
|
||||
const docs = await loader.loadAndSplit(textSplitter)
|
||||
alldocs.push(...docs)
|
||||
docs.push(...(await loader.loadAndSplit(textSplitter)))
|
||||
} else {
|
||||
const docs = await loader.load()
|
||||
alldocs.push(...docs)
|
||||
docs.push(...(await loader.load()))
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if (metadata) {
|
||||
const parsedMetadata = typeof metadata === 'object' ? metadata : JSON.parse(metadata)
|
||||
let finaldocs = []
|
||||
for (const doc of alldocs) {
|
||||
const newdoc = {
|
||||
...doc,
|
||||
metadata: {
|
||||
docs = docs.map((doc) => ({
|
||||
...doc,
|
||||
metadata: omit(
|
||||
{
|
||||
...doc.metadata,
|
||||
...parsedMetadata
|
||||
}
|
||||
}
|
||||
finaldocs.push(newdoc)
|
||||
}
|
||||
return finaldocs
|
||||
},
|
||||
omitMetadataKeys
|
||||
)
|
||||
}))
|
||||
} else {
|
||||
docs = docs.map((doc) => ({
|
||||
...doc,
|
||||
metadata: omit(
|
||||
{
|
||||
...doc.metadata
|
||||
},
|
||||
omitMetadataKeys
|
||||
)
|
||||
}))
|
||||
}
|
||||
|
||||
return alldocs
|
||||
return docs
|
||||
}
|
||||
}
|
||||
|
||||
|
|
|
|||
|
|
@ -1,4 +1,5 @@
|
|||
import { ICommonObject, INode, INodeData, INodeParams } from '../../../src/Interface'
|
||||
import { omit } from 'lodash'
|
||||
import { ICommonObject, IDocument, INode, INodeData, INodeParams } from '../../../src/Interface'
|
||||
import { TextSplitter } from 'langchain/text_splitter'
|
||||
import { JSONLinesLoader } from 'langchain/document_loaders/fs/json'
|
||||
import { getFileFromStorage } from '../../../src'
|
||||
|
|
@ -44,9 +45,21 @@ class Jsonlines_DocumentLoaders implements INode {
|
|||
optional: false
|
||||
},
|
||||
{
|
||||
label: 'Metadata',
|
||||
label: 'Additional Metadata',
|
||||
name: 'metadata',
|
||||
type: 'json',
|
||||
description: 'Additional metadata to be added to the extracted documents',
|
||||
optional: true,
|
||||
additionalParams: true
|
||||
},
|
||||
{
|
||||
label: 'Omit Metadata Keys',
|
||||
name: 'omitMetadataKeys',
|
||||
type: 'string',
|
||||
rows: 4,
|
||||
description:
|
||||
'Each document loader comes with a default set of metadata keys that are extracted from the document. You can use this field to omit some of the default metadata keys. The value should be a list of keys, seperated by comma',
|
||||
placeholder: 'key1, key2, key3.nestedKey1',
|
||||
optional: true,
|
||||
additionalParams: true
|
||||
}
|
||||
|
|
@ -58,8 +71,14 @@ class Jsonlines_DocumentLoaders implements INode {
|
|||
const jsonLinesFileBase64 = nodeData.inputs?.jsonlinesFile as string
|
||||
const pointerName = nodeData.inputs?.pointerName as string
|
||||
const metadata = nodeData.inputs?.metadata
|
||||
const _omitMetadataKeys = nodeData.inputs?.omitMetadataKeys as string
|
||||
|
||||
let alldocs = []
|
||||
let omitMetadataKeys: string[] = []
|
||||
if (_omitMetadataKeys) {
|
||||
omitMetadataKeys = _omitMetadataKeys.split(',').map((key) => key.trim())
|
||||
}
|
||||
|
||||
let docs: IDocument[] = []
|
||||
let files: string[] = []
|
||||
|
||||
let pointer = '/' + pointerName.trim()
|
||||
|
|
@ -79,11 +98,9 @@ class Jsonlines_DocumentLoaders implements INode {
|
|||
const loader = new JSONLinesLoader(blob, pointer)
|
||||
|
||||
if (textSplitter) {
|
||||
const docs = await loader.loadAndSplit(textSplitter)
|
||||
alldocs.push(...docs)
|
||||
docs.push(...(await loader.loadAndSplit(textSplitter)))
|
||||
} else {
|
||||
const docs = await loader.load()
|
||||
alldocs.push(...docs)
|
||||
docs.push(...(await loader.load()))
|
||||
}
|
||||
}
|
||||
} else {
|
||||
|
|
@ -101,32 +118,38 @@ class Jsonlines_DocumentLoaders implements INode {
|
|||
const loader = new JSONLinesLoader(blob, pointer)
|
||||
|
||||
if (textSplitter) {
|
||||
const docs = await loader.loadAndSplit(textSplitter)
|
||||
alldocs.push(...docs)
|
||||
docs.push(...(await loader.loadAndSplit(textSplitter)))
|
||||
} else {
|
||||
const docs = await loader.load()
|
||||
alldocs.push(...docs)
|
||||
docs.push(...(await loader.load()))
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if (metadata) {
|
||||
const parsedMetadata = typeof metadata === 'object' ? metadata : JSON.parse(metadata)
|
||||
let finaldocs = []
|
||||
for (const doc of alldocs) {
|
||||
const newdoc = {
|
||||
...doc,
|
||||
metadata: {
|
||||
docs = docs.map((doc) => ({
|
||||
...doc,
|
||||
metadata: omit(
|
||||
{
|
||||
...doc.metadata,
|
||||
...parsedMetadata
|
||||
}
|
||||
}
|
||||
finaldocs.push(newdoc)
|
||||
}
|
||||
return finaldocs
|
||||
},
|
||||
omitMetadataKeys
|
||||
)
|
||||
}))
|
||||
} else {
|
||||
docs = docs.map((doc) => ({
|
||||
...doc,
|
||||
metadata: omit(
|
||||
{
|
||||
...doc.metadata
|
||||
},
|
||||
omitMetadataKeys
|
||||
)
|
||||
}))
|
||||
}
|
||||
|
||||
return alldocs
|
||||
return docs
|
||||
}
|
||||
}
|
||||
|
||||
|
|
|
|||
|
|
@ -1,4 +1,5 @@
|
|||
import { ICommonObject, INode, INodeData, INodeParams } from '../../../src/Interface'
|
||||
import { omit } from 'lodash'
|
||||
import { ICommonObject, IDocument, INode, INodeData, INodeParams } from '../../../src/Interface'
|
||||
import { TextSplitter } from 'langchain/text_splitter'
|
||||
import { NotionAPILoader, NotionAPILoaderOptions } from 'langchain/document_loaders/web/notionapi'
|
||||
import { getCredentialData, getCredentialParam } from '../../../src'
|
||||
|
|
@ -44,9 +45,21 @@ class NotionDB_DocumentLoaders implements INode {
|
|||
description: 'If your URL looks like - https://www.notion.so/abcdefh?v=long_hash_2, then abcdefh is the database ID'
|
||||
},
|
||||
{
|
||||
label: 'Metadata',
|
||||
label: 'Additional Metadata',
|
||||
name: 'metadata',
|
||||
type: 'json',
|
||||
description: 'Additional metadata to be added to the extracted documents',
|
||||
optional: true,
|
||||
additionalParams: true
|
||||
},
|
||||
{
|
||||
label: 'Omit Metadata Keys',
|
||||
name: 'omitMetadataKeys',
|
||||
type: 'string',
|
||||
rows: 4,
|
||||
description:
|
||||
'Each document loader comes with a default set of metadata keys that are extracted from the document. You can use this field to omit some of the default metadata keys. The value should be a list of keys, seperated by comma',
|
||||
placeholder: 'key1, key2, key3.nestedKey1',
|
||||
optional: true,
|
||||
additionalParams: true
|
||||
}
|
||||
|
|
@ -57,6 +70,12 @@ class NotionDB_DocumentLoaders implements INode {
|
|||
const textSplitter = nodeData.inputs?.textSplitter as TextSplitter
|
||||
const databaseId = nodeData.inputs?.databaseId as string
|
||||
const metadata = nodeData.inputs?.metadata
|
||||
const _omitMetadataKeys = nodeData.inputs?.omitMetadataKeys as string
|
||||
|
||||
let omitMetadataKeys: string[] = []
|
||||
if (_omitMetadataKeys) {
|
||||
omitMetadataKeys = _omitMetadataKeys.split(',').map((key) => key.trim())
|
||||
}
|
||||
|
||||
const credentialData = await getCredentialData(nodeData.credential ?? '', options)
|
||||
const notionIntegrationToken = getCredentialParam('notionIntegrationToken', credentialData, nodeData)
|
||||
|
|
@ -74,7 +93,7 @@ class NotionDB_DocumentLoaders implements INode {
|
|||
}
|
||||
const loader = new NotionAPILoader(obj)
|
||||
|
||||
let docs = []
|
||||
let docs: IDocument[] = []
|
||||
if (textSplitter) {
|
||||
docs = await loader.loadAndSplit(textSplitter)
|
||||
} else {
|
||||
|
|
@ -83,18 +102,26 @@ class NotionDB_DocumentLoaders implements INode {
|
|||
|
||||
if (metadata) {
|
||||
const parsedMetadata = typeof metadata === 'object' ? metadata : JSON.parse(metadata)
|
||||
let finaldocs = []
|
||||
for (const doc of docs) {
|
||||
const newdoc = {
|
||||
...doc,
|
||||
metadata: {
|
||||
docs = docs.map((doc) => ({
|
||||
...doc,
|
||||
metadata: omit(
|
||||
{
|
||||
...doc.metadata,
|
||||
...parsedMetadata
|
||||
}
|
||||
}
|
||||
finaldocs.push(newdoc)
|
||||
}
|
||||
return finaldocs
|
||||
},
|
||||
omitMetadataKeys
|
||||
)
|
||||
}))
|
||||
} else {
|
||||
docs = docs.map((doc) => ({
|
||||
...doc,
|
||||
metadata: omit(
|
||||
{
|
||||
...doc.metadata
|
||||
},
|
||||
omitMetadataKeys
|
||||
)
|
||||
}))
|
||||
}
|
||||
|
||||
return docs
|
||||
|
|
|
|||
|
|
@ -1,4 +1,5 @@
|
|||
import { INode, INodeData, INodeParams } from '../../../src/Interface'
|
||||
import { omit } from 'lodash'
|
||||
import { IDocument, INode, INodeData, INodeParams } from '../../../src/Interface'
|
||||
import { TextSplitter } from 'langchain/text_splitter'
|
||||
import { NotionLoader } from 'langchain/document_loaders/fs/notion'
|
||||
|
||||
|
|
@ -37,9 +38,21 @@ class NotionFolder_DocumentLoaders implements INode {
|
|||
optional: true
|
||||
},
|
||||
{
|
||||
label: 'Metadata',
|
||||
label: 'Additional Metadata',
|
||||
name: 'metadata',
|
||||
type: 'json',
|
||||
description: 'Additional metadata to be added to the extracted documents',
|
||||
optional: true,
|
||||
additionalParams: true
|
||||
},
|
||||
{
|
||||
label: 'Omit Metadata Keys',
|
||||
name: 'omitMetadataKeys',
|
||||
type: 'string',
|
||||
rows: 4,
|
||||
description:
|
||||
'Each document loader comes with a default set of metadata keys that are extracted from the document. You can use this field to omit some of the default metadata keys. The value should be a list of keys, seperated by comma',
|
||||
placeholder: 'key1, key2, key3.nestedKey1',
|
||||
optional: true,
|
||||
additionalParams: true
|
||||
}
|
||||
|
|
@ -50,9 +63,15 @@ class NotionFolder_DocumentLoaders implements INode {
|
|||
const textSplitter = nodeData.inputs?.textSplitter as TextSplitter
|
||||
const notionFolder = nodeData.inputs?.notionFolder as string
|
||||
const metadata = nodeData.inputs?.metadata
|
||||
const _omitMetadataKeys = nodeData.inputs?.omitMetadataKeys as string
|
||||
|
||||
let omitMetadataKeys: string[] = []
|
||||
if (_omitMetadataKeys) {
|
||||
omitMetadataKeys = _omitMetadataKeys.split(',').map((key) => key.trim())
|
||||
}
|
||||
|
||||
const loader = new NotionLoader(notionFolder)
|
||||
let docs = []
|
||||
let docs: IDocument[] = []
|
||||
|
||||
if (textSplitter) {
|
||||
docs = await loader.loadAndSplit(textSplitter)
|
||||
|
|
@ -62,18 +81,26 @@ class NotionFolder_DocumentLoaders implements INode {
|
|||
|
||||
if (metadata) {
|
||||
const parsedMetadata = typeof metadata === 'object' ? metadata : JSON.parse(metadata)
|
||||
let finaldocs = []
|
||||
for (const doc of docs) {
|
||||
const newdoc = {
|
||||
...doc,
|
||||
metadata: {
|
||||
docs = docs.map((doc) => ({
|
||||
...doc,
|
||||
metadata: omit(
|
||||
{
|
||||
...doc.metadata,
|
||||
...parsedMetadata
|
||||
}
|
||||
}
|
||||
finaldocs.push(newdoc)
|
||||
}
|
||||
return finaldocs
|
||||
},
|
||||
omitMetadataKeys
|
||||
)
|
||||
}))
|
||||
} else {
|
||||
docs = docs.map((doc) => ({
|
||||
...doc,
|
||||
metadata: omit(
|
||||
{
|
||||
...doc.metadata
|
||||
},
|
||||
omitMetadataKeys
|
||||
)
|
||||
}))
|
||||
}
|
||||
|
||||
return docs
|
||||
|
|
|
|||
|
|
@ -1,4 +1,5 @@
|
|||
import { ICommonObject, INode, INodeData, INodeParams } from '../../../src/Interface'
|
||||
import { omit } from 'lodash'
|
||||
import { ICommonObject, IDocument, INode, INodeData, INodeParams } from '../../../src/Interface'
|
||||
import { TextSplitter } from 'langchain/text_splitter'
|
||||
import { NotionAPILoader, NotionAPILoaderOptions } from 'langchain/document_loaders/web/notionapi'
|
||||
import { getCredentialData, getCredentialParam } from '../../../src'
|
||||
|
|
@ -45,9 +46,21 @@ class NotionPage_DocumentLoaders implements INode {
|
|||
'The last The 32 char hex in the url path. For example: https://www.notion.so/skarard/LangChain-Notion-API-b34ca03f219c4420a6046fc4bdfdf7b4, b34ca03f219c4420a6046fc4bdfdf7b4 is the Page ID'
|
||||
},
|
||||
{
|
||||
label: 'Metadata',
|
||||
label: 'Additional Metadata',
|
||||
name: 'metadata',
|
||||
type: 'json',
|
||||
description: 'Additional metadata to be added to the extracted documents',
|
||||
optional: true,
|
||||
additionalParams: true
|
||||
},
|
||||
{
|
||||
label: 'Omit Metadata Keys',
|
||||
name: 'omitMetadataKeys',
|
||||
type: 'string',
|
||||
rows: 4,
|
||||
description:
|
||||
'Each document loader comes with a default set of metadata keys that are extracted from the document. You can use this field to omit some of the default metadata keys. The value should be a list of keys, seperated by comma',
|
||||
placeholder: 'key1, key2, key3.nestedKey1',
|
||||
optional: true,
|
||||
additionalParams: true
|
||||
}
|
||||
|
|
@ -58,6 +71,12 @@ class NotionPage_DocumentLoaders implements INode {
|
|||
const textSplitter = nodeData.inputs?.textSplitter as TextSplitter
|
||||
const pageId = nodeData.inputs?.pageId as string
|
||||
const metadata = nodeData.inputs?.metadata
|
||||
const _omitMetadataKeys = nodeData.inputs?.omitMetadataKeys as string
|
||||
|
||||
let omitMetadataKeys: string[] = []
|
||||
if (_omitMetadataKeys) {
|
||||
omitMetadataKeys = _omitMetadataKeys.split(',').map((key) => key.trim())
|
||||
}
|
||||
|
||||
const credentialData = await getCredentialData(nodeData.credential ?? '', options)
|
||||
const notionIntegrationToken = getCredentialParam('notionIntegrationToken', credentialData, nodeData)
|
||||
|
|
@ -71,7 +90,7 @@ class NotionPage_DocumentLoaders implements INode {
|
|||
}
|
||||
const loader = new NotionAPILoader(obj)
|
||||
|
||||
let docs = []
|
||||
let docs: IDocument[] = []
|
||||
if (textSplitter) {
|
||||
docs = await loader.loadAndSplit(textSplitter)
|
||||
} else {
|
||||
|
|
@ -80,18 +99,26 @@ class NotionPage_DocumentLoaders implements INode {
|
|||
|
||||
if (metadata) {
|
||||
const parsedMetadata = typeof metadata === 'object' ? metadata : JSON.parse(metadata)
|
||||
let finaldocs = []
|
||||
for (const doc of docs) {
|
||||
const newdoc = {
|
||||
...doc,
|
||||
metadata: {
|
||||
docs = docs.map((doc) => ({
|
||||
...doc,
|
||||
metadata: omit(
|
||||
{
|
||||
...doc.metadata,
|
||||
...parsedMetadata
|
||||
}
|
||||
}
|
||||
finaldocs.push(newdoc)
|
||||
}
|
||||
return finaldocs
|
||||
},
|
||||
omitMetadataKeys
|
||||
)
|
||||
}))
|
||||
} else {
|
||||
docs = docs.map((doc) => ({
|
||||
...doc,
|
||||
metadata: omit(
|
||||
{
|
||||
...doc.metadata
|
||||
},
|
||||
omitMetadataKeys
|
||||
)
|
||||
}))
|
||||
}
|
||||
|
||||
return docs
|
||||
|
|
|
|||
|
|
@ -1,4 +1,5 @@
|
|||
import { ICommonObject, INode, INodeData, INodeParams } from '../../../src/Interface'
|
||||
import { omit } from 'lodash'
|
||||
import { IDocument, ICommonObject, INode, INodeData, INodeParams } from '../../../src/Interface'
|
||||
import { TextSplitter } from 'langchain/text_splitter'
|
||||
import { PDFLoader } from 'langchain/document_loaders/fs/pdf'
|
||||
import { getFileFromStorage } from '../../../src'
|
||||
|
|
@ -60,9 +61,21 @@ class Pdf_DocumentLoaders implements INode {
|
|||
additionalParams: true
|
||||
},
|
||||
{
|
||||
label: 'Metadata',
|
||||
label: 'Additional Metadata',
|
||||
name: 'metadata',
|
||||
type: 'json',
|
||||
description: 'Additional metadata to be added to the extracted documents',
|
||||
optional: true,
|
||||
additionalParams: true
|
||||
},
|
||||
{
|
||||
label: 'Omit Metadata Keys',
|
||||
name: 'omitMetadataKeys',
|
||||
type: 'string',
|
||||
rows: 4,
|
||||
description:
|
||||
'Each document loader comes with a default set of metadata keys that are extracted from the document. You can use this field to omit some of the default metadata keys. The value should be a list of keys, seperated by comma',
|
||||
placeholder: 'key1, key2, key3.nestedKey1',
|
||||
optional: true,
|
||||
additionalParams: true
|
||||
}
|
||||
|
|
@ -75,8 +88,14 @@ class Pdf_DocumentLoaders implements INode {
|
|||
const usage = nodeData.inputs?.usage as string
|
||||
const metadata = nodeData.inputs?.metadata
|
||||
const legacyBuild = nodeData.inputs?.legacyBuild as boolean
|
||||
const _omitMetadataKeys = nodeData.inputs?.omitMetadataKeys as string
|
||||
|
||||
let alldocs: any[] = []
|
||||
let omitMetadataKeys: string[] = []
|
||||
if (_omitMetadataKeys) {
|
||||
omitMetadataKeys = _omitMetadataKeys.split(',').map((key) => key.trim())
|
||||
}
|
||||
|
||||
let docs: IDocument[] = []
|
||||
let files: string[] = []
|
||||
|
||||
//FILE-STORAGE::["CONTRIBUTING.md","LICENSE.md","README.md"]
|
||||
|
|
@ -92,7 +111,7 @@ class Pdf_DocumentLoaders implements INode {
|
|||
for (const file of files) {
|
||||
const fileData = await getFileFromStorage(file, chatflowid)
|
||||
const bf = Buffer.from(fileData)
|
||||
await this.extractDocs(usage, bf, legacyBuild, textSplitter, alldocs)
|
||||
await this.extractDocs(usage, bf, legacyBuild, textSplitter, docs)
|
||||
}
|
||||
} else {
|
||||
if (pdfFileBase64.startsWith('[') && pdfFileBase64.endsWith(']')) {
|
||||
|
|
@ -105,30 +124,38 @@ class Pdf_DocumentLoaders implements INode {
|
|||
const splitDataURI = file.split(',')
|
||||
splitDataURI.pop()
|
||||
const bf = Buffer.from(splitDataURI.pop() || '', 'base64')
|
||||
await this.extractDocs(usage, bf, legacyBuild, textSplitter, alldocs)
|
||||
await this.extractDocs(usage, bf, legacyBuild, textSplitter, docs)
|
||||
}
|
||||
}
|
||||
|
||||
if (metadata) {
|
||||
const parsedMetadata = typeof metadata === 'object' ? metadata : JSON.parse(metadata)
|
||||
let finaldocs = []
|
||||
for (const doc of alldocs) {
|
||||
const newdoc = {
|
||||
...doc,
|
||||
metadata: {
|
||||
docs = docs.map((doc) => ({
|
||||
...doc,
|
||||
metadata: omit(
|
||||
{
|
||||
...doc.metadata,
|
||||
...parsedMetadata
|
||||
}
|
||||
}
|
||||
finaldocs.push(newdoc)
|
||||
}
|
||||
return finaldocs
|
||||
},
|
||||
omitMetadataKeys
|
||||
)
|
||||
}))
|
||||
} else {
|
||||
docs = docs.map((doc) => ({
|
||||
...doc,
|
||||
metadata: omit(
|
||||
{
|
||||
...doc.metadata
|
||||
},
|
||||
omitMetadataKeys
|
||||
)
|
||||
}))
|
||||
}
|
||||
|
||||
return alldocs
|
||||
return docs
|
||||
}
|
||||
|
||||
private async extractDocs(usage: string, bf: Buffer, legacyBuild: boolean, textSplitter: TextSplitter, alldocs: any[]) {
|
||||
private async extractDocs(usage: string, bf: Buffer, legacyBuild: boolean, textSplitter: TextSplitter, docs: IDocument[]) {
|
||||
if (usage === 'perFile') {
|
||||
const loader = new PDFLoader(new Blob([bf]), {
|
||||
splitPages: false,
|
||||
|
|
@ -137,11 +164,9 @@ class Pdf_DocumentLoaders implements INode {
|
|||
legacyBuild ? import('pdfjs-dist/legacy/build/pdf.js') : import('pdf-parse/lib/pdf.js/v1.10.100/build/pdf.js')
|
||||
})
|
||||
if (textSplitter) {
|
||||
const docs = await loader.loadAndSplit(textSplitter)
|
||||
alldocs.push(...docs)
|
||||
docs.push(...(await loader.loadAndSplit(textSplitter)))
|
||||
} else {
|
||||
const docs = await loader.load()
|
||||
alldocs.push(...docs)
|
||||
docs.push(...(await loader.load()))
|
||||
}
|
||||
} else {
|
||||
const loader = new PDFLoader(new Blob([bf]), {
|
||||
|
|
@ -150,11 +175,9 @@ class Pdf_DocumentLoaders implements INode {
|
|||
legacyBuild ? import('pdfjs-dist/legacy/build/pdf.js') : import('pdf-parse/lib/pdf.js/v1.10.100/build/pdf.js')
|
||||
})
|
||||
if (textSplitter) {
|
||||
const docs = await loader.loadAndSplit(textSplitter)
|
||||
alldocs.push(...docs)
|
||||
docs.push(...(await loader.loadAndSplit(textSplitter)))
|
||||
} else {
|
||||
const docs = await loader.load()
|
||||
alldocs.push(...docs)
|
||||
docs.push(...(await loader.load()))
|
||||
}
|
||||
}
|
||||
}
|
||||
|
|
|
|||
|
|
@ -1,4 +1,5 @@
|
|||
import { INode, INodeData, INodeOutputsValue, INodeParams } from '../../../src/Interface'
|
||||
import { omit } from 'lodash'
|
||||
import { IDocument, INode, INodeData, INodeOutputsValue, INodeParams } from '../../../src/Interface'
|
||||
import { TextSplitter } from 'langchain/text_splitter'
|
||||
import { Document } from '@langchain/core/documents'
|
||||
import { handleEscapeCharacters } from '../../../src'
|
||||
|
|
@ -40,9 +41,21 @@ class PlainText_DocumentLoaders implements INode {
|
|||
optional: true
|
||||
},
|
||||
{
|
||||
label: 'Metadata',
|
||||
label: 'Additional Metadata',
|
||||
name: 'metadata',
|
||||
type: 'json',
|
||||
description: 'Additional metadata to be added to the extracted documents',
|
||||
optional: true,
|
||||
additionalParams: true
|
||||
},
|
||||
{
|
||||
label: 'Omit Metadata Keys',
|
||||
name: 'omitMetadataKeys',
|
||||
type: 'string',
|
||||
rows: 4,
|
||||
description:
|
||||
'Each document loader comes with a default set of metadata keys that are extracted from the document. You can use this field to omit some of the default metadata keys. The value should be a list of keys, seperated by comma',
|
||||
placeholder: 'key1, key2, key3.nestedKey1',
|
||||
optional: true,
|
||||
additionalParams: true
|
||||
}
|
||||
|
|
@ -68,42 +81,54 @@ class PlainText_DocumentLoaders implements INode {
|
|||
const text = nodeData.inputs?.text as string
|
||||
const metadata = nodeData.inputs?.metadata
|
||||
const output = nodeData.outputs?.output as string
|
||||
const _omitMetadataKeys = nodeData.inputs?.omitMetadataKeys as string
|
||||
|
||||
let alldocs: Document<Record<string, any>>[] = []
|
||||
let omitMetadataKeys: string[] = []
|
||||
if (_omitMetadataKeys) {
|
||||
omitMetadataKeys = _omitMetadataKeys.split(',').map((key) => key.trim())
|
||||
}
|
||||
|
||||
let docs: IDocument[] = []
|
||||
|
||||
if (textSplitter) {
|
||||
const docs = await textSplitter.createDocuments([text])
|
||||
alldocs.push(...docs)
|
||||
docs.push(...(await textSplitter.createDocuments([text])))
|
||||
} else {
|
||||
alldocs.push(
|
||||
docs.push(
|
||||
new Document({
|
||||
pageContent: text
|
||||
})
|
||||
)
|
||||
}
|
||||
|
||||
let finaldocs: Document<Record<string, any>>[] = []
|
||||
if (metadata) {
|
||||
const parsedMetadata = typeof metadata === 'object' ? metadata : JSON.parse(metadata)
|
||||
for (const doc of alldocs) {
|
||||
const newdoc = {
|
||||
...doc,
|
||||
metadata: {
|
||||
docs = docs.map((doc) => ({
|
||||
...doc,
|
||||
metadata: omit(
|
||||
{
|
||||
...doc.metadata,
|
||||
...parsedMetadata
|
||||
}
|
||||
}
|
||||
finaldocs.push(newdoc)
|
||||
}
|
||||
},
|
||||
omitMetadataKeys
|
||||
)
|
||||
}))
|
||||
} else {
|
||||
finaldocs = alldocs
|
||||
docs = docs.map((doc) => ({
|
||||
...doc,
|
||||
metadata: omit(
|
||||
{
|
||||
...doc.metadata
|
||||
},
|
||||
omitMetadataKeys
|
||||
)
|
||||
}))
|
||||
}
|
||||
|
||||
if (output === 'document') {
|
||||
return finaldocs
|
||||
return docs
|
||||
} else {
|
||||
let finaltext = ''
|
||||
for (const doc of finaldocs) {
|
||||
for (const doc of docs) {
|
||||
finaltext += `${doc.pageContent}\n`
|
||||
}
|
||||
return handleEscapeCharacters(finaltext, false)
|
||||
|
|
|
|||
|
|
@ -1,4 +1,5 @@
|
|||
import { ICommonObject, INode, INodeData, INodeParams } from '../../../src/Interface'
|
||||
import { omit } from 'lodash'
|
||||
import { ICommonObject, IDocument, INode, INodeData, INodeParams } from '../../../src/Interface'
|
||||
import { TextSplitter } from 'langchain/text_splitter'
|
||||
import { Browser, Page, PlaywrightWebBaseLoader, PlaywrightWebBaseLoaderOptions } from 'langchain/document_loaders/web/playwright'
|
||||
import { test } from 'linkifyjs'
|
||||
|
|
@ -53,6 +54,7 @@ class Playwright_DocumentLoaders implements INode {
|
|||
description: 'Scrape relative links from XML sitemap URL'
|
||||
}
|
||||
],
|
||||
default: 'webCrawl',
|
||||
optional: true,
|
||||
additionalParams: true
|
||||
},
|
||||
|
|
@ -106,9 +108,21 @@ class Playwright_DocumentLoaders implements INode {
|
|||
description: 'CSS selectors like .div or #div'
|
||||
},
|
||||
{
|
||||
label: 'Metadata',
|
||||
label: 'Additional Metadata',
|
||||
name: 'metadata',
|
||||
type: 'json',
|
||||
description: 'Additional metadata to be added to the extracted documents',
|
||||
optional: true,
|
||||
additionalParams: true
|
||||
},
|
||||
{
|
||||
label: 'Omit Metadata Keys',
|
||||
name: 'omitMetadataKeys',
|
||||
type: 'string',
|
||||
rows: 4,
|
||||
description:
|
||||
'Each document loader comes with a default set of metadata keys that are extracted from the document. You can use this field to omit some of the default metadata keys. The value should be a list of keys, seperated by comma',
|
||||
placeholder: 'key1, key2, key3.nestedKey1',
|
||||
optional: true,
|
||||
additionalParams: true
|
||||
}
|
||||
|
|
@ -123,6 +137,12 @@ class Playwright_DocumentLoaders implements INode {
|
|||
let limit = parseInt(nodeData.inputs?.limit as string)
|
||||
let waitUntilGoToOption = nodeData.inputs?.waitUntilGoToOption as 'load' | 'domcontentloaded' | 'networkidle' | 'commit' | undefined
|
||||
let waitForSelector = nodeData.inputs?.waitForSelector as string
|
||||
const _omitMetadataKeys = nodeData.inputs?.omitMetadataKeys as string
|
||||
|
||||
let omitMetadataKeys: string[] = []
|
||||
if (_omitMetadataKeys) {
|
||||
omitMetadataKeys = _omitMetadataKeys.split(',').map((key) => key.trim())
|
||||
}
|
||||
|
||||
let url = nodeData.inputs?.url as string
|
||||
url = url.trim()
|
||||
|
|
@ -164,7 +184,7 @@ class Playwright_DocumentLoaders implements INode {
|
|||
}
|
||||
}
|
||||
|
||||
let docs = []
|
||||
let docs: IDocument[] = []
|
||||
if (relativeLinksMethod) {
|
||||
if (process.env.DEBUG === 'true') options.logger.info(`Start ${relativeLinksMethod}`)
|
||||
// if limit is 0 we don't want it to default to 10 so we check explicitly for null or undefined
|
||||
|
|
@ -195,18 +215,26 @@ class Playwright_DocumentLoaders implements INode {
|
|||
|
||||
if (metadata) {
|
||||
const parsedMetadata = typeof metadata === 'object' ? metadata : JSON.parse(metadata)
|
||||
let finaldocs = []
|
||||
for (const doc of docs) {
|
||||
const newdoc = {
|
||||
...doc,
|
||||
metadata: {
|
||||
docs = docs.map((doc) => ({
|
||||
...doc,
|
||||
metadata: omit(
|
||||
{
|
||||
...doc.metadata,
|
||||
...parsedMetadata
|
||||
}
|
||||
}
|
||||
finaldocs.push(newdoc)
|
||||
}
|
||||
return finaldocs
|
||||
},
|
||||
omitMetadataKeys
|
||||
)
|
||||
}))
|
||||
} else {
|
||||
docs = docs.map((doc) => ({
|
||||
...doc,
|
||||
metadata: omit(
|
||||
{
|
||||
...doc.metadata
|
||||
},
|
||||
omitMetadataKeys
|
||||
)
|
||||
}))
|
||||
}
|
||||
|
||||
return docs
|
||||
|
|
|
|||
|
|
@ -1,4 +1,5 @@
|
|||
import { ICommonObject, INode, INodeData, INodeParams } from '../../../src/Interface'
|
||||
import { omit } from 'lodash'
|
||||
import { ICommonObject, IDocument, INode, INodeData, INodeParams } from '../../../src/Interface'
|
||||
import { TextSplitter } from 'langchain/text_splitter'
|
||||
import { Browser, Page, PuppeteerWebBaseLoader, PuppeteerWebBaseLoaderOptions } from 'langchain/document_loaders/web/puppeteer'
|
||||
import { test } from 'linkifyjs'
|
||||
|
|
@ -54,6 +55,7 @@ class Puppeteer_DocumentLoaders implements INode {
|
|||
description: 'Scrape relative links from XML sitemap URL'
|
||||
}
|
||||
],
|
||||
default: 'webCrawl',
|
||||
optional: true,
|
||||
additionalParams: true
|
||||
},
|
||||
|
|
@ -107,9 +109,21 @@ class Puppeteer_DocumentLoaders implements INode {
|
|||
description: 'CSS selectors like .div or #div'
|
||||
},
|
||||
{
|
||||
label: 'Metadata',
|
||||
label: 'Additional Metadata',
|
||||
name: 'metadata',
|
||||
type: 'json',
|
||||
description: 'Additional metadata to be added to the extracted documents',
|
||||
optional: true,
|
||||
additionalParams: true
|
||||
},
|
||||
{
|
||||
label: 'Omit Metadata Keys',
|
||||
name: 'omitMetadataKeys',
|
||||
type: 'string',
|
||||
rows: 4,
|
||||
description:
|
||||
'Each document loader comes with a default set of metadata keys that are extracted from the document. You can use this field to omit some of the default metadata keys. The value should be a list of keys, seperated by comma',
|
||||
placeholder: 'key1, key2, key3.nestedKey1',
|
||||
optional: true,
|
||||
additionalParams: true
|
||||
}
|
||||
|
|
@ -124,6 +138,12 @@ class Puppeteer_DocumentLoaders implements INode {
|
|||
let limit = parseInt(nodeData.inputs?.limit as string)
|
||||
let waitUntilGoToOption = nodeData.inputs?.waitUntilGoToOption as PuppeteerLifeCycleEvent
|
||||
let waitForSelector = nodeData.inputs?.waitForSelector as string
|
||||
const _omitMetadataKeys = nodeData.inputs?.omitMetadataKeys as string
|
||||
|
||||
let omitMetadataKeys: string[] = []
|
||||
if (_omitMetadataKeys) {
|
||||
omitMetadataKeys = _omitMetadataKeys.split(',').map((key) => key.trim())
|
||||
}
|
||||
|
||||
let url = nodeData.inputs?.url as string
|
||||
url = url.trim()
|
||||
|
|
@ -165,7 +185,7 @@ class Puppeteer_DocumentLoaders implements INode {
|
|||
}
|
||||
}
|
||||
|
||||
let docs = []
|
||||
let docs: IDocument[] = []
|
||||
if (relativeLinksMethod) {
|
||||
if (process.env.DEBUG === 'true') options.logger.info(`Start ${relativeLinksMethod}`)
|
||||
// if limit is 0 we don't want it to default to 10 so we check explicitly for null or undefined
|
||||
|
|
@ -196,18 +216,26 @@ class Puppeteer_DocumentLoaders implements INode {
|
|||
|
||||
if (metadata) {
|
||||
const parsedMetadata = typeof metadata === 'object' ? metadata : JSON.parse(metadata)
|
||||
let finaldocs = []
|
||||
for (const doc of docs) {
|
||||
const newdoc = {
|
||||
...doc,
|
||||
metadata: {
|
||||
docs = docs.map((doc) => ({
|
||||
...doc,
|
||||
metadata: omit(
|
||||
{
|
||||
...doc.metadata,
|
||||
...parsedMetadata
|
||||
}
|
||||
}
|
||||
finaldocs.push(newdoc)
|
||||
}
|
||||
return finaldocs
|
||||
},
|
||||
omitMetadataKeys
|
||||
)
|
||||
}))
|
||||
} else {
|
||||
docs = docs.map((doc) => ({
|
||||
...doc,
|
||||
metadata: omit(
|
||||
{
|
||||
...doc.metadata
|
||||
},
|
||||
omitMetadataKeys
|
||||
)
|
||||
}))
|
||||
}
|
||||
|
||||
return docs
|
||||
|
|
|
|||
|
|
@ -1,3 +1,4 @@
|
|||
import { omit } from 'lodash'
|
||||
import { ICommonObject, INode, INodeData, INodeOptionsValue, INodeParams } from '../../../src/Interface'
|
||||
import { S3Loader } from 'langchain/document_loaders/web/s3'
|
||||
import {
|
||||
|
|
@ -413,9 +414,21 @@ class S3_DocumentLoaders implements INode {
|
|||
default: '500'
|
||||
},
|
||||
{
|
||||
label: 'Metadata',
|
||||
label: 'Additional Metadata',
|
||||
name: 'metadata',
|
||||
type: 'json',
|
||||
description: 'Additional metadata to be added to the extracted documents',
|
||||
optional: true,
|
||||
additionalParams: true
|
||||
},
|
||||
{
|
||||
label: 'Omit Metadata Keys',
|
||||
name: 'omitMetadataKeys',
|
||||
type: 'string',
|
||||
rows: 4,
|
||||
description:
|
||||
'Each document loader comes with a default set of metadata keys that are extracted from the document. You can use this field to omit some of the default metadata keys. The value should be a list of keys, seperated by comma',
|
||||
placeholder: 'key1, key2, key3.nestedKey1',
|
||||
optional: true,
|
||||
additionalParams: true
|
||||
}
|
||||
|
|
@ -451,6 +464,12 @@ class S3_DocumentLoaders implements INode {
|
|||
const combineUnderNChars = nodeData.inputs?.combineUnderNChars as number
|
||||
const newAfterNChars = nodeData.inputs?.newAfterNChars as number
|
||||
const maxCharacters = nodeData.inputs?.maxCharacters as number
|
||||
const _omitMetadataKeys = nodeData.inputs?.omitMetadataKeys as string
|
||||
|
||||
let omitMetadataKeys: string[] = []
|
||||
if (_omitMetadataKeys) {
|
||||
omitMetadataKeys = _omitMetadataKeys.split(',').map((key) => key.trim())
|
||||
}
|
||||
|
||||
let credentials: S3ClientConfig['credentials'] | undefined
|
||||
|
||||
|
|
@ -542,19 +561,25 @@ class S3_DocumentLoaders implements INode {
|
|||
const parsedMetadata = typeof metadata === 'object' ? metadata : JSON.parse(metadata)
|
||||
docs = docs.map((doc) => ({
|
||||
...doc,
|
||||
metadata: {
|
||||
...doc.metadata,
|
||||
...parsedMetadata,
|
||||
[sourceIdKey]: doc.metadata[sourceIdKey] || sourceIdKey
|
||||
}
|
||||
metadata: omit(
|
||||
{
|
||||
...doc.metadata,
|
||||
...parsedMetadata,
|
||||
[sourceIdKey]: doc.metadata[sourceIdKey] || sourceIdKey
|
||||
},
|
||||
omitMetadataKeys
|
||||
)
|
||||
}))
|
||||
} else {
|
||||
docs = docs.map((doc) => ({
|
||||
...doc,
|
||||
metadata: {
|
||||
...doc.metadata,
|
||||
[sourceIdKey]: doc.metadata[sourceIdKey] || sourceIdKey
|
||||
}
|
||||
metadata: omit(
|
||||
{
|
||||
...doc.metadata,
|
||||
[sourceIdKey]: doc.metadata[sourceIdKey] || sourceIdKey
|
||||
},
|
||||
omitMetadataKeys
|
||||
)
|
||||
}))
|
||||
}
|
||||
|
||||
|
|
|
|||
|
|
@ -1,3 +1,4 @@
|
|||
import { omit } from 'lodash'
|
||||
import { ICommonObject, INode, INodeData, INodeParams } from '../../../src/Interface'
|
||||
import { TextSplitter } from 'langchain/text_splitter'
|
||||
import { SearchApiLoader } from 'langchain/document_loaders/web/searchapi'
|
||||
|
|
@ -54,9 +55,21 @@ class SearchAPI_DocumentLoaders implements INode {
|
|||
optional: true
|
||||
},
|
||||
{
|
||||
label: 'Metadata',
|
||||
label: 'Additional Metadata',
|
||||
name: 'metadata',
|
||||
type: 'json',
|
||||
description: 'Additional metadata to be added to the extracted documents',
|
||||
optional: true,
|
||||
additionalParams: true
|
||||
},
|
||||
{
|
||||
label: 'Omit Metadata Keys',
|
||||
name: 'omitMetadataKeys',
|
||||
type: 'string',
|
||||
rows: 4,
|
||||
description:
|
||||
'Each document loader comes with a default set of metadata keys that are extracted from the document. You can use this field to omit some of the default metadata keys. The value should be a list of keys, seperated by comma',
|
||||
placeholder: 'key1, key2, key3.nestedKey1',
|
||||
optional: true,
|
||||
additionalParams: true
|
||||
}
|
||||
|
|
@ -68,6 +81,12 @@ class SearchAPI_DocumentLoaders implements INode {
|
|||
const query = nodeData.inputs?.query as string
|
||||
const customParameters = nodeData.inputs?.customParameters
|
||||
const metadata = nodeData.inputs?.metadata
|
||||
const _omitMetadataKeys = nodeData.inputs?.omitMetadataKeys as string
|
||||
|
||||
let omitMetadataKeys: string[] = []
|
||||
if (_omitMetadataKeys) {
|
||||
omitMetadataKeys = _omitMetadataKeys.split(',').map((key) => key.trim())
|
||||
}
|
||||
|
||||
// Fetch the API credentials for this node
|
||||
const credentialData = await getCredentialData(nodeData.credential ?? '', options)
|
||||
|
|
@ -87,19 +106,30 @@ class SearchAPI_DocumentLoaders implements INode {
|
|||
const loader = new SearchApiLoader(loaderConfig)
|
||||
|
||||
// Fetch documents, split if a text splitter is provided
|
||||
const docs = textSplitter ? await loader.loadAndSplit() : await loader.load()
|
||||
let docs = textSplitter ? await loader.loadAndSplit() : await loader.load()
|
||||
|
||||
if (metadata) {
|
||||
const parsedMetadata = typeof metadata === 'object' ? metadata : JSON.parse(metadata)
|
||||
return docs.map((doc) => {
|
||||
return {
|
||||
...doc,
|
||||
metadata: {
|
||||
docs = docs.map((doc) => ({
|
||||
...doc,
|
||||
metadata: omit(
|
||||
{
|
||||
...doc.metadata,
|
||||
...parsedMetadata
|
||||
}
|
||||
}
|
||||
})
|
||||
},
|
||||
omitMetadataKeys
|
||||
)
|
||||
}))
|
||||
} else {
|
||||
docs = docs.map((doc) => ({
|
||||
...doc,
|
||||
metadata: omit(
|
||||
{
|
||||
...doc.metadata
|
||||
},
|
||||
omitMetadataKeys
|
||||
)
|
||||
}))
|
||||
}
|
||||
|
||||
return docs
|
||||
|
|
|
|||
|
|
@ -1,3 +1,4 @@
|
|||
import { omit } from 'lodash'
|
||||
import { ICommonObject, INode, INodeData, INodeParams } from '../../../src/Interface'
|
||||
import { TextSplitter } from 'langchain/text_splitter'
|
||||
import { SerpAPILoader } from 'langchain/document_loaders/web/serpapi'
|
||||
|
|
@ -44,9 +45,21 @@ class SerpAPI_DocumentLoaders implements INode {
|
|||
optional: true
|
||||
},
|
||||
{
|
||||
label: 'Metadata',
|
||||
label: 'Additional Metadata',
|
||||
name: 'metadata',
|
||||
type: 'json',
|
||||
description: 'Additional metadata to be added to the extracted documents',
|
||||
optional: true,
|
||||
additionalParams: true
|
||||
},
|
||||
{
|
||||
label: 'Omit Metadata Keys',
|
||||
name: 'omitMetadataKeys',
|
||||
type: 'string',
|
||||
rows: 4,
|
||||
description:
|
||||
'Each document loader comes with a default set of metadata keys that are extracted from the document. You can use this field to omit some of the default metadata keys. The value should be a list of keys, seperated by comma',
|
||||
placeholder: 'key1, key2, key3.nestedKey1',
|
||||
optional: true,
|
||||
additionalParams: true
|
||||
}
|
||||
|
|
@ -57,23 +70,40 @@ class SerpAPI_DocumentLoaders implements INode {
|
|||
const textSplitter = nodeData.inputs?.textSplitter as TextSplitter
|
||||
const query = nodeData.inputs?.query as string
|
||||
const metadata = nodeData.inputs?.metadata
|
||||
const _omitMetadataKeys = nodeData.inputs?.omitMetadataKeys as string
|
||||
|
||||
let omitMetadataKeys: string[] = []
|
||||
if (_omitMetadataKeys) {
|
||||
omitMetadataKeys = _omitMetadataKeys.split(',').map((key) => key.trim())
|
||||
}
|
||||
|
||||
const credentialData = await getCredentialData(nodeData.credential ?? '', options)
|
||||
const serpApiKey = getCredentialParam('serpApiKey', credentialData, nodeData)
|
||||
const loader = new SerpAPILoader({ q: query, apiKey: serpApiKey })
|
||||
const docs = textSplitter ? await loader.loadAndSplit() : await loader.load()
|
||||
let docs = textSplitter ? await loader.loadAndSplit() : await loader.load()
|
||||
|
||||
if (metadata) {
|
||||
const parsedMetadata = typeof metadata === 'object' ? metadata : JSON.parse(metadata)
|
||||
return docs.map((doc) => {
|
||||
return {
|
||||
...doc,
|
||||
metadata: {
|
||||
docs = docs.map((doc) => ({
|
||||
...doc,
|
||||
metadata: omit(
|
||||
{
|
||||
...doc.metadata,
|
||||
...parsedMetadata
|
||||
}
|
||||
}
|
||||
})
|
||||
},
|
||||
omitMetadataKeys
|
||||
)
|
||||
}))
|
||||
} else {
|
||||
docs = docs.map((doc) => ({
|
||||
...doc,
|
||||
metadata: omit(
|
||||
{
|
||||
...doc.metadata
|
||||
},
|
||||
omitMetadataKeys
|
||||
)
|
||||
}))
|
||||
}
|
||||
|
||||
return docs
|
||||
|
|
|
|||
|
|
@ -1,7 +1,7 @@
|
|||
import { ICommonObject, INode, INodeData, INodeOutputsValue, INodeParams } from '../../../src/Interface'
|
||||
import { omit } from 'lodash'
|
||||
import { ICommonObject, IDocument, INode, INodeData, INodeOutputsValue, INodeParams } from '../../../src/Interface'
|
||||
import { TextSplitter } from 'langchain/text_splitter'
|
||||
import { TextLoader } from 'langchain/document_loaders/fs/text'
|
||||
import { Document } from '@langchain/core/documents'
|
||||
import { getFileFromStorage, handleEscapeCharacters } from '../../../src'
|
||||
|
||||
class Text_DocumentLoaders implements INode {
|
||||
|
|
@ -40,9 +40,21 @@ class Text_DocumentLoaders implements INode {
|
|||
optional: true
|
||||
},
|
||||
{
|
||||
label: 'Metadata',
|
||||
label: 'Additional Metadata',
|
||||
name: 'metadata',
|
||||
type: 'json',
|
||||
description: 'Additional metadata to be added to the extracted documents',
|
||||
optional: true,
|
||||
additionalParams: true
|
||||
},
|
||||
{
|
||||
label: 'Omit Metadata Keys',
|
||||
name: 'omitMetadataKeys',
|
||||
type: 'string',
|
||||
rows: 4,
|
||||
description:
|
||||
'Each document loader comes with a default set of metadata keys that are extracted from the document. You can use this field to omit some of the default metadata keys. The value should be a list of keys, seperated by comma',
|
||||
placeholder: 'key1, key2, key3.nestedKey1',
|
||||
optional: true,
|
||||
additionalParams: true
|
||||
}
|
||||
|
|
@ -68,8 +80,14 @@ class Text_DocumentLoaders implements INode {
|
|||
const txtFileBase64 = nodeData.inputs?.txtFile as string
|
||||
const metadata = nodeData.inputs?.metadata
|
||||
const output = nodeData.outputs?.output as string
|
||||
const _omitMetadataKeys = nodeData.inputs?.omitMetadataKeys as string
|
||||
|
||||
let alldocs = []
|
||||
let omitMetadataKeys: string[] = []
|
||||
if (_omitMetadataKeys) {
|
||||
omitMetadataKeys = _omitMetadataKeys.split(',').map((key) => key.trim())
|
||||
}
|
||||
|
||||
let docs: IDocument[] = []
|
||||
let files: string[] = []
|
||||
|
||||
//FILE-STORAGE::["CONTRIBUTING.md","LICENSE.md","README.md"]
|
||||
|
|
@ -88,11 +106,9 @@ class Text_DocumentLoaders implements INode {
|
|||
const loader = new TextLoader(blob)
|
||||
|
||||
if (textSplitter) {
|
||||
const docs = await loader.loadAndSplit(textSplitter)
|
||||
alldocs.push(...docs)
|
||||
docs.push(...(await loader.loadAndSplit(textSplitter)))
|
||||
} else {
|
||||
const docs = await loader.load()
|
||||
alldocs.push(...docs)
|
||||
docs.push(...(await loader.load()))
|
||||
}
|
||||
}
|
||||
} else {
|
||||
|
|
@ -110,37 +126,42 @@ class Text_DocumentLoaders implements INode {
|
|||
const loader = new TextLoader(blob)
|
||||
|
||||
if (textSplitter) {
|
||||
const docs = await loader.loadAndSplit(textSplitter)
|
||||
alldocs.push(...docs)
|
||||
docs.push(...(await loader.loadAndSplit(textSplitter)))
|
||||
} else {
|
||||
const docs = await loader.load()
|
||||
alldocs.push(...docs)
|
||||
docs.push(...(await loader.load()))
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
let finaldocs: Document<Record<string, any>>[] = []
|
||||
if (metadata) {
|
||||
const parsedMetadata = typeof metadata === 'object' ? metadata : JSON.parse(metadata)
|
||||
for (const doc of alldocs) {
|
||||
const newdoc = {
|
||||
...doc,
|
||||
metadata: {
|
||||
docs = docs.map((doc) => ({
|
||||
...doc,
|
||||
metadata: omit(
|
||||
{
|
||||
...doc.metadata,
|
||||
...parsedMetadata
|
||||
}
|
||||
}
|
||||
finaldocs.push(newdoc)
|
||||
}
|
||||
},
|
||||
omitMetadataKeys
|
||||
)
|
||||
}))
|
||||
} else {
|
||||
finaldocs = alldocs
|
||||
docs = docs.map((doc) => ({
|
||||
...doc,
|
||||
metadata: omit(
|
||||
{
|
||||
...doc.metadata
|
||||
},
|
||||
omitMetadataKeys
|
||||
)
|
||||
}))
|
||||
}
|
||||
|
||||
if (output === 'document') {
|
||||
return finaldocs
|
||||
return docs
|
||||
} else {
|
||||
let finaltext = ''
|
||||
for (const doc of finaldocs) {
|
||||
for (const doc of docs) {
|
||||
finaltext += `${doc.pageContent}\n`
|
||||
}
|
||||
return handleEscapeCharacters(finaltext, false)
|
||||
|
|
|
|||
|
|
@ -1,4 +1,5 @@
|
|||
import { ICommonObject, INode, INodeData, INodeParams } from '../../../src/Interface'
|
||||
import { omit } from 'lodash'
|
||||
import { ICommonObject, IDocument, INode, INodeData, INodeParams } from '../../../src/Interface'
|
||||
import {
|
||||
UnstructuredLoaderOptions,
|
||||
UnstructuredLoaderStrategy,
|
||||
|
|
@ -400,9 +401,21 @@ class UnstructuredFile_DocumentLoaders implements INode {
|
|||
default: '500'
|
||||
},
|
||||
{
|
||||
label: 'Metadata',
|
||||
label: 'Additional Metadata',
|
||||
name: 'metadata',
|
||||
type: 'json',
|
||||
description: 'Additional metadata to be added to the extracted documents',
|
||||
optional: true,
|
||||
additionalParams: true
|
||||
},
|
||||
{
|
||||
label: 'Omit Metadata Keys',
|
||||
name: 'omitMetadataKeys',
|
||||
type: 'string',
|
||||
rows: 4,
|
||||
description:
|
||||
'Each document loader comes with a default set of metadata keys that are extracted from the document. You can use this field to omit some of the default metadata keys. The value should be a list of keys, seperated by comma',
|
||||
placeholder: 'key1, key2, key3.nestedKey1',
|
||||
optional: true,
|
||||
additionalParams: true
|
||||
}
|
||||
|
|
@ -429,6 +442,12 @@ class UnstructuredFile_DocumentLoaders implements INode {
|
|||
const combineUnderNChars = nodeData.inputs?.combineUnderNChars as number
|
||||
const newAfterNChars = nodeData.inputs?.newAfterNChars as number
|
||||
const maxCharacters = nodeData.inputs?.maxCharacters as number
|
||||
const _omitMetadataKeys = nodeData.inputs?.omitMetadataKeys as string
|
||||
|
||||
let omitMetadataKeys: string[] = []
|
||||
if (_omitMetadataKeys) {
|
||||
omitMetadataKeys = _omitMetadataKeys.split(',').map((key) => key.trim())
|
||||
}
|
||||
const fileBase64 = nodeData.inputs?.fileObject as string
|
||||
|
||||
const obj: UnstructuredLoaderOptions = {
|
||||
|
|
@ -452,7 +471,7 @@ class UnstructuredFile_DocumentLoaders implements INode {
|
|||
const unstructuredAPIKey = getCredentialParam('unstructuredAPIKey', credentialData, nodeData)
|
||||
if (unstructuredAPIKey) obj.apiKey = unstructuredAPIKey
|
||||
|
||||
let docs: any[] = []
|
||||
let docs: IDocument[] = []
|
||||
let files: string[] = []
|
||||
|
||||
if (fileBase64) {
|
||||
|
|
@ -499,19 +518,25 @@ class UnstructuredFile_DocumentLoaders implements INode {
|
|||
const parsedMetadata = typeof metadata === 'object' ? metadata : JSON.parse(metadata)
|
||||
docs = docs.map((doc) => ({
|
||||
...doc,
|
||||
metadata: {
|
||||
...doc.metadata,
|
||||
...parsedMetadata,
|
||||
[sourceIdKey]: doc.metadata[sourceIdKey] || sourceIdKey
|
||||
}
|
||||
metadata: omit(
|
||||
{
|
||||
...doc.metadata,
|
||||
...parsedMetadata,
|
||||
[sourceIdKey]: doc.metadata[sourceIdKey] || sourceIdKey
|
||||
},
|
||||
omitMetadataKeys
|
||||
)
|
||||
}))
|
||||
} else {
|
||||
docs = docs.map((doc) => ({
|
||||
...doc,
|
||||
metadata: {
|
||||
...doc.metadata,
|
||||
[sourceIdKey]: doc.metadata[sourceIdKey] || sourceIdKey
|
||||
}
|
||||
metadata: omit(
|
||||
{
|
||||
...doc.metadata,
|
||||
[sourceIdKey]: doc.metadata[sourceIdKey] || sourceIdKey
|
||||
},
|
||||
omitMetadataKeys
|
||||
)
|
||||
}))
|
||||
}
|
||||
|
||||
|
|
|
|||
|
|
@ -1,3 +1,4 @@
|
|||
import { omit } from 'lodash'
|
||||
import { ICommonObject, INode, INodeData, INodeParams } from '../../../src/Interface'
|
||||
import {
|
||||
UnstructuredDirectoryLoader,
|
||||
|
|
@ -379,9 +380,21 @@ class UnstructuredFolder_DocumentLoaders implements INode {
|
|||
default: '500'
|
||||
},
|
||||
{
|
||||
label: 'Metadata',
|
||||
label: 'Additional Metadata',
|
||||
name: 'metadata',
|
||||
type: 'json',
|
||||
description: 'Additional metadata to be added to the extracted documents',
|
||||
optional: true,
|
||||
additionalParams: true
|
||||
},
|
||||
{
|
||||
label: 'Omit Metadata Keys',
|
||||
name: 'omitMetadataKeys',
|
||||
type: 'string',
|
||||
rows: 4,
|
||||
description:
|
||||
'Each document loader comes with a default set of metadata keys that are extracted from the document. You can use this field to omit some of the default metadata keys. The value should be a list of keys, seperated by comma',
|
||||
placeholder: 'key1, key2, key3.nestedKey1',
|
||||
optional: true,
|
||||
additionalParams: true
|
||||
}
|
||||
|
|
@ -408,6 +421,12 @@ class UnstructuredFolder_DocumentLoaders implements INode {
|
|||
const combineUnderNChars = nodeData.inputs?.combineUnderNChars as number
|
||||
const newAfterNChars = nodeData.inputs?.newAfterNChars as number
|
||||
const maxCharacters = nodeData.inputs?.maxCharacters as number
|
||||
const _omitMetadataKeys = nodeData.inputs?.omitMetadataKeys as string
|
||||
|
||||
let omitMetadataKeys: string[] = []
|
||||
if (_omitMetadataKeys) {
|
||||
omitMetadataKeys = _omitMetadataKeys.split(',').map((key) => key.trim())
|
||||
}
|
||||
|
||||
const obj: UnstructuredLoaderOptions = {
|
||||
apiUrl: unstructuredAPIUrl,
|
||||
|
|
@ -437,19 +456,25 @@ class UnstructuredFolder_DocumentLoaders implements INode {
|
|||
const parsedMetadata = typeof metadata === 'object' ? metadata : JSON.parse(metadata)
|
||||
docs = docs.map((doc) => ({
|
||||
...doc,
|
||||
metadata: {
|
||||
...doc.metadata,
|
||||
...parsedMetadata,
|
||||
[sourceIdKey]: doc.metadata[sourceIdKey] || sourceIdKey
|
||||
}
|
||||
metadata: omit(
|
||||
{
|
||||
...doc.metadata,
|
||||
...parsedMetadata,
|
||||
[sourceIdKey]: doc.metadata[sourceIdKey] || sourceIdKey
|
||||
},
|
||||
omitMetadataKeys
|
||||
)
|
||||
}))
|
||||
} else {
|
||||
docs = docs.map((doc) => ({
|
||||
...doc,
|
||||
metadata: {
|
||||
...doc.metadata,
|
||||
[sourceIdKey]: doc.metadata[sourceIdKey] || sourceIdKey
|
||||
}
|
||||
metadata: omit(
|
||||
{
|
||||
...doc.metadata,
|
||||
[sourceIdKey]: doc.metadata[sourceIdKey] || sourceIdKey
|
||||
},
|
||||
omitMetadataKeys
|
||||
)
|
||||
}))
|
||||
}
|
||||
|
||||
|
|
|
|||
|
|
@ -27,6 +27,7 @@ class CharacterTextSplitter_TextSplitters implements INode {
|
|||
label: 'Chunk Size',
|
||||
name: 'chunkSize',
|
||||
type: 'number',
|
||||
description: 'Number of characters in each chunk. Default is 1000.',
|
||||
default: 1000,
|
||||
optional: true
|
||||
},
|
||||
|
|
@ -34,6 +35,8 @@ class CharacterTextSplitter_TextSplitters implements INode {
|
|||
label: 'Chunk Overlap',
|
||||
name: 'chunkOverlap',
|
||||
type: 'number',
|
||||
description: 'Number of characters to overlap between chunks. Default is 200.',
|
||||
default: 200,
|
||||
optional: true
|
||||
},
|
||||
{
|
||||
|
|
|
|||
|
|
@ -101,6 +101,7 @@ class CodeTextSplitter_TextSplitters implements INode {
|
|||
label: 'Chunk Size',
|
||||
name: 'chunkSize',
|
||||
type: 'number',
|
||||
description: 'Number of characters in each chunk. Default is 1000.',
|
||||
default: 1000,
|
||||
optional: true
|
||||
},
|
||||
|
|
@ -108,6 +109,8 @@ class CodeTextSplitter_TextSplitters implements INode {
|
|||
label: 'Chunk Overlap',
|
||||
name: 'chunkOverlap',
|
||||
type: 'number',
|
||||
description: 'Number of characters to overlap between chunks. Default is 200.',
|
||||
default: 200,
|
||||
optional: true
|
||||
}
|
||||
]
|
||||
|
|
|
|||
|
|
@ -28,6 +28,7 @@ class HtmlToMarkdownTextSplitter_TextSplitters implements INode {
|
|||
label: 'Chunk Size',
|
||||
name: 'chunkSize',
|
||||
type: 'number',
|
||||
description: 'Number of characters in each chunk. Default is 1000.',
|
||||
default: 1000,
|
||||
optional: true
|
||||
},
|
||||
|
|
@ -35,6 +36,8 @@ class HtmlToMarkdownTextSplitter_TextSplitters implements INode {
|
|||
label: 'Chunk Overlap',
|
||||
name: 'chunkOverlap',
|
||||
type: 'number',
|
||||
description: 'Number of characters to overlap between chunks. Default is 200.',
|
||||
default: 200,
|
||||
optional: true
|
||||
}
|
||||
]
|
||||
|
|
|
|||
|
|
@ -27,6 +27,7 @@ class MarkdownTextSplitter_TextSplitters implements INode {
|
|||
label: 'Chunk Size',
|
||||
name: 'chunkSize',
|
||||
type: 'number',
|
||||
description: 'Number of characters in each chunk. Default is 1000.',
|
||||
default: 1000,
|
||||
optional: true
|
||||
},
|
||||
|
|
@ -34,6 +35,8 @@ class MarkdownTextSplitter_TextSplitters implements INode {
|
|||
label: 'Chunk Overlap',
|
||||
name: 'chunkOverlap',
|
||||
type: 'number',
|
||||
description: 'Number of characters to overlap between chunks. Default is 200.',
|
||||
default: 200,
|
||||
optional: true
|
||||
}
|
||||
]
|
||||
|
|
|
|||
|
|
@ -27,6 +27,7 @@ class RecursiveCharacterTextSplitter_TextSplitters implements INode {
|
|||
label: 'Chunk Size',
|
||||
name: 'chunkSize',
|
||||
type: 'number',
|
||||
description: 'Number of characters in each chunk. Default is 1000.',
|
||||
default: 1000,
|
||||
optional: true
|
||||
},
|
||||
|
|
@ -34,6 +35,8 @@ class RecursiveCharacterTextSplitter_TextSplitters implements INode {
|
|||
label: 'Chunk Overlap',
|
||||
name: 'chunkOverlap',
|
||||
type: 'number',
|
||||
description: 'Number of characters to overlap between chunks. Default is 200.',
|
||||
default: 200,
|
||||
optional: true
|
||||
},
|
||||
{
|
||||
|
|
|
|||
|
|
@ -56,6 +56,7 @@ class TokenTextSplitter_TextSplitters implements INode {
|
|||
label: 'Chunk Size',
|
||||
name: 'chunkSize',
|
||||
type: 'number',
|
||||
description: 'Number of characters in each chunk. Default is 1000.',
|
||||
default: 1000,
|
||||
optional: true
|
||||
},
|
||||
|
|
@ -63,6 +64,8 @@ class TokenTextSplitter_TextSplitters implements INode {
|
|||
label: 'Chunk Overlap',
|
||||
name: 'chunkOverlap',
|
||||
type: 'number',
|
||||
description: 'Number of characters to overlap between chunks. Default is 200.',
|
||||
default: 200,
|
||||
optional: true
|
||||
}
|
||||
]
|
||||
|
|
|
|||
|
|
@ -176,6 +176,11 @@ export type MessageContentImageUrl = {
|
|||
}
|
||||
}
|
||||
|
||||
export interface IDocument<Metadata extends Record<string, any> = Record<string, any>> {
|
||||
pageContent: string
|
||||
metadata: Metadata
|
||||
}
|
||||
|
||||
/**
|
||||
* Classes
|
||||
*/
|
||||
|
|
|
|||
|
|
@ -135,6 +135,21 @@ export const removeFilesFromStorage = async (...paths: string[]) => {
|
|||
}
|
||||
}
|
||||
|
||||
export const removeSpecificFileFromStorage = async (...paths: string[]) => {
|
||||
const storageType = getStorageType()
|
||||
if (storageType === 's3') {
|
||||
let Key = paths.reduce((acc, cur) => acc + '/' + cur, '')
|
||||
// remove the first '/' if it exists
|
||||
if (Key.startsWith('/')) {
|
||||
Key = Key.substring(1)
|
||||
}
|
||||
await _deleteS3Folder(Key)
|
||||
} else {
|
||||
const file = path.join(getStoragePath(), ...paths)
|
||||
fs.unlinkSync(file)
|
||||
}
|
||||
}
|
||||
|
||||
export const removeFolderFromStorage = async (...paths: string[]) => {
|
||||
const storageType = getStorageType()
|
||||
if (storageType === 's3') {
|
||||
|
|
|
|||
|
|
@ -0,0 +1,165 @@
|
|||
import { DocumentStore } from './database/entities/DocumentStore'
|
||||
|
||||
export enum DocumentStoreStatus {
|
||||
EMPTY_SYNC = 'EMPTY',
|
||||
SYNC = 'SYNC',
|
||||
SYNCING = 'SYNCING',
|
||||
STALE = 'STALE',
|
||||
NEW = 'NEW'
|
||||
}
|
||||
|
||||
export interface IDocumentStore {
|
||||
id: string
|
||||
name: string
|
||||
description: string
|
||||
loaders: string // JSON string
|
||||
whereUsed: string // JSON string
|
||||
updatedDate: Date
|
||||
createdDate: Date
|
||||
status: DocumentStoreStatus
|
||||
}
|
||||
|
||||
export interface IDocumentStoreFileChunk {
|
||||
id: string
|
||||
chunkNo: number
|
||||
docId: string
|
||||
storeId: string
|
||||
pageContent: string
|
||||
metadata: string
|
||||
}
|
||||
|
||||
export interface IDocumentStoreFileChunkPagedResponse {
|
||||
chunks: IDocumentStoreFileChunk[]
|
||||
count: number
|
||||
file?: IDocumentStoreLoader
|
||||
currentPage: number
|
||||
storeName: string
|
||||
description: string
|
||||
}
|
||||
|
||||
export interface IDocumentStoreLoader {
|
||||
id: string
|
||||
loaderId: string
|
||||
loaderName: string
|
||||
loaderConfig: any // JSON string
|
||||
splitterId: string
|
||||
splitterName: string
|
||||
splitterConfig: any // JSON string
|
||||
totalChunks: number
|
||||
totalChars: number
|
||||
status: DocumentStoreStatus
|
||||
storeId?: string
|
||||
files?: IDocumentStoreLoaderFile[]
|
||||
source?: string
|
||||
credential?: string
|
||||
}
|
||||
|
||||
export interface IDocumentStoreLoaderForPreview extends IDocumentStoreLoader {
|
||||
rehydrated: boolean
|
||||
preview: boolean
|
||||
previewChunkCount: number
|
||||
}
|
||||
|
||||
export interface IDocumentStoreLoaderFile {
|
||||
id: string
|
||||
name: string
|
||||
mimePrefix: string
|
||||
size: number
|
||||
status: DocumentStoreStatus
|
||||
uploaded: Date
|
||||
}
|
||||
|
||||
export interface IDocumentStoreWhereUsed {
|
||||
id: string
|
||||
name: string
|
||||
}
|
||||
|
||||
export class DocumentStoreDTO {
|
||||
id: string
|
||||
name: string
|
||||
description: string
|
||||
files: IDocumentStoreLoaderFile[]
|
||||
whereUsed: IDocumentStoreWhereUsed[]
|
||||
createdDate: Date
|
||||
updatedDate: Date
|
||||
status: DocumentStoreStatus
|
||||
chunkOverlap: number
|
||||
splitter: string
|
||||
totalChunks: number
|
||||
totalChars: number
|
||||
chunkSize: number
|
||||
loaders: IDocumentStoreLoader[]
|
||||
|
||||
constructor() {}
|
||||
|
||||
static fromEntity(entity: DocumentStore): DocumentStoreDTO {
|
||||
let documentStoreDTO = new DocumentStoreDTO()
|
||||
|
||||
Object.assign(documentStoreDTO, entity)
|
||||
documentStoreDTO.id = entity.id
|
||||
documentStoreDTO.name = entity.name
|
||||
documentStoreDTO.description = entity.description
|
||||
documentStoreDTO.status = entity.status
|
||||
documentStoreDTO.totalChars = 0
|
||||
documentStoreDTO.totalChunks = 0
|
||||
|
||||
if (entity.whereUsed) {
|
||||
documentStoreDTO.whereUsed = JSON.parse(entity.whereUsed)
|
||||
} else {
|
||||
documentStoreDTO.whereUsed = []
|
||||
}
|
||||
|
||||
if (entity.loaders) {
|
||||
documentStoreDTO.loaders = JSON.parse(entity.loaders)
|
||||
documentStoreDTO.loaders.map((loader) => {
|
||||
documentStoreDTO.totalChars += loader.totalChars
|
||||
documentStoreDTO.totalChunks += loader.totalChunks
|
||||
switch (loader.loaderId) {
|
||||
case 'pdfFile':
|
||||
loader.source = loader.loaderConfig.pdfFile.replace('FILE-STORAGE::', '')
|
||||
break
|
||||
case 'apiLoader':
|
||||
loader.source = loader.loaderConfig.url + ' (' + loader.loaderConfig.method + ')'
|
||||
break
|
||||
case 'cheerioWebScraper':
|
||||
loader.source = loader.loaderConfig.url
|
||||
break
|
||||
case 'jsonFile':
|
||||
loader.source = loader.loaderConfig.jsonFile.replace('FILE-STORAGE::', '')
|
||||
break
|
||||
case 'docxFile':
|
||||
loader.source = loader.loaderConfig.docxFile.replace('FILE-STORAGE::', '')
|
||||
break
|
||||
case 'textFile':
|
||||
loader.source = loader.loaderConfig.txtFile.replace('FILE-STORAGE::', '')
|
||||
break
|
||||
case 'unstructuredFileLoader':
|
||||
loader.source = loader.loaderConfig.filePath
|
||||
break
|
||||
default:
|
||||
loader.source = 'None'
|
||||
break
|
||||
}
|
||||
if (loader.status !== 'SYNC') {
|
||||
documentStoreDTO.status = DocumentStoreStatus.STALE
|
||||
}
|
||||
})
|
||||
}
|
||||
|
||||
return documentStoreDTO
|
||||
}
|
||||
|
||||
static fromEntities(entities: DocumentStore[]): DocumentStoreDTO[] {
|
||||
return entities.map((entity) => this.fromEntity(entity))
|
||||
}
|
||||
|
||||
static toEntity(body: any): DocumentStore {
|
||||
const docStore = new DocumentStore()
|
||||
Object.assign(docStore, body)
|
||||
docStore.loaders = '[]'
|
||||
docStore.whereUsed = '[]'
|
||||
// when a new document store is created, it is empty and in sync
|
||||
docStore.status = DocumentStoreStatus.EMPTY_SYNC
|
||||
return docStore
|
||||
}
|
||||
}
|
||||
|
|
@ -253,3 +253,6 @@ export interface IUploadFileSizeAndTypes {
|
|||
fileTypes: string[]
|
||||
maxUploadSize: number
|
||||
}
|
||||
|
||||
// DocumentStore related
|
||||
export * from './Interface.DocumentStore'
|
||||
|
|
|
|||
|
|
@ -0,0 +1,263 @@
|
|||
import { NextFunction, Request, Response } from 'express'
|
||||
import { StatusCodes } from 'http-status-codes'
|
||||
import documentStoreService from '../../services/documentstore'
|
||||
import { DocumentStore } from '../../database/entities/DocumentStore'
|
||||
import { InternalFlowiseError } from '../../errors/internalFlowiseError'
|
||||
import { DocumentStoreDTO } from '../../Interface'
|
||||
|
||||
const createDocumentStore = async (req: Request, res: Response, next: NextFunction) => {
|
||||
try {
|
||||
if (typeof req.body === 'undefined') {
|
||||
throw new InternalFlowiseError(
|
||||
StatusCodes.PRECONDITION_FAILED,
|
||||
`Error: documentStoreController.createDocumentStore - body not provided!`
|
||||
)
|
||||
}
|
||||
const body = req.body
|
||||
const docStore = DocumentStoreDTO.toEntity(body)
|
||||
const apiResponse = await documentStoreService.createDocumentStore(docStore)
|
||||
return res.json(apiResponse)
|
||||
} catch (error) {
|
||||
next(error)
|
||||
}
|
||||
}
|
||||
|
||||
const getAllDocumentStores = async (req: Request, res: Response, next: NextFunction) => {
|
||||
try {
|
||||
const apiResponse = await documentStoreService.getAllDocumentStores()
|
||||
return res.json(DocumentStoreDTO.fromEntities(apiResponse))
|
||||
} catch (error) {
|
||||
next(error)
|
||||
}
|
||||
}
|
||||
|
||||
const deleteLoaderFromDocumentStore = async (req: Request, res: Response, next: NextFunction) => {
|
||||
try {
|
||||
const storeId = req.params.id
|
||||
const loaderId = req.params.loaderId
|
||||
|
||||
if (!storeId || !loaderId) {
|
||||
throw new InternalFlowiseError(
|
||||
StatusCodes.PRECONDITION_FAILED,
|
||||
`Error: documentStoreController.deleteLoaderFromDocumentStore - missing storeId or loaderId.`
|
||||
)
|
||||
}
|
||||
const apiResponse = await documentStoreService.deleteLoaderFromDocumentStore(storeId, loaderId)
|
||||
return res.json(DocumentStoreDTO.fromEntity(apiResponse))
|
||||
} catch (error) {
|
||||
next(error)
|
||||
}
|
||||
}
|
||||
|
||||
const getDocumentStoreById = async (req: Request, res: Response, next: NextFunction) => {
|
||||
try {
|
||||
if (typeof req.params.id === 'undefined' || req.params.id === '') {
|
||||
throw new InternalFlowiseError(
|
||||
StatusCodes.PRECONDITION_FAILED,
|
||||
`Error: documentStoreController.getDocumentStoreById - id not provided!`
|
||||
)
|
||||
}
|
||||
const apiResponse = await documentStoreService.getDocumentStoreById(req.params.id)
|
||||
if (apiResponse && apiResponse.whereUsed) {
|
||||
apiResponse.whereUsed = JSON.stringify(await documentStoreService.getUsedChatflowNames(apiResponse))
|
||||
}
|
||||
return res.json(DocumentStoreDTO.fromEntity(apiResponse))
|
||||
} catch (error) {
|
||||
next(error)
|
||||
}
|
||||
}
|
||||
|
||||
const getDocumentStoreFileChunks = async (req: Request, res: Response, next: NextFunction) => {
|
||||
try {
|
||||
if (typeof req.params.storeId === 'undefined' || req.params.storeId === '') {
|
||||
throw new InternalFlowiseError(
|
||||
StatusCodes.PRECONDITION_FAILED,
|
||||
`Error: documentStoreController.getDocumentStoreFileChunks - storeId not provided!`
|
||||
)
|
||||
}
|
||||
if (typeof req.params.fileId === 'undefined' || req.params.fileId === '') {
|
||||
throw new InternalFlowiseError(
|
||||
StatusCodes.PRECONDITION_FAILED,
|
||||
`Error: documentStoreController.getDocumentStoreFileChunks - fileId not provided!`
|
||||
)
|
||||
}
|
||||
const page = req.params.pageNo ? parseInt(req.params.pageNo) : 1
|
||||
const apiResponse = await documentStoreService.getDocumentStoreFileChunks(req.params.storeId, req.params.fileId, page)
|
||||
return res.json(apiResponse)
|
||||
} catch (error) {
|
||||
next(error)
|
||||
}
|
||||
}
|
||||
|
||||
const deleteDocumentStoreFileChunk = async (req: Request, res: Response, next: NextFunction) => {
|
||||
try {
|
||||
if (typeof req.params.storeId === 'undefined' || req.params.storeId === '') {
|
||||
throw new InternalFlowiseError(
|
||||
StatusCodes.PRECONDITION_FAILED,
|
||||
`Error: documentStoreController.deleteDocumentStoreFileChunk - storeId not provided!`
|
||||
)
|
||||
}
|
||||
if (typeof req.params.loaderId === 'undefined' || req.params.loaderId === '') {
|
||||
throw new InternalFlowiseError(
|
||||
StatusCodes.PRECONDITION_FAILED,
|
||||
`Error: documentStoreController.deleteDocumentStoreFileChunk - loaderId not provided!`
|
||||
)
|
||||
}
|
||||
if (typeof req.params.chunkId === 'undefined' || req.params.chunkId === '') {
|
||||
throw new InternalFlowiseError(
|
||||
StatusCodes.PRECONDITION_FAILED,
|
||||
`Error: documentStoreController.deleteDocumentStoreFileChunk - chunkId not provided!`
|
||||
)
|
||||
}
|
||||
const apiResponse = await documentStoreService.deleteDocumentStoreFileChunk(
|
||||
req.params.storeId,
|
||||
req.params.loaderId,
|
||||
req.params.chunkId
|
||||
)
|
||||
return res.json(apiResponse)
|
||||
} catch (error) {
|
||||
next(error)
|
||||
}
|
||||
}
|
||||
|
||||
const editDocumentStoreFileChunk = async (req: Request, res: Response, next: NextFunction) => {
|
||||
try {
|
||||
if (typeof req.params.storeId === 'undefined' || req.params.storeId === '') {
|
||||
throw new InternalFlowiseError(
|
||||
StatusCodes.PRECONDITION_FAILED,
|
||||
`Error: documentStoreController.editDocumentStoreFileChunk - storeId not provided!`
|
||||
)
|
||||
}
|
||||
if (typeof req.params.loaderId === 'undefined' || req.params.loaderId === '') {
|
||||
throw new InternalFlowiseError(
|
||||
StatusCodes.PRECONDITION_FAILED,
|
||||
`Error: documentStoreController.editDocumentStoreFileChunk - loaderId not provided!`
|
||||
)
|
||||
}
|
||||
if (typeof req.params.chunkId === 'undefined' || req.params.chunkId === '') {
|
||||
throw new InternalFlowiseError(
|
||||
StatusCodes.PRECONDITION_FAILED,
|
||||
`Error: documentStoreController.editDocumentStoreFileChunk - chunkId not provided!`
|
||||
)
|
||||
}
|
||||
const body = req.body
|
||||
if (typeof body === 'undefined' || body.pageContent === 'undefined' || body.pageContent === '') {
|
||||
throw new InternalFlowiseError(
|
||||
StatusCodes.PRECONDITION_FAILED,
|
||||
`Error: documentStoreController.editDocumentStoreFileChunk - body not provided!`
|
||||
)
|
||||
}
|
||||
const apiResponse = await documentStoreService.editDocumentStoreFileChunk(
|
||||
req.params.storeId,
|
||||
req.params.loaderId,
|
||||
req.params.chunkId,
|
||||
body.pageContent
|
||||
)
|
||||
return res.json(apiResponse)
|
||||
} catch (error) {
|
||||
next(error)
|
||||
}
|
||||
}
|
||||
|
||||
const processFileChunks = async (req: Request, res: Response, next: NextFunction) => {
|
||||
try {
|
||||
if (typeof req.body === 'undefined') {
|
||||
throw new InternalFlowiseError(
|
||||
StatusCodes.PRECONDITION_FAILED,
|
||||
`Error: documentStoreController.processFileChunks - body not provided!`
|
||||
)
|
||||
}
|
||||
const body = req.body
|
||||
const apiResponse = await documentStoreService.processAndSaveChunks(body)
|
||||
return res.json(apiResponse)
|
||||
} catch (error) {
|
||||
next(error)
|
||||
}
|
||||
}
|
||||
|
||||
const updateDocumentStore = async (req: Request, res: Response, next: NextFunction) => {
|
||||
try {
|
||||
if (typeof req.params.id === 'undefined' || req.params.id === '') {
|
||||
throw new InternalFlowiseError(
|
||||
StatusCodes.PRECONDITION_FAILED,
|
||||
`Error: documentStoreController.updateDocumentStore - storeId not provided!`
|
||||
)
|
||||
}
|
||||
if (typeof req.body === 'undefined') {
|
||||
throw new InternalFlowiseError(
|
||||
StatusCodes.PRECONDITION_FAILED,
|
||||
`Error: documentStoreController.updateDocumentStore - body not provided!`
|
||||
)
|
||||
}
|
||||
const store = await documentStoreService.getDocumentStoreById(req.params.id)
|
||||
if (!store) {
|
||||
throw new InternalFlowiseError(
|
||||
StatusCodes.NOT_FOUND,
|
||||
`Error: documentStoreController.updateDocumentStore - DocumentStore ${req.params.id} not found in the database`
|
||||
)
|
||||
}
|
||||
const body = req.body
|
||||
const updateDocStore = new DocumentStore()
|
||||
Object.assign(updateDocStore, body)
|
||||
const apiResponse = await documentStoreService.updateDocumentStore(store, updateDocStore)
|
||||
return res.json(DocumentStoreDTO.fromEntity(apiResponse))
|
||||
} catch (error) {
|
||||
next(error)
|
||||
}
|
||||
}
|
||||
|
||||
const deleteDocumentStore = async (req: Request, res: Response, next: NextFunction) => {
|
||||
try {
|
||||
if (typeof req.params.id === 'undefined' || req.params.id === '') {
|
||||
throw new InternalFlowiseError(
|
||||
StatusCodes.PRECONDITION_FAILED,
|
||||
`Error: documentStoreController.deleteDocumentStore - storeId not provided!`
|
||||
)
|
||||
}
|
||||
const apiResponse = await documentStoreService.deleteDocumentStore(req.params.id)
|
||||
return res.json(apiResponse)
|
||||
} catch (error) {
|
||||
next(error)
|
||||
}
|
||||
}
|
||||
|
||||
const previewFileChunks = async (req: Request, res: Response, next: NextFunction) => {
|
||||
try {
|
||||
if (typeof req.body === 'undefined') {
|
||||
throw new InternalFlowiseError(
|
||||
StatusCodes.PRECONDITION_FAILED,
|
||||
`Error: documentStoreController.previewFileChunks - body not provided!`
|
||||
)
|
||||
}
|
||||
const body = req.body
|
||||
body.preview = true
|
||||
const apiResponse = await documentStoreService.previewChunks(body)
|
||||
return res.json(apiResponse)
|
||||
} catch (error) {
|
||||
next(error)
|
||||
}
|
||||
}
|
||||
|
||||
const getDocumentLoaders = async (req: Request, res: Response, next: NextFunction) => {
|
||||
try {
|
||||
const apiResponse = await documentStoreService.getDocumentLoaders()
|
||||
return res.json(apiResponse)
|
||||
} catch (error) {
|
||||
next(error)
|
||||
}
|
||||
}
|
||||
|
||||
export default {
|
||||
deleteDocumentStore,
|
||||
createDocumentStore,
|
||||
getAllDocumentStores,
|
||||
deleteLoaderFromDocumentStore,
|
||||
getDocumentStoreById,
|
||||
getDocumentStoreFileChunks,
|
||||
updateDocumentStore,
|
||||
processFileChunks,
|
||||
previewFileChunks,
|
||||
getDocumentLoaders,
|
||||
deleteDocumentStoreFileChunk,
|
||||
editDocumentStoreFileChunk
|
||||
}
|
||||
|
|
@ -1,4 +1,5 @@
|
|||
import { Request, Response, NextFunction } from 'express'
|
||||
import _ from 'lodash'
|
||||
import nodesService from '../../services/nodes'
|
||||
import { InternalFlowiseError } from '../../errors/internalFlowiseError'
|
||||
import { StatusCodes } from 'http-status-codes'
|
||||
|
|
@ -24,6 +25,22 @@ const getNodeByName = async (req: Request, res: Response, next: NextFunction) =>
|
|||
}
|
||||
}
|
||||
|
||||
const getNodesByCategory = async (req: Request, res: Response, next: NextFunction) => {
|
||||
try {
|
||||
if (typeof req.params.name === 'undefined' || req.params.name === '') {
|
||||
throw new InternalFlowiseError(
|
||||
StatusCodes.PRECONDITION_FAILED,
|
||||
`Error: nodesController.getNodesByCategory - name not provided!`
|
||||
)
|
||||
}
|
||||
const name = _.unescape(req.params.name)
|
||||
const apiResponse = await nodesService.getAllNodesForCategory(name)
|
||||
return res.json(apiResponse)
|
||||
} catch (error) {
|
||||
next(error)
|
||||
}
|
||||
}
|
||||
|
||||
const getSingleNodeIcon = async (req: Request, res: Response, next: NextFunction) => {
|
||||
try {
|
||||
if (typeof req.params === 'undefined' || !req.params.name) {
|
||||
|
|
@ -77,5 +94,6 @@ export default {
|
|||
getNodeByName,
|
||||
getSingleNodeIcon,
|
||||
getSingleNodeAsyncOptions,
|
||||
executeCustomFunction
|
||||
executeCustomFunction,
|
||||
getNodesByCategory
|
||||
}
|
||||
|
|
|
|||
|
|
@ -0,0 +1,31 @@
|
|||
import { Column, CreateDateColumn, Entity, PrimaryGeneratedColumn, UpdateDateColumn } from 'typeorm'
|
||||
import { DocumentStoreStatus, IDocumentStore } from '../../Interface'
|
||||
|
||||
@Entity()
|
||||
export class DocumentStore implements IDocumentStore {
|
||||
@PrimaryGeneratedColumn('uuid')
|
||||
id: string
|
||||
|
||||
@Column({ nullable: false, type: 'text' })
|
||||
name: string
|
||||
|
||||
@Column({ nullable: true, type: 'text' })
|
||||
description: string
|
||||
|
||||
@Column({ nullable: true, type: 'text' })
|
||||
loaders: string
|
||||
|
||||
@Column({ nullable: true, type: 'text' })
|
||||
whereUsed: string
|
||||
|
||||
@Column({ type: 'timestamp' })
|
||||
@CreateDateColumn()
|
||||
createdDate: Date
|
||||
|
||||
@Column({ type: 'timestamp' })
|
||||
@UpdateDateColumn()
|
||||
updatedDate: Date
|
||||
|
||||
@Column({ nullable: false, type: 'text' })
|
||||
status: DocumentStoreStatus
|
||||
}
|
||||
|
|
@ -0,0 +1,25 @@
|
|||
import { Column, Entity, Index, PrimaryGeneratedColumn } from 'typeorm'
|
||||
import { IDocumentStoreFileChunk } from '../../Interface'
|
||||
|
||||
@Entity()
|
||||
export class DocumentStoreFileChunk implements IDocumentStoreFileChunk {
|
||||
@PrimaryGeneratedColumn('uuid')
|
||||
id: string
|
||||
|
||||
@Index()
|
||||
@Column({ type: 'uuid' })
|
||||
docId: string
|
||||
|
||||
@Index()
|
||||
@Column({ type: 'uuid' })
|
||||
storeId: string
|
||||
|
||||
@Column()
|
||||
chunkNo: number
|
||||
|
||||
@Column({ nullable: false, type: 'text' })
|
||||
pageContent: string
|
||||
|
||||
@Column({ nullable: true, type: 'text' })
|
||||
metadata: string
|
||||
}
|
||||
|
|
@ -5,6 +5,8 @@ import { Credential } from './Credential'
|
|||
import { Tool } from './Tool'
|
||||
import { Assistant } from './Assistant'
|
||||
import { Variable } from './Variable'
|
||||
import { DocumentStore } from './DocumentStore'
|
||||
import { DocumentStoreFileChunk } from './DocumentStoreFileChunk'
|
||||
import { Lead } from './Lead'
|
||||
import { UpsertHistory } from './UpsertHistory'
|
||||
|
||||
|
|
@ -16,6 +18,8 @@ export const entities = {
|
|||
Tool,
|
||||
Assistant,
|
||||
Variable,
|
||||
DocumentStore,
|
||||
DocumentStoreFileChunk,
|
||||
Lead,
|
||||
UpsertHistory
|
||||
}
|
||||
|
|
|
|||
|
|
@ -0,0 +1,37 @@
|
|||
import { MigrationInterface, QueryRunner } from 'typeorm'
|
||||
|
||||
export class AddDocumentStore1711637331047 implements MigrationInterface {
|
||||
public async up(queryRunner: QueryRunner): Promise<void> {
|
||||
await queryRunner.query(
|
||||
`CREATE TABLE IF NOT EXISTS \`document_store\` (
|
||||
\`id\` varchar(36) NOT NULL,
|
||||
\`name\` varchar(255) NOT NULL,
|
||||
\`description\` varchar(255),
|
||||
\`loaders\` text,
|
||||
\`whereUsed\` text,
|
||||
\`status\` varchar(20) NOT NULL,
|
||||
\`createdDate\` datetime(6) NOT NULL DEFAULT CURRENT_TIMESTAMP(6),
|
||||
\`updatedDate\` datetime(6) NOT NULL DEFAULT CURRENT_TIMESTAMP(6) ON UPDATE CURRENT_TIMESTAMP(6),
|
||||
PRIMARY KEY (\`id\`)
|
||||
) ENGINE=InnoDB DEFAULT CHARSET=utf8mb4 COLLATE=utf8mb4_0900_ai_ci;`
|
||||
)
|
||||
await queryRunner.query(
|
||||
`CREATE TABLE IF NOT EXISTS \`document_store_file_chunk\` (
|
||||
\`id\` varchar(36) NOT NULL,
|
||||
\`docId\` varchar(36) NOT NULL,
|
||||
\`storeId\` varchar(36) NOT NULL,
|
||||
\`chunkNo\` INT NOT NULL,
|
||||
\`pageContent\` text,
|
||||
\`metadata\` text,
|
||||
PRIMARY KEY (\`id\`),
|
||||
KEY \`IDX_e76bae1780b77e56aab1h2asd4\` (\`docId\`),
|
||||
KEY \`IDX_e213b811b01405a42309a6a410\` (\`storeId\`)
|
||||
) ENGINE=InnoDB DEFAULT CHARSET=utf8mb4 COLLATE=utf8mb4_0900_ai_ci;`
|
||||
)
|
||||
}
|
||||
|
||||
public async down(queryRunner: QueryRunner): Promise<void> {
|
||||
await queryRunner.query(`DROP TABLE document_store`)
|
||||
await queryRunner.query(`DROP TABLE document_store_file_chunk`)
|
||||
}
|
||||
}
|
||||
|
|
@ -15,6 +15,7 @@ import { AddVariableEntity1699325775451 } from './1702200925471-AddVariableEntit
|
|||
import { AddSpeechToText1706364937060 } from './1706364937060-AddSpeechToText'
|
||||
import { AddUpsertHistoryEntity1709814301358 } from './1709814301358-AddUpsertHistoryEntity'
|
||||
import { AddFeedback1707213626553 } from './1707213626553-AddFeedback'
|
||||
import { AddDocumentStore1711637331047 } from './1711637331047-AddDocumentStore'
|
||||
import { AddLead1710832127079 } from './1710832127079-AddLead'
|
||||
import { AddLeadToChatMessage1711538023578 } from './1711538023578-AddLeadToChatMessage'
|
||||
|
||||
|
|
@ -36,6 +37,7 @@ export const mysqlMigrations = [
|
|||
AddSpeechToText1706364937060,
|
||||
AddUpsertHistoryEntity1709814301358,
|
||||
AddFeedback1707213626553,
|
||||
AddDocumentStore1711637331047,
|
||||
AddLead1710832127079,
|
||||
AddLeadToChatMessage1711538023578
|
||||
]
|
||||
|
|
|
|||
|
|
@ -0,0 +1,41 @@
|
|||
import { MigrationInterface, QueryRunner } from 'typeorm'
|
||||
|
||||
export class AddDocumentStore1711637331047 implements MigrationInterface {
|
||||
public async up(queryRunner: QueryRunner): Promise<void> {
|
||||
await queryRunner.query(
|
||||
`CREATE TABLE IF NOT EXISTS document_store (
|
||||
id uuid NOT NULL DEFAULT uuid_generate_v4(),
|
||||
"name" varchar NOT NULL,
|
||||
"description" varchar,
|
||||
"loaders" text,
|
||||
"whereUsed" text,
|
||||
"status" varchar NOT NULL,
|
||||
"createdDate" timestamp NOT NULL DEFAULT now(),
|
||||
"updatedDate" timestamp NOT NULL DEFAULT now(),
|
||||
CONSTRAINT "PK_98495043dd774f54-9830ab78f9" PRIMARY KEY (id)
|
||||
);`
|
||||
)
|
||||
await queryRunner.query(
|
||||
`CREATE TABLE IF NOT EXISTS document_store_file_chunk (
|
||||
id uuid NOT NULL DEFAULT uuid_generate_v4(),
|
||||
"docId" uuid NOT NULL,
|
||||
"chunkNo" integer NOT NULL,
|
||||
"storeId" uuid NOT NULL,
|
||||
"pageContent" text,
|
||||
"metadata" text,
|
||||
CONSTRAINT "PK_90005043dd774f54-9830ab78f9" PRIMARY KEY (id)
|
||||
);`
|
||||
)
|
||||
await queryRunner.query(
|
||||
`CREATE INDEX IF NOT EXISTS "IDX_e76bae1780b77e56aab1h2asd4" ON document_store_file_chunk USING btree (docId);`
|
||||
)
|
||||
await queryRunner.query(
|
||||
`CREATE INDEX IF NOT EXISTS "IDX_e213b811b01405a42309a6a410" ON document_store_file_chunk USING btree (storeId);`
|
||||
)
|
||||
}
|
||||
|
||||
public async down(queryRunner: QueryRunner): Promise<void> {
|
||||
await queryRunner.query(`DROP TABLE document_store`)
|
||||
await queryRunner.query(`DROP TABLE document_store_file_chunk`)
|
||||
}
|
||||
}
|
||||
|
|
@ -16,6 +16,7 @@ import { AddSpeechToText1706364937060 } from './1706364937060-AddSpeechToText'
|
|||
import { AddUpsertHistoryEntity1709814301358 } from './1709814301358-AddUpsertHistoryEntity'
|
||||
import { AddFeedback1707213601923 } from './1707213601923-AddFeedback'
|
||||
import { FieldTypes1710497452584 } from './1710497452584-FieldTypes'
|
||||
import { AddDocumentStore1711637331047 } from './1711637331047-AddDocumentStore'
|
||||
import { AddLead1710832137905 } from './1710832137905-AddLead'
|
||||
import { AddLeadToChatMessage1711538016098 } from './1711538016098-AddLeadToChatMessage'
|
||||
|
||||
|
|
@ -38,6 +39,7 @@ export const postgresMigrations = [
|
|||
AddUpsertHistoryEntity1709814301358,
|
||||
AddFeedback1707213601923,
|
||||
FieldTypes1710497452584,
|
||||
AddDocumentStore1711637331047,
|
||||
AddLead1710832137905,
|
||||
AddLeadToChatMessage1711538016098
|
||||
]
|
||||
|
|
|
|||
|
|
@ -0,0 +1,34 @@
|
|||
import { MigrationInterface, QueryRunner } from 'typeorm'
|
||||
|
||||
export class AddDocumentStore1711637331047 implements MigrationInterface {
|
||||
public async up(queryRunner: QueryRunner): Promise<void> {
|
||||
await queryRunner.query(
|
||||
`CREATE TABLE IF NOT EXISTS "document_store" (
|
||||
"id" varchar PRIMARY KEY NOT NULL,
|
||||
"name" varchar NOT NULL,
|
||||
"description" varchar,
|
||||
"status" varchar NOT NULL,
|
||||
"loaders" text,
|
||||
"whereUsed" text,
|
||||
"updatedDate" datetime NOT NULL DEFAULT (datetime('now')),
|
||||
"createdDate" datetime NOT NULL DEFAULT (datetime('now')));`
|
||||
)
|
||||
await queryRunner.query(
|
||||
`CREATE TABLE IF NOT EXISTS "document_store_file_chunk" (
|
||||
"id" varchar PRIMARY KEY NOT NULL,
|
||||
"docId" varchar NOT NULL,
|
||||
"storeId" varchar NOT NULL,
|
||||
"chunkNo" INTEGER NOT NULL,
|
||||
"pageContent" text,
|
||||
"metadata" text
|
||||
);`
|
||||
)
|
||||
await queryRunner.query(`CREATE INDEX "IDX_e76bae1780b77e56aab1h2asd4" ON "document_store_file_chunk" ("docId") ;`)
|
||||
await queryRunner.query(`CREATE INDEX "IDX_e213b811b01405a42309a6a410" ON "document_store_file_chunk" ("storeId") ;`)
|
||||
}
|
||||
|
||||
public async down(queryRunner: QueryRunner): Promise<void> {
|
||||
await queryRunner.query(`DROP TABLE IF EXISTS "document_store";`)
|
||||
await queryRunner.query(`DROP TABLE IF EXISTS "document_store_file_chunk";`)
|
||||
}
|
||||
}
|
||||
|
|
@ -15,6 +15,7 @@ import { AddVariableEntity1699325775451 } from './1702200925471-AddVariableEntit
|
|||
import { AddSpeechToText1706364937060 } from './1706364937060-AddSpeechToText'
|
||||
import { AddUpsertHistoryEntity1709814301358 } from './1709814301358-AddUpsertHistoryEntity'
|
||||
import { AddFeedback1707213619308 } from './1707213619308-AddFeedback'
|
||||
import { AddDocumentStore1711637331047 } from './1711637331047-AddDocumentStore'
|
||||
import { AddLead1710832117612 } from './1710832117612-AddLead'
|
||||
import { AddLeadToChatMessage1711537986113 } from './1711537986113-AddLeadToChatMessage'
|
||||
|
||||
|
|
@ -36,6 +37,7 @@ export const sqliteMigrations = [
|
|||
AddSpeechToText1706364937060,
|
||||
AddUpsertHistoryEntity1709814301358,
|
||||
AddFeedback1707213619308,
|
||||
AddDocumentStore1711637331047,
|
||||
AddLead1710832117612,
|
||||
AddLeadToChatMessage1711537986113
|
||||
]
|
||||
|
|
|
|||
|
|
@ -5,11 +5,10 @@ import cors from 'cors'
|
|||
import http from 'http'
|
||||
import basicAuth from 'express-basic-auth'
|
||||
import { Server } from 'socket.io'
|
||||
import logger from './utils/logger'
|
||||
import { expressRequestLogger } from './utils/logger'
|
||||
import { DataSource } from 'typeorm'
|
||||
import { IChatFlow } from './Interface'
|
||||
import { getNodeModulesPackagePath, getEncryptionKey } from './utils'
|
||||
import logger, { expressRequestLogger } from './utils/logger'
|
||||
import { getDataSource } from './DataSource'
|
||||
import { NodesPool } from './NodesPool'
|
||||
import { ChatFlow } from './database/entities/ChatFlow'
|
||||
|
|
|
|||
|
|
@ -0,0 +1,36 @@
|
|||
import express from 'express'
|
||||
import documentStoreController from '../../controllers/documentstore'
|
||||
const router = express.Router()
|
||||
|
||||
/** Document Store Routes */
|
||||
// Create document store
|
||||
router.post('/store', documentStoreController.createDocumentStore)
|
||||
// List all stores
|
||||
router.get('/stores', documentStoreController.getAllDocumentStores)
|
||||
// Get specific store
|
||||
router.get('/store/:id', documentStoreController.getDocumentStoreById)
|
||||
// Update documentStore
|
||||
router.put('/store/:id', documentStoreController.updateDocumentStore)
|
||||
// Delete documentStore
|
||||
router.delete('/store/:id', documentStoreController.deleteDocumentStore)
|
||||
|
||||
/** Component Nodes = Document Store - Loaders */
|
||||
// Get all loaders
|
||||
router.get('/loaders', documentStoreController.getDocumentLoaders)
|
||||
|
||||
// delete loader from document store
|
||||
router.delete('/loader/:id/:loaderId', documentStoreController.deleteLoaderFromDocumentStore)
|
||||
// chunking preview
|
||||
router.post('/loader/preview', documentStoreController.previewFileChunks)
|
||||
// chunking process
|
||||
router.post('/loader/process', documentStoreController.processFileChunks)
|
||||
|
||||
/** Document Store - Loaders - Chunks */
|
||||
// delete specific file chunk from the store
|
||||
router.delete('/chunks/:storeId/:loaderId/:chunkId', documentStoreController.deleteDocumentStoreFileChunk)
|
||||
// edit specific file chunk from the store
|
||||
router.put('/chunks/:storeId/:loaderId/:chunkId', documentStoreController.editDocumentStoreFileChunk)
|
||||
// Get all file chunks from the store
|
||||
router.get('/chunks/:storeId/:fileId/:pageNo', documentStoreController.getDocumentStoreFileChunks)
|
||||
|
||||
export default router
|
||||
|
|
@ -8,6 +8,7 @@ import chatMessageRouter from './chat-messages'
|
|||
import componentsCredentialsRouter from './components-credentials'
|
||||
import componentsCredentialsIconRouter from './components-credentials-icon'
|
||||
import credentialsRouter from './credentials'
|
||||
import documentStoreRouter from './documentstore'
|
||||
import feedbackRouter from './feedback'
|
||||
import fetchLinksRouter from './fetch-links'
|
||||
import flowConfigRouter from './flow-config'
|
||||
|
|
@ -49,6 +50,7 @@ router.use('/components-credentials', componentsCredentialsRouter)
|
|||
router.use('/components-credentials-icon', componentsCredentialsIconRouter)
|
||||
router.use('/chatflows-uploads', chatflowsUploadsRouter)
|
||||
router.use('/credentials', credentialsRouter)
|
||||
router.use('/document-store', documentStoreRouter)
|
||||
router.use('/feedback', feedbackRouter)
|
||||
router.use('/fetch-links', fetchLinksRouter)
|
||||
router.use('/flow-config', flowConfigRouter)
|
||||
|
|
|
|||
|
|
@ -5,5 +5,6 @@ const router = express.Router()
|
|||
// READ
|
||||
router.get('/', nodesController.getAllNodes)
|
||||
router.get(['/', '/:name'], nodesController.getNodeByName)
|
||||
router.get('/category/:name', nodesController.getNodesByCategory)
|
||||
|
||||
export default router
|
||||
|
|
|
|||
|
|
@ -13,6 +13,7 @@ import { ChatMessageFeedback } from '../../database/entities/ChatMessageFeedback
|
|||
import { UpsertHistory } from '../../database/entities/UpsertHistory'
|
||||
import { containsBase64File, updateFlowDataWithFilePaths } from '../../utils/fileRepository'
|
||||
import { getErrorMessage } from '../../errors/utils'
|
||||
import documentStoreService from '../../services/documentstore'
|
||||
|
||||
// Check if chatflow valid for streaming
|
||||
const checkIfChatflowIsValidForStreaming = async (chatflowId: string): Promise<any> => {
|
||||
|
|
@ -76,6 +77,7 @@ const deleteChatflow = async (chatflowId: string): Promise<any> => {
|
|||
try {
|
||||
// Delete all uploads corresponding to this chatflow
|
||||
await removeFolderFromStorage(chatflowId)
|
||||
await documentStoreService.updateDocumentStoreUsage(chatflowId, undefined)
|
||||
|
||||
// Delete all chat messages
|
||||
await appServer.AppDataSource.getRepository(ChatMessage).delete({ chatflowid: chatflowId })
|
||||
|
|
@ -166,6 +168,7 @@ const saveChatflow = async (newChatFlow: ChatFlow): Promise<any> => {
|
|||
|
||||
// step 2 - convert base64 to file paths and update the chatflow
|
||||
step1Results.flowData = await updateFlowDataWithFilePaths(step1Results.id, incomingFlowData)
|
||||
await _checkAndUpdateDocumentStoreUsage(step1Results)
|
||||
dbResponse = await appServer.AppDataSource.getRepository(ChatFlow).save(step1Results)
|
||||
} else {
|
||||
const chatflow = appServer.AppDataSource.getRepository(ChatFlow).create(newChatFlow)
|
||||
|
|
@ -192,6 +195,7 @@ const updateChatflow = async (chatflow: ChatFlow, updateChatFlow: ChatFlow): Pro
|
|||
updateChatFlow.flowData = await updateFlowDataWithFilePaths(chatflow.id, updateChatFlow.flowData)
|
||||
}
|
||||
const newDbChatflow = appServer.AppDataSource.getRepository(ChatFlow).merge(chatflow, updateChatFlow)
|
||||
await _checkAndUpdateDocumentStoreUsage(newDbChatflow)
|
||||
const dbResponse = await appServer.AppDataSource.getRepository(ChatFlow).save(newDbChatflow)
|
||||
|
||||
// chatFlowPool is initialized only when a flow is opened
|
||||
|
|
@ -261,6 +265,18 @@ const getSinglePublicChatbotConfig = async (chatflowId: string): Promise<any> =>
|
|||
}
|
||||
}
|
||||
|
||||
const _checkAndUpdateDocumentStoreUsage = async (chatflow: ChatFlow) => {
|
||||
const parsedFlowData: IReactFlowObject = JSON.parse(chatflow.flowData)
|
||||
const nodes = parsedFlowData.nodes
|
||||
// from the nodes array find if there is a node with name == documentStore)
|
||||
const node = nodes.length > 0 && nodes.find((node) => node.data.name === 'documentStore')
|
||||
if (!node || !node.data || !node.data.inputs || node.data.inputs['selectedStore'] === undefined) {
|
||||
await documentStoreService.updateDocumentStoreUsage(chatflow.id, undefined)
|
||||
} else {
|
||||
await documentStoreService.updateDocumentStoreUsage(chatflow.id, node.data.inputs['selectedStore'])
|
||||
}
|
||||
}
|
||||
|
||||
export default {
|
||||
checkIfChatflowIsValidForStreaming,
|
||||
checkIfChatflowIsValidForUploads,
|
||||
|
|
|
|||
|
|
@ -0,0 +1,710 @@
|
|||
import { getRunningExpressApp } from '../../utils/getRunningExpressApp'
|
||||
import { DocumentStore } from '../../database/entities/DocumentStore'
|
||||
// @ts-ignore
|
||||
import {
|
||||
addFileToStorage,
|
||||
getFileFromStorage,
|
||||
ICommonObject,
|
||||
IDocument,
|
||||
removeFilesFromStorage,
|
||||
removeSpecificFileFromStorage
|
||||
} from 'flowise-components'
|
||||
import {
|
||||
DocumentStoreStatus,
|
||||
IDocumentStoreFileChunkPagedResponse,
|
||||
IDocumentStoreLoader,
|
||||
IDocumentStoreLoaderFile,
|
||||
IDocumentStoreLoaderForPreview,
|
||||
IDocumentStoreWhereUsed
|
||||
} from '../../Interface'
|
||||
import { DocumentStoreFileChunk } from '../../database/entities/DocumentStoreFileChunk'
|
||||
import { v4 as uuidv4 } from 'uuid'
|
||||
import { databaseEntities } from '../../utils'
|
||||
import logger from '../../utils/logger'
|
||||
import nodesService from '../nodes'
|
||||
import { InternalFlowiseError } from '../../errors/internalFlowiseError'
|
||||
import { StatusCodes } from 'http-status-codes'
|
||||
import { getErrorMessage } from '../../errors/utils'
|
||||
import { ChatFlow } from '../../database/entities/ChatFlow'
|
||||
|
||||
const DOCUMENT_STORE_BASE_FOLDER = 'docustore'
|
||||
|
||||
const createDocumentStore = async (newDocumentStore: DocumentStore) => {
|
||||
try {
|
||||
const appServer = getRunningExpressApp()
|
||||
const documentStore = appServer.AppDataSource.getRepository(DocumentStore).create(newDocumentStore)
|
||||
const dbResponse = await appServer.AppDataSource.getRepository(DocumentStore).save(documentStore)
|
||||
return dbResponse
|
||||
} catch (error) {
|
||||
throw new InternalFlowiseError(
|
||||
StatusCodes.INTERNAL_SERVER_ERROR,
|
||||
`Error: documentStoreServices.createDocumentStore - ${getErrorMessage(error)}`
|
||||
)
|
||||
}
|
||||
}
|
||||
|
||||
const getAllDocumentStores = async () => {
|
||||
try {
|
||||
const appServer = getRunningExpressApp()
|
||||
const entities = await appServer.AppDataSource.getRepository(DocumentStore).find()
|
||||
return entities
|
||||
} catch (error) {
|
||||
throw new InternalFlowiseError(
|
||||
StatusCodes.INTERNAL_SERVER_ERROR,
|
||||
`Error: documentStoreServices.getAllDocumentStores - ${getErrorMessage(error)}`
|
||||
)
|
||||
}
|
||||
}
|
||||
|
||||
const deleteLoaderFromDocumentStore = async (storeId: string, loaderId: string) => {
|
||||
try {
|
||||
const appServer = getRunningExpressApp()
|
||||
const entity = await appServer.AppDataSource.getRepository(DocumentStore).findOneBy({
|
||||
id: storeId
|
||||
})
|
||||
if (!entity) {
|
||||
throw new InternalFlowiseError(
|
||||
StatusCodes.NOT_FOUND,
|
||||
`Error: documentStoreServices.deleteLoaderFromDocumentStore - Document store ${storeId} not found`
|
||||
)
|
||||
}
|
||||
const existingLoaders = JSON.parse(entity.loaders)
|
||||
const found = existingLoaders.find((uFile: IDocumentStoreLoader) => uFile.id === loaderId)
|
||||
if (found) {
|
||||
if (found.path) {
|
||||
//remove the existing files, if any of the file loaders were used.
|
||||
await removeSpecificFileFromStorage(DOCUMENT_STORE_BASE_FOLDER, entity.id, found.path)
|
||||
}
|
||||
const index = existingLoaders.indexOf(found)
|
||||
if (index > -1) {
|
||||
existingLoaders.splice(index, 1)
|
||||
}
|
||||
// remove the chunks
|
||||
await appServer.AppDataSource.getRepository(DocumentStoreFileChunk).delete({ docId: found.id })
|
||||
|
||||
entity.loaders = JSON.stringify(existingLoaders)
|
||||
const results = await appServer.AppDataSource.getRepository(DocumentStore).save(entity)
|
||||
return results
|
||||
} else {
|
||||
throw new InternalFlowiseError(StatusCodes.INTERNAL_SERVER_ERROR, `Unable to locate loader in Document Store ${entity.name}`)
|
||||
}
|
||||
} catch (error) {
|
||||
throw new InternalFlowiseError(
|
||||
StatusCodes.INTERNAL_SERVER_ERROR,
|
||||
`Error: documentStoreServices.deleteLoaderFromDocumentStore - ${getErrorMessage(error)}`
|
||||
)
|
||||
}
|
||||
}
|
||||
|
||||
const getDocumentStoreById = async (storeId: string) => {
|
||||
try {
|
||||
const appServer = getRunningExpressApp()
|
||||
const entity = await appServer.AppDataSource.getRepository(DocumentStore).findOneBy({
|
||||
id: storeId
|
||||
})
|
||||
if (!entity) {
|
||||
throw new InternalFlowiseError(
|
||||
StatusCodes.NOT_FOUND,
|
||||
`Error: documentStoreServices.getDocumentStoreById - Document store ${storeId} not found`
|
||||
)
|
||||
}
|
||||
return entity
|
||||
} catch (error) {
|
||||
throw new InternalFlowiseError(
|
||||
StatusCodes.INTERNAL_SERVER_ERROR,
|
||||
`Error: documentStoreServices.getDocumentStoreById - ${getErrorMessage(error)}`
|
||||
)
|
||||
}
|
||||
}
|
||||
|
||||
const getUsedChatflowNames = async (entity: DocumentStore) => {
|
||||
try {
|
||||
const appServer = getRunningExpressApp()
|
||||
if (entity.whereUsed) {
|
||||
const whereUsed = JSON.parse(entity.whereUsed)
|
||||
const updatedWhereUsed: IDocumentStoreWhereUsed[] = []
|
||||
for (let i = 0; i < whereUsed.length; i++) {
|
||||
const associatedChatflow = await appServer.AppDataSource.getRepository(ChatFlow).findOne({
|
||||
where: { id: whereUsed[i] },
|
||||
select: ['id', 'name']
|
||||
})
|
||||
if (associatedChatflow) {
|
||||
updatedWhereUsed.push({
|
||||
id: whereUsed[i],
|
||||
name: associatedChatflow.name
|
||||
})
|
||||
}
|
||||
}
|
||||
return updatedWhereUsed
|
||||
}
|
||||
return []
|
||||
} catch (error) {
|
||||
throw new InternalFlowiseError(
|
||||
StatusCodes.INTERNAL_SERVER_ERROR,
|
||||
`Error: documentStoreServices.getUsedChatflowNames - ${getErrorMessage(error)}`
|
||||
)
|
||||
}
|
||||
}
|
||||
|
||||
// Get chunks for a specific loader or store
|
||||
const getDocumentStoreFileChunks = async (storeId: string, fileId: string, pageNo: number = 1) => {
|
||||
try {
|
||||
const appServer = getRunningExpressApp()
|
||||
const entity = await appServer.AppDataSource.getRepository(DocumentStore).findOneBy({
|
||||
id: storeId
|
||||
})
|
||||
if (!entity) {
|
||||
throw new InternalFlowiseError(
|
||||
StatusCodes.NOT_FOUND,
|
||||
`Error: documentStoreServices.getDocumentStoreById - Document store ${storeId} not found`
|
||||
)
|
||||
}
|
||||
const loaders = JSON.parse(entity.loaders)
|
||||
|
||||
let found: IDocumentStoreLoader | undefined
|
||||
if (fileId !== 'all') {
|
||||
found = loaders.find((loader: IDocumentStoreLoader) => loader.id === fileId)
|
||||
if (!found) {
|
||||
throw new InternalFlowiseError(
|
||||
StatusCodes.NOT_FOUND,
|
||||
`Error: documentStoreServices.getDocumentStoreById - Document file ${fileId} not found`
|
||||
)
|
||||
}
|
||||
}
|
||||
let totalChars = 0
|
||||
loaders.forEach((loader: IDocumentStoreLoader) => {
|
||||
totalChars += loader.totalChars
|
||||
})
|
||||
if (found) {
|
||||
found.totalChars = totalChars
|
||||
found.id = entity.id
|
||||
found.status = entity.status
|
||||
}
|
||||
const PAGE_SIZE = 50
|
||||
const skip = (pageNo - 1) * PAGE_SIZE
|
||||
const take = PAGE_SIZE
|
||||
let whereCondition: any = { docId: fileId }
|
||||
if (fileId === 'all') {
|
||||
whereCondition = { storeId: storeId }
|
||||
}
|
||||
const count = await appServer.AppDataSource.getRepository(DocumentStoreFileChunk).count({
|
||||
where: whereCondition
|
||||
})
|
||||
const chunksWithCount = await appServer.AppDataSource.getRepository(DocumentStoreFileChunk).find({
|
||||
skip,
|
||||
take,
|
||||
where: whereCondition,
|
||||
order: {
|
||||
chunkNo: 'ASC'
|
||||
}
|
||||
})
|
||||
|
||||
if (!chunksWithCount) {
|
||||
throw new InternalFlowiseError(StatusCodes.NOT_FOUND, `File ${fileId} not found`)
|
||||
}
|
||||
|
||||
const response: IDocumentStoreFileChunkPagedResponse = {
|
||||
chunks: chunksWithCount,
|
||||
count: count,
|
||||
file: found,
|
||||
currentPage: pageNo,
|
||||
storeName: entity.name,
|
||||
description: entity.description
|
||||
}
|
||||
return response
|
||||
} catch (error) {
|
||||
throw new InternalFlowiseError(
|
||||
StatusCodes.INTERNAL_SERVER_ERROR,
|
||||
`Error: documentStoreServices.getDocumentStoreFileChunks - ${getErrorMessage(error)}`
|
||||
)
|
||||
}
|
||||
}
|
||||
|
||||
const deleteDocumentStore = async (storeId: string) => {
|
||||
try {
|
||||
const appServer = getRunningExpressApp()
|
||||
// delete all the chunks associated with the store
|
||||
await appServer.AppDataSource.getRepository(DocumentStoreFileChunk).delete({
|
||||
storeId: storeId
|
||||
})
|
||||
// now delete the files associated with the store
|
||||
const entity = await appServer.AppDataSource.getRepository(DocumentStore).findOneBy({
|
||||
id: storeId
|
||||
})
|
||||
if (!entity) throw new Error(`Document store ${storeId} not found`)
|
||||
await removeFilesFromStorage(DOCUMENT_STORE_BASE_FOLDER, entity.id)
|
||||
// now delete the store
|
||||
const tbd = await appServer.AppDataSource.getRepository(DocumentStore).delete({
|
||||
id: storeId
|
||||
})
|
||||
|
||||
return { deleted: tbd.affected }
|
||||
} catch (error) {
|
||||
throw new InternalFlowiseError(
|
||||
StatusCodes.INTERNAL_SERVER_ERROR,
|
||||
`Error: documentStoreServices.deleteDocumentStore - ${getErrorMessage(error)}`
|
||||
)
|
||||
}
|
||||
}
|
||||
|
||||
const deleteDocumentStoreFileChunk = async (storeId: string, docId: string, chunkId: string) => {
|
||||
try {
|
||||
const appServer = getRunningExpressApp()
|
||||
const entity = await appServer.AppDataSource.getRepository(DocumentStore).findOneBy({
|
||||
id: storeId
|
||||
})
|
||||
if (!entity) {
|
||||
throw new InternalFlowiseError(StatusCodes.NOT_FOUND, `Document store ${storeId} not found`)
|
||||
}
|
||||
const loaders = JSON.parse(entity.loaders)
|
||||
const found = loaders.find((ldr: IDocumentStoreLoader) => ldr.id === docId)
|
||||
if (!found) {
|
||||
throw new InternalFlowiseError(StatusCodes.NOT_FOUND, `Document store loader ${docId} not found`)
|
||||
}
|
||||
|
||||
const tbdChunk = await appServer.AppDataSource.getRepository(DocumentStoreFileChunk).findOneBy({
|
||||
id: chunkId
|
||||
})
|
||||
if (!tbdChunk) {
|
||||
throw new InternalFlowiseError(StatusCodes.NOT_FOUND, `Document Chunk ${chunkId} not found`)
|
||||
}
|
||||
await appServer.AppDataSource.getRepository(DocumentStoreFileChunk).delete(chunkId)
|
||||
found.totalChunks--
|
||||
found.totalChars -= tbdChunk.pageContent.length
|
||||
entity.loaders = JSON.stringify(loaders)
|
||||
await appServer.AppDataSource.getRepository(DocumentStore).save(entity)
|
||||
return getDocumentStoreFileChunks(storeId, docId)
|
||||
} catch (error) {
|
||||
throw new InternalFlowiseError(
|
||||
StatusCodes.INTERNAL_SERVER_ERROR,
|
||||
`Error: documentStoreServices.deleteDocumentStoreFileChunk - ${getErrorMessage(error)}`
|
||||
)
|
||||
}
|
||||
}
|
||||
|
||||
const editDocumentStoreFileChunk = async (storeId: string, docId: string, chunkId: string, content: string) => {
|
||||
try {
|
||||
const appServer = getRunningExpressApp()
|
||||
const entity = await appServer.AppDataSource.getRepository(DocumentStore).findOneBy({
|
||||
id: storeId
|
||||
})
|
||||
if (!entity) {
|
||||
throw new InternalFlowiseError(StatusCodes.NOT_FOUND, `Document store ${storeId} not found`)
|
||||
}
|
||||
const loaders = JSON.parse(entity.loaders)
|
||||
const found = loaders.find((ldr: IDocumentStoreLoader) => ldr.id === docId)
|
||||
if (!found) {
|
||||
throw new InternalFlowiseError(StatusCodes.NOT_FOUND, `Document store loader ${docId} not found`)
|
||||
}
|
||||
|
||||
const editChunk = await appServer.AppDataSource.getRepository(DocumentStoreFileChunk).findOneBy({
|
||||
id: chunkId
|
||||
})
|
||||
if (!editChunk) {
|
||||
throw new InternalFlowiseError(StatusCodes.NOT_FOUND, `Document Chunk ${chunkId} not found`)
|
||||
}
|
||||
found.totalChars -= editChunk.pageContent.length
|
||||
editChunk.pageContent = content
|
||||
found.totalChars += content.length
|
||||
await appServer.AppDataSource.getRepository(DocumentStoreFileChunk).save(editChunk)
|
||||
entity.loaders = JSON.stringify(loaders)
|
||||
await appServer.AppDataSource.getRepository(DocumentStore).save(entity)
|
||||
return getDocumentStoreFileChunks(storeId, docId)
|
||||
} catch (error) {
|
||||
throw new InternalFlowiseError(
|
||||
StatusCodes.INTERNAL_SERVER_ERROR,
|
||||
`Error: documentStoreServices.editDocumentStoreFileChunk - ${getErrorMessage(error)}`
|
||||
)
|
||||
}
|
||||
}
|
||||
|
||||
// Update documentStore
|
||||
const updateDocumentStore = async (documentStore: DocumentStore, updatedDocumentStore: DocumentStore) => {
|
||||
try {
|
||||
const appServer = getRunningExpressApp()
|
||||
const tmpUpdatedDocumentStore = appServer.AppDataSource.getRepository(DocumentStore).merge(documentStore, updatedDocumentStore)
|
||||
const dbResponse = await appServer.AppDataSource.getRepository(DocumentStore).save(tmpUpdatedDocumentStore)
|
||||
return dbResponse
|
||||
} catch (error) {
|
||||
throw new InternalFlowiseError(
|
||||
StatusCodes.INTERNAL_SERVER_ERROR,
|
||||
`Error: documentStoreServices.updateDocumentStore - ${getErrorMessage(error)}`
|
||||
)
|
||||
}
|
||||
}
|
||||
|
||||
const _saveFileToStorage = async (fileBase64: string, entity: DocumentStore) => {
|
||||
const splitDataURI = fileBase64.split(',')
|
||||
const filename = splitDataURI.pop()?.split(':')[1] ?? ''
|
||||
const bf = Buffer.from(splitDataURI.pop() || '', 'base64')
|
||||
const mimePrefix = splitDataURI.pop()
|
||||
let mime = ''
|
||||
if (mimePrefix) {
|
||||
mime = mimePrefix.split(';')[0].split(':')[1]
|
||||
}
|
||||
await addFileToStorage(mime, bf, filename, DOCUMENT_STORE_BASE_FOLDER, entity.id)
|
||||
return {
|
||||
id: uuidv4(),
|
||||
name: filename,
|
||||
mimePrefix: mime,
|
||||
size: bf.length,
|
||||
status: DocumentStoreStatus.NEW,
|
||||
uploaded: new Date()
|
||||
}
|
||||
}
|
||||
|
||||
const _splitIntoChunks = async (data: IDocumentStoreLoaderForPreview) => {
|
||||
try {
|
||||
const appServer = getRunningExpressApp()
|
||||
let splitterInstance = null
|
||||
if (data.splitterConfig && Object.keys(data.splitterConfig).length > 0) {
|
||||
const nodeInstanceFilePath = appServer.nodesPool.componentNodes[data.splitterId].filePath as string
|
||||
const nodeModule = await import(nodeInstanceFilePath)
|
||||
const newNodeInstance = new nodeModule.nodeClass()
|
||||
let nodeData = {
|
||||
inputs: { ...data.splitterConfig },
|
||||
id: 'splitter_0'
|
||||
}
|
||||
splitterInstance = await newNodeInstance.init(nodeData)
|
||||
}
|
||||
const nodeInstanceFilePath = appServer.nodesPool.componentNodes[data.loaderId].filePath as string
|
||||
const nodeModule = await import(nodeInstanceFilePath)
|
||||
// doc loader configs
|
||||
const nodeData = {
|
||||
credential: data.credential || undefined,
|
||||
inputs: { ...data.loaderConfig, textSplitter: splitterInstance },
|
||||
outputs: { output: 'document' }
|
||||
}
|
||||
const options: ICommonObject = {
|
||||
chatflowid: uuidv4(),
|
||||
appDataSource: appServer.AppDataSource,
|
||||
databaseEntities,
|
||||
logger
|
||||
}
|
||||
const docNodeInstance = new nodeModule.nodeClass()
|
||||
let docs: IDocument[] = await docNodeInstance.init(nodeData, '', options)
|
||||
return docs
|
||||
} catch (error) {
|
||||
throw new InternalFlowiseError(
|
||||
StatusCodes.INTERNAL_SERVER_ERROR,
|
||||
`Error: documentStoreServices.splitIntoChunks - ${getErrorMessage(error)}`
|
||||
)
|
||||
}
|
||||
}
|
||||
|
||||
const _normalizeFilePaths = async (data: IDocumentStoreLoaderForPreview, entity: DocumentStore | null) => {
|
||||
const keys = Object.getOwnPropertyNames(data.loaderConfig)
|
||||
let rehydrated = false
|
||||
for (let i = 0; i < keys.length; i++) {
|
||||
const input = data.loaderConfig[keys[i]]
|
||||
if (!input) {
|
||||
continue
|
||||
}
|
||||
if (typeof input !== 'string') {
|
||||
continue
|
||||
}
|
||||
let documentStoreEntity: DocumentStore | null = entity
|
||||
if (input.startsWith('FILE-STORAGE::')) {
|
||||
if (!documentStoreEntity) {
|
||||
const appServer = getRunningExpressApp()
|
||||
documentStoreEntity = await appServer.AppDataSource.getRepository(DocumentStore).findOneBy({
|
||||
id: data.storeId
|
||||
})
|
||||
if (!documentStoreEntity) {
|
||||
throw new InternalFlowiseError(StatusCodes.NOT_FOUND, `Document store ${data.storeId} not found`)
|
||||
}
|
||||
}
|
||||
const fileName = input.replace('FILE-STORAGE::', '')
|
||||
let files: string[] = []
|
||||
if (fileName.startsWith('[') && fileName.endsWith(']')) {
|
||||
files = JSON.parse(fileName)
|
||||
} else {
|
||||
files = [fileName]
|
||||
}
|
||||
const loaders = JSON.parse(documentStoreEntity.loaders)
|
||||
const currentLoader = loaders.find((ldr: IDocumentStoreLoader) => ldr.id === data.id)
|
||||
if (currentLoader) {
|
||||
const base64Files: string[] = []
|
||||
for (const file of files) {
|
||||
const bf = await getFileFromStorage(file, DOCUMENT_STORE_BASE_FOLDER, documentStoreEntity.id)
|
||||
// find the file entry that has the same name as the file
|
||||
const uploadedFile = currentLoader.files.find((uFile: IDocumentStoreLoaderFile) => uFile.name === file)
|
||||
const mimePrefix = 'data:' + uploadedFile.mimePrefix + ';base64'
|
||||
const base64String = mimePrefix + ',' + bf.toString('base64') + `,filename:${file}`
|
||||
base64Files.push(base64String)
|
||||
}
|
||||
data.loaderConfig[keys[i]] = JSON.stringify(base64Files)
|
||||
rehydrated = true
|
||||
}
|
||||
}
|
||||
}
|
||||
data.rehydrated = rehydrated
|
||||
}
|
||||
|
||||
const previewChunks = async (data: IDocumentStoreLoaderForPreview) => {
|
||||
try {
|
||||
if (data.preview) {
|
||||
if (
|
||||
data.loaderId === 'cheerioWebScraper' ||
|
||||
data.loaderId === 'puppeteerWebScraper' ||
|
||||
data.loaderId === 'playwrightWebScraper'
|
||||
) {
|
||||
data.loaderConfig['limit'] = 3
|
||||
}
|
||||
}
|
||||
if (!data.rehydrated) {
|
||||
await _normalizeFilePaths(data, null)
|
||||
}
|
||||
let docs = await _splitIntoChunks(data)
|
||||
const totalChunks = docs.length
|
||||
// if -1, return all chunks
|
||||
if (data.previewChunkCount === -1) data.previewChunkCount = totalChunks
|
||||
// return all docs if the user ask for more than we have
|
||||
if (totalChunks <= data.previewChunkCount) data.previewChunkCount = totalChunks
|
||||
// return only the first n chunks
|
||||
if (totalChunks > data.previewChunkCount) docs = docs.slice(0, data.previewChunkCount)
|
||||
|
||||
return { chunks: docs, totalChunks: totalChunks, previewChunkCount: data.previewChunkCount }
|
||||
} catch (error) {
|
||||
throw new InternalFlowiseError(
|
||||
StatusCodes.INTERNAL_SERVER_ERROR,
|
||||
`Error: documentStoreServices.previewChunks - ${getErrorMessage(error)}`
|
||||
)
|
||||
}
|
||||
}
|
||||
|
||||
const processAndSaveChunks = async (data: IDocumentStoreLoaderForPreview) => {
|
||||
try {
|
||||
const appServer = getRunningExpressApp()
|
||||
const entity = await appServer.AppDataSource.getRepository(DocumentStore).findOneBy({
|
||||
id: data.storeId
|
||||
})
|
||||
if (!entity) {
|
||||
throw new InternalFlowiseError(
|
||||
StatusCodes.NOT_FOUND,
|
||||
`Error: documentStoreServices.processAndSaveChunks - Document store ${data.storeId} not found`
|
||||
)
|
||||
}
|
||||
|
||||
const newLoaderId = data.id ?? uuidv4()
|
||||
const existingLoaders = JSON.parse(entity.loaders)
|
||||
const found = existingLoaders.find((ldr: IDocumentStoreLoader) => ldr.id === newLoaderId)
|
||||
if (found) {
|
||||
// clean up the current status and mark the loader as pending_sync
|
||||
found.totalChunks = 0
|
||||
found.totalChars = 0
|
||||
found.status = DocumentStoreStatus.SYNCING
|
||||
entity.loaders = JSON.stringify(existingLoaders)
|
||||
} else {
|
||||
let loader: IDocumentStoreLoader = {
|
||||
id: newLoaderId,
|
||||
loaderId: data.loaderId,
|
||||
loaderName: data.loaderName,
|
||||
loaderConfig: data.loaderConfig,
|
||||
splitterId: data.splitterId,
|
||||
splitterName: data.splitterName,
|
||||
splitterConfig: data.splitterConfig,
|
||||
totalChunks: 0,
|
||||
totalChars: 0,
|
||||
status: DocumentStoreStatus.SYNCING
|
||||
}
|
||||
if (data.credential) {
|
||||
loader.credential = data.credential
|
||||
}
|
||||
existingLoaders.push(loader)
|
||||
entity.loaders = JSON.stringify(existingLoaders)
|
||||
}
|
||||
await appServer.AppDataSource.getRepository(DocumentStore).save(entity)
|
||||
// this method will run async, will have to be moved to a worker thread
|
||||
_saveChunksToStorage(data, entity, newLoaderId).then(() => {})
|
||||
return getDocumentStoreFileChunks(data.storeId as string, newLoaderId)
|
||||
} catch (error) {
|
||||
throw new InternalFlowiseError(
|
||||
StatusCodes.INTERNAL_SERVER_ERROR,
|
||||
`Error: documentStoreServices.processAndSaveChunks - ${getErrorMessage(error)}`
|
||||
)
|
||||
}
|
||||
}
|
||||
|
||||
const _saveChunksToStorage = async (data: IDocumentStoreLoaderForPreview, entity: DocumentStore, newLoaderId: string) => {
|
||||
const re = new RegExp('^data.*;base64', 'i')
|
||||
|
||||
try {
|
||||
const appServer = getRunningExpressApp()
|
||||
//step 1: restore the full paths, if any
|
||||
await _normalizeFilePaths(data, entity)
|
||||
//step 2: split the file into chunks
|
||||
previewChunks(data).then(async (response) => {
|
||||
//{ chunks: docs, totalChunks: totalChunks, previewChunkCount: data.previewChunkCount }
|
||||
//step 3: remove base64 files and save them to storage, this needs to be rewritten
|
||||
let filesWithMetadata = []
|
||||
const keys = Object.getOwnPropertyNames(data.loaderConfig)
|
||||
for (let i = 0; i < keys.length; i++) {
|
||||
const input = data.loaderConfig[keys[i]]
|
||||
if (!input) {
|
||||
continue
|
||||
}
|
||||
if (typeof input !== 'string') {
|
||||
continue
|
||||
}
|
||||
if (input.startsWith('[') && input.endsWith(']')) {
|
||||
const files = JSON.parse(input)
|
||||
const fileNames: string[] = []
|
||||
for (let j = 0; j < files.length; j++) {
|
||||
const file = files[j]
|
||||
if (re.test(file)) {
|
||||
const fileMetadata = await _saveFileToStorage(file, entity)
|
||||
fileNames.push(fileMetadata.name)
|
||||
filesWithMetadata.push(fileMetadata)
|
||||
}
|
||||
}
|
||||
data.loaderConfig[keys[i]] = 'FILE-STORAGE::' + JSON.stringify(fileNames)
|
||||
} else if (re.test(input)) {
|
||||
const fileNames: string[] = []
|
||||
const fileMetadata = await _saveFileToStorage(input, entity)
|
||||
fileNames.push(fileMetadata.name)
|
||||
filesWithMetadata.push(fileMetadata)
|
||||
data.loaderConfig[keys[i]] = 'FILE-STORAGE::' + JSON.stringify(fileNames)
|
||||
break
|
||||
}
|
||||
}
|
||||
const existingLoaders = JSON.parse(entity.loaders)
|
||||
const loader = existingLoaders.find((ldr: IDocumentStoreLoader) => ldr.id === newLoaderId)
|
||||
if (data.id) {
|
||||
//step 4: remove all files and chunks associated with the previous loader
|
||||
const index = existingLoaders.indexOf(loader)
|
||||
if (index > -1) {
|
||||
existingLoaders.splice(index, 1)
|
||||
if (!data.rehydrated) {
|
||||
if (loader.files) {
|
||||
loader.files.map(async (file: IDocumentStoreLoaderFile) => {
|
||||
await removeSpecificFileFromStorage(DOCUMENT_STORE_BASE_FOLDER, entity.id, file.name)
|
||||
})
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
//step 5: upload with the new files and loaderConfig
|
||||
if (filesWithMetadata.length > 0) {
|
||||
loader.loaderConfig = data.loaderConfig
|
||||
loader.files = filesWithMetadata
|
||||
}
|
||||
//step 6: update the loaders with the new loaderConfig
|
||||
if (data.id) {
|
||||
existingLoaders.push(loader)
|
||||
}
|
||||
//step 7: remove all previous chunks
|
||||
await appServer.AppDataSource.getRepository(DocumentStoreFileChunk).delete({ docId: newLoaderId })
|
||||
if (response.chunks) {
|
||||
//step 8: now save the new chunks
|
||||
const totalChars = response.chunks.reduce((acc: number, chunk) => acc + chunk.pageContent.length, 0)
|
||||
response.chunks.map(async (chunk: IDocument, index: number) => {
|
||||
const docChunk: DocumentStoreFileChunk = {
|
||||
docId: newLoaderId,
|
||||
storeId: data.storeId || '',
|
||||
id: uuidv4(),
|
||||
chunkNo: index + 1,
|
||||
pageContent: chunk.pageContent,
|
||||
metadata: JSON.stringify(chunk.metadata)
|
||||
}
|
||||
const dChunk = appServer.AppDataSource.getRepository(DocumentStoreFileChunk).create(docChunk)
|
||||
await appServer.AppDataSource.getRepository(DocumentStoreFileChunk).save(dChunk)
|
||||
})
|
||||
// update the loader with the new metrics
|
||||
loader.totalChunks = response.totalChunks
|
||||
loader.totalChars = totalChars
|
||||
}
|
||||
loader.status = 'SYNC'
|
||||
// have a flag and iterate over the loaders and update the entity status to SYNC
|
||||
const allSynced = existingLoaders.every((ldr: IDocumentStoreLoader) => ldr.status === 'SYNC')
|
||||
entity.status = allSynced ? DocumentStoreStatus.SYNC : DocumentStoreStatus.STALE
|
||||
entity.loaders = JSON.stringify(existingLoaders)
|
||||
//step 9: update the entity in the database
|
||||
await appServer.AppDataSource.getRepository(DocumentStore).save(entity)
|
||||
return
|
||||
})
|
||||
} catch (error) {
|
||||
throw new InternalFlowiseError(
|
||||
StatusCodes.INTERNAL_SERVER_ERROR,
|
||||
`Error: documentStoreServices._saveChunksToStorage - ${getErrorMessage(error)}`
|
||||
)
|
||||
}
|
||||
}
|
||||
|
||||
// Get all component nodes
|
||||
const getDocumentLoaders = async () => {
|
||||
const removeDocumentLoadersWithName = ['documentStore', 'vectorStoreToDocument', 'unstructuredFolderLoader', 'folderFiles']
|
||||
|
||||
try {
|
||||
const dbResponse = await nodesService.getAllNodesForCategory('Document Loaders')
|
||||
return dbResponse.filter((node) => !removeDocumentLoadersWithName.includes(node.name))
|
||||
} catch (error) {
|
||||
throw new InternalFlowiseError(
|
||||
StatusCodes.INTERNAL_SERVER_ERROR,
|
||||
`Error: documentStoreServices.getDocumentLoaders - ${getErrorMessage(error)}`
|
||||
)
|
||||
}
|
||||
}
|
||||
|
||||
const updateDocumentStoreUsage = async (chatId: string, storeId: string | undefined) => {
|
||||
try {
|
||||
// find the document store
|
||||
const appServer = getRunningExpressApp()
|
||||
// find all entities that have the chatId in their whereUsed
|
||||
const entities = await appServer.AppDataSource.getRepository(DocumentStore).find()
|
||||
entities.map(async (entity: DocumentStore) => {
|
||||
const whereUsed = JSON.parse(entity.whereUsed)
|
||||
const found = whereUsed.find((w: string) => w === chatId)
|
||||
if (found) {
|
||||
if (!storeId) {
|
||||
// remove the chatId from the whereUsed, as the store is being deleted
|
||||
const index = whereUsed.indexOf(chatId)
|
||||
if (index > -1) {
|
||||
whereUsed.splice(index, 1)
|
||||
entity.whereUsed = JSON.stringify(whereUsed)
|
||||
await appServer.AppDataSource.getRepository(DocumentStore).save(entity)
|
||||
}
|
||||
} else if (entity.id === storeId) {
|
||||
// do nothing, already found and updated
|
||||
} else if (entity.id !== storeId) {
|
||||
// remove the chatId from the whereUsed, as a new store is being used
|
||||
const index = whereUsed.indexOf(chatId)
|
||||
if (index > -1) {
|
||||
whereUsed.splice(index, 1)
|
||||
entity.whereUsed = JSON.stringify(whereUsed)
|
||||
await appServer.AppDataSource.getRepository(DocumentStore).save(entity)
|
||||
}
|
||||
}
|
||||
} else {
|
||||
if (entity.id === storeId) {
|
||||
// add the chatId to the whereUsed
|
||||
whereUsed.push(chatId)
|
||||
entity.whereUsed = JSON.stringify(whereUsed)
|
||||
await appServer.AppDataSource.getRepository(DocumentStore).save(entity)
|
||||
}
|
||||
}
|
||||
})
|
||||
} catch (error) {
|
||||
throw new InternalFlowiseError(
|
||||
StatusCodes.INTERNAL_SERVER_ERROR,
|
||||
`Error: documentStoreServices.updateDocumentStoreUsage - ${getErrorMessage(error)}`
|
||||
)
|
||||
}
|
||||
}
|
||||
|
||||
export default {
|
||||
updateDocumentStoreUsage,
|
||||
deleteDocumentStore,
|
||||
createDocumentStore,
|
||||
deleteLoaderFromDocumentStore,
|
||||
getAllDocumentStores,
|
||||
getDocumentStoreById,
|
||||
getUsedChatflowNames,
|
||||
getDocumentStoreFileChunks,
|
||||
updateDocumentStore,
|
||||
previewChunks,
|
||||
processAndSaveChunks,
|
||||
deleteDocumentStoreFileChunk,
|
||||
editDocumentStoreFileChunk,
|
||||
getDocumentLoaders
|
||||
}
|
||||
|
|
@ -23,6 +23,27 @@ const getAllNodes = async () => {
|
|||
}
|
||||
}
|
||||
|
||||
// Get all component nodes for a specific category
|
||||
const getAllNodesForCategory = async (category: string) => {
|
||||
try {
|
||||
const appServer = getRunningExpressApp()
|
||||
const dbResponse = []
|
||||
for (const nodeName in appServer.nodesPool.componentNodes) {
|
||||
const componentNode = appServer.nodesPool.componentNodes[nodeName]
|
||||
if (componentNode.category === category) {
|
||||
const clonedNode = cloneDeep(componentNode)
|
||||
dbResponse.push(clonedNode)
|
||||
}
|
||||
}
|
||||
return dbResponse
|
||||
} catch (error) {
|
||||
throw new InternalFlowiseError(
|
||||
StatusCodes.INTERNAL_SERVER_ERROR,
|
||||
`Error: nodesService.getAllNodesForCategory - ${getErrorMessage(error)}`
|
||||
)
|
||||
}
|
||||
}
|
||||
|
||||
// Get specific component node via name
|
||||
const getNodeByName = async (nodeName: string) => {
|
||||
try {
|
||||
|
|
@ -138,5 +159,6 @@ export default {
|
|||
getNodeByName,
|
||||
getSingleNodeIcon,
|
||||
getSingleNodeAsyncOptions,
|
||||
executeCustomFunction
|
||||
executeCustomFunction,
|
||||
getAllNodesForCategory
|
||||
}
|
||||
|
|
|
|||
|
|
@ -41,6 +41,8 @@ import { Assistant } from '../database/entities/Assistant'
|
|||
import { DataSource } from 'typeorm'
|
||||
import { CachePool } from '../CachePool'
|
||||
import { Variable } from '../database/entities/Variable'
|
||||
import { DocumentStore } from '../database/entities/DocumentStore'
|
||||
import { DocumentStoreFileChunk } from '../database/entities/DocumentStoreFileChunk'
|
||||
import { InternalFlowiseError } from '../errors/internalFlowiseError'
|
||||
import { StatusCodes } from 'http-status-codes'
|
||||
|
||||
|
|
@ -54,7 +56,9 @@ export const databaseEntities: IDatabaseEntity = {
|
|||
Tool: Tool,
|
||||
Credential: Credential,
|
||||
Assistant: Assistant,
|
||||
Variable: Variable
|
||||
Variable: Variable,
|
||||
DocumentStore: DocumentStore,
|
||||
DocumentStoreFileChunk: DocumentStoreFileChunk
|
||||
}
|
||||
|
||||
/**
|
||||
|
|
@ -471,6 +475,7 @@ export const buildFlow = async (
|
|||
appDataSource,
|
||||
databaseEntities,
|
||||
cachePool,
|
||||
isUpsert,
|
||||
dynamicVariables,
|
||||
uploads
|
||||
})
|
||||
|
|
@ -1384,3 +1389,10 @@ export const getAppVersion = async () => {
|
|||
return ''
|
||||
}
|
||||
}
|
||||
|
||||
export const convertToValidFilename = (word: string) => {
|
||||
return word
|
||||
.replace(/[/|\\:*?"<>]/g, ' ')
|
||||
.replace(' ', '')
|
||||
.toLowerCase()
|
||||
}
|
||||
|
|
|
|||
|
|
@ -0,0 +1,32 @@
|
|||
import client from './client'
|
||||
|
||||
const getAllDocumentStores = () => client.get('/document-store/stores')
|
||||
const getDocumentLoaders = () => client.get('/document-store/loaders')
|
||||
const getSpecificDocumentStore = (id) => client.get(`/document-store/store/${id}`)
|
||||
const createDocumentStore = (body) => client.post(`/document-store/store`, body)
|
||||
const updateDocumentStore = (id, body) => client.put(`/document-store/store/${id}`, body)
|
||||
const deleteDocumentStore = (id) => client.delete(`/document-store/store/${id}`)
|
||||
|
||||
const deleteLoaderFromStore = (id, fileId) => client.delete(`/document-store/loader/${id}/${fileId}`)
|
||||
const deleteChunkFromStore = (storeId, loaderId, chunkId) => client.delete(`/document-store/chunks/${storeId}/${loaderId}/${chunkId}`)
|
||||
const editChunkFromStore = (storeId, loaderId, chunkId, body) =>
|
||||
client.put(`/document-store/chunks/${storeId}/${loaderId}/${chunkId}`, body)
|
||||
|
||||
const getFileChunks = (storeId, fileId, pageNo) => client.get(`/document-store/chunks/${storeId}/${fileId}/${pageNo}`)
|
||||
const previewChunks = (body) => client.post('/document-store/loader/preview', body)
|
||||
const processChunks = (body) => client.post(`/document-store/loader/process`, body)
|
||||
|
||||
export default {
|
||||
getAllDocumentStores,
|
||||
getSpecificDocumentStore,
|
||||
createDocumentStore,
|
||||
deleteLoaderFromStore,
|
||||
getFileChunks,
|
||||
updateDocumentStore,
|
||||
previewChunks,
|
||||
processChunks,
|
||||
getDocumentLoaders,
|
||||
deleteChunkFromStore,
|
||||
editChunkFromStore,
|
||||
deleteDocumentStore
|
||||
}
|
||||
|
|
@ -3,11 +3,13 @@ import client from './client'
|
|||
const getAllNodes = () => client.get('/nodes')
|
||||
|
||||
const getSpecificNode = (name) => client.get(`/nodes/${name}`)
|
||||
const getNodesByCategory = (name) => client.get(`/nodes/category/${name}`)
|
||||
|
||||
const executeCustomFunctionNode = (body) => client.post(`/node-custom-function`, body)
|
||||
|
||||
export default {
|
||||
getAllNodes,
|
||||
getSpecificNode,
|
||||
executeCustomFunctionNode
|
||||
executeCustomFunctionNode,
|
||||
getNodesByCategory
|
||||
}
|
||||
|
|
|
|||
|
|
@ -0,0 +1 @@
|
|||
<svg xmlns="http://www.w3.org/2000/svg" data-name="Layer 1" width="647.63626" height="632.17383" viewBox="0 0 647.63626 632.17383" xmlns:xlink="http://www.w3.org/1999/xlink"><path d="M687.3279,276.08691H512.81813a15.01828,15.01828,0,0,0-15,15v387.85l-2,.61005-42.81006,13.11a8.00676,8.00676,0,0,1-9.98974-5.31L315.678,271.39691a8.00313,8.00313,0,0,1,5.31006-9.99l65.97022-20.2,191.25-58.54,65.96972-20.2a7.98927,7.98927,0,0,1,9.99024,5.3l32.5498,106.32Z" transform="translate(-276.18187 -133.91309)" fill="#f2f2f2"/><path d="M725.408,274.08691l-39.23-128.14a16.99368,16.99368,0,0,0-21.23-11.28l-92.75,28.39L380.95827,221.60693l-92.75,28.4a17.0152,17.0152,0,0,0-11.28028,21.23l134.08008,437.93a17.02661,17.02661,0,0,0,16.26026,12.03,16.78926,16.78926,0,0,0,4.96972-.75l63.58008-19.46,2-.62v-2.09l-2,.61-64.16992,19.65a15.01489,15.01489,0,0,1-18.73-9.95l-134.06983-437.94a14.97935,14.97935,0,0,1,9.94971-18.73l92.75-28.4,191.24024-58.54,92.75-28.4a15.15551,15.15551,0,0,1,4.40966-.66,15.01461,15.01461,0,0,1,14.32032,10.61l39.0498,127.56.62012,2h2.08008Z" transform="translate(-276.18187 -133.91309)" fill="#3f3d56"/><path d="M398.86279,261.73389a9.0157,9.0157,0,0,1-8.61133-6.3667l-12.88037-42.07178a8.99884,8.99884,0,0,1,5.9712-11.24023l175.939-53.86377a9.00867,9.00867,0,0,1,11.24072,5.9707l12.88037,42.07227a9.01029,9.01029,0,0,1-5.9707,11.24072L401.49219,261.33887A8.976,8.976,0,0,1,398.86279,261.73389Z" transform="translate(-276.18187 -133.91309)" fill="#673ab7"/><circle cx="190.15351" cy="24.95465" r="20" fill="#673ab7"/><circle cx="190.15351" cy="24.95465" r="12.66462" fill="#fff"/><path d="M878.81836,716.08691h-338a8.50981,8.50981,0,0,1-8.5-8.5v-405a8.50951,8.50951,0,0,1,8.5-8.5h338a8.50982,8.50982,0,0,1,8.5,8.5v405A8.51013,8.51013,0,0,1,878.81836,716.08691Z" transform="translate(-276.18187 -133.91309)" fill="#e6e6e6"/><path d="M723.31813,274.08691h-210.5a17.02411,17.02411,0,0,0-17,17v407.8l2-.61v-407.19a15.01828,15.01828,0,0,1,15-15H723.93825Zm183.5,0h-394a17.02411,17.02411,0,0,0-17,17v458a17.0241,17.0241,0,0,0,17,17h394a17.0241,17.0241,0,0,0,17-17v-458A17.02411,17.02411,0,0,0,906.81813,274.08691Zm15,475a15.01828,15.01828,0,0,1-15,15h-394a15.01828,15.01828,0,0,1-15-15v-458a15.01828,15.01828,0,0,1,15-15h394a15.01828,15.01828,0,0,1,15,15Z" transform="translate(-276.18187 -133.91309)" fill="#3f3d56"/><path d="M801.81836,318.08691h-184a9.01015,9.01015,0,0,1-9-9v-44a9.01016,9.01016,0,0,1,9-9h184a9.01016,9.01016,0,0,1,9,9v44A9.01015,9.01015,0,0,1,801.81836,318.08691Z" transform="translate(-276.18187 -133.91309)" fill="#673ab7"/><circle cx="433.63626" cy="105.17383" r="20" fill="#673ab7"/><circle cx="433.63626" cy="105.17383" r="12.18187" fill="#fff"/></svg>
|
||||
|
After Width: | Height: | Size: 2.6 KiB |
File diff suppressed because one or more lines are too long
|
After Width: | Height: | Size: 9.4 KiB |
File diff suppressed because one or more lines are too long
|
After Width: | Height: | Size: 20 KiB |
|
|
@ -1,13 +1,26 @@
|
|||
import PropTypes from 'prop-types'
|
||||
|
||||
// material-ui
|
||||
import { Box, OutlinedInput, Toolbar, Typography } from '@mui/material'
|
||||
import { IconButton, Box, OutlinedInput, Toolbar, Typography } from '@mui/material'
|
||||
import { useTheme } from '@mui/material/styles'
|
||||
import { StyledFab } from '@/ui-component/button/StyledFab'
|
||||
|
||||
// icons
|
||||
import { IconSearch } from '@tabler/icons'
|
||||
import { IconSearch, IconArrowLeft, IconEdit } from '@tabler/icons'
|
||||
|
||||
const ViewHeader = ({ children, filters = null, onSearchChange, search, searchPlaceholder = 'Search', title }) => {
|
||||
const ViewHeader = ({
|
||||
children,
|
||||
filters = null,
|
||||
onSearchChange,
|
||||
search,
|
||||
searchPlaceholder = 'Search',
|
||||
title,
|
||||
description,
|
||||
isBackButton,
|
||||
onBack,
|
||||
isEditButton,
|
||||
onEdit
|
||||
}) => {
|
||||
const theme = useTheme()
|
||||
|
||||
return (
|
||||
|
|
@ -21,15 +34,54 @@ const ViewHeader = ({ children, filters = null, onSearchChange, search, searchPl
|
|||
width: '100%'
|
||||
}}
|
||||
>
|
||||
<Typography
|
||||
sx={{
|
||||
fontSize: '2rem',
|
||||
fontWeight: 600
|
||||
}}
|
||||
variant='h1'
|
||||
>
|
||||
{title}
|
||||
</Typography>
|
||||
<Box sx={{ display: 'flex', alignItems: 'center', flexDirection: 'row' }}>
|
||||
{isBackButton && (
|
||||
<StyledFab sx={{ mr: 3 }} size='small' color='secondary' aria-label='back' title='Back' onClick={onBack}>
|
||||
<IconArrowLeft />
|
||||
</StyledFab>
|
||||
)}
|
||||
<Box sx={{ display: 'flex', alignItems: 'start', flexDirection: 'column' }}>
|
||||
<Typography
|
||||
sx={{
|
||||
fontSize: '2rem',
|
||||
fontWeight: 600,
|
||||
display: '-webkit-box',
|
||||
WebkitLineClamp: 3,
|
||||
WebkitBoxOrient: 'vertical',
|
||||
textOverflow: 'ellipsis',
|
||||
overflow: 'hidden',
|
||||
flex: 1,
|
||||
maxWidth: 'calc(100vh - 100px)'
|
||||
}}
|
||||
variant='h1'
|
||||
>
|
||||
{title}
|
||||
</Typography>
|
||||
{description && (
|
||||
<Typography
|
||||
sx={{
|
||||
fontSize: '1rem',
|
||||
fontWeight: 500,
|
||||
mt: 2,
|
||||
display: '-webkit-box',
|
||||
WebkitLineClamp: 5,
|
||||
WebkitBoxOrient: 'vertical',
|
||||
textOverflow: 'ellipsis',
|
||||
overflow: 'hidden',
|
||||
flex: 1,
|
||||
maxWidth: 'calc(100vh - 100px)'
|
||||
}}
|
||||
>
|
||||
{description}
|
||||
</Typography>
|
||||
)}
|
||||
</Box>
|
||||
{isEditButton && (
|
||||
<IconButton sx={{ ml: 3 }} color='secondary' title='Edit' onClick={onEdit}>
|
||||
<IconEdit />
|
||||
</IconButton>
|
||||
)}
|
||||
</Box>
|
||||
<Box sx={{ height: 40, display: 'flex', alignItems: 'center', gap: 1 }}>
|
||||
{search && (
|
||||
<OutlinedInput
|
||||
|
|
@ -77,7 +129,12 @@ ViewHeader.propTypes = {
|
|||
onSearchChange: PropTypes.func,
|
||||
search: PropTypes.bool,
|
||||
searchPlaceholder: PropTypes.string,
|
||||
title: PropTypes.string
|
||||
title: PropTypes.string,
|
||||
description: PropTypes.string,
|
||||
isBackButton: PropTypes.bool,
|
||||
onBack: PropTypes.func,
|
||||
isEditButton: PropTypes.bool,
|
||||
onEdit: PropTypes.func
|
||||
}
|
||||
|
||||
export default ViewHeader
|
||||
|
|
|
|||
|
|
@ -1,8 +1,8 @@
|
|||
// assets
|
||||
import { IconHierarchy, IconBuildingStore, IconKey, IconTool, IconLock, IconRobot, IconVariable } from '@tabler/icons'
|
||||
import { IconHierarchy, IconBuildingStore, IconKey, IconTool, IconLock, IconRobot, IconVariable, IconFiles } from '@tabler/icons'
|
||||
|
||||
// constant
|
||||
const icons = { IconHierarchy, IconBuildingStore, IconKey, IconTool, IconLock, IconRobot, IconVariable }
|
||||
const icons = { IconHierarchy, IconBuildingStore, IconKey, IconTool, IconLock, IconRobot, IconVariable, IconFiles }
|
||||
|
||||
// ==============================|| DASHBOARD MENU ITEMS ||============================== //
|
||||
|
||||
|
|
@ -66,6 +66,14 @@ const dashboard = {
|
|||
url: '/apikey',
|
||||
icon: icons.IconKey,
|
||||
breadcrumbs: true
|
||||
},
|
||||
{
|
||||
id: 'documents',
|
||||
title: 'Document Stores',
|
||||
type: 'item',
|
||||
url: '/document-stores',
|
||||
icon: icons.IconFiles,
|
||||
breadcrumbs: true
|
||||
}
|
||||
]
|
||||
}
|
||||
|
|
|
|||
|
|
@ -25,6 +25,12 @@ const Credentials = Loadable(lazy(() => import('@/views/credentials')))
|
|||
// variables routing
|
||||
const Variables = Loadable(lazy(() => import('@/views/variables')))
|
||||
|
||||
// documents routing
|
||||
const Documents = Loadable(lazy(() => import('@/views/docstore')))
|
||||
const DocumentStoreDetail = Loadable(lazy(() => import('@/views/docstore/DocumentStoreDetail')))
|
||||
const ShowStoredChunks = Loadable(lazy(() => import('@/views/docstore/ShowStoredChunks')))
|
||||
const LoaderConfigPreviewChunks = Loadable(lazy(() => import('@/views/docstore/LoaderConfigPreviewChunks')))
|
||||
|
||||
// ==============================|| MAIN ROUTING ||============================== //
|
||||
|
||||
const MainRoutes = {
|
||||
|
|
@ -62,6 +68,22 @@ const MainRoutes = {
|
|||
{
|
||||
path: '/variables',
|
||||
element: <Variables />
|
||||
},
|
||||
{
|
||||
path: '/document-stores',
|
||||
element: <Documents />
|
||||
},
|
||||
{
|
||||
path: '/document-stores/:id',
|
||||
element: <DocumentStoreDetail />
|
||||
},
|
||||
{
|
||||
path: '/document-stores/chunks/:id/:id',
|
||||
element: <ShowStoredChunks />
|
||||
},
|
||||
{
|
||||
path: '/document-stores/:id/:name',
|
||||
element: <LoaderConfigPreviewChunks />
|
||||
}
|
||||
]
|
||||
}
|
||||
|
|
|
|||
|
|
@ -0,0 +1,191 @@
|
|||
import PropTypes from 'prop-types'
|
||||
import { useSelector } from 'react-redux'
|
||||
|
||||
// material-ui
|
||||
import { styled } from '@mui/material/styles'
|
||||
import { Box, Grid, Typography, useTheme } from '@mui/material'
|
||||
import { IconVectorBezier2, IconLanguage, IconScissors } from '@tabler/icons'
|
||||
|
||||
// project imports
|
||||
import MainCard from '@/ui-component/cards/MainCard'
|
||||
import DocumentStoreStatus from '@/views/docstore/DocumentStoreStatus'
|
||||
|
||||
import { kFormatter } from '@/utils/genericHelper'
|
||||
|
||||
const CardWrapper = styled(MainCard)(({ theme }) => ({
|
||||
background: theme.palette.card.main,
|
||||
color: theme.darkTextPrimary,
|
||||
overflow: 'auto',
|
||||
position: 'relative',
|
||||
boxShadow: '0 2px 14px 0 rgb(32 40 45 / 8%)',
|
||||
cursor: 'pointer',
|
||||
'&:hover': {
|
||||
background: theme.palette.card.hover,
|
||||
boxShadow: '0 2px 14px 0 rgb(32 40 45 / 20%)'
|
||||
},
|
||||
height: '100%',
|
||||
minHeight: '160px',
|
||||
maxHeight: '300px',
|
||||
width: '100%',
|
||||
overflowWrap: 'break-word',
|
||||
whiteSpace: 'pre-line'
|
||||
}))
|
||||
|
||||
// ===========================|| DOC STORE CARD ||=========================== //
|
||||
|
||||
const DocumentStoreCard = ({ data, images, onClick }) => {
|
||||
const theme = useTheme()
|
||||
const customization = useSelector((state) => state.customization)
|
||||
|
||||
return (
|
||||
<CardWrapper content={false} onClick={onClick} sx={{ border: 1, borderColor: theme.palette.grey[900] + 25, borderRadius: 2 }}>
|
||||
<Box sx={{ height: '100%', p: 2.25 }}>
|
||||
<Grid container justifyContent='space-between' direction='column' sx={{ height: '100%' }} gap={2}>
|
||||
<Box display='flex' flexDirection='column' sx={{ flex: 1, width: '100%' }}>
|
||||
<div
|
||||
style={{
|
||||
width: '100%',
|
||||
display: 'flex',
|
||||
flexDirection: 'row',
|
||||
alignItems: 'center',
|
||||
overflow: 'hidden'
|
||||
}}
|
||||
>
|
||||
<Typography
|
||||
sx={{
|
||||
display: '-webkit-box',
|
||||
fontSize: '1.25rem',
|
||||
fontWeight: 500,
|
||||
WebkitLineClamp: 2,
|
||||
WebkitBoxOrient: 'vertical',
|
||||
textOverflow: 'ellipsis',
|
||||
overflow: 'hidden',
|
||||
flex: 1
|
||||
}}
|
||||
>
|
||||
{data.name}
|
||||
</Typography>
|
||||
<DocumentStoreStatus status={data.status} />
|
||||
</div>
|
||||
<span
|
||||
style={{
|
||||
display: '-webkit-box',
|
||||
marginTop: 10,
|
||||
overflowWrap: 'break-word',
|
||||
WebkitLineClamp: 2,
|
||||
WebkitBoxOrient: 'vertical',
|
||||
textOverflow: 'ellipsis',
|
||||
overflow: 'hidden'
|
||||
}}
|
||||
>
|
||||
{data.description || ' '}
|
||||
</span>
|
||||
</Box>
|
||||
<Grid container columnGap={2} rowGap={1}>
|
||||
<div
|
||||
style={{
|
||||
paddingLeft: '7px',
|
||||
paddingRight: '7px',
|
||||
paddingTop: '3px',
|
||||
paddingBottom: '3px',
|
||||
fontSize: '11px',
|
||||
width: 'max-content',
|
||||
borderRadius: '25px',
|
||||
boxShadow: customization.isDarkMode
|
||||
? '0 2px 14px 0 rgb(255 255 255 / 20%)'
|
||||
: '0 2px 14px 0 rgb(32 40 45 / 20%)',
|
||||
|
||||
display: 'flex',
|
||||
flexDirection: 'row',
|
||||
alignItems: 'center'
|
||||
}}
|
||||
>
|
||||
<IconVectorBezier2 style={{ marginRight: 5 }} size={15} />
|
||||
{data.whereUsed?.length ?? 0} {data.whereUsed?.length <= 1 ? 'flow' : 'flows'}
|
||||
</div>
|
||||
<div
|
||||
style={{
|
||||
paddingLeft: '7px',
|
||||
paddingRight: '7px',
|
||||
paddingTop: '3px',
|
||||
paddingBottom: '3px',
|
||||
fontSize: '11px',
|
||||
width: 'max-content',
|
||||
borderRadius: '25px',
|
||||
boxShadow: customization.isDarkMode
|
||||
? '0 2px 14px 0 rgb(255 255 255 / 20%)'
|
||||
: '0 2px 14px 0 rgb(32 40 45 / 20%)',
|
||||
|
||||
display: 'flex',
|
||||
flexDirection: 'row',
|
||||
alignItems: 'center'
|
||||
}}
|
||||
>
|
||||
<IconLanguage style={{ marginRight: 5 }} size={15} />
|
||||
{kFormatter(data.totalChars ?? 0)} chars
|
||||
</div>
|
||||
<div
|
||||
style={{
|
||||
paddingLeft: '7px',
|
||||
paddingRight: '7px',
|
||||
paddingTop: '3px',
|
||||
paddingBottom: '3px',
|
||||
fontSize: '11px',
|
||||
width: 'max-content',
|
||||
borderRadius: '25px',
|
||||
boxShadow: customization.isDarkMode
|
||||
? '0 2px 14px 0 rgb(255 255 255 / 20%)'
|
||||
: '0 2px 14px 0 rgb(32 40 45 / 20%)',
|
||||
display: 'flex',
|
||||
flexDirection: 'row',
|
||||
alignItems: 'center'
|
||||
}}
|
||||
>
|
||||
<IconScissors style={{ marginRight: 5 }} size={15} />
|
||||
{kFormatter(data.totalChunks ?? 0)} chunks
|
||||
</div>
|
||||
</Grid>
|
||||
{images && images.length > 0 && (
|
||||
<Box
|
||||
sx={{
|
||||
display: 'flex',
|
||||
alignItems: 'center',
|
||||
justifyContent: 'start',
|
||||
gap: 1
|
||||
}}
|
||||
>
|
||||
{images.slice(0, images.length > 3 ? 3 : images.length).map((img) => (
|
||||
<Box
|
||||
key={img}
|
||||
sx={{
|
||||
width: 30,
|
||||
height: 30,
|
||||
borderRadius: '50%',
|
||||
backgroundColor: customization.isDarkMode
|
||||
? theme.palette.common.white
|
||||
: theme.palette.grey[300] + 75
|
||||
}}
|
||||
>
|
||||
<img style={{ width: '100%', height: '100%', padding: 5, objectFit: 'contain' }} alt='' src={img} />
|
||||
</Box>
|
||||
))}
|
||||
{images.length > 3 && (
|
||||
<Typography sx={{ alignItems: 'center', display: 'flex', fontSize: '.9rem', fontWeight: 200 }}>
|
||||
+ {images.length - 3} More
|
||||
</Typography>
|
||||
)}
|
||||
</Box>
|
||||
)}
|
||||
</Grid>
|
||||
</Box>
|
||||
</CardWrapper>
|
||||
)
|
||||
}
|
||||
|
||||
DocumentStoreCard.propTypes = {
|
||||
data: PropTypes.object,
|
||||
images: PropTypes.array,
|
||||
onClick: PropTypes.func
|
||||
}
|
||||
|
||||
export default DocumentStoreCard
|
||||
|
|
@ -144,7 +144,6 @@ const ItemCard = ({ data, images, onClick }) => {
|
|||
}
|
||||
|
||||
ItemCard.propTypes = {
|
||||
isLoading: PropTypes.bool,
|
||||
data: PropTypes.object,
|
||||
images: PropTypes.array,
|
||||
onClick: PropTypes.func
|
||||
|
|
|
|||
|
|
@ -23,7 +23,7 @@ const StatsCard = ({ title, stat }) => {
|
|||
|
||||
StatsCard.propTypes = {
|
||||
title: PropTypes.string,
|
||||
stat: PropTypes.string
|
||||
stat: PropTypes.string | PropTypes.number
|
||||
}
|
||||
|
||||
export default StatsCard
|
||||
|
|
|
|||
|
|
@ -113,7 +113,7 @@ const ExpandTextDialog = ({ show, dialogProps, onCancel, onConfirm }) => {
|
|||
lang={languageType}
|
||||
placeholder={inputParam.placeholder}
|
||||
basicSetup={
|
||||
languageType === 'json'
|
||||
languageType !== 'js'
|
||||
? { lineNumbers: false, foldGutter: false, autocompletion: false, highlightActiveLine: false }
|
||||
: {}
|
||||
}
|
||||
|
|
|
|||
|
|
@ -161,7 +161,7 @@ const ManageScrapedLinksDialog = ({ show, dialogProps, onCancel, onSave }) => {
|
|||
<Box sx={{ display: 'flex', alignItems: 'center', justifyContent: 'space-between', mb: 1.5 }}>
|
||||
<Typography sx={{ fontWeight: 500 }}>Scraped Links</Typography>
|
||||
{selectedLinks.length > 0 ? (
|
||||
<StyledButton
|
||||
<Button
|
||||
sx={{ height: 'max-content', width: 'max-content' }}
|
||||
variant='outlined'
|
||||
color='error'
|
||||
|
|
@ -170,7 +170,7 @@ const ManageScrapedLinksDialog = ({ show, dialogProps, onCancel, onSave }) => {
|
|||
startIcon={<IconEraser />}
|
||||
>
|
||||
Clear All
|
||||
</StyledButton>
|
||||
</Button>
|
||||
) : null}
|
||||
</Box>
|
||||
<>
|
||||
|
|
|
|||
|
|
@ -22,7 +22,7 @@ export const MultiDropdown = ({ name, value, options, onSelect, formControlSx =
|
|||
const customization = useSelector((state) => state.customization)
|
||||
const findMatchingOptions = (options = [], internalValue) => {
|
||||
let values = []
|
||||
if (internalValue && typeof internalValue === 'string') values = JSON.parse(internalValue)
|
||||
if ('choose an option' !== internalValue && internalValue && typeof internalValue === 'string') values = JSON.parse(internalValue)
|
||||
else values = internalValue
|
||||
return options.filter((option) => values.includes(option.name))
|
||||
}
|
||||
|
|
|
|||
|
|
@ -5,8 +5,21 @@ import { json } from '@codemirror/lang-json'
|
|||
import { vscodeDark } from '@uiw/codemirror-theme-vscode'
|
||||
import { sublime } from '@uiw/codemirror-theme-sublime'
|
||||
import { EditorView } from '@codemirror/view'
|
||||
import { useTheme } from '@mui/material/styles'
|
||||
|
||||
export const CodeEditor = ({
|
||||
value,
|
||||
height,
|
||||
theme,
|
||||
lang,
|
||||
placeholder,
|
||||
disabled = false,
|
||||
autoFocus = false,
|
||||
basicSetup = {},
|
||||
onValueChange
|
||||
}) => {
|
||||
const colorTheme = useTheme()
|
||||
|
||||
export const CodeEditor = ({ value, height, theme, lang, placeholder, disabled = false, basicSetup = {}, onValueChange }) => {
|
||||
const customStyle = EditorView.baseTheme({
|
||||
'&': {
|
||||
color: '#191b1f',
|
||||
|
|
@ -14,7 +27,18 @@ export const CodeEditor = ({ value, height, theme, lang, placeholder, disabled =
|
|||
},
|
||||
'.cm-placeholder': {
|
||||
color: 'rgba(120, 120, 120, 0.5)'
|
||||
}
|
||||
},
|
||||
'.cm-content':
|
||||
lang !== 'js'
|
||||
? {
|
||||
fontFamily: 'Roboto, sans-serif',
|
||||
fontSize: '0.95rem',
|
||||
letterSpacing: '0em',
|
||||
fontWeight: 400,
|
||||
lineHeight: '1.5em',
|
||||
color: colorTheme.darkTextPrimary
|
||||
}
|
||||
: {}
|
||||
})
|
||||
|
||||
return (
|
||||
|
|
@ -31,6 +55,8 @@ export const CodeEditor = ({ value, height, theme, lang, placeholder, disabled =
|
|||
onChange={onValueChange}
|
||||
readOnly={disabled}
|
||||
editable={!disabled}
|
||||
// eslint-disable-next-line
|
||||
autoFocus={autoFocus}
|
||||
basicSetup={basicSetup}
|
||||
/>
|
||||
)
|
||||
|
|
@ -43,6 +69,7 @@ CodeEditor.propTypes = {
|
|||
lang: PropTypes.string,
|
||||
placeholder: PropTypes.string,
|
||||
disabled: PropTypes.bool,
|
||||
autoFocus: PropTypes.bool,
|
||||
basicSetup: PropTypes.object,
|
||||
onValueChange: PropTypes.func
|
||||
}
|
||||
|
|
|
|||
|
|
@ -0,0 +1,105 @@
|
|||
import { styled } from '@mui/material/styles'
|
||||
import Box from '@mui/material/Box'
|
||||
import Slider from '@mui/material/Slider'
|
||||
import { Grid, Input } from '@mui/material'
|
||||
import PropTypes from 'prop-types'
|
||||
|
||||
const BoxShadow = '0 3px 1px rgba(0,0,0,0.1),0 4px 8px rgba(0,0,0,0.13),0 0 0 1px rgba(0,0,0,0.02)'
|
||||
|
||||
const CustomInputSlider = styled(Slider)(({ theme }) => ({
|
||||
color: theme.palette.mode === 'dark' ? '#0a84ff' : '#007bff',
|
||||
height: 5,
|
||||
padding: '15px 0',
|
||||
'& .MuiSlider-thumb': {
|
||||
height: 20,
|
||||
width: 20,
|
||||
backgroundColor: '#333',
|
||||
boxShadow: '0 0 2px 0px rgba(0, 0, 0, 0.1)',
|
||||
'&:focus, &:hover, &.Mui-active': {
|
||||
boxShadow: '0px 0px 3px 1px rgba(0, 0, 0, 0.1)',
|
||||
// Reset on touch devices, it doesn't add specificity
|
||||
'@media (hover: none)': {
|
||||
boxShadow: BoxShadow
|
||||
}
|
||||
},
|
||||
'&:before': {
|
||||
boxShadow: '0px 0px 1px 0px rgba(0,0,0,0.2), 0px 0px 0px 0px rgba(0,0,0,0.14), 0px 0px 1px 0px rgba(0,0,0,0.12)'
|
||||
}
|
||||
},
|
||||
'& .MuiSlider-valueLabel': {
|
||||
fontSize: 12,
|
||||
fontWeight: 'normal',
|
||||
top: -1,
|
||||
backgroundColor: 'unset',
|
||||
color: theme.palette.text.primary,
|
||||
'&::before': {
|
||||
display: 'none'
|
||||
},
|
||||
'& *': {
|
||||
background: 'transparent',
|
||||
color: theme.palette.mode === 'dark' ? '#000' : '#000'
|
||||
}
|
||||
},
|
||||
'& .MuiSlider-track': {
|
||||
border: 'none',
|
||||
height: 5
|
||||
},
|
||||
'& .MuiSlider-rail': {
|
||||
opacity: 0.5,
|
||||
boxShadow: 'inset 0px 0px 4px -2px #000',
|
||||
backgroundColor: '#d0d0d0'
|
||||
}
|
||||
}))
|
||||
|
||||
export const InputSlider = ({ value, onChange }) => {
|
||||
const handleSliderChange = (event, newValue) => onChange(newValue)
|
||||
|
||||
const handleInputChange = (event) => {
|
||||
onChange(event.target.value === '' ? 0 : Number(event.target.value))
|
||||
}
|
||||
|
||||
const handleBlur = () => {
|
||||
if (value < 0) {
|
||||
onChange(0)
|
||||
}
|
||||
}
|
||||
|
||||
return (
|
||||
<Box sx={{ width: '100%' }}>
|
||||
<Grid container spacing={2} sx={{ mt: 1 }} alignItems='center'>
|
||||
<Grid item xs>
|
||||
<CustomInputSlider
|
||||
value={typeof value === 'number' ? value : 0}
|
||||
onChange={handleSliderChange}
|
||||
valueLabelDisplay='on'
|
||||
aria-labelledby='input-slider'
|
||||
step={10}
|
||||
min={0}
|
||||
max={5000}
|
||||
/>
|
||||
</Grid>
|
||||
<Grid item>
|
||||
<Input
|
||||
sx={{ ml: 3, mr: 3 }}
|
||||
value={value}
|
||||
size='small'
|
||||
onChange={handleInputChange}
|
||||
onBlur={handleBlur}
|
||||
inputProps={{
|
||||
step: 10,
|
||||
min: 0,
|
||||
max: 10000,
|
||||
type: 'number',
|
||||
'aria-labelledby': 'input-slider'
|
||||
}}
|
||||
/>
|
||||
</Grid>
|
||||
</Grid>
|
||||
</Box>
|
||||
)
|
||||
}
|
||||
|
||||
InputSlider.propTypes = {
|
||||
value: PropTypes.number,
|
||||
onChange: PropTypes.func
|
||||
}
|
||||
|
|
@ -737,14 +737,37 @@ export const getOS = () => {
|
|||
return os
|
||||
}
|
||||
|
||||
export const formatBytes = (bytes, decimals = 2) => {
|
||||
if (!+bytes) return '0 Bytes'
|
||||
|
||||
const k = 1024
|
||||
const dm = decimals < 0 ? 0 : decimals
|
||||
const sizes = ['Bytes', 'KiB', 'MiB', 'GiB', 'TiB', 'PiB', 'EiB', 'ZiB', 'YiB']
|
||||
|
||||
const i = Math.floor(Math.log(bytes) / Math.log(k))
|
||||
|
||||
return `${parseFloat((bytes / Math.pow(k, i)).toFixed(dm))} ${sizes[i]}`
|
||||
export const formatBytes = (number) => {
|
||||
if (number == null || number === undefined || number <= 0) {
|
||||
return '0 Bytes'
|
||||
}
|
||||
var scaleCounter = 0
|
||||
var scaleInitials = [' Bytes', ' KB', ' MB', ' GB', ' TB', ' PB', ' EB', ' ZB', ' YB']
|
||||
while (number >= 1024 && scaleCounter < scaleInitials.length - 1) {
|
||||
number /= 1024
|
||||
scaleCounter++
|
||||
}
|
||||
if (scaleCounter >= scaleInitials.length) scaleCounter = scaleInitials.length - 1
|
||||
let compactNumber = number
|
||||
.toFixed(2)
|
||||
.replace(/\.?0+$/, '')
|
||||
.replace(/\B(?=(\d{3})+(?!\d))/g, ',')
|
||||
compactNumber += scaleInitials[scaleCounter]
|
||||
return compactNumber.trim()
|
||||
}
|
||||
|
||||
// Formatter from: https://stackoverflow.com/a/9462382
|
||||
export const kFormatter = (num) => {
|
||||
const lookup = [
|
||||
{ value: 1, symbol: '' },
|
||||
{ value: 1e3, symbol: 'k' },
|
||||
{ value: 1e6, symbol: 'M' },
|
||||
{ value: 1e9, symbol: 'G' },
|
||||
{ value: 1e12, symbol: 'T' },
|
||||
{ value: 1e15, symbol: 'P' },
|
||||
{ value: 1e18, symbol: 'E' }
|
||||
]
|
||||
const regexp = /\.0+$|(?<=\.[0-9]*[1-9])0+$/
|
||||
const item = lookup.findLast((item) => num >= item.value)
|
||||
return item ? (num / item.value).toFixed(1).replace(regexp, '').concat(item.symbol) : '0'
|
||||
}
|
||||
|
|
|
|||
|
|
@ -1,3 +1,5 @@
|
|||
import * as PropTypes from 'prop-types'
|
||||
import moment from 'moment/moment'
|
||||
import { useEffect, useState } from 'react'
|
||||
import { useDispatch, useSelector } from 'react-redux'
|
||||
import { enqueueSnackbar as enqueueSnackbarAction, closeSnackbar as closeSnackbarAction } from '@/store/actions'
|
||||
|
|
@ -28,6 +30,8 @@ import MainCard from '@/ui-component/cards/MainCard'
|
|||
import { StyledButton } from '@/ui-component/button/StyledButton'
|
||||
import APIKeyDialog from './APIKeyDialog'
|
||||
import ConfirmDialog from '@/ui-component/dialog/ConfirmDialog'
|
||||
import ViewHeader from '@/layout/MainLayout/ViewHeader'
|
||||
import ErrorBoundary from '@/ErrorBoundary'
|
||||
|
||||
// API
|
||||
import apiKeyApi from '@/api/apikey'
|
||||
|
|
@ -42,12 +46,9 @@ import useNotifier from '@/utils/useNotifier'
|
|||
// Icons
|
||||
import { IconTrash, IconEdit, IconCopy, IconChevronsUp, IconChevronsDown, IconX, IconPlus, IconEye, IconEyeOff } from '@tabler/icons'
|
||||
import APIEmptySVG from '@/assets/images/api_empty.svg'
|
||||
import * as PropTypes from 'prop-types'
|
||||
import moment from 'moment/moment'
|
||||
import ViewHeader from '@/layout/MainLayout/ViewHeader'
|
||||
import ErrorBoundary from '@/ErrorBoundary'
|
||||
|
||||
// ==============================|| APIKey ||============================== //
|
||||
|
||||
const StyledTableCell = styled(TableCell)(({ theme }) => ({
|
||||
borderColor: theme.palette.grey[900] + 25,
|
||||
padding: '6px 16px',
|
||||
|
|
|
|||
|
|
@ -2,7 +2,7 @@ import { useEffect, useState } from 'react'
|
|||
import { useNavigate } from 'react-router-dom'
|
||||
|
||||
// material-ui
|
||||
import { Box, Skeleton, Stack, ToggleButton } from '@mui/material'
|
||||
import { Box, Skeleton, Stack, ToggleButton, ToggleButtonGroup } from '@mui/material'
|
||||
import { useTheme } from '@mui/material/styles'
|
||||
|
||||
// project imports
|
||||
|
|
@ -12,6 +12,10 @@ import { gridSpacing } from '@/store/constant'
|
|||
import WorkflowEmptySVG from '@/assets/images/workflow_empty.svg'
|
||||
import LoginDialog from '@/ui-component/dialog/LoginDialog'
|
||||
import ConfirmDialog from '@/ui-component/dialog/ConfirmDialog'
|
||||
import { FlowListTable } from '@/ui-component/table/FlowListTable'
|
||||
import { StyledButton } from '@/ui-component/button/StyledButton'
|
||||
import ViewHeader from '@/layout/MainLayout/ViewHeader'
|
||||
import ErrorBoundary from '@/ErrorBoundary'
|
||||
|
||||
// API
|
||||
import chatflowsApi from '@/api/chatflows'
|
||||
|
|
@ -24,12 +28,6 @@ import { baseURL } from '@/store/constant'
|
|||
|
||||
// icons
|
||||
import { IconPlus, IconLayoutGrid, IconList } from '@tabler/icons'
|
||||
import * as React from 'react'
|
||||
import ToggleButtonGroup from '@mui/material/ToggleButtonGroup'
|
||||
import { FlowListTable } from '@/ui-component/table/FlowListTable'
|
||||
import { StyledButton } from '@/ui-component/button/StyledButton'
|
||||
import ViewHeader from '@/layout/MainLayout/ViewHeader'
|
||||
import ErrorBoundary from '@/ErrorBoundary'
|
||||
|
||||
// ==============================|| CHATFLOWS ||============================== //
|
||||
|
||||
|
|
@ -45,7 +43,7 @@ const Chatflows = () => {
|
|||
const [loginDialogProps, setLoginDialogProps] = useState({})
|
||||
|
||||
const getAllChatflowsApi = useApi(chatflowsApi.getAllChatflows)
|
||||
const [view, setView] = React.useState(localStorage.getItem('flowDisplayStyle') || 'card')
|
||||
const [view, setView] = useState(localStorage.getItem('flowDisplayStyle') || 'card')
|
||||
|
||||
const handleChange = (event, nextView) => {
|
||||
if (nextView === null) return
|
||||
|
|
|
|||
|
|
@ -0,0 +1,228 @@
|
|||
import { createPortal } from 'react-dom'
|
||||
import PropTypes from 'prop-types'
|
||||
import { useState, useEffect } from 'react'
|
||||
import { useDispatch } from 'react-redux'
|
||||
import {
|
||||
HIDE_CANVAS_DIALOG,
|
||||
SHOW_CANVAS_DIALOG,
|
||||
enqueueSnackbar as enqueueSnackbarAction,
|
||||
closeSnackbar as closeSnackbarAction
|
||||
} from '@/store/actions'
|
||||
|
||||
// Material
|
||||
import { Button, Dialog, DialogActions, DialogContent, DialogTitle, Box, Typography, OutlinedInput } from '@mui/material'
|
||||
|
||||
// Project imports
|
||||
import { StyledButton } from '@/ui-component/button/StyledButton'
|
||||
import ConfirmDialog from '@/ui-component/dialog/ConfirmDialog'
|
||||
|
||||
// Icons
|
||||
import { IconX, IconFiles } from '@tabler/icons'
|
||||
|
||||
// API
|
||||
import documentStoreApi from '@/api/documentstore'
|
||||
|
||||
// utils
|
||||
import useNotifier from '@/utils/useNotifier'
|
||||
|
||||
const AddDocStoreDialog = ({ show, dialogProps, onCancel, onConfirm }) => {
|
||||
const portalElement = document.getElementById('portal')
|
||||
|
||||
const dispatch = useDispatch()
|
||||
|
||||
// ==============================|| Snackbar ||============================== //
|
||||
|
||||
useNotifier()
|
||||
|
||||
const enqueueSnackbar = (...args) => dispatch(enqueueSnackbarAction(...args))
|
||||
const closeSnackbar = (...args) => dispatch(closeSnackbarAction(...args))
|
||||
|
||||
const [documentStoreName, setDocumentStoreName] = useState('')
|
||||
const [documentStoreDesc, setDocumentStoreDesc] = useState('')
|
||||
const [dialogType, setDialogType] = useState('ADD')
|
||||
const [docStoreId, setDocumentStoreId] = useState()
|
||||
|
||||
useEffect(() => {
|
||||
setDialogType(dialogProps.type)
|
||||
if (dialogProps.type === 'EDIT' && dialogProps.data) {
|
||||
setDocumentStoreName(dialogProps.data.name)
|
||||
setDocumentStoreDesc(dialogProps.data.description)
|
||||
setDocumentStoreId(dialogProps.data.id)
|
||||
} else if (dialogProps.type === 'ADD') {
|
||||
setDocumentStoreName('')
|
||||
setDocumentStoreDesc('')
|
||||
}
|
||||
|
||||
return () => {
|
||||
setDocumentStoreName('')
|
||||
setDocumentStoreDesc('')
|
||||
}
|
||||
}, [dialogProps])
|
||||
|
||||
useEffect(() => {
|
||||
if (show) dispatch({ type: SHOW_CANVAS_DIALOG })
|
||||
else dispatch({ type: HIDE_CANVAS_DIALOG })
|
||||
return () => dispatch({ type: HIDE_CANVAS_DIALOG })
|
||||
}, [show, dispatch])
|
||||
|
||||
const createDocumentStore = async () => {
|
||||
try {
|
||||
const obj = {
|
||||
name: documentStoreName,
|
||||
description: documentStoreDesc
|
||||
}
|
||||
const createResp = await documentStoreApi.createDocumentStore(obj)
|
||||
if (createResp.data) {
|
||||
enqueueSnackbar({
|
||||
message: 'New Document Store created.',
|
||||
options: {
|
||||
key: new Date().getTime() + Math.random(),
|
||||
variant: 'success',
|
||||
action: (key) => (
|
||||
<Button style={{ color: 'white' }} onClick={() => closeSnackbar(key)}>
|
||||
<IconX />
|
||||
</Button>
|
||||
)
|
||||
}
|
||||
})
|
||||
onConfirm(createResp.data.id)
|
||||
}
|
||||
} catch (err) {
|
||||
const errorData = typeof err === 'string' ? err : err.response?.data || `${err.response.data.message}`
|
||||
enqueueSnackbar({
|
||||
message: `Failed to add new Document Store: ${errorData}`,
|
||||
options: {
|
||||
key: new Date().getTime() + Math.random(),
|
||||
variant: 'error',
|
||||
persist: true,
|
||||
action: (key) => (
|
||||
<Button style={{ color: 'white' }} onClick={() => closeSnackbar(key)}>
|
||||
<IconX />
|
||||
</Button>
|
||||
)
|
||||
}
|
||||
})
|
||||
onCancel()
|
||||
}
|
||||
}
|
||||
|
||||
const updateDocumentStore = async () => {
|
||||
try {
|
||||
const saveObj = {
|
||||
name: documentStoreName,
|
||||
description: documentStoreDesc
|
||||
}
|
||||
|
||||
const saveResp = await documentStoreApi.updateDocumentStore(docStoreId, saveObj)
|
||||
if (saveResp.data) {
|
||||
enqueueSnackbar({
|
||||
message: 'Document Store Updated!',
|
||||
options: {
|
||||
key: new Date().getTime() + Math.random(),
|
||||
variant: 'success',
|
||||
action: (key) => (
|
||||
<Button style={{ color: 'white' }} onClick={() => closeSnackbar(key)}>
|
||||
<IconX />
|
||||
</Button>
|
||||
)
|
||||
}
|
||||
})
|
||||
onConfirm(saveResp.data.id)
|
||||
}
|
||||
} catch (error) {
|
||||
const errorData = error.response?.data || `${error.response?.status}: ${error.response?.statusText}`
|
||||
enqueueSnackbar({
|
||||
message: `Failed to update Document Store: ${errorData}`,
|
||||
options: {
|
||||
key: new Date().getTime() + Math.random(),
|
||||
variant: 'error',
|
||||
persist: true,
|
||||
action: (key) => (
|
||||
<Button style={{ color: 'white' }} onClick={() => closeSnackbar(key)}>
|
||||
<IconX />
|
||||
</Button>
|
||||
)
|
||||
}
|
||||
})
|
||||
onCancel()
|
||||
}
|
||||
}
|
||||
|
||||
const component = show ? (
|
||||
<Dialog
|
||||
fullWidth
|
||||
maxWidth='sm'
|
||||
open={show}
|
||||
onClose={onCancel}
|
||||
aria-labelledby='alert-dialog-title'
|
||||
aria-describedby='alert-dialog-description'
|
||||
>
|
||||
<DialogTitle style={{ fontSize: '1rem' }} id='alert-dialog-title'>
|
||||
<div style={{ display: 'flex', flexDirection: 'row', alignItems: 'center' }}>
|
||||
<IconFiles style={{ marginRight: '10px' }} />
|
||||
{dialogProps.title}
|
||||
</div>
|
||||
</DialogTitle>
|
||||
<DialogContent>
|
||||
<Box sx={{ p: 2 }}>
|
||||
<div style={{ display: 'flex', flexDirection: 'row' }}>
|
||||
<Typography>
|
||||
Name<span style={{ color: 'red' }}> *</span>
|
||||
</Typography>
|
||||
|
||||
<div style={{ flexGrow: 1 }}></div>
|
||||
</div>
|
||||
<OutlinedInput
|
||||
size='small'
|
||||
sx={{ mt: 1 }}
|
||||
type='string'
|
||||
fullWidth
|
||||
key='documentStoreName'
|
||||
onChange={(e) => setDocumentStoreName(e.target.value)}
|
||||
value={documentStoreName ?? ''}
|
||||
/>
|
||||
</Box>
|
||||
<Box sx={{ p: 2 }}>
|
||||
<div style={{ display: 'flex', flexDirection: 'row' }}>
|
||||
<Typography>Description</Typography>
|
||||
|
||||
<div style={{ flexGrow: 1 }}></div>
|
||||
</div>
|
||||
<OutlinedInput
|
||||
size='small'
|
||||
multiline={true}
|
||||
rows={7}
|
||||
sx={{ mt: 1 }}
|
||||
type='string'
|
||||
fullWidth
|
||||
key='documentStoreDesc'
|
||||
onChange={(e) => setDocumentStoreDesc(e.target.value)}
|
||||
value={documentStoreDesc ?? ''}
|
||||
/>
|
||||
</Box>
|
||||
</DialogContent>
|
||||
<DialogActions>
|
||||
<Button onClick={() => onCancel()}>Cancel</Button>
|
||||
<StyledButton
|
||||
disabled={!documentStoreName}
|
||||
variant='contained'
|
||||
onClick={() => (dialogType === 'ADD' ? createDocumentStore() : updateDocumentStore())}
|
||||
>
|
||||
{dialogProps.confirmButtonName}
|
||||
</StyledButton>
|
||||
</DialogActions>
|
||||
<ConfirmDialog />
|
||||
</Dialog>
|
||||
) : null
|
||||
|
||||
return createPortal(component, portalElement)
|
||||
}
|
||||
|
||||
AddDocStoreDialog.propTypes = {
|
||||
show: PropTypes.bool,
|
||||
dialogProps: PropTypes.object,
|
||||
onCancel: PropTypes.func,
|
||||
onConfirm: PropTypes.func
|
||||
}
|
||||
|
||||
export default AddDocStoreDialog
|
||||
|
|
@ -0,0 +1,275 @@
|
|||
import PropTypes from 'prop-types'
|
||||
import { useState } from 'react'
|
||||
import { useSelector } from 'react-redux'
|
||||
|
||||
// material-ui
|
||||
import { Box, Typography, IconButton, Button } from '@mui/material'
|
||||
import { IconArrowsMaximize, IconAlertTriangle } from '@tabler/icons'
|
||||
|
||||
// project import
|
||||
import { Dropdown } from '@/ui-component/dropdown/Dropdown'
|
||||
import { MultiDropdown } from '@/ui-component/dropdown/MultiDropdown'
|
||||
import { AsyncDropdown } from '@/ui-component/dropdown/AsyncDropdown'
|
||||
import { Input } from '@/ui-component/input/Input'
|
||||
import { DataGrid } from '@/ui-component/grid/DataGrid'
|
||||
import { File } from '@/ui-component/file/File'
|
||||
import { SwitchInput } from '@/ui-component/switch/Switch'
|
||||
import { JsonEditorInput } from '@/ui-component/json/JsonEditor'
|
||||
import { TooltipWithParser } from '@/ui-component/tooltip/TooltipWithParser'
|
||||
import { CodeEditor } from '@/ui-component/editor/CodeEditor'
|
||||
import ExpandTextDialog from '@/ui-component/dialog/ExpandTextDialog'
|
||||
import ManageScrapedLinksDialog from '@/ui-component/dialog/ManageScrapedLinksDialog'
|
||||
import CredentialInputHandler from '@/views/canvas/CredentialInputHandler'
|
||||
|
||||
// const
|
||||
import { FLOWISE_CREDENTIAL_ID } from '@/store/constant'
|
||||
|
||||
// ===========================|| DocStoreInputHandler ||=========================== //
|
||||
|
||||
const DocStoreInputHandler = ({ inputParam, data, disabled = false }) => {
|
||||
const customization = useSelector((state) => state.customization)
|
||||
|
||||
const [showExpandDialog, setShowExpandDialog] = useState(false)
|
||||
const [expandDialogProps, setExpandDialogProps] = useState({})
|
||||
const [showManageScrapedLinksDialog, setShowManageScrapedLinksDialog] = useState(false)
|
||||
const [manageScrapedLinksDialogProps, setManageScrapedLinksDialogProps] = useState({})
|
||||
|
||||
const onExpandDialogClicked = (value, inputParam) => {
|
||||
const dialogProps = {
|
||||
value,
|
||||
inputParam,
|
||||
disabled,
|
||||
confirmButtonName: 'Save',
|
||||
cancelButtonName: 'Cancel'
|
||||
}
|
||||
setExpandDialogProps(dialogProps)
|
||||
setShowExpandDialog(true)
|
||||
}
|
||||
|
||||
const onManageLinksDialogClicked = (url, selectedLinks, relativeLinksMethod, limit) => {
|
||||
const dialogProps = {
|
||||
url,
|
||||
relativeLinksMethod,
|
||||
limit,
|
||||
selectedLinks,
|
||||
confirmButtonName: 'Save',
|
||||
cancelButtonName: 'Cancel'
|
||||
}
|
||||
setManageScrapedLinksDialogProps(dialogProps)
|
||||
setShowManageScrapedLinksDialog(true)
|
||||
}
|
||||
|
||||
const onManageLinksDialogSave = (url, links) => {
|
||||
setShowManageScrapedLinksDialog(false)
|
||||
data.inputs.url = url
|
||||
data.inputs.selectedLinks = links
|
||||
}
|
||||
|
||||
const onExpandDialogSave = (newValue, inputParamName) => {
|
||||
setShowExpandDialog(false)
|
||||
data.inputs[inputParamName] = newValue
|
||||
}
|
||||
|
||||
return (
|
||||
<div>
|
||||
{inputParam && (
|
||||
<>
|
||||
<Box sx={{ p: 2 }}>
|
||||
<div style={{ display: 'flex', flexDirection: 'row' }}>
|
||||
<Typography>
|
||||
{inputParam.label}
|
||||
{!inputParam.optional && <span style={{ color: 'red' }}> *</span>}
|
||||
{inputParam.description && <TooltipWithParser style={{ marginLeft: 10 }} title={inputParam.description} />}
|
||||
</Typography>
|
||||
<div style={{ flexGrow: 1 }}></div>
|
||||
{((inputParam.type === 'string' && inputParam.rows) || inputParam.type === 'code') && (
|
||||
<IconButton
|
||||
size='small'
|
||||
sx={{
|
||||
height: 25,
|
||||
width: 25
|
||||
}}
|
||||
title='Expand'
|
||||
color='primary'
|
||||
onClick={() =>
|
||||
onExpandDialogClicked(data.inputs[inputParam.name] ?? inputParam.default ?? '', inputParam)
|
||||
}
|
||||
>
|
||||
<IconArrowsMaximize />
|
||||
</IconButton>
|
||||
)}
|
||||
</div>
|
||||
{inputParam.warning && (
|
||||
<div
|
||||
style={{
|
||||
display: 'flex',
|
||||
flexDirection: 'row',
|
||||
alignItems: 'center',
|
||||
borderRadius: 10,
|
||||
background: 'rgb(254,252,191)',
|
||||
padding: 10,
|
||||
marginTop: 10,
|
||||
marginBottom: 10
|
||||
}}
|
||||
>
|
||||
<IconAlertTriangle size={30} color='orange' />
|
||||
<span style={{ color: 'rgb(116,66,16)', marginLeft: 10 }}>{inputParam.warning}</span>
|
||||
</div>
|
||||
)}
|
||||
{inputParam.type === 'credential' && (
|
||||
<CredentialInputHandler
|
||||
disabled={disabled}
|
||||
data={data}
|
||||
inputParam={inputParam}
|
||||
onSelect={(newValue) => {
|
||||
data.credential = newValue
|
||||
data.inputs[FLOWISE_CREDENTIAL_ID] = newValue // in case data.credential is not updated
|
||||
}}
|
||||
/>
|
||||
)}
|
||||
|
||||
{inputParam.type === 'file' && (
|
||||
<File
|
||||
disabled={disabled}
|
||||
fileType={inputParam.fileType || '*'}
|
||||
onChange={(newValue) => (data.inputs[inputParam.name] = newValue)}
|
||||
value={data.inputs[inputParam.name] ?? inputParam.default ?? 'Choose a file to upload'}
|
||||
/>
|
||||
)}
|
||||
{inputParam.type === 'boolean' && (
|
||||
<SwitchInput
|
||||
disabled={disabled}
|
||||
onChange={(newValue) => (data.inputs[inputParam.name] = newValue)}
|
||||
value={data.inputs[inputParam.name] ?? inputParam.default ?? false}
|
||||
/>
|
||||
)}
|
||||
{inputParam.type === 'datagrid' && (
|
||||
<DataGrid
|
||||
disabled={disabled}
|
||||
columns={inputParam.datagrid}
|
||||
hideFooter={true}
|
||||
rows={data.inputs[inputParam.name] ?? JSON.stringify(inputParam.default) ?? []}
|
||||
onChange={(newValue) => (data.inputs[inputParam.name] = newValue)}
|
||||
/>
|
||||
)}
|
||||
{inputParam.type === 'code' && (
|
||||
<>
|
||||
<div style={{ height: '5px' }}></div>
|
||||
<div style={{ height: inputParam.rows ? '100px' : '200px' }}>
|
||||
<CodeEditor
|
||||
disabled={disabled}
|
||||
value={data.inputs[inputParam.name] ?? inputParam.default ?? ''}
|
||||
height={inputParam.rows ? '100px' : '200px'}
|
||||
theme={customization.isDarkMode ? 'dark' : 'light'}
|
||||
lang={'js'}
|
||||
placeholder={inputParam.placeholder}
|
||||
onValueChange={(code) => (data.inputs[inputParam.name] = code)}
|
||||
basicSetup={{ highlightActiveLine: false, highlightActiveLineGutter: false }}
|
||||
/>
|
||||
</div>
|
||||
</>
|
||||
)}
|
||||
{(inputParam.type === 'string' || inputParam.type === 'password' || inputParam.type === 'number') && (
|
||||
<Input
|
||||
key={data.inputs[inputParam.name]}
|
||||
disabled={disabled}
|
||||
inputParam={inputParam}
|
||||
onChange={(newValue) => (data.inputs[inputParam.name] = newValue)}
|
||||
value={data.inputs[inputParam.name] ?? inputParam.default ?? ''}
|
||||
nodeId={data.id}
|
||||
/>
|
||||
)}
|
||||
{inputParam.type === 'json' && (
|
||||
<JsonEditorInput
|
||||
disabled={disabled}
|
||||
onChange={(newValue) => (data.inputs[inputParam.name] = newValue)}
|
||||
value={data.inputs[inputParam.name] ?? inputParam.default ?? ''}
|
||||
isDarkMode={customization.isDarkMode}
|
||||
/>
|
||||
)}
|
||||
{inputParam.type === 'options' && (
|
||||
<Dropdown
|
||||
disabled={disabled}
|
||||
name={inputParam.name}
|
||||
options={inputParam.options}
|
||||
onSelect={(newValue) => (data.inputs[inputParam.name] = newValue)}
|
||||
value={data.inputs[inputParam.name] ?? inputParam.default ?? 'choose an option'}
|
||||
/>
|
||||
)}
|
||||
{inputParam.type === 'multiOptions' && (
|
||||
<MultiDropdown
|
||||
disabled={disabled}
|
||||
name={inputParam.name}
|
||||
options={inputParam.options}
|
||||
onSelect={(newValue) => (data.inputs[inputParam.name] = newValue)}
|
||||
value={data.inputs[inputParam.name] ?? inputParam.default ?? 'choose an option'}
|
||||
/>
|
||||
)}
|
||||
{inputParam.type === 'asyncOptions' && (
|
||||
<>
|
||||
{data.inputParams.length === 1 && <div style={{ marginTop: 10 }} />}
|
||||
<div style={{ display: 'flex', flexDirection: 'row' }}>
|
||||
<AsyncDropdown
|
||||
disabled={disabled}
|
||||
name={inputParam.name}
|
||||
nodeData={data}
|
||||
value={data.inputs[inputParam.name] ?? inputParam.default ?? 'choose an option'}
|
||||
onSelect={(newValue) => (data.inputs[inputParam.name] = newValue)}
|
||||
onCreateNew={() => addAsyncOption(inputParam.name)}
|
||||
/>
|
||||
</div>
|
||||
</>
|
||||
)}
|
||||
{(data.name === 'cheerioWebScraper' ||
|
||||
data.name === 'puppeteerWebScraper' ||
|
||||
data.name === 'playwrightWebScraper') &&
|
||||
inputParam.name === 'url' && (
|
||||
<>
|
||||
<Button
|
||||
style={{
|
||||
display: 'flex',
|
||||
flexDirection: 'row',
|
||||
width: '100%'
|
||||
}}
|
||||
disabled={disabled}
|
||||
sx={{ borderRadius: '12px', width: '100%', mt: 1 }}
|
||||
variant='outlined'
|
||||
onClick={() =>
|
||||
onManageLinksDialogClicked(
|
||||
data.inputs[inputParam.name] ?? inputParam.default ?? '',
|
||||
data.inputs.selectedLinks,
|
||||
data.inputs['relativeLinksMethod'] ?? 'webCrawl',
|
||||
parseInt(data.inputs['limit']) ?? 0
|
||||
)
|
||||
}
|
||||
>
|
||||
Manage Links
|
||||
</Button>
|
||||
<ManageScrapedLinksDialog
|
||||
show={showManageScrapedLinksDialog}
|
||||
dialogProps={manageScrapedLinksDialogProps}
|
||||
onCancel={() => setShowManageScrapedLinksDialog(false)}
|
||||
onSave={onManageLinksDialogSave}
|
||||
/>
|
||||
</>
|
||||
)}
|
||||
</Box>
|
||||
</>
|
||||
)}
|
||||
<ExpandTextDialog
|
||||
show={showExpandDialog}
|
||||
dialogProps={expandDialogProps}
|
||||
onCancel={() => setShowExpandDialog(false)}
|
||||
onConfirm={(newValue, inputParamName) => onExpandDialogSave(newValue, inputParamName)}
|
||||
></ExpandTextDialog>
|
||||
</div>
|
||||
)
|
||||
}
|
||||
|
||||
DocStoreInputHandler.propTypes = {
|
||||
inputParam: PropTypes.object,
|
||||
data: PropTypes.object,
|
||||
disabled: PropTypes.bool
|
||||
}
|
||||
|
||||
export default DocStoreInputHandler
|
||||
|
|
@ -0,0 +1,189 @@
|
|||
import { useState, useEffect } from 'react'
|
||||
import { createPortal } from 'react-dom'
|
||||
import { useDispatch } from 'react-redux'
|
||||
import PropTypes from 'prop-types'
|
||||
import { List, ListItemButton, Dialog, DialogContent, DialogTitle, Box, OutlinedInput, InputAdornment, Typography } from '@mui/material'
|
||||
import { useTheme } from '@mui/material/styles'
|
||||
import { IconSearch, IconX } from '@tabler/icons'
|
||||
|
||||
// API
|
||||
import documentStoreApi from '@/api/documentstore'
|
||||
|
||||
// const
|
||||
import { baseURL } from '@/store/constant'
|
||||
import { HIDE_CANVAS_DIALOG, SHOW_CANVAS_DIALOG } from '@/store/actions'
|
||||
import useApi from '@/hooks/useApi'
|
||||
|
||||
const DocumentLoaderListDialog = ({ show, dialogProps, onCancel, onDocLoaderSelected }) => {
|
||||
const portalElement = document.getElementById('portal')
|
||||
const dispatch = useDispatch()
|
||||
const theme = useTheme()
|
||||
const [searchValue, setSearchValue] = useState('')
|
||||
const [documentLoaders, setDocumentLoaders] = useState([])
|
||||
|
||||
const getDocumentLoadersApi = useApi(documentStoreApi.getDocumentLoaders)
|
||||
|
||||
const onSearchChange = (val) => {
|
||||
setSearchValue(val)
|
||||
}
|
||||
|
||||
function filterFlows(data) {
|
||||
return data.name.toLowerCase().indexOf(searchValue.toLowerCase()) > -1
|
||||
}
|
||||
|
||||
useEffect(() => {
|
||||
if (dialogProps.documentLoaders) {
|
||||
setDocumentLoaders(dialogProps.documentLoaders)
|
||||
}
|
||||
}, [dialogProps])
|
||||
|
||||
useEffect(() => {
|
||||
getDocumentLoadersApi.request()
|
||||
|
||||
// eslint-disable-next-line react-hooks/exhaustive-deps
|
||||
}, [])
|
||||
|
||||
useEffect(() => {
|
||||
if (getDocumentLoadersApi.data) {
|
||||
setDocumentLoaders(getDocumentLoadersApi.data)
|
||||
}
|
||||
// eslint-disable-next-line react-hooks/exhaustive-deps
|
||||
}, [getDocumentLoadersApi.data])
|
||||
|
||||
useEffect(() => {
|
||||
if (show) dispatch({ type: SHOW_CANVAS_DIALOG })
|
||||
else dispatch({ type: HIDE_CANVAS_DIALOG })
|
||||
return () => dispatch({ type: HIDE_CANVAS_DIALOG })
|
||||
}, [show, dispatch])
|
||||
|
||||
const component = show ? (
|
||||
<Dialog
|
||||
fullWidth
|
||||
maxWidth='md'
|
||||
open={show}
|
||||
onClose={onCancel}
|
||||
aria-labelledby='alert-dialog-title'
|
||||
aria-describedby='alert-dialog-description'
|
||||
>
|
||||
<DialogTitle sx={{ fontSize: '1rem', p: 3, pb: 0 }} id='alert-dialog-title'>
|
||||
{dialogProps.title}
|
||||
</DialogTitle>
|
||||
<DialogContent sx={{ display: 'flex', flexDirection: 'column', gap: 2, maxHeight: '75vh', position: 'relative', px: 3, pb: 3 }}>
|
||||
<Box
|
||||
sx={{
|
||||
backgroundColor: theme.palette.background.paper,
|
||||
pt: 2,
|
||||
position: 'sticky',
|
||||
top: 0,
|
||||
zIndex: 10
|
||||
}}
|
||||
>
|
||||
<OutlinedInput
|
||||
sx={{ width: '100%', pr: 2, pl: 2, position: 'sticky' }}
|
||||
id='input-search-credential'
|
||||
value={searchValue}
|
||||
onChange={(e) => onSearchChange(e.target.value)}
|
||||
placeholder='Search'
|
||||
startAdornment={
|
||||
<InputAdornment position='start'>
|
||||
<IconSearch stroke={1.5} size='1rem' color={theme.palette.grey[500]} />
|
||||
</InputAdornment>
|
||||
}
|
||||
endAdornment={
|
||||
<InputAdornment
|
||||
position='end'
|
||||
sx={{
|
||||
cursor: 'pointer',
|
||||
color: theme.palette.grey[500],
|
||||
'&:hover': {
|
||||
color: theme.palette.grey[900]
|
||||
}
|
||||
}}
|
||||
title='Clear Search'
|
||||
>
|
||||
<IconX
|
||||
stroke={1.5}
|
||||
size='1rem'
|
||||
onClick={() => onSearchChange('')}
|
||||
style={{
|
||||
cursor: 'pointer'
|
||||
}}
|
||||
/>
|
||||
</InputAdornment>
|
||||
}
|
||||
aria-describedby='search-helper-text'
|
||||
inputProps={{
|
||||
'aria-label': 'weight'
|
||||
}}
|
||||
/>
|
||||
</Box>
|
||||
<List
|
||||
sx={{
|
||||
width: '100%',
|
||||
display: 'grid',
|
||||
gridTemplateColumns: 'repeat(3, 1fr)',
|
||||
gap: 2,
|
||||
py: 0,
|
||||
zIndex: 9,
|
||||
borderRadius: '10px',
|
||||
[theme.breakpoints.down('md')]: {
|
||||
maxWidth: 370
|
||||
}
|
||||
}}
|
||||
>
|
||||
{[...documentLoaders].filter(filterFlows).map((documentLoader) => (
|
||||
<ListItemButton
|
||||
alignItems='center'
|
||||
key={documentLoader.name}
|
||||
onClick={() => onDocLoaderSelected(documentLoader.name)}
|
||||
sx={{
|
||||
border: 1,
|
||||
borderColor: theme.palette.grey[900] + 25,
|
||||
borderRadius: 2,
|
||||
display: 'flex',
|
||||
alignItems: 'center',
|
||||
justifyContent: 'start',
|
||||
textAlign: 'left',
|
||||
gap: 1,
|
||||
p: 2
|
||||
}}
|
||||
>
|
||||
<div
|
||||
style={{
|
||||
width: 50,
|
||||
height: 50,
|
||||
borderRadius: '50%',
|
||||
backgroundColor: 'white'
|
||||
}}
|
||||
>
|
||||
<img
|
||||
style={{
|
||||
width: '100%',
|
||||
height: '100%',
|
||||
padding: 7,
|
||||
borderRadius: '50%',
|
||||
objectFit: 'contain'
|
||||
}}
|
||||
alt={documentLoader.name}
|
||||
src={`${baseURL}/api/v1/node-icon/${documentLoader.name}`}
|
||||
/>
|
||||
</div>
|
||||
<Typography>{documentLoader.label}</Typography>
|
||||
</ListItemButton>
|
||||
))}
|
||||
</List>
|
||||
</DialogContent>
|
||||
</Dialog>
|
||||
) : null
|
||||
|
||||
return createPortal(component, portalElement)
|
||||
}
|
||||
|
||||
DocumentLoaderListDialog.propTypes = {
|
||||
show: PropTypes.bool,
|
||||
dialogProps: PropTypes.object,
|
||||
onCancel: PropTypes.func,
|
||||
onDocLoaderSelected: PropTypes.func
|
||||
}
|
||||
|
||||
export default DocumentLoaderListDialog
|
||||
|
|
@ -0,0 +1,626 @@
|
|||
import { useEffect, useState } from 'react'
|
||||
import { useDispatch, useSelector } from 'react-redux'
|
||||
import * as PropTypes from 'prop-types'
|
||||
import { useNavigate } from 'react-router-dom'
|
||||
|
||||
// material-ui
|
||||
import {
|
||||
Box,
|
||||
Stack,
|
||||
Typography,
|
||||
TableContainer,
|
||||
Paper,
|
||||
Table,
|
||||
TableHead,
|
||||
TableRow,
|
||||
TableCell,
|
||||
TableBody,
|
||||
Chip,
|
||||
Menu,
|
||||
MenuItem,
|
||||
Divider,
|
||||
Button,
|
||||
Skeleton,
|
||||
IconButton
|
||||
} from '@mui/material'
|
||||
import { alpha, styled, useTheme } from '@mui/material/styles'
|
||||
import { tableCellClasses } from '@mui/material/TableCell'
|
||||
|
||||
// project imports
|
||||
import MainCard from '@/ui-component/cards/MainCard'
|
||||
import AddDocStoreDialog from '@/views/docstore/AddDocStoreDialog'
|
||||
import ConfirmDialog from '@/ui-component/dialog/ConfirmDialog'
|
||||
import DocumentLoaderListDialog from '@/views/docstore/DocumentLoaderListDialog'
|
||||
import ErrorBoundary from '@/ErrorBoundary'
|
||||
|
||||
// API
|
||||
import documentsApi from '@/api/documentstore'
|
||||
|
||||
// Hooks
|
||||
import useApi from '@/hooks/useApi'
|
||||
import useConfirm from '@/hooks/useConfirm'
|
||||
import useNotifier from '@/utils/useNotifier'
|
||||
|
||||
// icons
|
||||
import { IconPlus, IconRefresh, IconScissors, IconTrash, IconX, IconVectorBezier2 } from '@tabler/icons'
|
||||
import KeyboardArrowDownIcon from '@mui/icons-material/KeyboardArrowDown'
|
||||
import FileDeleteIcon from '@mui/icons-material/Delete'
|
||||
import FileEditIcon from '@mui/icons-material/Edit'
|
||||
import FileChunksIcon from '@mui/icons-material/AppRegistration'
|
||||
import doc_store_details_emptySVG from '@/assets/images/doc_store_details_empty.svg'
|
||||
|
||||
// store
|
||||
import { closeSnackbar as closeSnackbarAction, enqueueSnackbar as enqueueSnackbarAction } from '@/store/actions'
|
||||
import { StyledButton } from '@/ui-component/button/StyledButton'
|
||||
import ViewHeader from '@/layout/MainLayout/ViewHeader'
|
||||
|
||||
// ==============================|| DOCUMENTS ||============================== //
|
||||
|
||||
const StyledTableCell = styled(TableCell)(({ theme }) => ({
|
||||
borderColor: theme.palette.grey[900] + 25,
|
||||
padding: '6px 16px',
|
||||
|
||||
[`&.${tableCellClasses.head}`]: {
|
||||
color: theme.palette.grey[900]
|
||||
},
|
||||
[`&.${tableCellClasses.body}`]: {
|
||||
fontSize: 14,
|
||||
height: 64
|
||||
}
|
||||
}))
|
||||
|
||||
const StyledTableRow = styled(TableRow)(() => ({
|
||||
// hide last border
|
||||
'&:last-child td, &:last-child th': {
|
||||
border: 0
|
||||
}
|
||||
}))
|
||||
|
||||
const StyledMenu = styled((props) => (
|
||||
<Menu
|
||||
elevation={0}
|
||||
anchorOrigin={{
|
||||
vertical: 'bottom',
|
||||
horizontal: 'right'
|
||||
}}
|
||||
transformOrigin={{
|
||||
vertical: 'top',
|
||||
horizontal: 'right'
|
||||
}}
|
||||
{...props}
|
||||
/>
|
||||
))(({ theme }) => ({
|
||||
'& .MuiPaper-root': {
|
||||
borderRadius: 6,
|
||||
marginTop: theme.spacing(1),
|
||||
minWidth: 180,
|
||||
boxShadow:
|
||||
'rgb(255, 255, 255) 0px 0px 0px 0px, rgba(0, 0, 0, 0.05) 0px 0px 0px 1px, rgba(0, 0, 0, 0.1) 0px 10px 15px -3px, rgba(0, 0, 0, 0.05) 0px 4px 6px -2px',
|
||||
'& .MuiMenu-list': {
|
||||
padding: '4px 0'
|
||||
},
|
||||
'& .MuiMenuItem-root': {
|
||||
'& .MuiSvgIcon-root': {
|
||||
fontSize: 18,
|
||||
color: theme.palette.text.secondary,
|
||||
marginRight: theme.spacing(1.5)
|
||||
},
|
||||
'&:active': {
|
||||
backgroundColor: alpha(theme.palette.primary.main, theme.palette.action.selectedOpacity)
|
||||
}
|
||||
}
|
||||
}
|
||||
}))
|
||||
|
||||
const DocumentStoreDetails = () => {
|
||||
const theme = useTheme()
|
||||
const customization = useSelector((state) => state.customization)
|
||||
const navigate = useNavigate()
|
||||
const dispatch = useDispatch()
|
||||
useNotifier()
|
||||
|
||||
const enqueueSnackbar = (...args) => dispatch(enqueueSnackbarAction(...args))
|
||||
const closeSnackbar = (...args) => dispatch(closeSnackbarAction(...args))
|
||||
const { confirm } = useConfirm()
|
||||
|
||||
const getSpecificDocumentStore = useApi(documentsApi.getSpecificDocumentStore)
|
||||
|
||||
const [error, setError] = useState(null)
|
||||
const [isLoading, setLoading] = useState(true)
|
||||
const [showDialog, setShowDialog] = useState(false)
|
||||
const [documentStore, setDocumentStore] = useState({})
|
||||
const [dialogProps, setDialogProps] = useState({})
|
||||
const [showDocumentLoaderListDialog, setShowDocumentLoaderListDialog] = useState(false)
|
||||
const [documentLoaderListDialogProps, setDocumentLoaderListDialogProps] = useState({})
|
||||
|
||||
const URLpath = document.location.pathname.toString().split('/')
|
||||
const storeId = URLpath[URLpath.length - 1] === 'document-stores' ? '' : URLpath[URLpath.length - 1]
|
||||
|
||||
const openPreviewSettings = (id) => {
|
||||
navigate('/document-stores/' + storeId + '/' + id)
|
||||
}
|
||||
|
||||
const showStoredChunks = (id) => {
|
||||
navigate('/document-stores/chunks/' + storeId + '/' + id)
|
||||
}
|
||||
|
||||
const onDocLoaderSelected = (docLoaderComponentName) => {
|
||||
setShowDocumentLoaderListDialog(false)
|
||||
navigate('/document-stores/' + storeId + '/' + docLoaderComponentName)
|
||||
}
|
||||
|
||||
const listLoaders = () => {
|
||||
const dialogProp = {
|
||||
title: 'Select Document Loader'
|
||||
}
|
||||
setDocumentLoaderListDialogProps(dialogProp)
|
||||
setShowDocumentLoaderListDialog(true)
|
||||
}
|
||||
|
||||
const onLoaderDelete = async (file) => {
|
||||
const confirmPayload = {
|
||||
title: `Delete`,
|
||||
description: `Delete Loader ${file.loaderName} ? This will delete all the associated document chunks.`,
|
||||
confirmButtonName: 'Delete',
|
||||
cancelButtonName: 'Cancel'
|
||||
}
|
||||
const isConfirmed = await confirm(confirmPayload)
|
||||
|
||||
if (isConfirmed) {
|
||||
try {
|
||||
const deleteResp = await documentsApi.deleteLoaderFromStore(storeId, file.id)
|
||||
if (deleteResp.data) {
|
||||
enqueueSnackbar({
|
||||
message: 'Loader and associated document chunks deleted',
|
||||
options: {
|
||||
key: new Date().getTime() + Math.random(),
|
||||
variant: 'success',
|
||||
action: (key) => (
|
||||
<Button style={{ color: 'white' }} onClick={() => closeSnackbar(key)}>
|
||||
<IconX />
|
||||
</Button>
|
||||
)
|
||||
}
|
||||
})
|
||||
onConfirm()
|
||||
}
|
||||
} catch (error) {
|
||||
setError(error)
|
||||
const errorData = error.response.data || `${error.response.status}: ${error.response.statusText}`
|
||||
enqueueSnackbar({
|
||||
message: `Failed to delete loader: ${errorData}`,
|
||||
options: {
|
||||
key: new Date().getTime() + Math.random(),
|
||||
variant: 'error',
|
||||
persist: true,
|
||||
action: (key) => (
|
||||
<Button style={{ color: 'white' }} onClick={() => closeSnackbar(key)}>
|
||||
<IconX />
|
||||
</Button>
|
||||
)
|
||||
}
|
||||
})
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
const onStoreDelete = async () => {
|
||||
const confirmPayload = {
|
||||
title: `Delete`,
|
||||
description: `Delete Store ${getSpecificDocumentStore.data?.name} ? This will delete all the associated loaders and document chunks.`,
|
||||
confirmButtonName: 'Delete',
|
||||
cancelButtonName: 'Cancel'
|
||||
}
|
||||
const isConfirmed = await confirm(confirmPayload)
|
||||
|
||||
if (isConfirmed) {
|
||||
try {
|
||||
const deleteResp = await documentsApi.deleteDocumentStore(storeId)
|
||||
if (deleteResp.data) {
|
||||
enqueueSnackbar({
|
||||
message: 'Store, Loader and associated document chunks deleted',
|
||||
options: {
|
||||
key: new Date().getTime() + Math.random(),
|
||||
variant: 'success',
|
||||
action: (key) => (
|
||||
<Button style={{ color: 'white' }} onClick={() => closeSnackbar(key)}>
|
||||
<IconX />
|
||||
</Button>
|
||||
)
|
||||
}
|
||||
})
|
||||
navigate('/document-stores/')
|
||||
}
|
||||
} catch (error) {
|
||||
setError(error)
|
||||
const errorData = error.response.data || `${error.response.status}: ${error.response.statusText}`
|
||||
enqueueSnackbar({
|
||||
message: `Failed to delete loader: ${errorData}`,
|
||||
options: {
|
||||
key: new Date().getTime() + Math.random(),
|
||||
variant: 'error',
|
||||
persist: true,
|
||||
action: (key) => (
|
||||
<Button style={{ color: 'white' }} onClick={() => closeSnackbar(key)}>
|
||||
<IconX />
|
||||
</Button>
|
||||
)
|
||||
}
|
||||
})
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
const onEditClicked = () => {
|
||||
const data = {
|
||||
name: documentStore.name,
|
||||
description: documentStore.description,
|
||||
id: documentStore.id
|
||||
}
|
||||
const dialogProp = {
|
||||
title: 'Edit Document Store',
|
||||
type: 'EDIT',
|
||||
cancelButtonName: 'Cancel',
|
||||
confirmButtonName: 'Update',
|
||||
data: data
|
||||
}
|
||||
setDialogProps(dialogProp)
|
||||
setShowDialog(true)
|
||||
}
|
||||
|
||||
const onConfirm = () => {
|
||||
setShowDialog(false)
|
||||
getSpecificDocumentStore.request(storeId)
|
||||
}
|
||||
|
||||
useEffect(() => {
|
||||
getSpecificDocumentStore.request(storeId)
|
||||
|
||||
// eslint-disable-next-line react-hooks/exhaustive-deps
|
||||
}, [])
|
||||
|
||||
useEffect(() => {
|
||||
if (getSpecificDocumentStore.data) {
|
||||
setDocumentStore(getSpecificDocumentStore.data)
|
||||
// total the chunks and chars
|
||||
}
|
||||
|
||||
// eslint-disable-next-line react-hooks/exhaustive-deps
|
||||
}, [getSpecificDocumentStore.data])
|
||||
|
||||
useEffect(() => {
|
||||
if (getSpecificDocumentStore.error) {
|
||||
setError(getSpecificDocumentStore.error)
|
||||
}
|
||||
// eslint-disable-next-line react-hooks/exhaustive-deps
|
||||
}, [getSpecificDocumentStore.error])
|
||||
|
||||
useEffect(() => {
|
||||
setLoading(getSpecificDocumentStore.loading)
|
||||
}, [getSpecificDocumentStore.loading])
|
||||
|
||||
return (
|
||||
<>
|
||||
<MainCard>
|
||||
{error ? (
|
||||
<ErrorBoundary error={error} />
|
||||
) : (
|
||||
<Stack flexDirection='column' sx={{ gap: 3 }}>
|
||||
<ViewHeader
|
||||
isBackButton={true}
|
||||
isEditButton={true}
|
||||
search={false}
|
||||
title={documentStore?.name}
|
||||
description={documentStore?.description}
|
||||
onBack={() => navigate('/document-stores')}
|
||||
onEdit={() => onEditClicked()}
|
||||
>
|
||||
<IconButton onClick={onStoreDelete} size='small' color='error' title='Delete Document Store' sx={{ mr: 2 }}>
|
||||
<IconTrash />
|
||||
</IconButton>
|
||||
{documentStore?.status === 'STALE' && (
|
||||
<Button variant='outlined' sx={{ mr: 2 }} startIcon={<IconRefresh />} onClick={onConfirm}>
|
||||
Refresh
|
||||
</Button>
|
||||
)}
|
||||
{documentStore?.totalChunks > 0 && (
|
||||
<Button
|
||||
variant='outlined'
|
||||
sx={{ borderRadius: 2, height: '100%' }}
|
||||
startIcon={<IconScissors />}
|
||||
onClick={() => showStoredChunks('all')}
|
||||
>
|
||||
View Chunks
|
||||
</Button>
|
||||
)}
|
||||
<StyledButton
|
||||
variant='contained'
|
||||
sx={{ borderRadius: 2, height: '100%', color: 'white' }}
|
||||
startIcon={<IconPlus />}
|
||||
onClick={listLoaders}
|
||||
>
|
||||
Add Document Loader
|
||||
</StyledButton>
|
||||
</ViewHeader>
|
||||
{getSpecificDocumentStore.data?.whereUsed?.length > 0 && (
|
||||
<Stack flexDirection='row' sx={{ gap: 2, alignItems: 'center', flexWrap: 'wrap' }}>
|
||||
<div
|
||||
style={{
|
||||
paddingLeft: '15px',
|
||||
paddingRight: '15px',
|
||||
paddingTop: '10px',
|
||||
paddingBottom: '10px',
|
||||
fontSize: '0.9rem',
|
||||
width: 'max-content',
|
||||
display: 'flex',
|
||||
flexDirection: 'row',
|
||||
alignItems: 'center'
|
||||
}}
|
||||
>
|
||||
<IconVectorBezier2 style={{ marginRight: 5 }} size={17} />
|
||||
Chatflows Used:
|
||||
</div>
|
||||
{getSpecificDocumentStore.data.whereUsed.map((chatflowUsed, index) => (
|
||||
<Chip
|
||||
key={index}
|
||||
clickable
|
||||
style={{
|
||||
width: 'max-content',
|
||||
borderRadius: '25px',
|
||||
boxShadow: customization.isDarkMode
|
||||
? '0 2px 14px 0 rgb(255 255 255 / 10%)'
|
||||
: '0 2px 14px 0 rgb(32 40 45 / 10%)'
|
||||
}}
|
||||
label={chatflowUsed.name}
|
||||
onClick={() => navigate('/canvas/' + chatflowUsed.id)}
|
||||
></Chip>
|
||||
))}
|
||||
</Stack>
|
||||
)}
|
||||
{!isLoading && documentStore && !documentStore?.loaders?.length ? (
|
||||
<Stack sx={{ alignItems: 'center', justifyContent: 'center' }} flexDirection='column'>
|
||||
<Box sx={{ p: 2, height: 'auto' }}>
|
||||
<img
|
||||
style={{ objectFit: 'cover', height: '16vh', width: 'auto' }}
|
||||
src={doc_store_details_emptySVG}
|
||||
alt='doc_store_details_emptySVG'
|
||||
/>
|
||||
</Box>
|
||||
<div>No Document Added Yet</div>
|
||||
<StyledButton
|
||||
variant='contained'
|
||||
sx={{ borderRadius: 2, height: '100%', mt: 2, color: 'white' }}
|
||||
startIcon={<IconPlus />}
|
||||
onClick={listLoaders}
|
||||
>
|
||||
Add Document Loader
|
||||
</StyledButton>
|
||||
</Stack>
|
||||
) : (
|
||||
<TableContainer
|
||||
sx={{ border: 1, borderColor: theme.palette.grey[900] + 25, borderRadius: 2 }}
|
||||
component={Paper}
|
||||
>
|
||||
<Table sx={{ minWidth: 650 }} aria-label='simple table'>
|
||||
<TableHead
|
||||
sx={{
|
||||
backgroundColor: customization.isDarkMode
|
||||
? theme.palette.common.black
|
||||
: theme.palette.grey[100],
|
||||
height: 56
|
||||
}}
|
||||
>
|
||||
<TableRow>
|
||||
<StyledTableCell> </StyledTableCell>
|
||||
<StyledTableCell>Loader</StyledTableCell>
|
||||
<StyledTableCell>Splitter</StyledTableCell>
|
||||
<StyledTableCell>Source(s)</StyledTableCell>
|
||||
<StyledTableCell>Chunks</StyledTableCell>
|
||||
<StyledTableCell>Chars</StyledTableCell>
|
||||
<StyledTableCell>Actions</StyledTableCell>
|
||||
</TableRow>
|
||||
</TableHead>
|
||||
<TableBody>
|
||||
{isLoading ? (
|
||||
<>
|
||||
<StyledTableRow>
|
||||
<StyledTableCell>
|
||||
<Skeleton variant='text' />
|
||||
</StyledTableCell>
|
||||
<StyledTableCell>
|
||||
<Skeleton variant='text' />
|
||||
</StyledTableCell>
|
||||
<StyledTableCell>
|
||||
<Skeleton variant='text' />
|
||||
</StyledTableCell>
|
||||
<StyledTableCell>
|
||||
<Skeleton variant='text' />
|
||||
</StyledTableCell>
|
||||
<StyledTableCell>
|
||||
<Skeleton variant='text' />
|
||||
</StyledTableCell>
|
||||
<StyledTableCell>
|
||||
<Skeleton variant='text' />
|
||||
</StyledTableCell>
|
||||
<StyledTableCell>
|
||||
<Skeleton variant='text' />
|
||||
</StyledTableCell>
|
||||
</StyledTableRow>
|
||||
<StyledTableRow>
|
||||
<StyledTableCell>
|
||||
<Skeleton variant='text' />
|
||||
</StyledTableCell>
|
||||
<StyledTableCell>
|
||||
<Skeleton variant='text' />
|
||||
</StyledTableCell>
|
||||
<StyledTableCell>
|
||||
<Skeleton variant='text' />
|
||||
</StyledTableCell>
|
||||
<StyledTableCell>
|
||||
<Skeleton variant='text' />
|
||||
</StyledTableCell>
|
||||
<StyledTableCell>
|
||||
<Skeleton variant='text' />
|
||||
</StyledTableCell>
|
||||
<StyledTableCell>
|
||||
<Skeleton variant='text' />
|
||||
</StyledTableCell>
|
||||
<StyledTableCell>
|
||||
<Skeleton variant='text' />
|
||||
</StyledTableCell>
|
||||
</StyledTableRow>
|
||||
</>
|
||||
) : (
|
||||
<>
|
||||
{documentStore?.loaders &&
|
||||
documentStore?.loaders.length > 0 &&
|
||||
documentStore?.loaders.map((loader, index) => (
|
||||
<LoaderRow
|
||||
key={index}
|
||||
index={index}
|
||||
loader={loader}
|
||||
theme={theme}
|
||||
onEditClick={() => openPreviewSettings(loader.id)}
|
||||
onViewChunksClick={() => showStoredChunks(loader.id)}
|
||||
onDeleteClick={() => onLoaderDelete(loader)}
|
||||
/>
|
||||
))}
|
||||
</>
|
||||
)}
|
||||
</TableBody>
|
||||
</Table>
|
||||
</TableContainer>
|
||||
)}
|
||||
{getSpecificDocumentStore.data?.status === 'STALE' && (
|
||||
<div style={{ width: '100%', textAlign: 'center', marginTop: '20px' }}>
|
||||
<Typography
|
||||
color='warning'
|
||||
style={{ color: 'darkred', fontWeight: 500, fontStyle: 'italic', fontSize: 12 }}
|
||||
>
|
||||
Some files are pending processing. Please Refresh to get the latest status.
|
||||
</Typography>
|
||||
</div>
|
||||
)}
|
||||
</Stack>
|
||||
)}
|
||||
</MainCard>
|
||||
{showDialog && (
|
||||
<AddDocStoreDialog
|
||||
dialogProps={dialogProps}
|
||||
show={showDialog}
|
||||
onCancel={() => setShowDialog(false)}
|
||||
onConfirm={onConfirm}
|
||||
/>
|
||||
)}
|
||||
{showDocumentLoaderListDialog && (
|
||||
<DocumentLoaderListDialog
|
||||
show={showDocumentLoaderListDialog}
|
||||
dialogProps={documentLoaderListDialogProps}
|
||||
onCancel={() => setShowDocumentLoaderListDialog(false)}
|
||||
onDocLoaderSelected={onDocLoaderSelected}
|
||||
/>
|
||||
)}
|
||||
<ConfirmDialog />
|
||||
</>
|
||||
)
|
||||
}
|
||||
|
||||
function LoaderRow(props) {
|
||||
const [anchorEl, setAnchorEl] = useState(null)
|
||||
const open = Boolean(anchorEl)
|
||||
|
||||
const handleClick = (event) => {
|
||||
event.preventDefault()
|
||||
event.stopPropagation()
|
||||
setAnchorEl(event.currentTarget)
|
||||
}
|
||||
|
||||
const handleClose = () => {
|
||||
setAnchorEl(null)
|
||||
}
|
||||
|
||||
const formatSources = (source) => {
|
||||
if (source && typeof source === 'string' && source.startsWith('[') && source.endsWith(']')) {
|
||||
return JSON.parse(source).join(', ')
|
||||
}
|
||||
return source
|
||||
}
|
||||
|
||||
return (
|
||||
<>
|
||||
<TableRow hover key={props.index} sx={{ '&:last-child td, &:last-child th': { border: 0 }, cursor: 'pointer' }}>
|
||||
<StyledTableCell onClick={props.onViewChunksClick} scope='row' style={{ width: '5%' }}>
|
||||
<div
|
||||
style={{
|
||||
display: 'flex',
|
||||
width: '20px',
|
||||
height: '20px',
|
||||
backgroundColor: props.loader?.status === 'SYNC' ? '#00e676' : '#ffe57f',
|
||||
borderRadius: '50%'
|
||||
}}
|
||||
></div>
|
||||
</StyledTableCell>
|
||||
<StyledTableCell onClick={props.onViewChunksClick} scope='row'>
|
||||
{props.loader.loaderName}
|
||||
</StyledTableCell>
|
||||
<StyledTableCell onClick={props.onViewChunksClick}>{props.loader.splitterName ?? 'None'}</StyledTableCell>
|
||||
<StyledTableCell onClick={props.onViewChunksClick}>{formatSources(props.loader.source)}</StyledTableCell>
|
||||
<StyledTableCell onClick={props.onViewChunksClick}>
|
||||
{props.loader.totalChunks && <Chip variant='outlined' size='small' label={props.loader.totalChunks.toLocaleString()} />}
|
||||
</StyledTableCell>
|
||||
<StyledTableCell onClick={props.onViewChunksClick}>
|
||||
{props.loader.totalChars && <Chip variant='outlined' size='small' label={props.loader.totalChars.toLocaleString()} />}
|
||||
</StyledTableCell>
|
||||
<StyledTableCell>
|
||||
<div>
|
||||
<Button
|
||||
id='document-store-action-button'
|
||||
aria-controls={open ? 'document-store-action-customized-menu' : undefined}
|
||||
aria-haspopup='true'
|
||||
aria-expanded={open ? 'true' : undefined}
|
||||
disableElevation
|
||||
onClick={(e) => handleClick(e)}
|
||||
endIcon={<KeyboardArrowDownIcon />}
|
||||
>
|
||||
Options
|
||||
</Button>
|
||||
<StyledMenu
|
||||
id='document-store-actions-customized-menu'
|
||||
MenuListProps={{
|
||||
'aria-labelledby': 'document-store-actions-customized-button'
|
||||
}}
|
||||
anchorEl={anchorEl}
|
||||
open={open}
|
||||
onClose={handleClose}
|
||||
>
|
||||
<MenuItem onClick={props.onEditClick} disableRipple>
|
||||
<FileEditIcon />
|
||||
Preview & Process
|
||||
</MenuItem>
|
||||
<MenuItem onClick={props.onViewChunksClick} disableRipple>
|
||||
<FileChunksIcon />
|
||||
View & Edit Chunks
|
||||
</MenuItem>
|
||||
<Divider sx={{ my: 0.5 }} />
|
||||
<MenuItem onClick={props.onDeleteClick} disableRipple>
|
||||
<FileDeleteIcon />
|
||||
Delete
|
||||
</MenuItem>
|
||||
</StyledMenu>
|
||||
</div>
|
||||
</StyledTableCell>
|
||||
</TableRow>
|
||||
</>
|
||||
)
|
||||
}
|
||||
|
||||
LoaderRow.propTypes = {
|
||||
loader: PropTypes.any,
|
||||
index: PropTypes.number,
|
||||
open: PropTypes.bool,
|
||||
theme: PropTypes.any,
|
||||
onViewChunksClick: PropTypes.func,
|
||||
onEditClick: PropTypes.func,
|
||||
onDeleteClick: PropTypes.func
|
||||
}
|
||||
export default DocumentStoreDetails
|
||||
|
|
@ -0,0 +1,85 @@
|
|||
import { useTheme } from '@mui/material'
|
||||
import { useSelector } from 'react-redux'
|
||||
import PropTypes from 'prop-types'
|
||||
|
||||
const DocumentStoreStatus = ({ status, isTableView }) => {
|
||||
const theme = useTheme()
|
||||
const customization = useSelector((state) => state.customization)
|
||||
|
||||
const getColor = (status) => {
|
||||
switch (status) {
|
||||
case 'STALE':
|
||||
return customization.isDarkMode
|
||||
? [theme.palette.grey[400], theme.palette.grey[600], theme.palette.grey[700]]
|
||||
: [theme.palette.grey[300], theme.palette.grey[500], theme.palette.grey[700]]
|
||||
case 'EMPTY':
|
||||
return ['#673ab7', '#673ab7', '#673ab7']
|
||||
case 'SYNCING':
|
||||
return ['#fff8e1', '#ffe57f', '#ffc107']
|
||||
case 'SYNC':
|
||||
return ['#cdf5d8', '#00e676', '#00c853']
|
||||
case 'NEW':
|
||||
return ['#e3f2fd', '#2196f3', '#1e88e5']
|
||||
default:
|
||||
return customization.isDarkMode
|
||||
? [theme.palette.grey[300], theme.palette.grey[500], theme.palette.grey[700]]
|
||||
: [theme.palette.grey[300], theme.palette.grey[500], theme.palette.grey[700]]
|
||||
}
|
||||
}
|
||||
|
||||
return (
|
||||
<>
|
||||
{!isTableView && (
|
||||
<div
|
||||
style={{
|
||||
display: 'flex',
|
||||
flexDirection: 'row',
|
||||
alignContent: 'center',
|
||||
alignItems: 'center',
|
||||
background: status === 'EMPTY' ? 'transparent' : getColor(status)[0],
|
||||
border: status === 'EMPTY' ? '1px solid' : 'none',
|
||||
borderColor: status === 'EMPTY' ? getColor(status)[0] : 'transparent',
|
||||
borderRadius: '25px',
|
||||
paddingTop: '3px',
|
||||
paddingBottom: '3px',
|
||||
paddingLeft: '10px',
|
||||
paddingRight: '10px'
|
||||
}}
|
||||
>
|
||||
<div
|
||||
style={{
|
||||
width: '10px',
|
||||
height: '10px',
|
||||
borderRadius: '50%',
|
||||
backgroundColor: status === 'EMPTY' ? 'transparent' : getColor(status)[1],
|
||||
border: status === 'EMPTY' ? '3px solid' : 'none',
|
||||
borderColor: status === 'EMPTY' ? getColor(status)[1] : 'transparent'
|
||||
}}
|
||||
/>
|
||||
<span style={{ fontSize: '0.7rem', color: getColor(status)[2], marginLeft: 5 }}>{status}</span>
|
||||
</div>
|
||||
)}
|
||||
{isTableView && (
|
||||
<div
|
||||
style={{
|
||||
display: 'flex',
|
||||
width: '20px',
|
||||
height: '20px',
|
||||
borderRadius: '50%',
|
||||
backgroundColor: status === 'EMPTY' ? 'transparent' : getColor(status)[1],
|
||||
border: status === 'EMPTY' ? '3px solid' : 'none',
|
||||
borderColor: status === 'EMPTY' ? getColor(status)[1] : 'transparent'
|
||||
}}
|
||||
title={status}
|
||||
></div>
|
||||
)}
|
||||
</>
|
||||
)
|
||||
}
|
||||
|
||||
DocumentStoreStatus.propTypes = {
|
||||
status: PropTypes.string,
|
||||
isTableView: PropTypes.bool
|
||||
}
|
||||
|
||||
export default DocumentStoreStatus
|
||||
|
|
@ -0,0 +1,236 @@
|
|||
import { useEffect, useState } from 'react'
|
||||
import { createPortal } from 'react-dom'
|
||||
import PropTypes from 'prop-types'
|
||||
import { useDispatch, useSelector } from 'react-redux'
|
||||
import ReactJson from 'flowise-react-json-view'
|
||||
import { HIDE_CANVAS_DIALOG, SHOW_CANVAS_DIALOG } from '@/store/actions'
|
||||
|
||||
// Material
|
||||
import { Button, Dialog, IconButton, DialogContent, DialogTitle, Typography } from '@mui/material'
|
||||
import { IconEdit, IconTrash, IconX, IconLanguage } from '@tabler/icons'
|
||||
|
||||
// Project imports
|
||||
import { CodeEditor } from '@/ui-component/editor/CodeEditor'
|
||||
|
||||
const ExpandedChunkDialog = ({ show, dialogProps, onCancel, onChunkEdit, onDeleteChunk, isReadOnly }) => {
|
||||
const portalElement = document.getElementById('portal')
|
||||
|
||||
const customization = useSelector((state) => state.customization)
|
||||
const dispatch = useDispatch()
|
||||
|
||||
const [selectedChunk, setSelectedChunk] = useState()
|
||||
const [selectedChunkNumber, setSelectedChunkNumber] = useState()
|
||||
const [isEdit, setIsEdit] = useState(false)
|
||||
const [contentValue, setContentValue] = useState('')
|
||||
const [metadata, setMetadata] = useState({})
|
||||
|
||||
const onClipboardCopy = (e) => {
|
||||
const src = e.src
|
||||
if (Array.isArray(src) || typeof src === 'object') {
|
||||
navigator.clipboard.writeText(JSON.stringify(src, null, ' '))
|
||||
} else {
|
||||
navigator.clipboard.writeText(src)
|
||||
}
|
||||
}
|
||||
|
||||
const onEditCancel = () => {
|
||||
setContentValue(selectedChunk?.pageContent)
|
||||
setMetadata(selectedChunk?.metadata ? JSON.parse(selectedChunk?.metadata) : {})
|
||||
setIsEdit(false)
|
||||
}
|
||||
|
||||
const onEditSaved = () => {
|
||||
onChunkEdit(contentValue, metadata, selectedChunk)
|
||||
}
|
||||
|
||||
useEffect(() => {
|
||||
if (dialogProps.data) {
|
||||
setSelectedChunk(dialogProps.data?.selectedChunk)
|
||||
setContentValue(dialogProps.data?.selectedChunk?.pageContent)
|
||||
setSelectedChunkNumber(dialogProps?.data.selectedChunkNumber)
|
||||
if (dialogProps.data?.selectedChunk?.metadata) {
|
||||
if (typeof dialogProps.data?.selectedChunk?.metadata === 'string') {
|
||||
setMetadata(JSON.parse(dialogProps.data?.selectedChunk?.metadata))
|
||||
} else if (typeof dialogProps.data?.selectedChunk?.metadata === 'object') {
|
||||
setMetadata(dialogProps.data?.selectedChunk?.metadata)
|
||||
}
|
||||
}
|
||||
}
|
||||
return () => {
|
||||
setSelectedChunk()
|
||||
setSelectedChunkNumber()
|
||||
setContentValue('')
|
||||
setMetadata({})
|
||||
setIsEdit(false)
|
||||
}
|
||||
}, [dialogProps])
|
||||
|
||||
useEffect(() => {
|
||||
if (show) dispatch({ type: SHOW_CANVAS_DIALOG })
|
||||
else dispatch({ type: HIDE_CANVAS_DIALOG })
|
||||
return () => dispatch({ type: HIDE_CANVAS_DIALOG })
|
||||
}, [show, dispatch])
|
||||
|
||||
const component = show ? (
|
||||
<Dialog
|
||||
fullWidth
|
||||
maxWidth='md'
|
||||
open={show}
|
||||
onClose={onCancel}
|
||||
aria-labelledby='alert-dialog-title'
|
||||
aria-describedby='alert-dialog-description'
|
||||
>
|
||||
<DialogTitle style={{ fontSize: '1rem' }} id='alert-dialog-title'>
|
||||
{selectedChunk && selectedChunkNumber && (
|
||||
<div style={{ display: 'flex', flexDirection: 'row', alignItems: 'center' }}>
|
||||
<Typography sx={{ flex: 1 }} variant='h4'>
|
||||
#{selectedChunkNumber}. {selectedChunk.id}
|
||||
</Typography>
|
||||
{!isEdit && !isReadOnly && (
|
||||
<IconButton onClick={() => setIsEdit(true)} size='small' color='primary' title='Edit Chunk' sx={{ ml: 2 }}>
|
||||
<IconEdit />
|
||||
</IconButton>
|
||||
)}
|
||||
{isEdit && !isReadOnly && (
|
||||
<Button onClick={() => onEditCancel()} color='primary' title='Cancel' sx={{ ml: 2 }}>
|
||||
Cancel
|
||||
</Button>
|
||||
)}
|
||||
{isEdit && !isReadOnly && (
|
||||
<Button
|
||||
onClick={() => onEditSaved(true)}
|
||||
color='primary'
|
||||
title='Save'
|
||||
variant='contained'
|
||||
sx={{ ml: 2, mr: 1 }}
|
||||
>
|
||||
Save
|
||||
</Button>
|
||||
)}
|
||||
{!isEdit && !isReadOnly && (
|
||||
<IconButton
|
||||
onClick={() => onDeleteChunk(selectedChunk)}
|
||||
size='small'
|
||||
color='error'
|
||||
title='Delete Chunk'
|
||||
sx={{ ml: 1 }}
|
||||
>
|
||||
<IconTrash />
|
||||
</IconButton>
|
||||
)}
|
||||
<IconButton onClick={onCancel} size='small' color='inherit' title='Close' sx={{ ml: 1 }}>
|
||||
<IconX />
|
||||
</IconButton>
|
||||
</div>
|
||||
)}
|
||||
</DialogTitle>
|
||||
<DialogContent>
|
||||
{selectedChunk && selectedChunkNumber && (
|
||||
<div>
|
||||
<div
|
||||
style={{
|
||||
paddingLeft: '10px',
|
||||
paddingRight: '10px',
|
||||
paddingTop: '5px',
|
||||
paddingBottom: '5px',
|
||||
fontSize: '15px',
|
||||
width: 'max-content',
|
||||
borderRadius: '25px',
|
||||
boxShadow: customization.isDarkMode
|
||||
? '0 2px 14px 0 rgb(255 255 255 / 20%)'
|
||||
: '0 2px 14px 0 rgb(32 40 45 / 20%)',
|
||||
display: 'flex',
|
||||
flexDirection: 'row',
|
||||
alignItems: 'center',
|
||||
marginTop: '5px',
|
||||
marginBottom: '10px'
|
||||
}}
|
||||
>
|
||||
<IconLanguage style={{ marginRight: 5 }} size={15} />
|
||||
{selectedChunk?.pageContent?.length} characters
|
||||
</div>
|
||||
<div style={{ marginTop: '5px' }}></div>
|
||||
{!isEdit && (
|
||||
<CodeEditor
|
||||
disabled={true}
|
||||
height='max-content'
|
||||
value={contentValue}
|
||||
theme={customization.isDarkMode ? 'dark' : 'light'}
|
||||
basicSetup={{
|
||||
lineNumbers: false,
|
||||
foldGutter: false,
|
||||
autocompletion: false,
|
||||
highlightActiveLine: false
|
||||
}}
|
||||
/>
|
||||
)}
|
||||
{isEdit && (
|
||||
<CodeEditor
|
||||
disabled={false}
|
||||
// eslint-disable-next-line
|
||||
autoFocus={true}
|
||||
height='max-content'
|
||||
value={contentValue}
|
||||
theme={customization.isDarkMode ? 'dark' : 'light'}
|
||||
basicSetup={{
|
||||
lineNumbers: false,
|
||||
foldGutter: false,
|
||||
autocompletion: false,
|
||||
highlightActiveLine: false
|
||||
}}
|
||||
onValueChange={(text) => setContentValue(text)}
|
||||
/>
|
||||
)}
|
||||
<div style={{ marginTop: '20px', marginBottom: '15px' }}>
|
||||
{!isEdit && (
|
||||
<ReactJson
|
||||
theme={customization.isDarkMode ? 'ocean' : 'rjv-default'}
|
||||
src={metadata}
|
||||
style={{ padding: '10px' }}
|
||||
name={null}
|
||||
quotesOnKeys={false}
|
||||
enableClipboard={false}
|
||||
displayDataTypes={false}
|
||||
collapsed={1}
|
||||
/>
|
||||
)}
|
||||
{isEdit && (
|
||||
<ReactJson
|
||||
theme={customization.isDarkMode ? 'ocean' : 'rjv-default'}
|
||||
src={metadata}
|
||||
style={{ padding: '10px' }}
|
||||
name={null}
|
||||
quotesOnKeys={false}
|
||||
displayDataTypes={false}
|
||||
enableClipboard={(e) => onClipboardCopy(e)}
|
||||
onEdit={(edit) => {
|
||||
setMetadata(edit.updated_src)
|
||||
}}
|
||||
onAdd={() => {
|
||||
//console.log(add)
|
||||
}}
|
||||
onDelete={(deleteobj) => {
|
||||
setMetadata(deleteobj.updated_src)
|
||||
}}
|
||||
/>
|
||||
)}
|
||||
</div>
|
||||
</div>
|
||||
)}
|
||||
</DialogContent>
|
||||
</Dialog>
|
||||
) : null
|
||||
|
||||
return createPortal(component, portalElement)
|
||||
}
|
||||
|
||||
ExpandedChunkDialog.propTypes = {
|
||||
show: PropTypes.bool,
|
||||
dialogProps: PropTypes.object,
|
||||
onCancel: PropTypes.func,
|
||||
onChunkEdit: PropTypes.func,
|
||||
onDeleteChunk: PropTypes.func,
|
||||
isReadOnly: PropTypes.bool
|
||||
}
|
||||
|
||||
export default ExpandedChunkDialog
|
||||
|
|
@ -0,0 +1,666 @@
|
|||
import { cloneDeep } from 'lodash'
|
||||
import { useEffect, useState } from 'react'
|
||||
import { validate as uuidValidate, v4 as uuidv4 } from 'uuid'
|
||||
import { useDispatch, useSelector } from 'react-redux'
|
||||
import { useNavigate } from 'react-router-dom'
|
||||
import ReactJson from 'flowise-react-json-view'
|
||||
|
||||
// Hooks
|
||||
import useApi from '@/hooks/useApi'
|
||||
|
||||
// Material-UI
|
||||
import { Skeleton, Toolbar, Box, Button, Card, CardContent, Grid, OutlinedInput, Stack, Typography } from '@mui/material'
|
||||
import { useTheme, styled } from '@mui/material/styles'
|
||||
import { IconScissors, IconArrowLeft, IconDatabaseImport, IconBook, IconX, IconEye } from '@tabler/icons'
|
||||
|
||||
// Project import
|
||||
import MainCard from '@/ui-component/cards/MainCard'
|
||||
import { StyledButton } from '@/ui-component/button/StyledButton'
|
||||
import { BackdropLoader } from '@/ui-component/loading/BackdropLoader'
|
||||
import DocStoreInputHandler from '@/views/docstore/DocStoreInputHandler'
|
||||
import { Dropdown } from '@/ui-component/dropdown/Dropdown'
|
||||
import { StyledFab } from '@/ui-component/button/StyledFab'
|
||||
import ErrorBoundary from '@/ErrorBoundary'
|
||||
import ExpandedChunkDialog from './ExpandedChunkDialog'
|
||||
|
||||
// API
|
||||
import nodesApi from '@/api/nodes'
|
||||
import documentStoreApi from '@/api/documentstore'
|
||||
import documentsApi from '@/api/documentstore'
|
||||
|
||||
// Const
|
||||
import { baseURL, gridSpacing } from '@/store/constant'
|
||||
import { closeSnackbar as closeSnackbarAction, enqueueSnackbar as enqueueSnackbarAction } from '@/store/actions'
|
||||
|
||||
// Utils
|
||||
import { initNode } from '@/utils/genericHelper'
|
||||
import useNotifier from '@/utils/useNotifier'
|
||||
|
||||
const CardWrapper = styled(MainCard)(({ theme }) => ({
|
||||
background: theme.palette.card.main,
|
||||
color: theme.darkTextPrimary,
|
||||
overflow: 'auto',
|
||||
position: 'relative',
|
||||
boxShadow: '0 2px 14px 0 rgb(32 40 45 / 8%)',
|
||||
cursor: 'pointer',
|
||||
'&:hover': {
|
||||
background: theme.palette.card.hover,
|
||||
boxShadow: '0 2px 14px 0 rgb(32 40 45 / 20%)'
|
||||
},
|
||||
maxHeight: '250px',
|
||||
minHeight: '250px',
|
||||
maxWidth: '100%',
|
||||
overflowWrap: 'break-word',
|
||||
whiteSpace: 'pre-line',
|
||||
padding: 1
|
||||
}))
|
||||
|
||||
// ===========================|| DOCUMENT LOADER CHUNKS ||=========================== //
|
||||
|
||||
const LoaderConfigPreviewChunks = () => {
|
||||
const customization = useSelector((state) => state.customization)
|
||||
const navigate = useNavigate()
|
||||
const theme = useTheme()
|
||||
|
||||
const getNodeDetailsApi = useApi(nodesApi.getSpecificNode)
|
||||
const getNodesByCategoryApi = useApi(nodesApi.getNodesByCategory)
|
||||
const getSpecificDocumentStoreApi = useApi(documentsApi.getSpecificDocumentStore)
|
||||
|
||||
const URLpath = document.location.pathname.toString().split('/')
|
||||
const docLoaderNodeName = URLpath[URLpath.length - 1] === 'document-stores' ? '' : URLpath[URLpath.length - 1]
|
||||
const storeId = URLpath[URLpath.length - 2] === 'document-stores' ? '' : URLpath[URLpath.length - 2]
|
||||
|
||||
const [selectedDocumentLoader, setSelectedDocumentLoader] = useState({})
|
||||
|
||||
const [loading, setLoading] = useState(false)
|
||||
const [error, setError] = useState(null)
|
||||
|
||||
const [textSplitterNodes, setTextSplitterNodes] = useState([])
|
||||
const [splitterOptions, setTextSplitterOptions] = useState([])
|
||||
const [selectedTextSplitter, setSelectedTextSplitter] = useState({})
|
||||
|
||||
const [documentChunks, setDocumentChunks] = useState([])
|
||||
const [totalChunks, setTotalChunks] = useState(0)
|
||||
|
||||
const [currentPreviewCount, setCurrentPreviewCount] = useState(0)
|
||||
const [previewChunkCount, setPreviewChunkCount] = useState(20)
|
||||
const [existingLoaderFromDocStoreTable, setExistingLoaderFromDocStoreTable] = useState()
|
||||
|
||||
const [showExpandedChunkDialog, setShowExpandedChunkDialog] = useState(false)
|
||||
const [expandedChunkDialogProps, setExpandedChunkDialogProps] = useState({})
|
||||
|
||||
const dispatch = useDispatch()
|
||||
|
||||
// ==============================|| Snackbar ||============================== //
|
||||
useNotifier()
|
||||
const enqueueSnackbar = (...args) => dispatch(enqueueSnackbarAction(...args))
|
||||
const closeSnackbar = (...args) => dispatch(closeSnackbarAction(...args))
|
||||
|
||||
const onSplitterChange = (name) => {
|
||||
const textSplitter = (textSplitterNodes ?? []).find((splitter) => splitter.name === name)
|
||||
if (textSplitter) {
|
||||
setSelectedTextSplitter(textSplitter)
|
||||
} else {
|
||||
setSelectedTextSplitter({})
|
||||
}
|
||||
}
|
||||
|
||||
const onChunkClick = (selectedChunk, selectedChunkNumber) => {
|
||||
const dialogProps = {
|
||||
data: {
|
||||
selectedChunk,
|
||||
selectedChunkNumber
|
||||
}
|
||||
}
|
||||
setExpandedChunkDialogProps(dialogProps)
|
||||
setShowExpandedChunkDialog(true)
|
||||
}
|
||||
|
||||
const checkMandatoryFields = () => {
|
||||
let canSubmit = true
|
||||
const inputParams = (selectedDocumentLoader.inputParams ?? []).filter((inputParam) => !inputParam.hidden)
|
||||
for (const inputParam of inputParams) {
|
||||
if (!inputParam.optional && !selectedDocumentLoader.inputs[inputParam.name]) {
|
||||
canSubmit = false
|
||||
break
|
||||
}
|
||||
}
|
||||
if (!canSubmit) {
|
||||
enqueueSnackbar({
|
||||
message: 'Please fill in all mandatory fields.',
|
||||
options: {
|
||||
key: new Date().getTime() + Math.random(),
|
||||
variant: 'warning',
|
||||
action: (key) => (
|
||||
<Button style={{ color: 'white' }} onClick={() => closeSnackbar(key)}>
|
||||
<IconX />
|
||||
</Button>
|
||||
)
|
||||
}
|
||||
})
|
||||
}
|
||||
return canSubmit
|
||||
}
|
||||
|
||||
const onPreviewChunks = async () => {
|
||||
if (checkMandatoryFields()) {
|
||||
setLoading(true)
|
||||
const config = prepareConfig()
|
||||
config.previewChunkCount = previewChunkCount
|
||||
|
||||
try {
|
||||
const previewResp = await documentStoreApi.previewChunks(config)
|
||||
if (previewResp.data) {
|
||||
setTotalChunks(previewResp.data.totalChunks)
|
||||
setDocumentChunks(previewResp.data.chunks)
|
||||
setCurrentPreviewCount(previewResp.data.previewChunkCount)
|
||||
}
|
||||
setLoading(false)
|
||||
} catch (error) {
|
||||
setLoading(false)
|
||||
enqueueSnackbar({
|
||||
message: `Failed to preview chunks: ${
|
||||
typeof error.response.data === 'object' ? error.response.data.message : error.response.data
|
||||
}`,
|
||||
options: {
|
||||
key: new Date().getTime() + Math.random(),
|
||||
variant: 'error',
|
||||
action: (key) => (
|
||||
<Button style={{ color: 'white' }} onClick={() => closeSnackbar(key)}>
|
||||
<IconX />
|
||||
</Button>
|
||||
)
|
||||
}
|
||||
})
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
const onSaveAndProcess = async () => {
|
||||
if (checkMandatoryFields()) {
|
||||
setLoading(true)
|
||||
const config = prepareConfig()
|
||||
try {
|
||||
const processResp = await documentStoreApi.processChunks(config)
|
||||
setLoading(false)
|
||||
if (processResp.data) {
|
||||
enqueueSnackbar({
|
||||
message: 'File submitted for processing. Redirecting to Document Store..',
|
||||
options: {
|
||||
key: new Date().getTime() + Math.random(),
|
||||
variant: 'success',
|
||||
action: (key) => (
|
||||
<Button style={{ color: 'white' }} onClick={() => closeSnackbar(key)}>
|
||||
<IconX />
|
||||
</Button>
|
||||
)
|
||||
}
|
||||
})
|
||||
navigate('/document-stores/' + storeId)
|
||||
}
|
||||
} catch (error) {
|
||||
setLoading(false)
|
||||
enqueueSnackbar({
|
||||
message: `Failed to process chunking: ${
|
||||
typeof error.response.data === 'object' ? error.response.data.message : error.response.data
|
||||
}`,
|
||||
options: {
|
||||
key: new Date().getTime() + Math.random(),
|
||||
variant: 'error',
|
||||
action: (key) => (
|
||||
<Button style={{ color: 'white' }} onClick={() => closeSnackbar(key)}>
|
||||
<IconX />
|
||||
</Button>
|
||||
)
|
||||
}
|
||||
})
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
const prepareConfig = () => {
|
||||
const config = {}
|
||||
|
||||
// Set loader id & name
|
||||
if (existingLoaderFromDocStoreTable) {
|
||||
config.loaderId = existingLoaderFromDocStoreTable.loaderId
|
||||
config.id = existingLoaderFromDocStoreTable.id
|
||||
} else {
|
||||
config.loaderId = docLoaderNodeName
|
||||
}
|
||||
|
||||
// Set store id & loader name
|
||||
config.storeId = storeId
|
||||
config.loaderName = selectedDocumentLoader?.label
|
||||
|
||||
// Set loader config
|
||||
if (selectedDocumentLoader.inputs) {
|
||||
config.loaderConfig = {}
|
||||
Object.keys(selectedDocumentLoader.inputs).map((key) => {
|
||||
config.loaderConfig[key] = selectedDocumentLoader.inputs[key]
|
||||
})
|
||||
}
|
||||
|
||||
// If Text splitter is set
|
||||
if (selectedTextSplitter.inputs && selectedTextSplitter.name && Object.keys(selectedTextSplitter).length > 0) {
|
||||
config.splitterId = selectedTextSplitter.name
|
||||
config.splitterConfig = {}
|
||||
|
||||
Object.keys(selectedTextSplitter.inputs).map((key) => {
|
||||
config.splitterConfig[key] = selectedTextSplitter.inputs[key]
|
||||
})
|
||||
const textSplitter = textSplitterNodes.find((splitter) => splitter.name === selectedTextSplitter.name)
|
||||
if (textSplitter) config.splitterName = textSplitter.label
|
||||
}
|
||||
|
||||
if (selectedDocumentLoader.credential) {
|
||||
config.credential = selectedDocumentLoader.credential
|
||||
}
|
||||
|
||||
return config
|
||||
}
|
||||
|
||||
useEffect(() => {
|
||||
if (uuidValidate(docLoaderNodeName)) {
|
||||
// this is a document store edit config
|
||||
getSpecificDocumentStoreApi.request(storeId)
|
||||
} else {
|
||||
getNodeDetailsApi.request(docLoaderNodeName)
|
||||
}
|
||||
|
||||
// eslint-disable-next-line react-hooks/exhaustive-deps
|
||||
}, [])
|
||||
|
||||
useEffect(() => {
|
||||
if (getNodeDetailsApi.data) {
|
||||
const nodeData = cloneDeep(initNode(getNodeDetailsApi.data, uuidv4()))
|
||||
|
||||
// If this is a document store edit config, set the existing input values
|
||||
if (existingLoaderFromDocStoreTable && existingLoaderFromDocStoreTable.loaderConfig) {
|
||||
nodeData.inputs = existingLoaderFromDocStoreTable.loaderConfig
|
||||
}
|
||||
setSelectedDocumentLoader(nodeData)
|
||||
|
||||
// Check if the loader has a text splitter, if yes, get the text splitter nodes
|
||||
const textSplitter = nodeData.inputAnchors.find((inputAnchor) => inputAnchor.name === 'textSplitter')
|
||||
if (textSplitter) {
|
||||
getNodesByCategoryApi.request('Text Splitters')
|
||||
}
|
||||
}
|
||||
|
||||
// eslint-disable-next-line react-hooks/exhaustive-deps
|
||||
}, [getNodeDetailsApi.data])
|
||||
|
||||
useEffect(() => {
|
||||
if (getNodesByCategoryApi.data) {
|
||||
// Set available text splitter nodes
|
||||
const nodes = []
|
||||
for (const node of getNodesByCategoryApi.data) {
|
||||
nodes.push(cloneDeep(initNode(node, uuidv4())))
|
||||
}
|
||||
setTextSplitterNodes(nodes)
|
||||
|
||||
// Set options
|
||||
const options = getNodesByCategoryApi.data.map((splitter) => ({
|
||||
label: splitter.label,
|
||||
name: splitter.name
|
||||
}))
|
||||
options.unshift({ label: 'None', name: 'none' })
|
||||
setTextSplitterOptions(options)
|
||||
|
||||
// If this is a document store edit config, set the existing input values
|
||||
if (
|
||||
existingLoaderFromDocStoreTable &&
|
||||
existingLoaderFromDocStoreTable.splitterConfig &&
|
||||
existingLoaderFromDocStoreTable.splitterId
|
||||
) {
|
||||
const textSplitter = nodes.find((splitter) => splitter.name === existingLoaderFromDocStoreTable.splitterId)
|
||||
if (textSplitter) {
|
||||
textSplitter.inputs = cloneDeep(existingLoaderFromDocStoreTable.splitterConfig)
|
||||
setSelectedTextSplitter(textSplitter)
|
||||
} else {
|
||||
setSelectedTextSplitter({})
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// eslint-disable-next-line react-hooks/exhaustive-deps
|
||||
}, [getNodesByCategoryApi.data])
|
||||
|
||||
useEffect(() => {
|
||||
if (getSpecificDocumentStoreApi.data) {
|
||||
if (getSpecificDocumentStoreApi.data?.loaders.length > 0) {
|
||||
const loader = getSpecificDocumentStoreApi.data.loaders.find((loader) => loader.id === docLoaderNodeName)
|
||||
if (loader) {
|
||||
setExistingLoaderFromDocStoreTable(loader)
|
||||
getNodeDetailsApi.request(loader.loaderId)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// eslint-disable-next-line react-hooks/exhaustive-deps
|
||||
}, [getSpecificDocumentStoreApi.data])
|
||||
|
||||
useEffect(() => {
|
||||
if (getSpecificDocumentStoreApi.error) {
|
||||
setError(getSpecificDocumentStoreApi.error)
|
||||
}
|
||||
|
||||
// eslint-disable-next-line react-hooks/exhaustive-deps
|
||||
}, [getSpecificDocumentStoreApi.error])
|
||||
|
||||
useEffect(() => {
|
||||
if (getNodeDetailsApi.error) {
|
||||
setError(getNodeDetailsApi.error)
|
||||
}
|
||||
|
||||
// eslint-disable-next-line react-hooks/exhaustive-deps
|
||||
}, [getNodeDetailsApi.error])
|
||||
|
||||
useEffect(() => {
|
||||
if (getNodesByCategoryApi.error) {
|
||||
setError(getNodesByCategoryApi.error)
|
||||
}
|
||||
|
||||
// eslint-disable-next-line react-hooks/exhaustive-deps
|
||||
}, [getNodesByCategoryApi.error])
|
||||
|
||||
return (
|
||||
<>
|
||||
<MainCard>
|
||||
{error ? (
|
||||
<ErrorBoundary error={error} />
|
||||
) : (
|
||||
<Stack flexDirection='column'>
|
||||
<Box sx={{ flexGrow: 1, py: 1.25, width: '100%' }}>
|
||||
<Toolbar
|
||||
disableGutters={true}
|
||||
sx={{
|
||||
p: 0,
|
||||
display: 'flex',
|
||||
justifyContent: 'space-between',
|
||||
width: '100%'
|
||||
}}
|
||||
>
|
||||
<Box sx={{ display: 'flex', alignItems: 'center', flexDirection: 'row' }}>
|
||||
<StyledFab size='small' color='secondary' aria-label='back' title='Back' onClick={() => navigate(-1)}>
|
||||
<IconArrowLeft />
|
||||
</StyledFab>
|
||||
<Typography sx={{ ml: 2, mr: 2 }} variant='h3'>
|
||||
{selectedDocumentLoader?.label}
|
||||
</Typography>
|
||||
<div
|
||||
style={{
|
||||
width: 40,
|
||||
height: 40,
|
||||
borderRadius: '50%',
|
||||
display: 'flex',
|
||||
alignItems: 'center',
|
||||
justifyContent: 'center',
|
||||
backgroundColor: 'white',
|
||||
boxShadow: '0 2px 14px 0 rgb(32 40 45 / 25%)'
|
||||
}}
|
||||
>
|
||||
{selectedDocumentLoader?.name ? (
|
||||
<img
|
||||
style={{
|
||||
width: '100%',
|
||||
height: '100%',
|
||||
padding: 7,
|
||||
borderRadius: '50%',
|
||||
objectFit: 'contain'
|
||||
}}
|
||||
alt={selectedDocumentLoader?.name ?? 'docloader'}
|
||||
src={`${baseURL}/api/v1/node-icon/${selectedDocumentLoader?.name}`}
|
||||
/>
|
||||
) : (
|
||||
<IconBook color='black' />
|
||||
)}
|
||||
</div>
|
||||
</Box>
|
||||
<Box>
|
||||
<StyledButton
|
||||
variant='contained'
|
||||
onClick={onSaveAndProcess}
|
||||
sx={{ borderRadius: 2, height: '100%' }}
|
||||
startIcon={<IconDatabaseImport />}
|
||||
>
|
||||
Process
|
||||
</StyledButton>
|
||||
</Box>
|
||||
</Toolbar>
|
||||
</Box>
|
||||
<Box>
|
||||
<Grid container spacing='2'>
|
||||
<Grid item xs={4} md={6} lg={6} sm={4}>
|
||||
<div
|
||||
style={{
|
||||
display: 'flex',
|
||||
flexDirection: 'column',
|
||||
paddingRight: 15
|
||||
}}
|
||||
>
|
||||
{selectedDocumentLoader &&
|
||||
Object.keys(selectedDocumentLoader).length > 0 &&
|
||||
(selectedDocumentLoader.inputParams ?? [])
|
||||
.filter((inputParam) => !inputParam.hidden)
|
||||
.map((inputParam, index) => (
|
||||
<DocStoreInputHandler
|
||||
key={index}
|
||||
inputParam={inputParam}
|
||||
data={selectedDocumentLoader}
|
||||
/>
|
||||
))}
|
||||
{textSplitterNodes && textSplitterNodes.length > 0 && (
|
||||
<>
|
||||
<Box sx={{ display: 'flex', alignItems: 'center', flexDirection: 'row', p: 2, mt: 5 }}>
|
||||
<Typography sx={{ mr: 2 }} variant='h3'>
|
||||
{(splitterOptions ?? []).find(
|
||||
(splitter) => splitter.name === selectedTextSplitter?.name
|
||||
)?.label ?? 'Select Text Splitter'}
|
||||
</Typography>
|
||||
<div
|
||||
style={{
|
||||
width: 40,
|
||||
height: 40,
|
||||
borderRadius: '50%',
|
||||
backgroundColor: 'white',
|
||||
display: 'flex',
|
||||
alignItems: 'center',
|
||||
justifyContent: 'center',
|
||||
boxShadow: '0 2px 14px 0 rgb(32 40 45 / 25%)'
|
||||
}}
|
||||
>
|
||||
{selectedTextSplitter?.name ? (
|
||||
<img
|
||||
style={{
|
||||
width: '100%',
|
||||
height: '100%',
|
||||
padding: 7,
|
||||
borderRadius: '50%',
|
||||
objectFit: 'contain'
|
||||
}}
|
||||
alt={selectedTextSplitter?.name ?? 'textsplitter'}
|
||||
src={`${baseURL}/api/v1/node-icon/${selectedTextSplitter?.name}`}
|
||||
/>
|
||||
) : (
|
||||
<IconScissors color='black' />
|
||||
)}
|
||||
</div>
|
||||
</Box>
|
||||
<Box sx={{ p: 2 }}>
|
||||
<Typography>Splitter</Typography>
|
||||
<Dropdown
|
||||
key={JSON.stringify(selectedTextSplitter)}
|
||||
name='textSplitter'
|
||||
options={splitterOptions}
|
||||
onSelect={(newValue) => onSplitterChange(newValue)}
|
||||
value={selectedTextSplitter?.name ?? 'none'}
|
||||
/>
|
||||
</Box>
|
||||
</>
|
||||
)}
|
||||
{Object.keys(selectedTextSplitter).length > 0 &&
|
||||
(selectedTextSplitter.inputParams ?? [])
|
||||
.filter((inputParam) => !inputParam.hidden)
|
||||
.map((inputParam, index) => (
|
||||
<DocStoreInputHandler key={index} data={selectedTextSplitter} inputParam={inputParam} />
|
||||
))}
|
||||
</div>
|
||||
</Grid>
|
||||
<Grid item xs={8} md={6} lg={6} sm={8}>
|
||||
{!documentChunks ||
|
||||
(documentChunks.length === 0 && (
|
||||
<div style={{ position: 'relative' }}>
|
||||
<Box display='grid' gridTemplateColumns='repeat(2, 1fr)' gap={gridSpacing}>
|
||||
<Skeleton
|
||||
animation={false}
|
||||
sx={{ bgcolor: customization.isDarkMode ? '#23262c' : '#fafafa' }}
|
||||
variant='rounded'
|
||||
height={160}
|
||||
/>
|
||||
<Skeleton
|
||||
animation={false}
|
||||
sx={{ bgcolor: customization.isDarkMode ? '#23262c' : '#fafafa' }}
|
||||
variant='rounded'
|
||||
height={160}
|
||||
/>
|
||||
<Skeleton
|
||||
animation={false}
|
||||
sx={{ bgcolor: customization.isDarkMode ? '#23262c' : '#fafafa' }}
|
||||
variant='rounded'
|
||||
height={160}
|
||||
/>
|
||||
<Skeleton
|
||||
animation={false}
|
||||
sx={{ bgcolor: customization.isDarkMode ? '#23262c' : '#fafafa' }}
|
||||
variant='rounded'
|
||||
height={160}
|
||||
/>
|
||||
<Skeleton
|
||||
animation={false}
|
||||
sx={{ bgcolor: customization.isDarkMode ? '#23262c' : '#fafafa' }}
|
||||
variant='rounded'
|
||||
height={160}
|
||||
/>
|
||||
<Skeleton
|
||||
animation={false}
|
||||
sx={{ bgcolor: customization.isDarkMode ? '#23262c' : '#fafafa' }}
|
||||
variant='rounded'
|
||||
height={160}
|
||||
/>
|
||||
</Box>
|
||||
<div
|
||||
style={{
|
||||
position: 'absolute',
|
||||
top: 0,
|
||||
right: 0,
|
||||
width: '100%',
|
||||
height: '100%',
|
||||
backdropFilter: `blur(1px)`,
|
||||
background: `transparent`,
|
||||
display: 'flex',
|
||||
alignItems: 'center',
|
||||
justifyContent: 'center'
|
||||
}}
|
||||
>
|
||||
<StyledFab
|
||||
color='secondary'
|
||||
aria-label='preview'
|
||||
title='Preview'
|
||||
variant='extended'
|
||||
onClick={onPreviewChunks}
|
||||
>
|
||||
<IconEye style={{ marginRight: '5px' }} />
|
||||
Preview Chunks
|
||||
</StyledFab>
|
||||
</div>
|
||||
</div>
|
||||
))}
|
||||
{documentChunks && documentChunks.length > 0 && (
|
||||
<>
|
||||
<Typography sx={{ wordWrap: 'break-word', textAlign: 'left', mb: 2 }} variant='h3'>
|
||||
{currentPreviewCount} of {totalChunks} Chunks
|
||||
</Typography>
|
||||
<Box sx={{ mb: 3 }}>
|
||||
<Typography>Show Chunks in Preview</Typography>
|
||||
<div style={{ display: 'flex', flexDirection: 'row' }}>
|
||||
<OutlinedInput
|
||||
size='small'
|
||||
multiline={false}
|
||||
sx={{ mt: 1, flex: 1, mr: 2 }}
|
||||
type='number'
|
||||
key='previewChunkCount'
|
||||
onChange={(e) => setPreviewChunkCount(e.target.value)}
|
||||
value={previewChunkCount ?? 25}
|
||||
/>
|
||||
<StyledFab
|
||||
color='secondary'
|
||||
aria-label='preview'
|
||||
title='Preview'
|
||||
variant='extended'
|
||||
onClick={onPreviewChunks}
|
||||
>
|
||||
<IconEye style={{ marginRight: '5px' }} />
|
||||
Preview
|
||||
</StyledFab>
|
||||
</div>
|
||||
</Box>
|
||||
<div style={{ height: '800px', overflow: 'scroll', padding: '5px' }}>
|
||||
<Grid container spacing={2}>
|
||||
{documentChunks?.map((row, index) => (
|
||||
<Grid item lg={6} md={6} sm={6} xs={6} key={index}>
|
||||
<CardWrapper
|
||||
content={false}
|
||||
onClick={() => onChunkClick(row, index + 1)}
|
||||
sx={{
|
||||
border: 1,
|
||||
borderColor: theme.palette.grey[900] + 25,
|
||||
borderRadius: 2
|
||||
}}
|
||||
>
|
||||
<Card>
|
||||
<CardContent sx={{ p: 1 }}>
|
||||
<Typography sx={{ wordWrap: 'break-word', mb: 1 }} variant='h5'>
|
||||
{`#${index + 1}. Characters: ${row.pageContent.length}`}
|
||||
</Typography>
|
||||
<Typography sx={{ wordWrap: 'break-word' }} variant='body2'>
|
||||
{row.pageContent}
|
||||
</Typography>
|
||||
<ReactJson
|
||||
theme={customization.isDarkMode ? 'ocean' : 'rjv-default'}
|
||||
style={{ paddingTop: 10 }}
|
||||
src={row.metadata}
|
||||
name={null}
|
||||
quotesOnKeys={false}
|
||||
enableClipboard={false}
|
||||
displayDataTypes={false}
|
||||
collapsed={1}
|
||||
/>
|
||||
</CardContent>
|
||||
</Card>
|
||||
</CardWrapper>
|
||||
</Grid>
|
||||
))}
|
||||
</Grid>
|
||||
</div>
|
||||
</>
|
||||
)}
|
||||
</Grid>
|
||||
</Grid>
|
||||
</Box>
|
||||
</Stack>
|
||||
)}
|
||||
</MainCard>
|
||||
<ExpandedChunkDialog
|
||||
show={showExpandedChunkDialog}
|
||||
isReadOnly={true}
|
||||
dialogProps={expandedChunkDialogProps}
|
||||
onCancel={() => setShowExpandedChunkDialog(false)}
|
||||
></ExpandedChunkDialog>
|
||||
{loading && <BackdropLoader open={loading} />}
|
||||
</>
|
||||
)
|
||||
}
|
||||
|
||||
export default LoaderConfigPreviewChunks
|
||||
|
|
@ -0,0 +1,386 @@
|
|||
import { useEffect, useState } from 'react'
|
||||
import { useDispatch, useSelector } from 'react-redux'
|
||||
import { useNavigate } from 'react-router-dom'
|
||||
import ReactJson from 'flowise-react-json-view'
|
||||
|
||||
// material-ui
|
||||
import { Box, Card, Button, Grid, IconButton, Stack, Typography } from '@mui/material'
|
||||
import { useTheme, styled } from '@mui/material/styles'
|
||||
import CardContent from '@mui/material/CardContent'
|
||||
import { IconLanguage, IconX, IconChevronLeft, IconChevronRight } from '@tabler/icons'
|
||||
import chunks_emptySVG from '@/assets/images/chunks_empty.svg'
|
||||
|
||||
// project imports
|
||||
import MainCard from '@/ui-component/cards/MainCard'
|
||||
import { BackdropLoader } from '@/ui-component/loading/BackdropLoader'
|
||||
import ConfirmDialog from '@/ui-component/dialog/ConfirmDialog'
|
||||
import ExpandedChunkDialog from './ExpandedChunkDialog'
|
||||
import ViewHeader from '@/layout/MainLayout/ViewHeader'
|
||||
|
||||
// API
|
||||
import documentsApi from '@/api/documentstore'
|
||||
|
||||
// Hooks
|
||||
import useApi from '@/hooks/useApi'
|
||||
import useConfirm from '@/hooks/useConfirm'
|
||||
import useNotifier from '@/utils/useNotifier'
|
||||
|
||||
// store
|
||||
import { closeSnackbar as closeSnackbarAction, enqueueSnackbar as enqueueSnackbarAction } from '@/store/actions'
|
||||
|
||||
const CardWrapper = styled(MainCard)(({ theme }) => ({
|
||||
background: theme.palette.card.main,
|
||||
color: theme.darkTextPrimary,
|
||||
overflow: 'auto',
|
||||
position: 'relative',
|
||||
boxShadow: '0 2px 14px 0 rgb(32 40 45 / 8%)',
|
||||
cursor: 'pointer',
|
||||
'&:hover': {
|
||||
background: theme.palette.card.hover,
|
||||
boxShadow: '0 2px 14px 0 rgb(32 40 45 / 20%)'
|
||||
},
|
||||
maxHeight: '250px',
|
||||
minHeight: '250px',
|
||||
maxWidth: '100%',
|
||||
overflowWrap: 'break-word',
|
||||
whiteSpace: 'pre-line',
|
||||
padding: 1
|
||||
}))
|
||||
|
||||
const ShowStoredChunks = () => {
|
||||
const customization = useSelector((state) => state.customization)
|
||||
const navigate = useNavigate()
|
||||
const dispatch = useDispatch()
|
||||
const theme = useTheme()
|
||||
const { confirm } = useConfirm()
|
||||
|
||||
useNotifier()
|
||||
const enqueueSnackbar = (...args) => dispatch(enqueueSnackbarAction(...args))
|
||||
const closeSnackbar = (...args) => dispatch(closeSnackbarAction(...args))
|
||||
|
||||
const getChunksApi = useApi(documentsApi.getFileChunks)
|
||||
|
||||
const URLpath = document.location.pathname.toString().split('/')
|
||||
const fileId = URLpath[URLpath.length - 1] === 'document-stores' ? '' : URLpath[URLpath.length - 1]
|
||||
const storeId = URLpath[URLpath.length - 2] === 'document-stores' ? '' : URLpath[URLpath.length - 2]
|
||||
|
||||
const [documentChunks, setDocumentChunks] = useState([])
|
||||
const [totalChunks, setTotalChunks] = useState(0)
|
||||
const [currentPage, setCurrentPage] = useState(1)
|
||||
const [start, setStart] = useState(1)
|
||||
const [end, setEnd] = useState(50)
|
||||
const [loading, setLoading] = useState(false)
|
||||
const [showExpandedChunkDialog, setShowExpandedChunkDialog] = useState(false)
|
||||
const [expandedChunkDialogProps, setExpandedChunkDialogProps] = useState({})
|
||||
const [fileNames, setFileNames] = useState([])
|
||||
|
||||
const chunkSelected = (chunkId) => {
|
||||
const selectedChunk = documentChunks.find((chunk) => chunk.id === chunkId)
|
||||
const selectedChunkNumber = documentChunks.findIndex((chunk) => chunk.id === chunkId) + start
|
||||
const dialogProps = {
|
||||
data: {
|
||||
selectedChunk,
|
||||
selectedChunkNumber
|
||||
}
|
||||
}
|
||||
setExpandedChunkDialogProps(dialogProps)
|
||||
setShowExpandedChunkDialog(true)
|
||||
}
|
||||
|
||||
const onChunkEdit = async (newPageContent, newMetadata, chunk) => {
|
||||
setLoading(true)
|
||||
setShowExpandedChunkDialog(false)
|
||||
try {
|
||||
const editResp = await documentsApi.editChunkFromStore(
|
||||
chunk.storeId,
|
||||
chunk.docId,
|
||||
chunk.id,
|
||||
{ pageContent: newPageContent, metadata: newMetadata },
|
||||
true
|
||||
)
|
||||
if (editResp.data) {
|
||||
enqueueSnackbar({
|
||||
message: 'Document chunk successfully edited!',
|
||||
options: {
|
||||
key: new Date().getTime() + Math.random(),
|
||||
variant: 'success',
|
||||
action: (key) => (
|
||||
<Button style={{ color: 'white' }} onClick={() => closeSnackbar(key)}>
|
||||
<IconX />
|
||||
</Button>
|
||||
)
|
||||
}
|
||||
})
|
||||
getChunksApi.request(storeId, fileId, currentPage)
|
||||
}
|
||||
setLoading(false)
|
||||
} catch (error) {
|
||||
setLoading(false)
|
||||
enqueueSnackbar({
|
||||
message: `Failed to edit chunk: ${
|
||||
typeof error.response.data === 'object' ? error.response.data.message : error.response.data
|
||||
}`,
|
||||
options: {
|
||||
key: new Date().getTime() + Math.random(),
|
||||
variant: 'error',
|
||||
action: (key) => (
|
||||
<Button style={{ color: 'white' }} onClick={() => closeSnackbar(key)}>
|
||||
<IconX />
|
||||
</Button>
|
||||
)
|
||||
}
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
const onDeleteChunk = async (chunk) => {
|
||||
const confirmPayload = {
|
||||
title: `Delete`,
|
||||
description: `Delete chunk ${chunk.id} ? This action cannot be undone.`,
|
||||
confirmButtonName: 'Delete',
|
||||
cancelButtonName: 'Cancel'
|
||||
}
|
||||
const isConfirmed = await confirm(confirmPayload)
|
||||
|
||||
if (isConfirmed) {
|
||||
setLoading(true)
|
||||
setShowExpandedChunkDialog(false)
|
||||
try {
|
||||
const delResp = await documentsApi.deleteChunkFromStore(chunk.storeId, chunk.docId, chunk.id)
|
||||
if (delResp.data) {
|
||||
enqueueSnackbar({
|
||||
message: 'Document chunk successfully deleted!',
|
||||
options: {
|
||||
key: new Date().getTime() + Math.random(),
|
||||
variant: 'success',
|
||||
action: (key) => (
|
||||
<Button style={{ color: 'white' }} onClick={() => closeSnackbar(key)}>
|
||||
<IconX />
|
||||
</Button>
|
||||
)
|
||||
}
|
||||
})
|
||||
getChunksApi.request(storeId, fileId, currentPage)
|
||||
}
|
||||
setLoading(false)
|
||||
} catch (error) {
|
||||
setLoading(false)
|
||||
enqueueSnackbar({
|
||||
message: `Failed to delete chunk: ${
|
||||
typeof error.response.data === 'object' ? error.response.data.message : error.response.data
|
||||
}`,
|
||||
options: {
|
||||
key: new Date().getTime() + Math.random(),
|
||||
variant: 'error',
|
||||
action: (key) => (
|
||||
<Button style={{ color: 'white' }} onClick={() => closeSnackbar(key)}>
|
||||
<IconX />
|
||||
</Button>
|
||||
)
|
||||
}
|
||||
})
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
useEffect(() => {
|
||||
setLoading(true)
|
||||
getChunksApi.request(storeId, fileId, currentPage)
|
||||
// eslint-disable-next-line react-hooks/exhaustive-deps
|
||||
}, [])
|
||||
|
||||
const changePage = (newPage) => {
|
||||
setLoading(true)
|
||||
setCurrentPage(newPage)
|
||||
getChunksApi.request(storeId, fileId, newPage)
|
||||
}
|
||||
|
||||
useEffect(() => {
|
||||
if (getChunksApi.data) {
|
||||
const data = getChunksApi.data
|
||||
setTotalChunks(data.count)
|
||||
setDocumentChunks(data.chunks)
|
||||
setLoading(false)
|
||||
setCurrentPage(data.currentPage)
|
||||
setStart(data.currentPage * 50 - 49)
|
||||
setEnd(data.currentPage * 50 > data.count ? data.count : data.currentPage * 50)
|
||||
if (data.file?.files && data.file.files.length > 0) {
|
||||
const fileNames = []
|
||||
for (const attachedFile of data.file.files) {
|
||||
fileNames.push(attachedFile.name)
|
||||
}
|
||||
setFileNames(fileNames)
|
||||
}
|
||||
}
|
||||
|
||||
// eslint-disable-next-line react-hooks/exhaustive-deps
|
||||
}, [getChunksApi.data])
|
||||
|
||||
return (
|
||||
<>
|
||||
<MainCard style={{ position: 'relative' }}>
|
||||
<Stack flexDirection='column' sx={{ gap: 1 }}>
|
||||
<ViewHeader
|
||||
isBackButton={true}
|
||||
search={false}
|
||||
title={getChunksApi.data?.file?.loaderName || getChunksApi.data?.storeName}
|
||||
description={getChunksApi.data?.file?.splitterName || getChunksApi.data?.description}
|
||||
onBack={() => navigate(-1)}
|
||||
></ViewHeader>
|
||||
<div style={{ width: '100%' }}>
|
||||
{fileNames.length > 0 && (
|
||||
<Grid sx={{ mt: 1 }} container>
|
||||
{fileNames.map((fileName, index) => (
|
||||
<div
|
||||
key={index}
|
||||
style={{
|
||||
paddingLeft: '15px',
|
||||
paddingRight: '15px',
|
||||
paddingTop: '10px',
|
||||
paddingBottom: '10px',
|
||||
fontSize: '0.9rem',
|
||||
width: 'max-content',
|
||||
borderRadius: '25px',
|
||||
boxShadow: customization.isDarkMode
|
||||
? '0 2px 14px 0 rgb(255 255 255 / 20%)'
|
||||
: '0 2px 14px 0 rgb(32 40 45 / 20%)',
|
||||
display: 'flex',
|
||||
flexDirection: 'row',
|
||||
alignItems: 'center',
|
||||
marginRight: '10px'
|
||||
}}
|
||||
>
|
||||
{fileName}
|
||||
</div>
|
||||
))}
|
||||
</Grid>
|
||||
)}
|
||||
<div
|
||||
style={{
|
||||
width: '100%',
|
||||
display: 'flex',
|
||||
flexDirection: 'row',
|
||||
alignItems: 'center',
|
||||
alignContent: 'center',
|
||||
overflow: 'hidden',
|
||||
marginTop: 15,
|
||||
marginBottom: 10
|
||||
}}
|
||||
>
|
||||
<div style={{ marginRight: 20, display: 'flex', flexDirection: 'row', alignItems: 'center' }}>
|
||||
<IconButton
|
||||
size='small'
|
||||
onClick={() => changePage(currentPage - 1)}
|
||||
style={{ marginRight: 10 }}
|
||||
variant='outlined'
|
||||
disabled={currentPage === 1}
|
||||
>
|
||||
<IconChevronLeft
|
||||
color={
|
||||
customization.isDarkMode
|
||||
? currentPage === 1
|
||||
? '#616161'
|
||||
: 'white'
|
||||
: currentPage === 1
|
||||
? '#e0e0e0'
|
||||
: 'black'
|
||||
}
|
||||
/>
|
||||
</IconButton>
|
||||
Showing {Math.min(start, totalChunks)}-{end} of {totalChunks} chunks
|
||||
<IconButton
|
||||
size='small'
|
||||
onClick={() => changePage(currentPage + 1)}
|
||||
style={{ marginLeft: 10 }}
|
||||
variant='outlined'
|
||||
disabled={end >= totalChunks}
|
||||
>
|
||||
<IconChevronRight
|
||||
color={
|
||||
customization.isDarkMode
|
||||
? end >= totalChunks
|
||||
? '#616161'
|
||||
: 'white'
|
||||
: end >= totalChunks
|
||||
? '#e0e0e0'
|
||||
: 'black'
|
||||
}
|
||||
/>
|
||||
</IconButton>
|
||||
</div>
|
||||
<div style={{ marginRight: 20, display: 'flex', flexDirection: 'row', alignItems: 'center' }}>
|
||||
<IconLanguage style={{ marginRight: 10 }} size={20} />
|
||||
{getChunksApi.data?.file?.totalChars?.toLocaleString()} characters
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
<div>
|
||||
<Grid container spacing={2}>
|
||||
{!documentChunks.length && (
|
||||
<div
|
||||
style={{
|
||||
display: 'flex',
|
||||
flexDirection: 'column',
|
||||
alignItems: 'center',
|
||||
width: '100%'
|
||||
}}
|
||||
>
|
||||
<Box sx={{ mt: 5, p: 2, height: 'auto' }}>
|
||||
<img
|
||||
style={{ objectFit: 'cover', height: '16vh', width: 'auto' }}
|
||||
src={chunks_emptySVG}
|
||||
alt='chunks_emptySVG'
|
||||
/>
|
||||
</Box>
|
||||
<div>No Chunks</div>
|
||||
</div>
|
||||
)}
|
||||
{documentChunks.length > 0 &&
|
||||
documentChunks.map((row, index) => (
|
||||
<Grid item lg={4} md={4} sm={6} xs={6} key={index}>
|
||||
<CardWrapper
|
||||
content={false}
|
||||
onClick={() => chunkSelected(row.id)}
|
||||
sx={{ border: 1, borderColor: theme.palette.grey[900] + 25, borderRadius: 2 }}
|
||||
>
|
||||
<Card>
|
||||
<CardContent sx={{ p: 2 }}>
|
||||
<Typography sx={{ wordWrap: 'break-word', mb: 1 }} variant='h5'>
|
||||
{`#${row.chunkNo}. Characters: ${row.pageContent.length}`}
|
||||
</Typography>
|
||||
<Typography sx={{ wordWrap: 'break-word' }} variant='body2'>
|
||||
{row.pageContent}
|
||||
</Typography>
|
||||
<ReactJson
|
||||
theme={customization.isDarkMode ? 'ocean' : 'rjv-default'}
|
||||
style={{ paddingTop: 10 }}
|
||||
src={row.metadata ? JSON.parse(row.metadata) : {}}
|
||||
name={null}
|
||||
quotesOnKeys={false}
|
||||
enableClipboard={false}
|
||||
displayDataTypes={false}
|
||||
collapsed={1}
|
||||
/>
|
||||
</CardContent>
|
||||
</Card>
|
||||
</CardWrapper>
|
||||
</Grid>
|
||||
))}
|
||||
</Grid>
|
||||
</div>
|
||||
</Stack>
|
||||
</MainCard>
|
||||
<ConfirmDialog />
|
||||
<ExpandedChunkDialog
|
||||
show={showExpandedChunkDialog}
|
||||
dialogProps={expandedChunkDialogProps}
|
||||
onCancel={() => setShowExpandedChunkDialog(false)}
|
||||
onChunkEdit={(newPageContent, newMetadata, selectedChunk) => onChunkEdit(newPageContent, newMetadata, selectedChunk)}
|
||||
onDeleteChunk={(selectedChunk) => onDeleteChunk(selectedChunk)}
|
||||
></ExpandedChunkDialog>
|
||||
{loading && <BackdropLoader open={loading} />}
|
||||
</>
|
||||
)
|
||||
}
|
||||
|
||||
export default ShowStoredChunks
|
||||
|
|
@ -0,0 +1,352 @@
|
|||
import { useEffect, useState } from 'react'
|
||||
import { useNavigate } from 'react-router-dom'
|
||||
import { useSelector } from 'react-redux'
|
||||
|
||||
// material-ui
|
||||
import {
|
||||
Box,
|
||||
Paper,
|
||||
Skeleton,
|
||||
Stack,
|
||||
Table,
|
||||
TableBody,
|
||||
TableCell,
|
||||
TableContainer,
|
||||
TableHead,
|
||||
TableRow,
|
||||
ToggleButton,
|
||||
ToggleButtonGroup,
|
||||
Typography
|
||||
} from '@mui/material'
|
||||
import { useTheme } from '@mui/material/styles'
|
||||
|
||||
// project imports
|
||||
import MainCard from '@/ui-component/cards/MainCard'
|
||||
import DocumentStoreCard from '@/ui-component/cards/DocumentStoreCard'
|
||||
import { StyledButton } from '@/ui-component/button/StyledButton'
|
||||
import AddDocStoreDialog from '@/views/docstore/AddDocStoreDialog'
|
||||
import ErrorBoundary from '@/ErrorBoundary'
|
||||
import ViewHeader from '@/layout/MainLayout/ViewHeader'
|
||||
import DocumentStoreStatus from '@/views/docstore/DocumentStoreStatus'
|
||||
|
||||
// API
|
||||
import useApi from '@/hooks/useApi'
|
||||
import documentsApi from '@/api/documentstore'
|
||||
|
||||
// icons
|
||||
import { IconPlus, IconLayoutGrid, IconList } from '@tabler/icons'
|
||||
import doc_store_empty from '@/assets/images/doc_store_empty.svg'
|
||||
|
||||
// const
|
||||
import { baseURL, gridSpacing } from '@/store/constant'
|
||||
|
||||
// ==============================|| DOCUMENTS ||============================== //
|
||||
|
||||
const Documents = () => {
|
||||
const theme = useTheme()
|
||||
const customization = useSelector((state) => state.customization)
|
||||
|
||||
const navigate = useNavigate()
|
||||
const getAllDocumentStores = useApi(documentsApi.getAllDocumentStores)
|
||||
|
||||
const [error, setError] = useState(null)
|
||||
const [isLoading, setLoading] = useState(true)
|
||||
const [images, setImages] = useState({})
|
||||
const [search, setSearch] = useState('')
|
||||
const [showDialog, setShowDialog] = useState(false)
|
||||
const [dialogProps, setDialogProps] = useState({})
|
||||
const [docStores, setDocStores] = useState([])
|
||||
const [view, setView] = useState(localStorage.getItem('docStoreDisplayStyle') || 'card')
|
||||
|
||||
const handleChange = (event, nextView) => {
|
||||
if (nextView === null) return
|
||||
localStorage.setItem('docStoreDisplayStyle', nextView)
|
||||
setView(nextView)
|
||||
}
|
||||
|
||||
function filterDocStores(data) {
|
||||
return data.name.toLowerCase().indexOf(search.toLowerCase()) > -1
|
||||
}
|
||||
|
||||
const onSearchChange = (event) => {
|
||||
setSearch(event.target.value)
|
||||
}
|
||||
|
||||
const goToDocumentStore = (id) => {
|
||||
navigate('/document-stores/' + id)
|
||||
}
|
||||
|
||||
const addNew = () => {
|
||||
const dialogProp = {
|
||||
title: 'Add New Document Store',
|
||||
type: 'ADD',
|
||||
cancelButtonName: 'Cancel',
|
||||
confirmButtonName: 'Add'
|
||||
}
|
||||
setDialogProps(dialogProp)
|
||||
setShowDialog(true)
|
||||
}
|
||||
|
||||
const onConfirm = () => {
|
||||
setShowDialog(false)
|
||||
getAllDocumentStores.request()
|
||||
}
|
||||
|
||||
useEffect(() => {
|
||||
getAllDocumentStores.request()
|
||||
|
||||
// eslint-disable-next-line react-hooks/exhaustive-deps
|
||||
}, [])
|
||||
|
||||
useEffect(() => {
|
||||
if (getAllDocumentStores.data) {
|
||||
try {
|
||||
const docStores = getAllDocumentStores.data
|
||||
if (!Array.isArray(docStores)) return
|
||||
const loaderImages = {}
|
||||
|
||||
for (let i = 0; i < docStores.length; i += 1) {
|
||||
const loaders = docStores[i].loaders ?? []
|
||||
|
||||
let totalChunks = 0
|
||||
let totalChars = 0
|
||||
loaderImages[docStores[i].id] = []
|
||||
for (let j = 0; j < loaders.length; j += 1) {
|
||||
const imageSrc = `${baseURL}/api/v1/node-icon/${loaders[j].loaderId}`
|
||||
if (!loaderImages[docStores[i].id].includes(imageSrc)) {
|
||||
loaderImages[docStores[i].id].push(imageSrc)
|
||||
}
|
||||
totalChunks += loaders[j]?.totalChunks ?? 0
|
||||
totalChars += loaders[j]?.totalChars ?? 0
|
||||
}
|
||||
docStores[i].totalDocs = loaders?.length ?? 0
|
||||
docStores[i].totalChunks = totalChunks
|
||||
docStores[i].totalChars = totalChars
|
||||
}
|
||||
setDocStores(docStores)
|
||||
setImages(loaderImages)
|
||||
} catch (e) {
|
||||
console.error(e)
|
||||
}
|
||||
}
|
||||
}, [getAllDocumentStores.data])
|
||||
|
||||
useEffect(() => {
|
||||
setLoading(getAllDocumentStores.loading)
|
||||
}, [getAllDocumentStores.loading])
|
||||
|
||||
useEffect(() => {
|
||||
setError(getAllDocumentStores.error)
|
||||
}, [getAllDocumentStores.error])
|
||||
|
||||
return (
|
||||
<MainCard>
|
||||
{error ? (
|
||||
<ErrorBoundary error={error} />
|
||||
) : (
|
||||
<Stack flexDirection='column' sx={{ gap: 3 }}>
|
||||
<ViewHeader onSearchChange={onSearchChange} search={true} searchPlaceholder='Search Name' title='Document Store'>
|
||||
<ToggleButtonGroup
|
||||
sx={{ borderRadius: 2, maxHeight: 40 }}
|
||||
value={view}
|
||||
color='primary'
|
||||
exclusive
|
||||
onChange={handleChange}
|
||||
>
|
||||
<ToggleButton
|
||||
sx={{
|
||||
borderColor: theme.palette.grey[900] + 25,
|
||||
borderRadius: 2,
|
||||
color: theme?.customization?.isDarkMode ? 'white' : 'inherit'
|
||||
}}
|
||||
variant='contained'
|
||||
value='card'
|
||||
title='Card View'
|
||||
>
|
||||
<IconLayoutGrid />
|
||||
</ToggleButton>
|
||||
<ToggleButton
|
||||
sx={{
|
||||
borderColor: theme.palette.grey[900] + 25,
|
||||
borderRadius: 2,
|
||||
color: theme?.customization?.isDarkMode ? 'white' : 'inherit'
|
||||
}}
|
||||
variant='contained'
|
||||
value='list'
|
||||
title='List View'
|
||||
>
|
||||
<IconList />
|
||||
</ToggleButton>
|
||||
</ToggleButtonGroup>
|
||||
<StyledButton
|
||||
variant='contained'
|
||||
sx={{ borderRadius: 2, height: '100%' }}
|
||||
onClick={addNew}
|
||||
startIcon={<IconPlus />}
|
||||
id='btn_createVariable'
|
||||
>
|
||||
Add New
|
||||
</StyledButton>
|
||||
</ViewHeader>
|
||||
{!view || view === 'card' ? (
|
||||
<>
|
||||
{isLoading && !docStores ? (
|
||||
<Box display='grid' gridTemplateColumns='repeat(3, 1fr)' gap={gridSpacing}>
|
||||
<Skeleton variant='rounded' height={160} />
|
||||
<Skeleton variant='rounded' height={160} />
|
||||
<Skeleton variant='rounded' height={160} />
|
||||
</Box>
|
||||
) : (
|
||||
<Box display='grid' gridTemplateColumns='repeat(3, 1fr)' gap={gridSpacing}>
|
||||
{docStores?.filter(filterDocStores).map((data, index) => (
|
||||
<DocumentStoreCard
|
||||
key={index}
|
||||
images={images[data.id]}
|
||||
data={data}
|
||||
onClick={() => goToDocumentStore(data.id)}
|
||||
/>
|
||||
))}
|
||||
</Box>
|
||||
)}
|
||||
</>
|
||||
) : (
|
||||
<TableContainer sx={{ border: 1, borderColor: theme.palette.grey[900] + 25, borderRadius: 2 }} component={Paper}>
|
||||
<Table aria-label='documents table'>
|
||||
<TableHead
|
||||
sx={{
|
||||
backgroundColor: customization.isDarkMode ? theme.palette.common.black : theme.palette.grey[100],
|
||||
height: 56
|
||||
}}
|
||||
>
|
||||
<TableRow>
|
||||
<TableCell> </TableCell>
|
||||
<TableCell>Name</TableCell>
|
||||
<TableCell>Description</TableCell>
|
||||
<TableCell>Connected flows</TableCell>
|
||||
<TableCell>Total characters</TableCell>
|
||||
<TableCell>Total chunks</TableCell>
|
||||
<TableCell>Loader types</TableCell>
|
||||
</TableRow>
|
||||
</TableHead>
|
||||
<TableBody>
|
||||
{docStores?.filter(filterDocStores).map((data, index) => (
|
||||
<TableRow
|
||||
onClick={() => goToDocumentStore(data.id)}
|
||||
hover
|
||||
key={index}
|
||||
sx={{ cursor: 'pointer', '&:last-child td, &:last-child th': { border: 0 } }}
|
||||
>
|
||||
<TableCell align='center'>
|
||||
<DocumentStoreStatus isTableView={true} status={data.status} />
|
||||
</TableCell>
|
||||
<TableCell>
|
||||
<Typography
|
||||
sx={{
|
||||
display: '-webkit-box',
|
||||
WebkitLineClamp: 5,
|
||||
WebkitBoxOrient: 'vertical',
|
||||
textOverflow: 'ellipsis',
|
||||
overflow: 'hidden'
|
||||
}}
|
||||
>
|
||||
{data.name}
|
||||
</Typography>
|
||||
</TableCell>
|
||||
<TableCell>
|
||||
<Typography
|
||||
sx={{
|
||||
display: '-webkit-box',
|
||||
WebkitLineClamp: 5,
|
||||
WebkitBoxOrient: 'vertical',
|
||||
textOverflow: 'ellipsis',
|
||||
overflow: 'hidden'
|
||||
}}
|
||||
>
|
||||
{data?.description}
|
||||
</Typography>
|
||||
</TableCell>
|
||||
<TableCell>{data.whereUsed?.length ?? 0}</TableCell>
|
||||
<TableCell>{data.totalChars}</TableCell>
|
||||
<TableCell>{data.totalChunks}</TableCell>
|
||||
<TableCell>
|
||||
{images[data.id] && (
|
||||
<Box
|
||||
sx={{
|
||||
display: 'flex',
|
||||
alignItems: 'center',
|
||||
justifyContent: 'start',
|
||||
gap: 1
|
||||
}}
|
||||
>
|
||||
{images[data.id].slice(0, images.length > 3 ? 3 : images.length).map((img) => (
|
||||
<Box
|
||||
key={img}
|
||||
sx={{
|
||||
width: 30,
|
||||
height: 30,
|
||||
borderRadius: '50%',
|
||||
backgroundColor: customization.isDarkMode
|
||||
? theme.palette.common.white
|
||||
: theme.palette.grey[300] + 75
|
||||
}}
|
||||
>
|
||||
<img
|
||||
style={{
|
||||
width: '100%',
|
||||
height: '100%',
|
||||
padding: 5,
|
||||
objectFit: 'contain'
|
||||
}}
|
||||
alt=''
|
||||
src={img}
|
||||
/>
|
||||
</Box>
|
||||
))}
|
||||
{images.length > 3 && (
|
||||
<Typography
|
||||
sx={{
|
||||
alignItems: 'center',
|
||||
display: 'flex',
|
||||
fontSize: '.9rem',
|
||||
fontWeight: 200
|
||||
}}
|
||||
>
|
||||
+ {images.length - 3} More
|
||||
</Typography>
|
||||
)}
|
||||
</Box>
|
||||
)}
|
||||
</TableCell>
|
||||
</TableRow>
|
||||
))}
|
||||
</TableBody>
|
||||
</Table>
|
||||
</TableContainer>
|
||||
)}
|
||||
{!isLoading && (!docStores || docStores.length === 0) && (
|
||||
<Stack sx={{ alignItems: 'center', justifyContent: 'center' }} flexDirection='column'>
|
||||
<Box sx={{ p: 2, height: 'auto' }}>
|
||||
<img
|
||||
style={{ objectFit: 'cover', height: '16vh', width: 'auto' }}
|
||||
src={doc_store_empty}
|
||||
alt='doc_store_empty'
|
||||
/>
|
||||
</Box>
|
||||
<div>No Document Stores Created Yet</div>
|
||||
</Stack>
|
||||
)}
|
||||
</Stack>
|
||||
)}
|
||||
{showDialog && (
|
||||
<AddDocStoreDialog
|
||||
dialogProps={dialogProps}
|
||||
show={showDialog}
|
||||
onCancel={() => setShowDialog(false)}
|
||||
onConfirm={onConfirm}
|
||||
/>
|
||||
)}
|
||||
</MainCard>
|
||||
)
|
||||
}
|
||||
|
||||
export default Documents
|
||||
|
|
@ -187,25 +187,7 @@ const AddEditVariableDialog = ({ show, dialogProps, onCancel, onConfirm, setErro
|
|||
>
|
||||
<DialogTitle sx={{ fontSize: '1rem' }} id='alert-dialog-title'>
|
||||
<div style={{ display: 'flex', flexDirection: 'row', alignItems: 'center' }}>
|
||||
<div
|
||||
style={{
|
||||
width: 50,
|
||||
height: 50,
|
||||
marginRight: 10,
|
||||
borderRadius: '50%',
|
||||
backgroundColor: 'white'
|
||||
}}
|
||||
>
|
||||
<IconVariable
|
||||
style={{
|
||||
width: '100%',
|
||||
height: '100%',
|
||||
padding: 7,
|
||||
borderRadius: '50%',
|
||||
objectFit: 'contain'
|
||||
}}
|
||||
/>
|
||||
</div>
|
||||
<IconVariable style={{ marginRight: '10px' }} />
|
||||
{dialogProps.type === 'ADD' ? 'Add Variable' : 'Edit Variable'}
|
||||
</div>
|
||||
</DialogTitle>
|
||||
|
|
|
|||
64657
pnpm-lock.yaml
64657
pnpm-lock.yaml
File diff suppressed because it is too large
Load Diff
Loading…
Reference in New Issue